Useful ragraph usage
Carter Bullard
carter at qosient.com
Mon Nov 6 13:53:50 EST 2006
Hey Jim,
One thing to think about. Ragraph() uses a program called rabins()
to get the
data to be structured/aligned to the time bins that you specify on
the graph.
It builds intermediate clustered data sets that are then used by perl
to build
the arrays to feed into rrdgraph(). A daily graph, graphed say in 5
minute bins,
generates an intermediate data file that has all the data clustered
by the
flow model and the 5 minute time chunks. Its much easier to graph a
weeks
worth of data using these intermediate data sets, rather than the
original
data (which will take forever to go through).
So, I would suggest that in your scripts you could build these
intermediate
files, graph from them and then use them for your weekly, and then use
the weekly intermediate files to generate your month graph.
This is something of the concept of my 'graph of the week', showing that
you get pretty good results when you use aggregated data.
Hope this is useful, and if you want to consider this approach, lets
get the
mailing list to come up with an approach that we can generalize.
Carter
On Nov 6, 2006, at 12:55 PM, Jim O'Gorman wrote:
> Sorry about the delay in a reply - brief vacation.
>
> What you describe sounds very interesting, and would be a great
> help in my situation.
>
> Currently I am producing a series of graphs every 5 mins that cover
> the last hour of use, and every hours that covers the last 24 hours
> of use. This is working just fine for me at this point.
>
> I have tried to produce the same graphs covering a weeks worth of
> use, but it sucks down all my RAM, then the process dies, and thats
> that. Being able to just feed the an rrd then pull the graphs from
> that should allow me to generate graphs for a much longer range of
> time then I currently am able to.
>
> To give you an idea of the graphs I am pulling, I have a general
> usage graph, then separate graphs for the local lan being src or
> dst. I have a couple others that I produce based on which remote
> network is being spoken too, and a couple more that have to do with
> special services that I want to track the usage of.
>
> Thanks again for this addition! It is great to have the ablity to
> graph and these random data sets that I am pulling.
>
> Thanks
> Jim
>
> On 10/26/06, carter at qosient.com <carter at qosient.com> wrote:
> Hey Jim,
> Thanks!!! One thing that I am interested in is ragraph() uses
> perl to load rrd's, we then use rrd_graph to generate the graph,
> and then we delete all the intermediate files. There is no reason
> that we couldn't just feed an rrd, and leave them like MRTG does.
> Because there are web based 'on demand' graph scripts that just
> load up rrd' we could feed that kind of system.
>
> Is this something that would be interesting?
>
> Carter
>
>
> Carter Bullard
> QoSient LLC
> 150 E. 57th Street Suite 12D
> New York, New York 10022
> +1 212 588-9133 Phone
> +1 212 588-9134 Fax
>
> -----Original Message-----
> From: "Jim O'Gorman" <jogorman at gmail.com>
> Date: Wed, 25 Oct 2006 11:42:51
> To:argus-info at lists.andrew.cmu.edu
> Subject: [ARGUS] Useful ragraph usage
>
> I have been using argus for a long time now to collect traffic data
> on various lines I have to maintain. With ragraph now, I am excited
> about finding new ways to use this collected data.
>
> Working off and on with the ragraph tool the last couple days,
> there seems to be many useful ways to put this to use. I am
> interested in using this to replace some other tools I use, such as
> mrtg. I am still up in the air on if this is practical at this
> point, and am currently putting together a series of graphs to use
> over the next month to see how the data compares.
>
> Along those lines, I would be interested to see how anyone is using
> ragraph, what sort of data they are finding useful, what command
> lines, etc. If anyone has anything they wish to share, that would
> be wonderful.
>
> I have been looking at graphing the basics of the last days/weeks/
> months traffic along with pulling out the top 10 talkers and
> graphing them as well. Perhaps also pulling out specific networks
> and doing them separate to watch for additional trends. This is
> what is so exciting about this tool, is the flexibility to trend
> all of these arbitrary datasets. At this point, I have not found
> any other tool that allows for this degree of flexibility.
>
> I have been finding the graph of the week very useful as well.
> Thank you for that.
>
> --
> Jim
> jameso at elwood.net: <mailto:jameso at elwood.net >
> jogorman at gmail.com
> : <mailto:jogorman at gmail.com>
> http://www.elwood.net: < http://www.elwood.net>
>
>
>
> --
> Jim
> jameso at elwood.net
> jogorman at gmail.com
> http://www.elwood.net
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://pairlist1.pair.net/pipermail/argus/attachments/20061106/a17b53d2/attachment.html>
More information about the argus
mailing list