FW: Big files.

Carter Bullard carter at qosient.com
Mon Jan 29 08:10:42 EST 2001


Hey Scott,
   Yes, "ra -r */*.gz" works very well, so you can
pump in a whole days worth, or a whole years worth
of records all at one time.

   We don't have keywords like "last week", "yesterday",
or "1 hour ago", but we could add some minimal support,
such as "-t -3h" or "-t -3d+1h".  If we have a syntax
and a semantic, I can put something together, but I'd
like to do it this week!

   Russell has a perl script that is a great start on
some of this support in his perl module Argus::Archive.
This script finds argus files that span time ranges out
of the archive.   I would suspect that a perl front
end would be the best way to do what you want with time,
since the syntax is already done?

   What do you thing?

Carter

Carter Bullard
QoSient, LLC
300 E. 56th Street, Suite 18K
New York, New York  10022

carter at qosient.com
Phone +1 212 813-9426
Fax   +1 212 813-9426




> -----Original Message-----
> From: Scott A. McIntyre [mailto:scott at xs4all.nl]
> Sent: Monday, January 29, 2001 2:13 AM
> To: Carter Bullard
> Subject: Re: Big files.
> 
> 
> >    The third process is the one that is actually
> > writing into the file.  It must have exited, due
> > to some error.  If you look in your /var/log/messages
> > file you may find a message.
> 
> Nope, sadly, there was no clue.  I did not compile LFS into 
> Argus when I
> built it, but am going to do so now.  This may be something 
> you want to
> test for, possibly -- if one is running 2.4 kernels, it's implemented,
> but user-space programs must compile an application with
> "-D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE"...
> 
> 
> >    I process the file every hour, and some process
> > it every 10 minutes.  It tends to generate files
> > that are easily indexed and searched, rather than
> > having one huge file for the whole day.
> 
> Yes, that would be my preference as well -- I don't suppose there's a
> hidden feature of the -r toggle to ra() that will allow it to read
> multiple files at one time?
> 
> I tend to use a lot of alert generating applications and go 
> back to the
> argus logs to do further tallies of specific traffic patterns by
> hosts/networks, etc.  In other words, if I'm going to rotate 
> more often,
> I need an easy way to go to a day's worth of data and search all files
> in one command.
> 
> And here's a really really useful idea (well, for me) -- the 
> ability to
> use GNU date --date type time structures.  Specifically I'd love to be
> able to do one of two things:
> 
> o  Specify a time to search argus data records in a relative context
> from "now" (ra -t '3 hours ago') to find all matching records 
> from three
> hours ago until present
> 
> o  Specify a time search range in "date" terminology.  ra -t `date
> --date "9 hours ago"`-`Mon Jan 29 08:00:00 CET 2001` or something
> similar.
> 
> I could be the only person interested in such things, so feel free to
> ignore.  I have written perl scripts which accomplish similar anyway.
> 
> Thanks, as always.
> 
> Scott
> 
> 
> 
-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/x-pkcs7-signature
Size: 3699 bytes
Desc: not available
URL: <https://pairlist1.pair.net/pipermail/argus/attachments/20010129/ef016872/attachment.bin>


More information about the argus mailing list