raconvert

CS Lee geek00l at gmail.com
Tue Aug 3 08:32:28 EDT 2010


hi guys,

I'm going to release argus to splunk stuffs - argus2splunk-v1.tar.gz

For splunk part

in NSM/local directory

app.conf
data/ui/views/Argus.xml
inputs.conf
props.conf
savedsearches.conf
transforms.conf
viewstates.conf

inputs.conf defines which directory to monitor for argus data
props.conf defines the argus data
transforms.conf helps to extract argus data field
savedsearches.conf contains all the report

In data/ui/views/Argus.xml, it contains the link to the Argus report. I'm
going to discuss about setup here -

For argus part, I use argus, rastream, racluster and ra to export the data
as csv format, and rsync to the splunk server to /var/log/argus directory
where splunk will monitor, I will also release all the scripts i have to
make the whole things work. Basically the setup is

Netowkr Link <------Argus Probe(Monitor, log argus default format and export
the data in csv format) ---------rsync in specific time interval----------->
Splunk Server(Process csv data, index, making graph and generate report)

Okay, the reason I have most of the stuffs processed in sensor and then only
sending the data to splunk server is because usually splunk is in high load
condition if it needs to process everything, and it's not so wise to put
more load to splunk server by running argus client tools in Splunk Server.

If you have more argus probes, can consider using radium setup. I will
release all the stuffs together at one shot however my current documentation
for the setup just sucks ;)

Carter and the rest of friends, if you have better idea of how to implement
the whole thing that would be great.

Cheers!






On Fri, Jul 30, 2010 at 11:32 PM, Paul Schmehl <pschmehl_lists at tx.rr.com>wrote:

> Yep, still in edu, and still trying to rise to your level of geekiness.
>  :-)
>
> I'd be glad to test it out.  We're making increased use of argus, but
> searching the logs is timeconsuming.  Being able to search in Splunk and
> locate exactly what log I need to go to would be quite helpful.
>
> We're presently storing about 15 days of logs, capturing the first 400
> bytes of every packet.  It's been quite useful.
>
>
> --On Friday, July 30, 2010 12:20:03 +0800 CS Lee <geek00l at gmail.com>
> wrote:
>
>  hi paul,
>>
>> Btw, how's life and are you still in .edu?
>>
>> Yes, I can send you the argus stuffs I have and you can test it, I have it
>> deployed on one production site, and another on testing site now.
>>
>>
>> On Fri, Jul 30, 2010 at 5:19 AM, Paul Schmehl <pschmehl_lists at tx.rr.com>
>> wrote:
>>
>> CS, we are *very* interested in this.  Is your argus to splunk app far
>> enough along to do testing?
>>
>>
>> --On Thursday, July 29, 2010 23:57:39 +0800 CS Lee <geek00l at gmail.com>
>> wrote:
>>
>>
>>
>>
>>
>> hi Carter,
>>
>> I was having the problem as well until I tried to get argus data into
>> splunk,
>> and in fact I have almost all the fields in argus extracted and send to
>> splunk, I always put suser data and duser data at last field. My argus
>> data
>> is in csv form and this is how I have it done with splunk though -
>>
>> In the props.conf(properties config)
>> [argus]
>> sourcetype = argus
>> REPORT-argus = argus-fields, argus-suser-data, argus-duser-data
>>
>> In the transforms.conf(data transformation config)
>> [argus-fields]
>> DELIMS = ","
>> FIELDS =
>>
>> "stime","flags","proto","src_ip","src_port","direction","dst_ip","dst_port","
>>
>> state","duration","pkts","bytes","appbytes","pps","bps","src_pkts","dst_pkts"
>>
>> ,"src_bytes","dst_bytes","src_appbytes","dst_appbytes","src_pps","dst_pps","s
>> rc_bps","dst_bps"
>>
>> [argus-suser-data]
>> REGEX = ,s\[\d+\]=(?<suser_data>.{0,64}),?
>>
>> [argus-duser-data]
>> REGEX = ,d\[\d+\]=(?<duser_data>.{0,64})
>>
>> I don't expect everyone to get the idea at first glance however if you are
>> familiar with splunk or regex this won't be too hard.
>>
>> I'm not trying to promote splunk here but since both of them can be glued
>> together so well, I just want to be able to perform analysis on every
>> field i
>> can obtain from argus record, and graphing them, further generating
>> report.
>> On top of them you can still keep argus record in its own format and
>> processed by ra like tools when you need to do some other post processing
>> which is not offered by splunk web.
>>
>> I have argus app to splunk done and plan to release it soon.
>>
>> Cheers ;)
>>
>>
>>
>> On Thu, Jul 29, 2010 at 11:15 PM, Carter Bullard <carter at qosient.com>
>> wrote:
>>
>>
>> Hey CS Lee,
>> Yes, the user buffers do need some work.  So how do other systems, like
>> csv,
>> deal with delimiters in the output?  Is there a universal escape strategy?
>>
>>
>>
>> Good to see you around.
>>
>> Carter
>>
>>
>>
>>
>>
>>
>> On Jul 28, 2010, at 11:23 AM, CS Lee wrote:
>>
>> hi Carter,
>>
>> How's life, think I'm back and will blog more about argus and flow stuffs!
>>
>> Regarding raconvert, the tricky part I see would be converting user data
>> field that is printed because I used to have the problem when using , or
>> other character as delimeter and end up need to do additional parsing to
>> get
>> user data extracted properly in the ascii flow records.
>>
>> Gentle people,
>> There is a new program in the clients distribution, raconvert(), with
>> manpage.
>>
>> This program is designed to convert ASCII based argus files to binary
>> argus
>> data records.   The ASCII must have a single character delimiter, such as
>> a
>> ',',
>> but you can specify the delimiter, using the "-c char" option.
>>
>>    ra -r argus.file -c ,  > /tmp/ra.txt
>>    raconvert -r /tmp/ra.txt -w - | ra
>>
>> raconvert() is not complete.  Currently, I'm handling maybe 50 out of the
>> 180
>> something fields that we can printout, but its time to put it out there,
>> so
>> if you
>> try to use it, and some fields don't get converted, send me a sample ascii
>> file,
>> and I'll add the support that your field.
>>
>> The records that we generate may not be complete.  It depends on how much
>> information you provide in the ascii records.  For instance if you only
>> have
>> the "StartTime" field, without the "LastTime" field, the resulting binary
>> argus
>> record will have a duration of 0, so you want to ensure that you have
>> enough
>> information in the ascii output to convey all that you want.
>>
>> Also, the name suggests that it should be able to do conversion, which may
>> imply that it converts more than just one thing to another, so, ......,
>> if you have any ideas as to what you would like to convert, just holler,
>> and
>> I'll see what I can do.
>>
>> I will try to add XML conversion before the summer is done.
>>
>> So why this program?  The primary reason is to support moving argus data
>> around in environments that don't like binary data.  You convert the
>> records
>> to ASCII, printing as many fields as practical, move the file to the next
>> location,
>> and then convert them back to binary records so you can do work with them.
>> Some high security places need this type of support.  But you could also
>> use
>> it as a means to create an argus data editor, if you wanted.
>>
>> Hope you find this useful,
>>
>> Carter
>>
>> --
>> Best Regards,
>>
>> CS Lee<geek00L[at]gmail.com>
>>
>> http://geek00l.blogspot.com
>> http://defcraft.net
>>
>>
>>
>>
>> --
>>
>>
>>
>> Paul Schmehl, Senior Infosec Analyst
>> As if it wasn't already obvious, my opinions
>> are my own and not those of my employer.
>> *******************************************
>> "It is as useless to argue with those who have
>> renounced the use of reason as to administer
>> medication to the dead." Thomas Jefferson
>>
>
>
>
> --
> Paul Schmehl, Senior Infosec Analyst
> As if it wasn't already obvious, my opinions
> are my own and not those of my employer.
> *******************************************
> "It is as useless to argue with those who have
> renounced the use of reason as to administer
> medication to the dead." Thomas Jefferson
>
>


-- 
Best Regards,

CS Lee<geek00L[at]gmail.com>

http://geek00l.blogspot.com
http://defcraft.net
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://pairlist1.pair.net/pipermail/argus/attachments/20100803/9064e15f/attachment.html>


More information about the argus mailing list