Argus 2.0 wishes

Peter Van Epp vanepp at sfu.ca
Thu Mar 9 11:44:57 EST 2000


	While I don't think ra will currently do this feeding the output of
ra in to a perl script will (with a relatively trivial amount of new perl).
	Open two output files one for each stream and code the network 
expression in to a perl if statement of the form

if (expression) {
	print FLOOD "$_\n";    # or a subset of the fields as you choose 
} else {
	print NONFLODD "$_\n"; 
}

	As a bonus here is a copy of my latest (althought not yet ready for
submission to the contrib direectory which I hope to get around to at some
point in the future :-))) parsing script which makes the above easier:

First a trivial example script which shows the use of parse_argus_log():
(the is_our_net() function not shown here selects the destination address 
being one of our address ranges to allow picking out probes directed to our
net). In your case you would put the if statement above in place of the

	if (&is_our_net($dest_net)) {

statement (and add two open statements for the output files right after the 
open(STDIN) ).

#!/usr/local/bin/perl

require "parse_argus_log.pl";

open(STDIN,$ARGV[0]) || die "Can't open $ARGV[0]: $!\n"
	if $ARGV[0];
$line = 0;
while (<STDIN>) {
	$line ++;
	if (($line % 10000) == 0) {
		print STDERR "Processing $line\n";
	}
	chop;
	&parse_argus_log("$_");
	if (&is_our_net($dest_net)) {
		print "$_\n";
	}
}

sub numerically {$b <=> $a;}
sub commas {
        local($_) = @_;
        1 while s/(.*\d)(\d\d\d)/$1,$2/;
        $_;
}                             

	Here is parse_argus_log.pl. It needs to be in the include path 
(probably in the same directory as the script is easiest) which parses the
ra output into perl variables that you can select on. The block of variables
starting with $date are public and can be accessed from the main script above
for testing whether the line is of interest. In your case source_ip, dest_ip
and possibly src_port and dst_port are probably what you want to trip your 
if statement on.

--- cut here filename: parse_argus_log.pl ----
sub main'parse_argus_log {

	# Parse an argus log file line in to the variable listed below to 
	# make it easy to scan log files from a perl script.
	
	local ($_) = @_;
	local ($rest, $t, $p, $a,$b,$c,$d,$e,$f,$g,$h);

	$date = ""; 
	$flag = "";
	$mid_flag = "";
	$end_flag = "";
	$type = "";
	$source_ip = "";
	$dest_ip = "";
	$src_pkt = "";
	$dest_pkt = "";
	$src_bytes = "";
	$dest_bytes = "";
	$source_net ="";
	$dest_net ="";
	$src_port = "";
	$dst_port = "";

	($date, $flag, $rest) = unpack("A18 A5 A200",$_);
	if ($start_time eq "") {
		$start_time = $date;
	}
	($type, $rest) = split(' ',$rest,2);
	if ($type eq "man") {
		$mid_flag = ' ';
	 	($source_ip, $dest_ip, $src_pkt, $dest_pkt, $src_bytes, 
		 $dest_bytes, $end_flag) = split(' ',$rest,7); 
	} elsif ($type eq "icmp") {
	 	($source_ip, $mid_flag, $dest_ip, $src_pkt, $dest_pkt, 
		 $end_flag) = split(' ',$rest,6); 
		if ($end_flag =~ /port/) {
			($t, $p, $dst_port, $rest) = split(' ',$end_flag);
		}
		($a,$b,$c,$d)= split(/\./,$source_ip);
		($e,$f,$g,$h)= split(/\./,$dest_ip);
	} else {
	 	($source_ip, $mid_flag, $dest_ip, $src_pkt, $dest_pkt, 
		 $src_bytes, $dest_bytes, $end_flag) = split(' ',$rest,8); 
		($a,$b,$c,$d,$src_port)= split(/\./,$source_ip);
		($e,$f,$g,$h,$dst_port)= split(/\./,$dest_ip);
	}
	$source_ip = "$a.$b.$c.$d";
	$source_net = "$a.$b.$c";
	$dest_ip = "$e.$f.$g.$h";
	$dest_net = "$e.$f.$g";
}

	# Return true if the argument is one of SFU's networks
sub main'is_our_net {

	local ($_) = @_;

	if (($_ =~ /^142\.58\./) || 
	    ($_ =~ /^199\.60\.[1-9]\.|^199\.60\.1[0-8]\./) ||
	    ($_ =~ /^192\.75\.24[0-7]\./) ||
	    ($_ =~ /^204\.239\.18\./) ||
	    ($_ =~ /^206\.12\.128\./) ||
	    ($_ =~ /^209\.87\.31\./)) {
		return (1);
	} else {
		return (0);
	}
}
1;
---- cut here ---

Peter Van Epp / Operations and Technical Support 
Simon Fraser University, Burnaby, B.C. Canada


> 
> hello
> 
> We had a host targetted last night for a flood - one suggestion for the wish
> list arose
> 
> Using ra to reduce the data I can process (-w)
> 
> a) for a given host or net or expression
> or
> b) not for the given host or net or expression
> 
> So to split the large data set up in to 2 new files (normal and flood) needs
> 2 passes.
> 
> It would be very useful to feed the result of the filter in to one file and
> the rest to a second file thus making it a one pass operation.
> 
> Now you can tell me it is already there .......
> 
> I can see that choosing a letter for the operation is getting difficult (how
> about -W?)
> 
> Cheers
> Neil
> 
> 



More information about the argus mailing list