Host accounting?

Neil Long neil.long at computing-services.oxford.ac.uk
Fri Feb 4 11:15:42 EST 2000


Hello

We have been trying to identify heavy usage hosts (well students + MP3
and/or video/audio, etc) and the argus data logs are very useful but
extraction is tedious.
We average about 15M per hour (excluding 80,8080,25) and I use simple shell
scripts to extract for the net block or blocks which identify particular
units over a 24 hour period.

I then use either radjacency to look for heavy usage by port (always nice to
find a favourite port!) or run through the net blocks looping on IP with
racount to a summary file.

The latter is the slowest as the (albeit reduced) datafile is read n*254
times. There wasn't anything obvious in the contrib directory but I wondered
if someone has already invented the wheel and has codes to generate bytes
per host - still a lot of work to identify who, what and where (let alone
why).

The problem I have with ra -c is that there are variable numbers of data
columns depending on the various flags such as s, d, *, M,  -- I suppose I
could suppress them with a mod to ra and then pump the data through perl but
I would have thought this would be a common task.

Even so - racount is very useful at the end to convince the support people
that Mr X downloaded 2G of whatever yesterday and could you have a word with
him as to its academic nature.... ;-)

PS> anyone know why port 115 might be so interesting to the port scanner
fraternity these days - is SFTP every used? or has SFTP become SecureFTP
rather than Simple?

regards
Neil



More information about the argus mailing list