argus memory usage
Peter Van Epp
vanepp at sfu.ca
Sun Aug 19 13:54:14 EDT 2007
On Sun, Aug 19, 2007 at 11:39:16AM -0400, Carter Bullard wrote:
> Hey Peter,
> I still think you may be starving your ArgusOutput thread, which of course
> would cause the problem where you don't get any records, and no file
> recreation on moving the file.
>
> So the memory strategy is "the ArgusModel allocates buffers that are to
> written
> out of the argus, (flow status records) and passes them to the ArgusOutput
> processor, who writes them to all the clients that want a copy, and then
> the ArgusOutput process deletes them". So two threads having to share
> a message queue, and one is allocating, and one is deleteing.
>
> For efficiency, we are keeping a pool of buffers, so that argus can respond
> to quick surges in flows, and so you won't necessarily see the ArgusOutput
> delete the records until later, so it holding onto them is not bad, but 500K
> records is probably too much.
>
> What I'm going to do tonight is remove the .threads tag in the distribution,
> and then change the buffer pool manager to be a little dumber, to see if
> that doesn't help.
>
> If you could run an argus without any clients, to see if the bug is in the
> ArgusModeler vs the ArgusOutputProcessor. If it is running fine, then
> run with only one client, and then with two (remember this is how we
> got here with the memory problems :o)
>
> That would help a great deal.
>
> Carter
>
OK, I ran an argus all night (not threads and a single client). It
ended up eating all them memory as usual. I've just killed and restarted argus
with no clients listening and we will see what happens (I have to go our for
a few hours anyway so it will sit and cook :-)). It may be that we can get
argus debug to tell us where it is allocating the memory which may be useful.
This is a part of the print out of the memory pool where the number is
incremented on either alloc or free. Note how many have a count of 1 which
should be candidates for not being freed (but may also just be long lasting
structures).
0x16aa8ae0 1
0x12733430 1
0x114ae6b0 7
0x133abca0 1
0x16224110 1
0x10e4db40 3
0x124f2200 1
0x11b46c30 5
0x15d7f030 1
0x110669f0 5
0x111d65e0 7
0x14793290 1
0x1191ec60 3
0x1458aac0 1
0x1450e860 1
0x112c5140 5
0x15afeda0 1
0x15e0a650 1
0x129b0d80 1
0x136f9700 1
0x139b0b00 3
0x1688a830 1
0x153e1ec0 1
0x116b24d0 1
0x1100d0b0 3
0x140e88c0 1
0x16969b20 1
0x12fbc660 1
0x12dabd30 3
0x10dbb350 1
0x1165ed70 2
0x15b1cb90 1
0x16247120 1
0x1577d260 3
0x11d33f20 3
0x14751bf0 2
0x163088c0 1
0x11e80da0 1
0x1061ee00 3
0x131aff50 1
0x10542480 3
0x107c1ed0 7
0x118b0080 5
0x146a3f80 1
0x1666fc10 1
0x14e9cfd0 1
0x119ba0f0 2
0x16a818a0 1
0x13556180 1
0x13da0050 5
0x11c05c70 1
0x14962640 3
0x15ef9c30 1
0x16688240 1
0x11d35750 1
0x1634edc0 1
0x13b8f7e0 1
0x11cf27c0 1
0x12278630 1
0x16514180 1
0x124d0b70 1
0x150d3a30 2
0x11426570 1
0x119397e0 1
0x14e19ae0 1
0x10f5da60 1
0x138b0060 1
0x163a9d80 1
0x10ff0400 2
0x113ae360 27
0x1604cde0 1
0x14ef4090 3
0x117f67d0 4
0x16394660 1
0x12059fc0 3
0x12e77730 3
0x13242f20 1
0x13332ef0 1
0x138bbd30 3
0x11a73400 4
0x12411170 1
0x14b84280 1
0x15557990 1
0x15445580 1
0x15db6bc0 1
0x126fc020 1
0x1420e4a0 1
0x163ae5e0 3
0x11ecea10 1
0x11a6a820 5
0x12e7f860 1
0x11596a60 1
0x11c2afc0 1
0x14a05bd0 1
0x143621f0 1
0x16024fa0 1
0x14dfedb0 1
0x14eeeb70 1
0x16696a90 1
0x12a114a0 1
0x1537d840 1
0x1323b720 5
0x15358a90 2
0x1619a5c0 1
0x112b7ac0 1
0x15cad7b0 1
0x1063aee0 1
0x142b05c0 3
0x13e43a90 1
0x11f49dd0 1
0x10b1d240 2
0x15b362d0 1
0x147a78a0 1
0x1637b190 3
0x1247d470 1
0x15200330 1
0x11a06d70 7
0x167ed2a0 2
0x12eb8740 1
0x11c81d80 1
0x12844580 1
0x1274baf0 1
0x10626510 5
0x12e577b0 1
0x13044920 1
0x137140d0 1
0x12b22890 1
0x13700ba0 3
0x1390ec20 1
0x11588fc0 5
0x13d2d540 1
0x157b1940 1
0x14555720 1
0x1114c700 1
0x152ee980 1
0x152716a0 1
0x10785cc0 5
0x14882840 1
0x10d6dc80 3
0x1159bb00 2
0x114ef2d0 5
0x15c60ef0 1
0x168fe9f0 1
0x15790010 3
0x15d33130 1
0x116d5ae0 5
0x10b716f0 9
0x14b8cb00 1
Peter Van Epp / Operations and Technical Support
Simon Fraser University, Burnaby, B.C. Canada
More information about the argus
mailing list