argusarchive segmentation faults

Peter Van Epp vanepp at sfu.ca
Tue Feb 1 14:01:20 EST 2005


	I've finally gotten tired of trying to keep perl from running out of 
memory when post processing a whole day and just installed a copy of mysql on
one of my test machines. It looks to be time to let someone else deal with 
running out of memory with big data sets :-) of course the downside is that 
I have to learn SQL and mySQL (but we at least have database folks and they 
run mySQL so I can ask dumb questions :-)).

Peter Van Epp / Operations and Technical Support 
Simon Fraser University, Burnaby, B.C. Canada

On Tue, Feb 01, 2005 at 12:39:12PM -0500, John Nagro wrote:
> Peter,
> 
> I appreciate the advice. We generally just have a lot of data to work
> with, i can live without the sort.
> 
> Thanks again!
> 
> -John
> 
> 
> On Tue, 1 Feb 2005 09:35:31 -0800, Peter Van Epp <vanepp at sfu.ca> wrote:
> >         Its not a bug pe se, but that sorting a big file takes a lot of
> > memory. You can boost system memory and perhaps the per task memory allocation
> > parameters in the kernel (at least on FreeBSD) so that your task can actually
> > use some of the additional memory and probably make it work depending on how
> > large your files get.
> > 
> > Peter Van Epp / Operations and Technical Support
> > Simon Fraser University, Burnaby, B.C. Canada
> > 
> > 
> > On Tue, Feb 01, 2005 at 12:29:43PM -0500, John Nagro wrote:
> > > Peter,
> > >
> > > Thanks for the response. It was infact rasort, i lost the specific
> > > message, but i remember that part. I will try to comment that out and
> > > see what happens. If it still fails, i will try that debugging and
> > > send more info to the list.
> > >
> > > Is there a particular bug with rasort that could be fixed?
> > >
> > > Thanks a lot!
> > >
> > > -John
> > >
> > >
> > > On Tue, 1 Feb 2005 09:23:10 -0800, Peter Van Epp <vanepp at sfu.ca> wrote:
> > > >         Replaceing #!/bin/sh with #!/bin/sh -x -v as the first line will
> > > > turn on debugging and tell you where the seg fault occurs (likely one of the
> > > > programs being called running out of memory). If it is generating core files
> > > > the core should tell you which program is getting the seg fault. rasort is
> > > > always a good bet, you may wish to comment out the RASORT line below if your
> > > > files are large (and/or feed one of the files that segfaults in to rasort and
> > > > see if it seg faults on the command line).
> > > >
> > > > if [ -f $ARCHIVE.tmp ]; then
> > > > #  $RAGATOR -VRr $ARCHIVE.tmp -w - | $RASORT -w $ARCHIVE
> > > >    $RASORT -r $ARCHIVE.tmp -w $ARCHIVE
> > > >
> > > > Peter Van Epp / Operations and Technical Support
> > > > Simon Fraser University, Burnaby, B.C. Canada
> > > >
> > > > On Tue, Feb 01, 2005 at 12:12:31PM -0500, John Nagro wrote:
> > > > > I have been getting seg faults when using the argusarchive script
> > > > > recently. Any ideas?
> > > > >
> > > > > --
> > > > > John Nagro
> > > > > john.nagro at gmail.com
> > > >
> > >
> > >
> > > --
> > > John Nagro
> > > john.nagro at gmail.com
> > 
> 
> 
> -- 
> John Nagro
> john.nagro at gmail.com



More information about the argus mailing list