I am running perl -v
This is perl, v5.8.6 built for darwin-thread-multi-2level
(with 2 registered patches, see perl -V for more detail)
on a MacOSX 10.4, dual cpu with 2GB of RAM and 230 GB free on the drive.
I am getting the following from a well tested program as I scale the
problem up for a data processing problem.
perl(2013) malloc: *** vm_allocate(siz e=8421376) failed (error code=3)
perl(2013) malloc: *** error: can't allocate region
perl(2013) malloc: *** set a breakpoint in szone_error to debug
Out of memory!
I know exactly where I can short circuit inside a major loop, which causes
the error to occur or not, and the error occurs at the same point in my
data stream (it is not the data--have less powerful linux machine running
fine) every
time. The next line following my early break in the major loop is another
push(@array,$so mething). About 1GB RAM is used as the program spins up
and reads in all external data. VM grows to about 1.4-1.5 GB, which does
surprize me. I do not have permanently growing arrays--each time the
major loop is done data is written to a file and arrays cleared. As the
process does its thing, RAM for the process actually diminishes and VM
grows.
With this much machine, I should not have memory problems at this level.
Have I hit the wall with perl? Do I need to "tune" the machine to
get into much much larger memory usage. Advice on how to pursue this mater
further.
This is perl, v5.8.6 built for darwin-thread-multi-2level
(with 2 registered patches, see perl -V for more detail)
on a MacOSX 10.4, dual cpu with 2GB of RAM and 230 GB free on the drive.
I am getting the following from a well tested program as I scale the
problem up for a data processing problem.
perl(2013) malloc: *** vm_allocate(siz e=8421376) failed (error code=3)
perl(2013) malloc: *** error: can't allocate region
perl(2013) malloc: *** set a breakpoint in szone_error to debug
Out of memory!
I know exactly where I can short circuit inside a major loop, which causes
the error to occur or not, and the error occurs at the same point in my
data stream (it is not the data--have less powerful linux machine running
fine) every
time. The next line following my early break in the major loop is another
push(@array,$so mething). About 1GB RAM is used as the program spins up
and reads in all external data. VM grows to about 1.4-1.5 GB, which does
surprize me. I do not have permanently growing arrays--each time the
major loop is done data is written to a file and arrays cleared. As the
process does its thing, RAM for the process actually diminishes and VM
grows.
With this much machine, I should not have memory problems at this level.
Have I hit the wall with perl? Do I need to "tune" the machine to
get into much much larger memory usage. Advice on how to pursue this mater
further.
Comment