Perl - Memory Issues

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • James B.

    Perl - Memory Issues

    I have written a perl script to parse some large flat-text logs. The
    logs are bzipped and come in through a pipe to STDIN. The script then
    performs some regular expressions to the incoming data then prints to
    STDOUT.

    The script works great, but the issue I have is that the script uses
    as much memory as the data coming into it. Therefore if I pipe a 800MB
    file into it, the memory usage grows and grows until it reaches
    approximatly 800MB, and it doesn't appear to release any of the memory
    until it is completely finished. Is there a way to have Perl not use
    so much of the system memory? Here is a sample of what I am doing.



    #!/usr/bin/perl -w

    use strict;

    if ($ARGV[0]){
    open STDIN, "bzcat $ARGV[0]|" or die "Cant uncompress file as a
    pipe\n$!\n";
    }

    foreach (<STDIN>) {
    chomp;
    if ($_ =~ /(somedata)/) {
    print "$1\n";
    }
    }
  • Jürgen Exner

    #2
    Re: Perl - Memory Issues

    James B. wrote:[color=blue]
    > The script works great, but the issue I have is that the script uses
    > as much memory as the data coming into it. Therefore if I pipe a 800MB
    > file into it, the memory usage grows and grows until it reaches
    > approximatly 800MB, and it doesn't appear to release any of the memory
    > until it is completely finished. Is there a way to have Perl not use
    > so much of the system memory? Here is a sample of what I am doing.
    >[/color]
    [...][color=blue]
    > foreach (<STDIN>) {[/color]

    Well, here you are computing the full array of all lines from STDIN, then
    foreach loops through them.
    Why don't you use the more typical
    while (<STDIN>)
    which will process line by line?

    jue


    Comment

    • Shawn Zabel

      #3
      Re: Perl - Memory Issues

      "James B." <james.q.berry1 @jsc.nasa.gov> wrote in message
      news:75f179b3.0 403091015.31505 800@posting.goo gle.com...[color=blue]
      > I have written a perl script to parse some large flat-text logs. The
      > logs are bzipped and come in through a pipe to STDIN. The script then
      > performs some regular expressions to the incoming data then prints to
      > STDOUT.
      >
      > The script works great, but the issue I have is that the script uses
      > as much memory as the data coming into it. Therefore if I pipe a 800MB
      > file into it, the memory usage grows and grows until it reaches
      > approximatly 800MB, and it doesn't appear to release any of the memory
      > until it is completely finished. Is there a way to have Perl not use
      > so much of the system memory? Here is a sample of what I am doing.
      >
      >
      >
      > #!/usr/bin/perl -w
      >
      > use strict;
      >
      > if ($ARGV[0]){
      > open STDIN, "bzcat $ARGV[0]|" or die "Cant uncompress file as a
      > pipe\n$!\n";
      > }
      >
      > foreach (<STDIN>) {
      > chomp;
      > if ($_ =~ /(somedata)/) {
      > print "$1\n";
      > }
      > }[/color]


      Change

      foreach (<STDIN>) {

      to

      while (<STDIN>) {

      The foreach construct reads every line in and then loops through them one at
      a time.
      The while construct reads one line in at a time and executes the loop.

      --
      Shawn



      Comment

      • James B.

        #4
        Re: Perl - Memory Issues

        Wow, what a simple soultion. Works great, Thanks.

        James

        Comment

        Working...