out of memory error on python 2.5.4 XP

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • pyuser91
    New Member
    • Dec 2009
    • 2

    out of memory error on python 2.5.4 XP

    Hello,
    I hope someone can help.
    I am using Python 2.5.4 (under python(x,y) umbrella) on XP SP3, and running a grammatical evolution program that runs recursions on equations that grow via nested recursive loops: the program is PonyGE from the google repository: I tried contacting authors to no avail.

    The program works, but after a small number of populations, it gets a 'dreaded' memory error if the equations grow too large. If I run it in Ipython I get an additional clue, which is s_push: parser stack overflow.

    An example equation that was halted with this error is:

    Code:
      return (4 * (2 - (4 * ((((2 + (2 + 0)) + 3) * (((4 + 3) - ((((2 * (3 * 1)) + (4 - (((0 - 4) + (((4 * (((((3 + 0) + (0 + 2)) + 3) * (((0 - 3) * 3) + (5 - (4 + (2 + 1))))) - 3)) - 3) * 5)) + 1))) * 4) - 2)) + 4)) + 1))))
    XXXeval_or_exec_outputXXX=sum([f(x) for x in range(10)]); 8264400


    I did some research on other people with this problem, and they said the
    parser.c MAXSTACK needed to be upped. Problem is I have no parser.c in my python directory. The other weird thing is the program halts here and only uses 15M of RAM out of 1.5G on my system (3.3Ghz AMD).

    One other hint is someone else ran it on linux with no problems. They mentioned it might have to do with the way xp allocates memory vs linux.
    Can anyone help?

    Thanks so much.

    P.S. Sorry, is there a sticky on how to insert code tags? I tried <code></code> and it didn't enclose any code.
    Last edited by bvdet; Dec 19 '09, 02:16 AM. Reason: Add code tags
  • bvdet
    Recognized Expert Specialist
    • Oct 2006
    • 2851

    #2
    Thanks for trying to add code tags. The guidelines for posting questions can be found here.

    Comment

    • annon desu
      New Member
      • Sep 2010
      • 1

      #3
      I'm having the same problem

      Did you ever figure this out? I get the same problem trying to down load a file of about 1/2 a GB but I have plenty of RAM etc on my machine

      Comment

      Working...