Long CLI-process consumes too much memory

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • fnatter@gmx.net

    Long CLI-process consumes too much memory

    hi,

    I am using an adapted (CLI) version of the mediawiki wiki->HTML parser
    on several thousand articles. Because of performance, we don't want to
    start the php interpreter for each article, so we loop over articles
    in the php script calling the parser.

    The problem is that some memory is not freed, and for about 100
    articles,
    the memory usage increased by about 10M. We recreate the Parser-Object
    in each iteration to avoid accumulation of article-data, but maybe
    there
    are still references to some or all of this. Is there a way to make
    sure
    that there are no references to $parser (or parts of it) so that it
    really
    gets freed?

    Can I recursively free memory of an object including all subobjects?

    Finally, is there a way to debug memory usage (display by object)?

    thanks!

    --
    Felix Natter

Working...