max_execution_time and fork

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • sandy

    max_execution_time and fork

    I use recursive readdir as the engine for
    a numerous file processing routines, including
    generating thumbnails, using getimagesize
    imagecreatetrue color and imagejpeg, etc.
    where each resize operation involves instantiating
    a new resizer class instance.

    .....but my (shared host) website is large, so I
    run past max_execution_t ime if I try to make
    all thumbnails at once, from the top of
    my document_root. My virtual file system
    includes a php.ini which I can edit, but
    bumping max_execution-time (although the file saves)
    seems to have no effect, and I do not have
    permission to run /usr/sbin/apachectl restart

    Perhaps the php.ini change
    will start to work the next time the server reboots.

    In the mean time, is there some way to fork a new process,
    for instantiating the resizer?

    I could try to use exec to run command line
    php. But there must be a more elegant way
    to do this. I'd use perl and imagemagick,
    but my shared host server doesn't have the
    right perl libs installed, and that's another
    can of worms.
  • randyaa@gmail.com

    #2
    Re: max_execution_t ime and fork

    why not just generate the thumbnails for the current directory, then
    provide navigational links to the sub-directories. I just started on
    a photo album system that works in that exact way. http://code.google.com/p/zippyphotos/.
    Check it out maybe we can work out some requirements together

    On Apr 28, 12:00 am, sandy <relativ...@isr elative.comwrot e:
    I use recursive readdir as the engine for
    a numerous file processing routines, including
    generating thumbnails, using getimagesize
    imagecreatetrue color and imagejpeg, etc.
    where each resize operation involves instantiating
    a new resizer class instance.
    >
    ....but my (shared host) website is large, so I
    run past max_execution_t ime if I try to make
    all thumbnails at once, from the top of
    my document_root. My virtual file system
    includes a php.ini which I can edit, but
    bumping max_execution-time (although the file saves)
    seems to have no effect, and I do not have
    permission to run /usr/sbin/apachectl restart
    >
    Perhaps the php.ini change
    will start to work the next time the server reboots.
    >
    In the mean time, is there some way to fork a new process,
    for instantiating the resizer?
    >
    I could try to use exec to run command line
    php. But there must be a more elegant way
    to do this. I'd use perl and imagemagick,
    but my shared host server doesn't have the
    right perl libs installed, and that's another
    can of worms.

    Comment

    • shimmyshack

      #3
      Re: max_execution_t ime and fork

      On Apr 28, 5:00 am, sandy <relativ...@isr elative.comwrot e:
      I use recursive readdir as the engine for
      a numerous file processing routines, including
      generating thumbnails, using getimagesize
      imagecreatetrue color and imagejpeg, etc.
      where each resize operation involves instantiating
      a new resizer class instance.
      >
      ....but my (shared host) website is large, so I
      run past max_execution_t ime if I try to make
      all thumbnails at once, from the top of
      my document_root. My virtual file system
      includes a php.ini which I can edit, but
      bumping max_execution-time (although the file saves)
      seems to have no effect, and I do not have
      permission to run /usr/sbin/apachectl restart
      >
      Perhaps the php.ini change
      will start to work the next time the server reboots.
      >
      In the mean time, is there some way to fork a new process,
      for instantiating the resizer?
      >
      I could try to use exec to run command line
      php. But there must be a more elegant way
      to do this. I'd use perl and imagemagick,
      but my shared host server doesn't have the
      right perl libs installed, and that's another
      can of worms.
      you could obtain a list of the jobs (or folders) to be processed, put
      it in a session, then use an iframe within a main page.
      the main page loads the iframe and a javascript function which can
      refresh the iframe.
      the iframe finishes calls processjob.php which completes job1
      job1 is removed from the array
      the php script outputs <script>parent. refreshIframe() </scriptto the
      iframe
      the iframe reloads and starts the next job

      this is of course a single thread approach, if you want to emulate
      more threads, use more iframes where the "parent iframe refresher" now
      takes the argument iframename.
      don't use too many or you will probably hit other hard limits of
      average CPU usage within a certain time, or just RAM.

      Comment

      • sandy

        #4
        Re: max_execution_t ime and fork

        shimmyshack wrote:
        >
        you could obtain a list of the jobs (or folders) to be processed, put
        it in a session, then use an iframe within a main page.
        the main page loads the iframe and a javascript function which can
        refresh the iframe.
        the iframe finishes calls processjob.php which completes job1
        job1 is removed from the array
        the php script outputs <script>parent. refreshIframe() </scriptto the
        iframe
        the iframe reloads and starts the next job
        >
        I like that idea. I'll try it.
        The best idea would be to get a co-located machine,
        or a virtual one, so I control server. In the
        meantime that sounds like a workable hack. Thank you.

        Comment

        • shimmyshack

          #5
          Re: max_execution_t ime and fork

          On Apr 28, 2:00 pm, sandy <relativ...@isr elative.comwrot e:
          shimmyshack wrote:
          >
          you could obtain a list of the jobs (or folders) to be processed, put
          it in a session, then use an iframe within a main page.
          the main page loads the iframe and a javascript function which can
          refresh the iframe.
          the iframe finishes calls processjob.php which completes job1
          job1 is removed from the array
          the php script outputs <script>parent. refreshIframe() </scriptto the
          iframe
          the iframe reloads and starts the next job
          >
          I like that idea. I'll try it.
          The best idea would be to get a co-located machine,
          or a virtual one, so I control server. In the
          meantime that sounds like a workable hack. Thank you.
          i like circumventing the resitrictions placed on a shared host, it's
          cheaper!
          although the above approach is jobs of unit size "folder"
          you could have a more fine grained approach where you list every job,
          using a database, the iframes would be included in each webapge
          provided there are jobs to do, the visitors then start a job if there
          are any. Provided you have low volume usage / record the last job
          start time in the db, this would be a great "cron" emulator which
          might be something else your shared service doesnt allow access to.

          Comment

          • sandy

            #6
            Re: max_execution_t ime and fork

            shimmyshack wrote:
            the iframes would be included in each webapge
            provided there are jobs to do, the visitors then start a job if there
            are any. Provided you have low volume usage / record the last job
            start time in the db, this would be a great "cron" emulator which
            might be something else your shared service doesnt allow access to.
            >
            Not sure about this. I use a home-rolled cms that reads a
            source directory structure, looking for images, image captions,
            html fragments, link references, etc, and then initializes
            a schema. Then I write out the whole website as static html.
            That way I can manage hundreds of sources and pages.

            Pages that *have* to be dynamic remain that way. But most
            of the site ends up as static html.

            .....so this is an admin functionality for me only, not something
            I want to let users invoke in anyway.

            Comment

            Working...