can't read large files - help

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • comp.lang.php

    can't read large files - help

    [PHP]
    if (!function_exis ts('bigfile')) {
    /**
    * Works like file() in PHP except that it will work more efficiently
    with very large files
    *
    * @access public
    * @param mixed $fullFilePath
    * @return array $lineArray
    * @see actual_path
    */
    function bigfile($fullFi lePath) {
    @ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
    MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
    ENOUGH UNTIL END!)
    $fileID = @fopen(actual_p ath($fullFilePa th), 'r');
    while (@!feof($fileID )) {
    $buffer = @fgets($fileID, 4096);
    $lineArray[] = $buffer;
    }
    @fclose($fileID );
    return $lineArray;
    }
    }
    [/PHP]

    I even temporarily increase memory (I know, bad idea but it's all I can
    think of to do), however, requirements stipulate that files that are
    smaller than the max file size (arbitrarily set) be sent via email
    attachment (of course, depending on email SMTP server if it gets sent)

    I can't think of any other trick to make this either work or NOT to
    time out but throw an error/warning.

    Help!

    Thanx
    Phil

  • ZeldorBlat

    #2
    Re: can't read large files - help


    comp.lang.php wrote:[color=blue]
    > [PHP]
    > if (!function_exis ts('bigfile')) {
    > /**
    > * Works like file() in PHP except that it will work more efficiently
    > with very large files
    > *
    > * @access public
    > * @param mixed $fullFilePath
    > * @return array $lineArray
    > * @see actual_path
    > */
    > function bigfile($fullFi lePath) {
    > @ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
    > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
    > ENOUGH UNTIL END!)
    > $fileID = @fopen(actual_p ath($fullFilePa th), 'r');
    > while (@!feof($fileID )) {
    > $buffer = @fgets($fileID, 4096);
    > $lineArray[] = $buffer;
    > }
    > @fclose($fileID );
    > return $lineArray;
    > }
    > }
    > [/PHP]
    >
    > I even temporarily increase memory (I know, bad idea but it's all I can
    > think of to do), however, requirements stipulate that files that are
    > smaller than the max file size (arbitrarily set) be sent via email
    > attachment (of course, depending on email SMTP server if it gets sent)
    >
    > I can't think of any other trick to make this either work or NOT to
    > time out but throw an error/warning.
    >
    > Help!
    >
    > Thanx
    > Phil[/color]

    What exactly are you trying to achieve? What do you mean by "it
    doesn't work?" Some more details will help us suggest a solution...

    Comment

    • comp.lang.php

      #3
      Re: can't read large files - help


      ZeldorBlat wrote:[color=blue]
      > comp.lang.php wrote:[color=green]
      > > [PHP]
      > > if (!function_exis ts('bigfile')) {
      > > /**
      > > * Works like file() in PHP except that it will work more efficiently
      > > with very large files
      > > *
      > > * @access public
      > > * @param mixed $fullFilePath
      > > * @return array $lineArray
      > > * @see actual_path
      > > */
      > > function bigfile($fullFi lePath) {
      > > @ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
      > > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
      > > ENOUGH UNTIL END!)
      > > $fileID = @fopen(actual_p ath($fullFilePa th), 'r');
      > > while (@!feof($fileID )) {
      > > $buffer = @fgets($fileID, 4096);
      > > $lineArray[] = $buffer;
      > > }
      > > @fclose($fileID );
      > > return $lineArray;
      > > }
      > > }
      > > [/PHP]
      > >
      > > I even temporarily increase memory (I know, bad idea but it's all I can
      > > think of to do), however, requirements stipulate that files that are
      > > smaller than the max file size (arbitrarily set) be sent via email
      > > attachment (of course, depending on email SMTP server if it gets sent)
      > >
      > > I can't think of any other trick to make this either work or NOT to
      > > time out but throw an error/warning.
      > >
      > > Help!
      > >
      > > Thanx
      > > Phil[/color]
      >
      > What exactly are you trying to achieve? What do you mean by "it
      > doesn't work?" Some more details will help us suggest a solution...[/color]

      At the moment I am able to allow for large files to be broken up into
      an array by not using file() but by using my function above, bigfile(),
      by increasing memory temporarily, so it seems I solved it after all; I
      can accomplish the opening and parsing of larger files this way, so
      thanx!

      Phil

      Comment

      • ZeldorBlat

        #4
        Re: can't read large files - help


        comp.lang.php wrote:[color=blue]
        > ZeldorBlat wrote:[color=green]
        > > comp.lang.php wrote:[color=darkred]
        > > > [PHP]
        > > > if (!function_exis ts('bigfile')) {
        > > > /**
        > > > * Works like file() in PHP except that it will work more efficiently
        > > > with very large files
        > > > *
        > > > * @access public
        > > > * @param mixed $fullFilePath
        > > > * @return array $lineArray
        > > > * @see actual_path
        > > > */
        > > > function bigfile($fullFi lePath) {
        > > > @ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
        > > > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
        > > > ENOUGH UNTIL END!)
        > > > $fileID = @fopen(actual_p ath($fullFilePa th), 'r');
        > > > while (@!feof($fileID )) {
        > > > $buffer = @fgets($fileID, 4096);
        > > > $lineArray[] = $buffer;
        > > > }
        > > > @fclose($fileID );
        > > > return $lineArray;
        > > > }
        > > > }
        > > > [/PHP]
        > > >
        > > > I even temporarily increase memory (I know, bad idea but it's all I can
        > > > think of to do), however, requirements stipulate that files that are
        > > > smaller than the max file size (arbitrarily set) be sent via email
        > > > attachment (of course, depending on email SMTP server if it gets sent)
        > > >
        > > > I can't think of any other trick to make this either work or NOT to
        > > > time out but throw an error/warning.
        > > >
        > > > Help!
        > > >
        > > > Thanx
        > > > Phil[/color]
        > >
        > > What exactly are you trying to achieve? What do you mean by "it
        > > doesn't work?" Some more details will help us suggest a solution...[/color]
        >
        > At the moment I am able to allow for large files to be broken up into
        > an array by not using file() but by using my function above, bigfile(),
        > by increasing memory temporarily, so it seems I solved it after all; I
        > can accomplish the opening and parsing of larger files this way, so
        > thanx!
        >
        > Phil[/color]

        I ask the question because you're trying to break it up into an array
        of lines -- which suggests that you're doing something with the data on
        a line-by-line basis. If that's the case, why not read a single line,
        do something with it, then read the next line? Then you don't need to
        load the whole thing into memory first.

        As I said before, though, it all depends on what you're trying to do.

        Comment

        • comp.lang.php

          #5
          Re: can't read large files - help


          ZeldorBlat wrote:[color=blue]
          > comp.lang.php wrote:[color=green]
          > > ZeldorBlat wrote:[color=darkred]
          > > > comp.lang.php wrote:
          > > > > [PHP]
          > > > > if (!function_exis ts('bigfile')) {
          > > > > /**
          > > > > * Works like file() in PHP except that it will work more efficiently
          > > > > with very large files
          > > > > *
          > > > > * @access public
          > > > > * @param mixed $fullFilePath
          > > > > * @return array $lineArray
          > > > > * @see actual_path
          > > > > */
          > > > > function bigfile($fullFi lePath) {
          > > > > @ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
          > > > > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
          > > > > ENOUGH UNTIL END!)
          > > > > $fileID = @fopen(actual_p ath($fullFilePa th), 'r');
          > > > > while (@!feof($fileID )) {
          > > > > $buffer = @fgets($fileID, 4096);
          > > > > $lineArray[] = $buffer;
          > > > > }
          > > > > @fclose($fileID );
          > > > > return $lineArray;
          > > > > }
          > > > > }
          > > > > [/PHP]
          > > > >
          > > > > I even temporarily increase memory (I know, bad idea but it's all I can
          > > > > think of to do), however, requirements stipulate that files that are
          > > > > smaller than the max file size (arbitrarily set) be sent via email
          > > > > attachment (of course, depending on email SMTP server if it gets sent)
          > > > >
          > > > > I can't think of any other trick to make this either work or NOT to
          > > > > time out but throw an error/warning.
          > > > >
          > > > > Help!
          > > > >
          > > > > Thanx
          > > > > Phil
          > > >
          > > > What exactly are you trying to achieve? What do you mean by "it
          > > > doesn't work?" Some more details will help us suggest a solution...[/color]
          > >
          > > At the moment I am able to allow for large files to be broken up into
          > > an array by not using file() but by using my function above, bigfile(),
          > > by increasing memory temporarily, so it seems I solved it after all; I
          > > can accomplish the opening and parsing of larger files this way, so
          > > thanx!
          > >
          > > Phil[/color]
          >
          > I ask the question because you're trying to break it up into an array
          > of lines -- which suggests that you're doing something with the data on
          > a line-by-line basis. If that's the case, why not read a single line,
          > do something with it, then read the next line? Then you don't need to
          > load the whole thing into memory first.
          >
          > As I said before, though, it all depends on what you're trying to do.[/color]

          What I am trying to do is to load the file as an attachment to an
          auto-generated email.

          Phil

          Comment

          • ZeldorBlat

            #6
            Re: can't read large files - help


            comp.lang.php wrote:[color=blue]
            > ZeldorBlat wrote:[color=green]
            > > comp.lang.php wrote:[color=darkred]
            > > > ZeldorBlat wrote:
            > > > > comp.lang.php wrote:
            > > > > > [PHP]
            > > > > > if (!function_exis ts('bigfile')) {
            > > > > > /**
            > > > > > * Works like file() in PHP except that it will work more efficiently
            > > > > > with very large files
            > > > > > *
            > > > > > * @access public
            > > > > > * @param mixed $fullFilePath
            > > > > > * @return array $lineArray
            > > > > > * @see actual_path
            > > > > > */
            > > > > > function bigfile($fullFi lePath) {
            > > > > > @ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
            > > > > > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
            > > > > > ENOUGH UNTIL END!)
            > > > > > $fileID = @fopen(actual_p ath($fullFilePa th), 'r');
            > > > > > while (@!feof($fileID )) {
            > > > > > $buffer = @fgets($fileID, 4096);
            > > > > > $lineArray[] = $buffer;
            > > > > > }
            > > > > > @fclose($fileID );
            > > > > > return $lineArray;
            > > > > > }
            > > > > > }
            > > > > > [/PHP]
            > > > > >
            > > > > > I even temporarily increase memory (I know, bad idea but it's all I can
            > > > > > think of to do), however, requirements stipulate that files that are
            > > > > > smaller than the max file size (arbitrarily set) be sent via email
            > > > > > attachment (of course, depending on email SMTP server if it gets sent)
            > > > > >
            > > > > > I can't think of any other trick to make this either work or NOT to
            > > > > > time out but throw an error/warning.
            > > > > >
            > > > > > Help!
            > > > > >
            > > > > > Thanx
            > > > > > Phil
            > > > >
            > > > > What exactly are you trying to achieve? What do you mean by "it
            > > > > doesn't work?" Some more details will help us suggest a solution...
            > > >
            > > > At the moment I am able to allow for large files to be broken up into
            > > > an array by not using file() but by using my function above, bigfile(),
            > > > by increasing memory temporarily, so it seems I solved it after all; I
            > > > can accomplish the opening and parsing of larger files this way, so
            > > > thanx!
            > > >
            > > > Phil[/color]
            > >
            > > I ask the question because you're trying to break it up into an array
            > > of lines -- which suggests that you're doing something with the data on
            > > a line-by-line basis. If that's the case, why not read a single line,
            > > do something with it, then read the next line? Then you don't need to
            > > load the whole thing into memory first.
            > >
            > > As I said before, though, it all depends on what you're trying to do.[/color]
            >
            > What I am trying to do is to load the file as an attachment to an
            > auto-generated email.
            >
            > Phil[/color]

            So let me make sure I understand this. You're trying to take a file
            that's so large that the normal file handling mechanisims can't deal
            with it, then send that massive file as an email attachment?

            Comment

            • comp.lang.php

              #7
              Re: can't read large files - help


              ZeldorBlat wrote:[color=blue]
              > comp.lang.php wrote:[color=green]
              > > ZeldorBlat wrote:[color=darkred]
              > > > comp.lang.php wrote:
              > > > > ZeldorBlat wrote:
              > > > > > comp.lang.php wrote:
              > > > > > > [PHP]
              > > > > > > if (!function_exis ts('bigfile')) {
              > > > > > > /**
              > > > > > > * Works like file() in PHP except that it will work more efficiently
              > > > > > > with very large files
              > > > > > > *
              > > > > > > * @access public
              > > > > > > * @param mixed $fullFilePath
              > > > > > > * @return array $lineArray
              > > > > > > * @see actual_path
              > > > > > > */
              > > > > > > function bigfile($fullFi lePath) {
              > > > > > > @ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
              > > > > > > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
              > > > > > > ENOUGH UNTIL END!)
              > > > > > > $fileID = @fopen(actual_p ath($fullFilePa th), 'r');
              > > > > > > while (@!feof($fileID )) {
              > > > > > > $buffer = @fgets($fileID, 4096);
              > > > > > > $lineArray[] = $buffer;
              > > > > > > }
              > > > > > > @fclose($fileID );
              > > > > > > return $lineArray;
              > > > > > > }
              > > > > > > }
              > > > > > > [/PHP]
              > > > > > >
              > > > > > > I even temporarily increase memory (I know, bad idea but it's all I can
              > > > > > > think of to do), however, requirements stipulate that files that are
              > > > > > > smaller than the max file size (arbitrarily set) be sent via email
              > > > > > > attachment (of course, depending on email SMTP server if it gets sent)
              > > > > > >
              > > > > > > I can't think of any other trick to make this either work or NOT to
              > > > > > > time out but throw an error/warning.
              > > > > > >
              > > > > > > Help!
              > > > > > >
              > > > > > > Thanx
              > > > > > > Phil
              > > > > >
              > > > > > What exactly are you trying to achieve? What do you mean by "it
              > > > > > doesn't work?" Some more details will help us suggest a solution...
              > > > >
              > > > > At the moment I am able to allow for large files to be broken up into
              > > > > an array by not using file() but by using my function above, bigfile(),
              > > > > by increasing memory temporarily, so it seems I solved it after all; I
              > > > > can accomplish the opening and parsing of larger files this way, so
              > > > > thanx!
              > > > >
              > > > > Phil
              > > >
              > > > I ask the question because you're trying to break it up into an array
              > > > of lines -- which suggests that you're doing something with the data on
              > > > a line-by-line basis. If that's the case, why not read a single line,
              > > > do something with it, then read the next line? Then you don't need to
              > > > load the whole thing into memory first.
              > > >
              > > > As I said before, though, it all depends on what you're trying to do.[/color]
              > >
              > > What I am trying to do is to load the file as an attachment to an
              > > auto-generated email.
              > >
              > > Phil[/color]
              >
              > So let me make sure I understand this. You're trying to take a file
              > that's so large that the normal file handling mechanisims can't deal
              > with it, then send that massive file as an email attachment?[/color]

              No trying involved, I can do it now. Just don't use file() but my own
              function, bigfile() and temporarily increase memory.

              Business requirement, plain and simple.

              Phil

              Comment

              Working...