readfile_and_downloads

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Cousin Stanley

    readfile_and_downloads

    Greetings ....

    I'm a dinosaur-age programmer but a php neophyte
    trying to put together some server-side php code
    that regulates the availability of a set of files
    for downloading, provides the download, and logs
    related data to a MySQL DB that will be used
    to assist with subsequent regulation decisions ....

    The readfile() function seems to be a convenient way
    to provide the download ....

    $num_bytes = readfile( $file_path ) ;

    The php docs for the readfile() function says ....

    Reads a file and writes it to the output buffer.

    Does readfile slurp the entire file into memory
    before beginning to write or does it read-a-little
    and then write-a-little chunk-wise via internal io
    buffers ?

    In the case where the files to be downloaded
    are fairly large, e.g. full CD sized or larger,
    trying to slurp the whole file before beginning
    the write phase seems potentially problematic,
    e.g. high server loads and swap-prone ....

    I don't know whether this could really be a problem
    or if I'm overly concerned with something that could
    take care of itself via normal system io buffering
    and individual task processing mechanisms ....

    Would coding a function using a < fread|fwrite > loop
    where only a chunk at a time is processed in each pass
    work out any better for downloading large files in cases
    where multiple users are simultaneously downloading ?


    --
    Stanley C. Kitching
    Human Being
    Phoenix, Arizona


    ----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
    http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
    ----= East and West-Coast Server Farms - Total Privacy via Encryption =----
  • Sean

    #2
    Re: readfile_and_do wnloads

    Something like this would work, you have to put some logging features
    in.

    function download_file($ filename, $content_type =
    'application/octet-stream', $isDel=false)
    {
    $file=basename( $filename);

    if (strstr($_SERVE R['HTTP_USER_AGEN T'], "MSIE"))
    {
    $file = preg_replace('/\./', '%2e', $file, substr_count($f ile, '.') -
    1);
    }

    // make sure the file exists before sending headers
    if(!$fdl=@fopen ($filename,'r') )
    {
    die("<br>Cannot Open File!<br>");
    }
    else
    {
    header("Cache-Control: ");// leave blank to avoid IE errors
    header("Pragma: ");// leave blank to avoid IE errors
    header("Content-type: $content_type") ;
    header("Content-Disposition: attachment; filename=\"$fil e\"");
    header("Content-length:".(strin g)(filesize($fi lename)));
    sleep(1);
    fpassthru($fdl) ;
    }
    if($isDel)
    {
    @unlink($filena me);
    }
    }

    download_file(' ./test.csv');

    That should work across fire fox and ie (I don't know about safari or
    opera). And the code can be tidied up a bit.

    I think processing only chucks of a file at a time code get very
    unnecessarily complex. If you are worried about performance why not set
    up your downloads as torrents?

    Comment

    • Sjoerd

      #3
      Re: readfile_and_do wnloads

      @Sean: He is not worried about performance, he is worried that he will
      not have enough memory to let 10 users download a 1 GB file.

      @Cousin: Please read the comments on
      http://www.php.net/manual/en/function.readfile.php. They say that
      readfile is slower, modifies the "last-modified" time and reads the
      entire file in memory. You are probably better off with a fread loop:

      $fp = fopen("filename ", "r");
      while ($data = fread($fp, 100000)) {
      echo $data;
      }
      fclose($fp);

      One more thing to think about: normally, PHP pages have a maximum
      execution time of 30 seconds. Use set_time_limit to allow transfer
      taking longer than 30 seconds.

      Another possibility is to have the PHP script do logging and what not,
      and then redirect to the actual file:

      header("Locatio n: http://my.host.com/actual.file");

      Comment

      • Chung Leong

        #4
        Re: readfile_and_do wnloads

        The behavior is readfile() is platform dependent. On operation systems
        that supports memory-mapped files (e.g. Win32), the entire file is
        mapped into the process's memory space and sent in one chunk (hence the
        complaints that the function reads in the whole file). On operation
        systems that do not support memory-mapped files, the function dispatch
        the data in 8K chunks.

        Comment

        • Cousin Stanley

          #5
          Re: readfile_and_do wnloads

          [color=blue]
          > Something like this would work,
          > you have to put some logging features in.
          >
          > function download_file( .... )
          > {
          > ....
          > fpassthru( $fdl ) ;
          > ....
          > }
          >
          > download_file( './test.csv' ) ;[/color]

          Sean ....

          Thanks for the reply and the example code ....

          As a php rookie I wasn't aware of the fpassthru() function
          so I checked my local php docs ....

          Reads to EOF on the given file pointer
          from the current position and writes
          the results to the output buffer ....

          This seems to work in a similar manner
          to the readfile() function slurping up
          the entire file ....
          [color=blue]
          > That should work across fire fox and ie
          > ( I don't know about safari or opera ).[/color]

          Dealing with differences in various browsers
          always hurts my head ....

          Thanks for the reminder that I need
          to keep these differences in mind ....
          [color=blue]
          > ....
          > I think processing only chucks of a file at a time
          > could get very unnecessarily complex.[/color]

          It will require a few more lines of code to write
          and maintain, but in cases where the size of the
          files to be downloaded is large and there are
          multiple users attempting to get them at the same time,
          maybe chunking the data across could help to keep
          the server running smoothly ....
          [color=blue]
          > If you are worried about performance
          > why not set up your downloads as torrents?[/color]

          I think there will be some alternate sites
          offering torrent downloads ....

          We have a limited monthly bandwidth limit
          and want to try and distribute it out
          evenly over the entire month ....


          --
          Stanley C. Kitching
          Human Being
          Phoenix, Arizona


          ----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
          http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
          ----= East and West-Coast Server Farms - Total Privacy via Encryption =----

          Comment

          • Cousin Stanley

            #6
            Re: readfile_and_do wnloads

            [color=blue]
            > @Sean: He is not worried about performance,
            > he is worried that he will not have enough memory
            > to let 10 users download a 1 GB file.
            >[/color]

            Sjoerd ....

            Yes, this is my primary concern ....

            The files I'm currently dealing with
            are in the CD-Sized range ~ 600 MB ....
            [color=blue]
            > @Cousin: Please read the comments on
            > http://www.php.net/manual/en/function.readfile.php.[/color]

            Thanks for pointing me toward the on-line php docs ....

            I have local copies under Debian Linux
            but they don't include the additional
            and informative user feedback ....
            [color=blue]
            > They say that readfile is slower,
            > modifies the "last-modified" time
            > and reads the entire file in memory.[/color]

            Most all of the comments there regarding readfile
            seemed interesting ....
            [color=blue]
            > You are probably better off with a fread loop:[/color]

            I think so too, and more so
            after reading the on-line docs ....
            [color=blue]
            > $fp = fopen( "filename", "r") ;
            >
            > while ( $data = fread( $fp , 100000 ) )
            > {
            > echo $data ;
            > }
            >
            > fclose( $fp ) ;
            >
            > One more thing to think about: normally, PHP pages have a maximum
            > execution time of 30 seconds. Use set_time_limit to allow transfer
            > taking longer than 30 seconds.[/color]

            I am aware that there is a limit,
            and will look into learning how
            to deal with it ....
            [color=blue]
            > Another possibilty is to have the PHP script do logging and what not,
            > and then redirect to the actual file:
            >
            > header( "Location: http://my.host.com/actual.file" ) ;[/color]

            When you redirect with a Location header in this manner
            what is the actual download function that is called ?

            In otherwords, would the redirection call readfile()
            to transfer ?

            If I pre-log before the redirection
            it seems that it might be difficult
            to know whether or not the download
            was fully completed and how much data
            was actually transferred in cases
            where the download was incomplete ....

            Thanks very much for the reply ....

            It's greatly appreciated ....


            --
            Stanley C. Kitching
            Human Being
            Phoenix, Arizona


            ----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
            http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
            ----= East and West-Coast Server Farms - Total Privacy via Encryption =----

            Comment

            • Cousin Stanley

              #7
              Re: readfile_and_do wnloads

              [color=blue]
              > The behavior of readfile() is platform dependent.
              >
              > On operation systems that supports memory-mapped files
              > ( e.g. Win32 ), the entire file is mapped into
              > the process's memory space and sent in one chunk
              > ( hence the complaints that the function reads in
              > the whole file ).
              >
              > On operation systems that do not support memory-mapped files,
              > the function dispatch the data in 8K chunks.[/color]

              Chung Leong ....

              Thanks for the reply ....

              The commercial hosting service that I'm now dealing with
              is running FreeBSD and I use Debian Linux on my local
              test machine ....

              However, I don't know about the memory-mapped file behavior
              on either of these systems ....

              I currently don't have to deal with
              any WinXX servers ....


              --
              Stanley C. Kitching
              Human Being
              Phoenix, Arizona


              ----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
              http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
              ----= East and West-Coast Server Farms - Total Privacy via Encryption =----

              Comment

              Working...