your opinion on this

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • henribaeyens

    your opinion on this

    Hello,

    So, I did this website for a client; one part gives users the opportunity
    to download various documents (generally Word documents) but they have to
    pay for that. We use micropayments. Upon payment, a script looks up the
    file name in a database, establishes a url, and the dl begins. I want to
    protect the directory the downloadable files reside in; obviously an
    htaccess directive would prevent all access and thence all downloads. So
    I thought of this: store the files in an htaccess-protected directory,
    and when it is requested, copy it to a public directory, give it a random
    name, and feed the url to the browser. To prevent files from piling up in
    the download directory, I would have to set up a task (cron job?) to
    delete all files whose date of creation (or last access) is more the a
    given period of time. The idea here is also to prevent anyone from, say,
    jotting down the file's url and access it at a date. Granted, he/she/it
    did pay for the file but he/she/it could as well pass the url on to
    someone else.

    What do you think about that; both on the principle and on the
    methodology.

    Thanks

    I realize it might not be the correct group to ask such a question, but
    someone here perhaps has come accross a similar issue
  • Mark

    #2
    Re: your opinion on this

    Doesn't sound like an elegant solution. I'm no expert in this field,
    but .htaccess wouldn't prevent you from doing this, would it?

    <?php
    // downloading a file
    $filename = $_GET['path'];

    // fix for IE catching or PHP bug issue
    header("Pragma: public");
    header("Expires : 0"); // set expiration time
    header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
    // browser must download file from server instead of cache

    // force download dialog
    header("Content-Type: application/force-download");
    header("Content-Type: application/octet-stream");
    header("Content-Type: application/download");

    // use the Content-Disposition header to supply a recommended filename
    and
    // force the browser to display the save dialog.
    header("Content-Disposition: attachment;
    filename=".base name($filename) .";");

    /*
    The Content-transfer-encoding header should be binary, since the file
    will be read
    directly from the disk and the raw bytes passed to the downloading
    computer.
    The Content-length header is useful to set for downloads. The browser
    will be able to
    show a progress meter as a file downloads. The content-lenght can be
    determines by
    filesize function returns the size of a file.
    */
    header("Content-Transfer-Encoding: binary");
    header("Content-Length: ".filesize($fil ename));

    @readfile($file name);
    exit(0);
    ?>

    (yoinked from http://ca.php.net/manual/en/function.header.php by
    milin_mestry at yahoo dot com)


    That way you can keep the file in its current directory, as well as
    rename it on download, or whatever you please, and you never even have
    to reveal the full URL to the user.


    On Aug 13, 3:42 pm, henribaeyens <cont...@myname .comwrote:
    Hello,
    >
    So, I did this website for a client; one part gives users the opportunity
    to download various documents (generally Word documents) but they have to
    pay for that. We use micropayments. Upon payment, a script looks up the
    file name in a database, establishes a url, and the dl begins. I want to
    protect the directory the downloadable files reside in; obviously an
    htaccess directive would prevent all access and thence all downloads. So
    I thought of this: store the files in an htaccess-protected directory,
    and when it is requested, copy it to a public directory, give it a random
    name, and feed the url to the browser. To prevent files from piling up in
    the download directory, I would have to set up a task (cron job?) to
    delete all files whose date of creation (or last access) is more the a
    given period of time. The idea here is also to prevent anyone from, say,
    jotting down the file's url and access it at a date. Granted, he/she/it
    did pay for the file but he/she/it could as well pass the url on to
    someone else.
    >
    What do you think about that; both on the principle and on the
    methodology.
    >
    Thanks
    >
    I realize it might not be the correct group to ask such a question, but
    someone here perhaps has come accross a similar issue

    Comment

    • sheldonlg

      #3
      Re: your opinion on this

      henribaeyens wrote:
      Hello,
      >
      So, I did this website for a client; one part gives users the opportunity
      to download various documents (generally Word documents) but they have to
      pay for that. We use micropayments. Upon payment, a script looks up the
      file name in a database, establishes a url, and the dl begins. I want to
      protect the directory the downloadable files reside in; obviously an
      htaccess directive would prevent all access and thence all downloads. So
      I thought of this: store the files in an htaccess-protected directory,
      and when it is requested, copy it to a public directory, give it a random
      name, and feed the url to the browser. To prevent files from piling up in
      the download directory, I would have to set up a task (cron job?) to
      delete all files whose date of creation (or last access) is more the a
      given period of time. The idea here is also to prevent anyone from, say,
      jotting down the file's url and access it at a date. Granted, he/she/it
      did pay for the file but he/she/it could as well pass the url on to
      someone else.
      >
      What do you think about that; both on the principle and on the
      methodology.
      >
      Thanks
      >
      I realize it might not be the correct group to ask such a question, but
      someone here perhaps has come accross a similar issue

      ....and just what is to prevent someone who has purchased the file from
      simply emailing that files as an attachment or putting it on a cd or
      uploading it to his own website or whatever and giving it to someone else?

      I did a website like this and I pointed this out at the time, but they
      wanted it anyway so I did it. After all they were paying. I put a
      counter in a database which I decremented (they were allowed three
      downloads), and only provided the file if downloads were available
      (after a login, of course).

      Comment

      • The Natural Philosopher

        #4
        Re: your opinion on this

        henribaeyens wrote:
        Hello,
        >
        So, I did this website for a client; one part gives users the opportunity
        to download various documents (generally Word documents) but they have to
        pay for that. We use micropayments. Upon payment, a script looks up the
        file name in a database, establishes a url, and the dl begins. I want to
        protect the directory the downloadable files reside in; obviously an
        htaccess directive would prevent all access and thence all downloads. So
        I thought of this: store the files in an htaccess-protected directory,
        and when it is requested, copy it to a public directory, give it a random
        name, and feed the url to the browser. To prevent files from piling up in
        the download directory, I would have to set up a task (cron job?) to
        delete all files whose date of creation (or last access) is more the a
        given period of time. The idea here is also to prevent anyone from, say,
        jotting down the file's url and access it at a date. Granted, he/she/it
        did pay for the file but he/she/it could as well pass the url on to
        someone else.
        >
        What do you think about that; both on the principle and on the
        methodology.
        >
        Thanks
        >
        I realize it might not be the correct group to ask such a question, but
        someone here perhaps has come accross a similar issue
        Far too complicated: use a database to store the files as BLOBS.

        Comment

        • David Quinton

          #5
          Re: your opinion on this

          On 13 Aug 2008 22:42:24 GMT, henribaeyens <contact@myname .comwrote:
          >Hello,
          >
          >So, I did this website for a client; one part gives users the opportunity
          >to download various documents
          Why reinvent the wheel?

          We use IPNMonitor:

          (but it uses PayPal IPN)
          --
          Locate your Mobile phone: <http://www.bizorg.co.u k/news.html>
          Great gifts: <http://www.ThisBritain .com/ASOS_popup.html >

          Comment

          • Michael Fesser

            #6
            Re: your opinion on this

            ..oO(henribaeye ns)
            >So, I did this website for a client; one part gives users the opportunity
            >to download various documents (generally Word documents) but they have to
            >pay for that. We use micropayments. Upon payment, a script looks up the
            >file name in a database, establishes a url, and the dl begins. I want to
            >protect the directory the downloadable files reside in; obviously an
            >htaccess directive would prevent all access and thence all downloads.
            The better way would be to store these files outside the document root,
            no .htaccess needed there.
            >So
            >I thought of this: store the files in an htaccess-protected directory,
            >and when it is requested, copy it to a public directory, give it a random
            >name, and feed the url to the browser.
            Ugly and insecure (security by obscurity).
            >To prevent files from piling up in
            >the download directory, I would have to set up a task (cron job?) to
            >delete all files whose date of creation (or last access) is more the a
            >given period of time.
            Even more ugly.
            >The idea here is also to prevent anyone from, say,
            >jotting down the file's url and access it at a date.
            If it's publically available, it can be downloaded.
            >Granted, he/she/it
            >did pay for the file but he/she/it could as well pass the url on to
            >someone else.
            This can happen everywhere, you can't prevent that.
            >What do you think about that; both on the principle and on the
            >methodology.
            Bad idea. Store the files outside the docroot as already said, then use
            a script to deliver them to the clients. The script just has to check
            that the user is allowed to download the requsted file.

            Micha

            Comment

            • Michael Fesser

              #7
              Re: your opinion on this

              ..oO(Mark)
              >Doesn't sound like an elegant solution. I'm no expert in this field,
              >but .htaccess wouldn't prevent you from doing this, would it?
              >
              ><?php
              >// downloading a file
              >$filename = $_GET['path'];
              >
              >[...]
              >
              >@readfile($fil ename);
              Never(!) use any client data without proper validation! Do you know what
              the above would allow an attacker to do? To download every file on the
              entire server which is readable for the web server! This would include
              important system configuration files, your own scripts, probably your
              database credentials ...

              Micha

              Comment

              • Mark

                #8
                Re: your opinion on this

                On Aug 14, 10:46 am, Michael Fesser <neti...@gmx.de wrote:
                .oO(Mark)
                >
                Doesn't sound like an elegant solution. I'm no expert in this field,
                but .htaccess wouldn't prevent you from doing this, would it?
                >
                <?php
                // downloading a file
                $filename = $_GET['path'];
                >
                [...]
                >
                @readfile($file name);
                >
                Never(!) use any client data without proper validation! Do you know what
                the above would allow an attacker to do? To download every file on the
                entire server which is readable for the web server! This would include
                important system configuration files, your own scripts, probably your
                database credentials ...
                >
                Micha
                Well... I'd like to say "that goes without saying", but I guess I
                can't. That was a copy and paste job :p

                Comment

                Working...