Stream from FTP directly to MySQL while parsing CSV

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Eric Anderson

    Stream from FTP directly to MySQL while parsing CSV

    I have some files that sit on a FTP server. These files contain data
    stored in a tab-separated format. I need to download these files and
    insert/update them in a MySQL database. My current basic strategy is to
    do the following:

    1) Login to the ftp server using the FTP library in PHP
    2) Create a variable that acts like a file handle using Stream_Var in PEAR.
    3) Use ftp_fget() to read a remote file into this variable (this is so I
    don't have to write it to disk).
    4) Parse that data now stored in memory using fgetcsv() (again treating
    that variable as a file handle using Stream_Var). This produces an array.
    4) Insert/Update the data in the array using DB in PEAR.

    This all seems to work and it means I don't have to write anything to
    disk. Everything is handled in memory so not temp files are needed. The
    downside is that some of these files are very large so the program can
    consume large amounts of memory. I want to see what I can do to reduce
    this memory usage.

    In a perfect world I don't need to keep the entire file in memory. As
    soon as a single line is read via FTP I should be able to pass that line
    off to the CSV parsing code and the MySQL insert/update should be able
    to take place as each line is parsed by the CSV library. I.E. I should
    have more than a buffer worth of data in memory at a time. A buffer
    would need to be able to store at least a entire line but my memory
    requirements would drop significantly.

    My problem is that I can't seem to be able to figure out how to do this
    with the current PHP libraries. It seems that most functions in PHP are
    not designed around the idea of piping streams of information together.

    The other restriction I have is that I am limited to just PHP 4.3. Any
    ideas or is holding the entire file in memory the best way (other than
    writing my own libraries).

    Eric
  • Chung Leong

    #2
    Re: Stream from FTP directly to MySQL while parsing CSV

    Eric Anderson wrote:[color=blue]
    > I have some files that sit on a FTP server. These files contain data
    > stored in a tab-separated format. I need to download these files and
    > insert/update them in a MySQL database. My current basic strategy is to
    > do the following:
    >
    > 1) Login to the ftp server using the FTP library in PHP
    > 2) Create a variable that acts like a file handle using Stream_Var in PEAR.
    > 3) Use ftp_fget() to read a remote file into this variable (this is so I
    > don't have to write it to disk).
    > 4) Parse that data now stored in memory using fgetcsv() (again treating
    > that variable as a file handle using Stream_Var). This produces an array.
    > 4) Insert/Update the data in the array using DB in PEAR.[/color]

    Is there a reason why you can't use the built-in FTP stream wrapper?

    Comment

    • NC

      #3
      Re: Stream from FTP directly to MySQL while parsing CSV

      Eric Anderson wrote:[color=blue]
      >
      > I have some files that sit on a FTP server. These files contain data
      > stored in a tab-separated format. I need to download these files and
      > insert/update them in a MySQL database. My current basic strategy
      > is to do the following:
      >
      > 1) Login to the ftp server using the FTP library in PHP
      > 2) Create a variable that acts like a file handle using Stream_Var in PEAR.
      > 3) Use ftp_fget() to read a remote file into this variable (this is so I
      > don't have to write it to disk).
      > 4) Parse that data now stored in memory using fgetcsv() (again treating
      > that variable as a file handle using Stream_Var). This produces an array.
      > 4) Insert/Update the data in the array using DB in PEAR.
      >
      > This all seems to work and it means I don't have to write anything to
      > disk.[/color]

      This is not necessarily a good thing. Because you want minimize the
      disk usage, you are missing out on MySQL's ability to process large
      files very quickly. I would suggest an alternative approach:

      1. Copy the remote file to your local disk.
      2. Use LOAD DATA INFILE to load the data into MySQL.
      3. Delete the data file, if necessary.

      Cheers,
      NC

      Comment

      • d

        #4
        Re: Stream from FTP directly to MySQL while parsing CSV

        "Eric Anderson" <eric@afaik.u s> wrote in message
        news:43d972b3$0 $3039$6d36acad@ titian.nntpserv er.com...[color=blue]
        >I have some files that sit on a FTP server. These files contain data stored
        >in a tab-separated format. I need to download these files and insert/update
        >them in a MySQL database. My current basic strategy is to do the following:
        >
        > 1) Login to the ftp server using the FTP library in PHP
        > 2) Create a variable that acts like a file handle using Stream_Var in
        > PEAR.
        > 3) Use ftp_fget() to read a remote file into this variable (this is so I
        > don't have to write it to disk).
        > 4) Parse that data now stored in memory using fgetcsv() (again treating
        > that variable as a file handle using Stream_Var). This produces an array.
        > 4) Insert/Update the data in the array using DB in PEAR.
        >
        > This all seems to work and it means I don't have to write anything to
        > disk. Everything is handled in memory so not temp files are needed. The
        > downside is that some of these files are very large so the program can
        > consume large amounts of memory. I want to see what I can do to reduce
        > this memory usage.
        >
        > In a perfect world I don't need to keep the entire file in memory. As soon
        > as a single line is read via FTP I should be able to pass that line off to
        > the CSV parsing code and the MySQL insert/update should be able to take
        > place as each line is parsed by the CSV library. I.E. I should have more
        > than a buffer worth of data in memory at a time. A buffer would need to be
        > able to store at least a entire line but my memory requirements would drop
        > significantly.
        >
        > My problem is that I can't seem to be able to figure out how to do this
        > with the current PHP libraries. It seems that most functions in PHP are
        > not designed around the idea of piping streams of information together.
        >
        > The other restriction I have is that I am limited to just PHP 4.3. Any
        > ideas or is holding the entire file in memory the best way (other than
        > writing my own libraries).
        >
        > Eric[/color]

        As chung was hinting at, use the FTP wrapper by simply opening the file with
        fopen() as you would a local file. You can then use fgets() to read a
        single line at a time, process that line, and repeat until the file has been
        read in its entirity.

        dave


        Comment

        • Eric Anderson

          #5
          Re: Stream from FTP directly to MySQL while parsing CSV

          d wrote:[color=blue]
          > As chung was hinting at, use the FTP wrapper by simply opening the file with
          > fopen() as you would a local file. You can then use fgets() to read a
          > single line at a time, process that line, and repeat until the file has been
          > read in its entirity.[/color]

          I thought about that but I am also reading a large number of file from
          the FTP server into the database and my assumption is that if I use
          fopen it will login to the FTP server everytime vs just once adding to
          the overhead a good bit.

          Thanks for the suggestion though. I'll keep it in mind.

          Eric

          Comment

          • Eric Anderson

            #6
            Re: Stream from FTP directly to MySQL while parsing CSV

            NC wrote:[color=blue]
            > This is not necessarily a good thing. Because you want minimize the
            > disk usage, you are missing out on MySQL's ability to process large
            > files very quickly. I would suggest an alternative approach:
            >
            > 1. Copy the remote file to your local disk.
            > 2. Use LOAD DATA INFILE to load the data into MySQL.
            > 3. Delete the data file, if necessary.[/color]

            Interesting approach. It would be fast and low on resources (although it
            would require usage of the filesystem but perhaps that isn't too big of
            a deal). The only downside is that it is MySQL specific. Currently this
            application is database independent and it would be nice to keep it that
            way. I'll keep it in mind.

            Eric

            Comment

            • Chung Leong

              #7
              Re: Stream from FTP directly to MySQL while parsing CSV


              Eric Anderson wrote:[color=blue]
              > d wrote:[color=green]
              > > As chung was hinting at, use the FTP wrapper by simply opening the file with
              > > fopen() as you would a local file. You can then use fgets() to read a
              > > single line at a time, process that line, and repeat until the file has been
              > > read in its entirity.[/color]
              >
              > I thought about that but I am also reading a large number of file from
              > the FTP server into the database and my assumption is that if I use
              > fopen it will login to the FTP server everytime vs just once adding to
              > the overhead a good bit.
              >
              > Thanks for the suggestion though. I'll keep it in mind.
              >
              > Eric[/color]

              Good point. On the other, the file transfer would happen concurrently
              with the database inserts. If the file is large, the time required
              would be lower for the entire operation.

              Comment

              • NC

                #8
                Re: Stream from FTP directly to MySQL while parsing CSV

                Eric Anderson wrote:[color=blue]
                > NC wrote:[color=green]
                > > This is not necessarily a good thing. Because you want minimize the
                > > disk usage, you are missing out on MySQL's ability to process large
                > > files very quickly. I would suggest an alternative approach:
                > >
                > > 1. Copy the remote file to your local disk.
                > > 2. Use LOAD DATA INFILE to load the data into MySQL.
                > > 3. Delete the data file, if necessary.[/color]
                >
                > Interesting approach. It would be fast and low on resources (although it
                > would require usage of the filesystem but perhaps that isn't too big of
                > a deal). The only downside is that it is MySQL specific. Currently this
                > application is database independent and it would be nice to keep it that
                > way.[/color]

                If memory serves, all SQL databases support import of text files. The
                query syntax may differ, but the concept is clearly there...

                Cheers,
                NC

                Comment

                • Eric Anderson

                  #9
                  Re: Stream from FTP directly to MySQL while parsing CSV

                  NC wrote:[color=blue]
                  > Eric Anderson wrote:[color=green]
                  >> NC wrote:[color=darkred]
                  >>> This is not necessarily a good thing. Because you want minimize the
                  >>> disk usage, you are missing out on MySQL's ability to process large
                  >>> files very quickly. I would suggest an alternative approach:
                  >>>
                  >>> 1. Copy the remote file to your local disk.
                  >>> 2. Use LOAD DATA INFILE to load the data into MySQL.
                  >>> 3. Delete the data file, if necessary.[/color]
                  >> Interesting approach. It would be fast and low on resources (although it
                  >> would require usage of the filesystem but perhaps that isn't too big of
                  >> a deal). The only downside is that it is MySQL specific. Currently this
                  >> application is database independent and it would be nice to keep it that
                  >> way.[/color]
                  >
                  > If memory serves, all SQL databases support import of text files. The
                  > query syntax may differ, but the concept is clearly there...[/color]

                  I went for a modification on this method. I got to thinking that php
                  variables are probably not designed to store large amounts of data. So
                  even though Stream_Var is convenient it is probably not not efficient
                  because as new data is read in from the FTP server PHP is probably
                  mallocing continuously causing performance really to drag. So I instead
                  have ftp_fget write out to whatever tmpfile() gives me to disk and then
                  use fgetcsv() to read that in from disk and insert into the database as
                  before.

                  This change resulted in reducing the time of the import taking about 3
                  hours to the import now taking about 10 minutes! It still seems like the
                  FTP library should offer a way to stream ftp data directly into a
                  consuming function such as fgetcsv() without having to read the entire
                  file into memory at once. But using the tmpfile() workaround seems to
                  perform well so I am happy.

                  Eric

                  Comment

                  Working...