how does one trap for out-of-memory errors?

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • lawrence

    how does one trap for out-of-memory errors?

    I'm not sure how this is normally done, on a large site, perhaps one
    running Phorum. Occassionally a thread will have hundreds of entries,
    perhaps a meg or two worth of data. You won't necessarily print all
    that to the screen, but PHP has to hold it in memory. Run some
    operations on it, or, more likely, have an array that you keep adding
    things to, and very soon you run into the 8 meg limit that is the
    default limit for PHP scripts.

    How do you trap for out-of-memory errors? I want to do something like:

    if (php is now using more than 8 megs) {
    print "Sorry, but this operation requires more memory than PHP is
    allowed.";
    } else {
    printOutEntryes ($allEntries);
    }


    PHP has a command that checks the current memory usage of PHP, but
    PHP, near as I know, has no command to estimate how much the next
    operation is going to drive up memory usuage.

    I'm sure this problem comes up for people using Phorum, or PostNuke,
    or phpSlash, or phpMyAdmin or any of a lot of programs. How is it
    normally handled?
  • Nikolai Chuvakhin

    #2
    Re: how does one trap for out-of-memory errors?

    lkrubner@geocit ies.com (lawrence) wrote in message
    news:<da7e68e8. 0310181002.6fbf 830b@posting.go ogle.com>...[color=blue]
    >
    > How do you trap for out-of-memory errors? I want to do something like:
    >
    > if (php is now using more than 8 megs) {
    > print "Sorry, but this operation requires more memory than PHP is
    > allowed.";
    > } else {
    > printOutEntryes ($allEntries);
    > }[/color]

    Not going to work. PHP will display an out-of-memory error and stop
    running BEFORE it gets to executing your if() statement.
    [color=blue]
    > I'm sure this problem comes up for people using Phorum, or PostNuke,
    > or phpSlash, or phpMyAdmin or any of a lot of programs. How is it
    > normally handled?[/color]

    By preventing it from happening. More specifically, by avoiding
    sucking into memory things of unknown and potentially very large
    size. Even more specifically, by writing proper database queries,
    so that the resulting datasets can be processed and output strictly
    one record at a time, without accumulating them in arrays on the
    client side.

    Cheers,
    NC

    Comment

    • Perttu Pulkkinen

      #3
      Re: how does one trap for out-of-memory errors?


      "Nikolai Chuvakhin" <nc@iname.com > wrote in message[color=blue]
      > Not going to work. PHP will display an out-of-memory error and stop
      > running BEFORE it gets to executing your if() statement.[/color]

      Are you also saying that memory errors are OUTSIDE of possibilities of php
      error handling systems (notices, warnings, errors and fatal errors). Because
      normally PHP errors CAN be handled and it is possible to execute code after
      error. You just need to write error handlers and set them to use.

      perttu pulkkinen, Finland


      Comment

      • DvDmanDT

        #4
        Re: how does one trap for out-of-memory errors?

        You know, try to read a 100mb file into memory, do some operations on it,
        then see what happends... It's strange really, but all that happends is
        slight slowness... There was some thread comparing PHP with ASP where
        someone wanted to something with big XML files...
        ASP+40mb XML file... Script ran for 60 mins or so unless my memory fails
        me... Then it just said out of memory or something...

        PHP + 100mb: 15 mins... He read entire file into memory, then did some
        operations on it... PHP handled the file and the memory without problems...

        --
        // DvDmanDT
        MSN: dvdmandt@hotmai l.com
        Mail: dvdmandt@telia. com
        "lawrence" <lkrubner@geoci ties.com> skrev i meddelandet
        news:da7e68e8.0 310181002.6fbf8 30b@posting.goo gle.com...[color=blue]
        > I'm not sure how this is normally done, on a large site, perhaps one
        > running Phorum. Occassionally a thread will have hundreds of entries,
        > perhaps a meg or two worth of data. You won't necessarily print all
        > that to the screen, but PHP has to hold it in memory. Run some
        > operations on it, or, more likely, have an array that you keep adding
        > things to, and very soon you run into the 8 meg limit that is the
        > default limit for PHP scripts.
        >
        > How do you trap for out-of-memory errors? I want to do something like:
        >
        > if (php is now using more than 8 megs) {
        > print "Sorry, but this operation requires more memory than PHP is
        > allowed.";
        > } else {
        > printOutEntryes ($allEntries);
        > }
        >
        >
        > PHP has a command that checks the current memory usage of PHP, but
        > PHP, near as I know, has no command to estimate how much the next
        > operation is going to drive up memory usuage.
        >
        > I'm sure this problem comes up for people using Phorum, or PostNuke,
        > or phpSlash, or phpMyAdmin or any of a lot of programs. How is it
        > normally handled?[/color]


        Comment

        • DvDmanDT

          #5
          Re: how does one trap for out-of-memory errors?

          To simply allocate way to much memory in a very idiotic way (to test what
          happends):

          <?php
          /**** Memory killer 1.0 ****
          * Will allocate some memory.
          * Case you wonder: No, I
          * don't have a life. I
          * suggest using this on cgi
          * version, PHP 4 or 5. Apache
          * might crash otherwise.
          */
          set_time_limit( 0); // This WILL take some time, up to 30 mins, so disable
          time limit
          define("BYTES_T O_ALLOCATE",20* 1024*1024); // 20 mb
          ob_implicit_flu sh();
          header("Content-Type: text/plain");
          function pmem(){
          global $str;
          echo "Current lenght: ".strlen($str). "\r\n";
          // Uncomment next line to show usage
          #echo "Memory useage: ".(memory_get_u sage()/1024)." kb\r\n";
          }
          register_tick_f unction('pmem') ;
          declare(ticks=1 024);
          $str="";
          srand(time());
          for($i=0;$i<BYT ES_TO_ALLOCATE; $i++)$str.=chr( rand(32,100)); // Add a random
          char
          ?>

          --
          // DvDmanDT
          MSN: dvdmandt@hotmai l.com
          Mail: dvdmandt@telia. com
          "DvDmanDT" <dvdmandt@telia .com> skrev i meddelandet
          news:u7Dkb.3020 6$mU6.79346@new sb.telia.net...[color=blue]
          > You know, try to read a 100mb file into memory, do some operations on it,
          > then see what happends... It's strange really, but all that happends is
          > slight slowness... There was some thread comparing PHP with ASP where
          > someone wanted to something with big XML files...
          > ASP+40mb XML file... Script ran for 60 mins or so unless my memory fails
          > me... Then it just said out of memory or something...
          >
          > PHP + 100mb: 15 mins... He read entire file into memory, then did some
          > operations on it... PHP handled the file and the memory without[/color]
          problems...[color=blue]
          >
          > --
          > // DvDmanDT
          > MSN: dvdmandt@hotmai l.com
          > Mail: dvdmandt@telia. com
          > "lawrence" <lkrubner@geoci ties.com> skrev i meddelandet
          > news:da7e68e8.0 310181002.6fbf8 30b@posting.goo gle.com...[color=green]
          > > I'm not sure how this is normally done, on a large site, perhaps one
          > > running Phorum. Occassionally a thread will have hundreds of entries,
          > > perhaps a meg or two worth of data. You won't necessarily print all
          > > that to the screen, but PHP has to hold it in memory. Run some
          > > operations on it, or, more likely, have an array that you keep adding
          > > things to, and very soon you run into the 8 meg limit that is the
          > > default limit for PHP scripts.
          > >
          > > How do you trap for out-of-memory errors? I want to do something like:
          > >
          > > if (php is now using more than 8 megs) {
          > > print "Sorry, but this operation requires more memory than PHP is
          > > allowed.";
          > > } else {
          > > printOutEntryes ($allEntries);
          > > }
          > >
          > >
          > > PHP has a command that checks the current memory usage of PHP, but
          > > PHP, near as I know, has no command to estimate how much the next
          > > operation is going to drive up memory usuage.
          > >
          > > I'm sure this problem comes up for people using Phorum, or PostNuke,
          > > or phpSlash, or phpMyAdmin or any of a lot of programs. How is it
          > > normally handled?[/color]
          >
          >[/color]


          Comment

          • Nikolai Chuvakhin

            #6
            Re: how does one trap for out-of-memory errors?

            "Perttu Pulkkinen" <Perttu.Pulkkin en@co.jyu.fi> wrote in message
            news:<1yrkb.51$ BY1.45@read3.in et.fi>...[color=blue]
            >
            > "Nikolai Chuvakhin" <nc@iname.com > wrote in message[color=green]
            > > PHP will display an out-of-memory error and stop running
            > > BEFORE it gets to executing your if() statement.[/color]
            >
            > Are you also saying that memory errors are OUTSIDE of possibilities
            > of php error handling systems (notices, warnings, errors and fatal
            > errors).[/color]

            No. All I'm saying is that what the original poster wanted cannot
            be done the way he wants it. In particular, if my understanding
            is correct, a memory allocation problem causes a fatal error
            (E_ERROR), but until the error is handled (possibly, by a user-
            defined error handling function), it is impossible to know whether
            the error is caused by memory allocation problem or something else.

            Cheers,
            NC

            Comment

            • lawrence

              #7
              Re: how does one trap for out-of-memory errors?

              nc@iname.com (Nikolai Chuvakhin) wrote in message news:<32d7a63c. 0310212041.6780 b34e@posting.go ogle.com>...[color=blue]
              > "Perttu Pulkkinen" <Perttu.Pulkkin en@co.jyu.fi> wrote in message
              > news:<1yrkb.51$ BY1.45@read3.in et.fi>...[color=green]
              > >
              > > "Nikolai Chuvakhin" <nc@iname.com > wrote in message[color=darkred]
              > > > PHP will display an out-of-memory error and stop running
              > > > BEFORE it gets to executing your if() statement.[/color]
              > >
              > > Are you also saying that memory errors are OUTSIDE of possibilities
              > > of php error handling systems (notices, warnings, errors and fatal
              > > errors).[/color]
              >
              > No. All I'm saying is that what the original poster wanted cannot
              > be done the way he wants it. In particular, if my understanding
              > is correct, a memory allocation problem causes a fatal error
              > (E_ERROR), but until the error is handled (possibly, by a user-
              > defined error handling function), it is impossible to know whether
              > the error is caused by memory allocation problem or something else.[/color]

              Yes, thank you. You understood my intention. I was asking about
              trapping for errors.

              If you can open a 100mb file and process it and then write it to disk,
              then what does the 8 mb limit refer to?

              Comment

              • lawrence

                #8
                Re: how does one trap for out-of-memory errors?

                nc@iname.com (Nikolai Chuvakhin) wrote in message news:<32d7a63c. 0310182055.442f 5236@posting.go ogle.com>...[color=blue][color=green]
                > > I'm sure this problem comes up for people using Phorum, or PostNuke,
                > > or phpSlash, or phpMyAdmin or any of a lot of programs. How is it
                > > normally handled?[/color]
                >
                > By preventing it from happening. More specifically, by avoiding
                > sucking into memory things of unknown and potentially very large
                > size. Even more specifically, by writing proper database queries,
                > so that the resulting datasets can be processed and output strictly
                > one record at a time, without accumulating them in arrays on the
                > client side.[/color]

                The right kind of queries? So something like this is bad:

                SELECT * FROM content WHERE type = 'weblogEntries'

                I get this back and then try to read it into an array. If there are
                enough entries, then I run into the 8 meg limit. I take it that your
                approach would avoid putting the whole return into an array?

                Comment

                • Nikolai Chuvakhin

                  #9
                  Re: how does one trap for out-of-memory errors?

                  lkrubner@geocit ies.com (lawrence) wrote in message
                  news:<da7e68e8. 0310231140.2867 c947@posting.go ogle.com>...[color=blue]
                  >[color=green][color=darkred]
                  > > > How is it normally handled?[/color]
                  > >
                  > > By preventing it from happening. More specifically, by avoiding
                  > > sucking into memory things of unknown and potentially very large
                  > > size. Even more specifically, by writing proper database queries,
                  > > so that the resulting datasets can be processed and output strictly
                  > > one record at a time, without accumulating them in arrays on the
                  > > client side.[/color]
                  >
                  > The right kind of queries? So something like this is bad:
                  >
                  > SELECT * FROM content WHERE type = 'weblogEntries'[/color]

                  No (except for the fact that `type` is a text, and thus likely
                  unindexed, field), but your handling of it is definitely bad:
                  [color=blue]
                  > I get this back and then try to read it into an array.[/color]

                  Why would you want to do this? With very few exceptions, any data
                  manipulation you are about to do with PHP is done faster and with
                  less overhead by MySQL. And if you are not going to manipulate
                  the data, why bother reading it into a huge array?

                  If you explain what exactly you are trying to achieve, I'll gladly
                  help you find a better way of getting it done.
                  [color=blue]
                  > I take it that your approach would avoid putting the whole
                  > return into an array?[/color]

                  Of course. The usual

                  while ($record = mysql_fetch_row ($result)) {}

                  only keeps in memory one record at a time. Remember, $result
                  is a resource (sort of like a file pointer), so it has the same
                  memory footprint regardless of the number of records in the
                  returned dataset.

                  Cheers,
                  NC

                  Comment

                  • Zurab Davitiani

                    #10
                    Re: how does one trap for out-of-memory errors?

                    Nikolai Chuvakhin wrote on Friday 24 October 2003 00:25:
                    [color=blue]
                    > Of course. The usual
                    >
                    > while ($record = mysql_fetch_row ($result)) {}
                    >
                    > only keeps in memory one record at a time.[/color]

                    I'm not so sure about this. My understanding is that

                    $my_var = "something" ;

                    $my_var = "something else";

                    does not free the original memory occupied by "something" , it just allocates
                    new memory for "something else". All memory is only freed when script
                    execution or PHP process ends. I agree though, maybe some garbage
                    collection would be great in PHP.

                    --
                    Business Web Solutions
                    ActiveLink, LLC

                    Comment

                    • Nikolai Chuvakhin

                      #11
                      Re: how does one trap for out-of-memory errors?

                      Zurab Davitiani <agt@mindless.c om> wrote in message
                      news:<fW4mb.304 4$dv3.1630@news svr14.news.prod igy.com>...[color=blue]
                      >
                      > Nikolai Chuvakhin wrote on Friday 24 October 2003 00:25:
                      >[color=green]
                      > > The usual
                      > > while ($record = mysql_fetch_row ($result)) {}
                      > > only keeps in memory one record at a time.[/color]
                      >
                      > I'm not so sure about this. My understanding is that
                      >
                      > $my_var = "something" ;
                      > $my_var = "something else";
                      >
                      > does not free the original memory occupied by "something" ,
                      > it just allocates new memory for "something else". All memory
                      > is only freed when script execution or PHP process ends.[/color]

                      I honestly doubt it. If you were correct, one of my largest
                      applications couldn't possibly work, because one of its
                      functionalities is like this:

                      1. Retrieve about 10 fields per record from the entire
                      table (currently, ~200,000 records).
                      2. Based on those 10 fields and some extra inputs, compute
                      about 70 more fields for each record.
                      3. Write the results back into the table.

                      In other words, the

                      while ($record = mysql_fetch_row ($result)) {}

                      cycle was repeated 200,000 times. Given that $record was an
                      array including a date and a collection of floating-point
                      numbers, I'd say its memory requirement was about 100 bytes.
                      This would mean that the script would trash about 20 megabytes
                      of memory only when handling this one variable. But what about
                      the 70 values that were computed for each of the 200,000 records?
                      They (being mostly integers, with a few floats) would trash
                      at least another 20 megabytes. The UPDATE query string formed
                      for writing the output into the databse was about 100 characters
                      long, so it, too, would trash 20 megabytes over 200,000
                      reassignments. Yet the whole thing never ever hit the default
                      memory limit...

                      I am not saying you're wrong, but if you're right, I'd like to
                      understand how it is possible for you to be right while I see
                      what I see. No pun intended; I really would like to learn
                      something here...

                      Cheers,
                      NC

                      Comment

                      • lawrence

                        #12
                        Re: how does one trap for out-of-memory errors?

                        nc@iname.com (Nikolai Chuvakhin) wrote in message[color=blue][color=green]
                        > > I get this back and then try to read it into an array.[/color]
                        >
                        > Why would you want to do this? With very few exceptions, any data
                        > manipulation you are about to do with PHP is done faster and with
                        > less overhead by MySQL. And if you are not going to manipulate
                        > the data, why bother reading it into a huge array?
                        >
                        > If you explain what exactly you are trying to achieve, I'll gladly
                        > help you find a better way of getting it done.
                        >[color=green]
                        > > I take it that your approach would avoid putting the whole
                        > > return into an array?[/color]
                        >
                        > Of course. The usual
                        >
                        > while ($record = mysql_fetch_row ($result)) {}
                        >
                        > only keeps in memory one record at a time. Remember, $result
                        > is a resource (sort of like a file pointer), so it has the same
                        > memory footprint regardless of the number of records in the
                        > returned dataset.[/color]

                        I am putting database returns in PHP arrays to mask the client code
                        from the underlying database. That is, I don't want my code to know
                        what kind of database or datastore is being dealt with.

                        I am storing all actual SQL in objects that I am calling queryObjects.
                        But the only place the query objects are called is in what I call Get,
                        Update, Delete, and Insert objects. The Get, Update, Delete, and
                        Insert objects are meant to mask, from the rest of the code, what kind
                        of database or datastore is in use. In other words, if I want to get
                        all weblog entries out of a database, I might have a method called
                        getAllWeblogEnt ries, and this function would contain actual SQL, and
                        return a PHP array. So I might have a query object for MySql and a
                        query object for PostGre and a query object for XML streams. Each of
                        these query objects would have a method called getAllWeblogEnt ries,
                        and each would return a PHP array.

                        The Get object knows what kind of database or datastore the website
                        uses (this is something set in the config file for that website). All
                        the other code simply makes a call to the Get object. Thus, the other
                        code, I mean the client code, never needs to know what kind of
                        database or datastore is in use.

                        I can't imagine how I can hide the database from the client code if I
                        use database return resources, as you suggest above. But I now realize
                        how limiting these arrays can be, in the presence of PHP's 8 meg
                        limit. So I'm open to any graceful ideas for otherwise masking the
                        database from the client code.

                        Comment

                        • Nikolai Chuvakhin

                          #13
                          Re: how does one trap for out-of-memory errors?

                          lkrubner@geocit ies.com (lawrence) wrote
                          in message <da7e68e8.03102 41718.3fcdc191@ posting.google. com>:[color=blue]
                          >
                          > I am putting database returns in PHP arrays to mask the client
                          > code from the underlying database. That is, I don't want my code
                          > to know what kind of database or datastore is being dealt with.[/color]

                          Haven't you heard that abstraction doesn't work? :) I am not
                          trying to start a flame war here, but you need to understand that
                          higher level of abstraction ALWAYS implies a performance penalty.
                          In your case, you are shooting for capital abstraction and what
                          you get for it is the performance equivalent of capital
                          punishment.
                          [color=blue]
                          > I am storing all actual SQL in objects that I am calling
                          > queryObjects. But the only place the query objects are called
                          > is in what I call Get, Update, Delete, and Insert objects. The
                          > Get, Update, Delete, and Insert objects are meant to mask, from
                          > the rest of the code, what kind of database or datastore is in
                          > use. In other words, if I want to get all weblog entries out of
                          > a database, I might have a method called getAllWeblogEnt ries,
                          > and this function would contain actual SQL, and return a PHP
                          > array. So I might have a query object for MySql and a query
                          > object for PostGre and a query object for XML streams. Each of
                          > these query objects would have a method called
                          > getAllWeblogEnt ries, and each would return a PHP array.[/color]

                          And that's the root of your problem. You want to return something
                          of potentially very large size, and you are hell-bent on storing
                          it in memory. But let me ask you: what will the application do
                          with a PHP array that contains all entries in the weblog? Output
                          them? Then the entries don't have to be in memory at all, they
                          can be output straight from the DB buffer. Sort them? Again,
                          they don't have to be in the memory for that, you should have
                          taken care of it on the database level. Search through them?
                          Same thing; too late to do it efficiently, had to be done on the
                          database level.

                          So my suggestion to you is not to have a getAllWeblogEnt ries
                          method at all. Rather, you should have a getWeblogEntry method
                          that returns a single weblog entry in an array and an entriesLeft
                          method that returns a boolean value. This way, you can retrieve
                          all entries one by one:

                          $qo = new queryObjectMySQ L ('SELECT * FROM entries');
                          // the constructor establishes the connection, retrieves data,
                          // and keeps the resulting resource for the object's internal
                          // use...
                          while ($qo->entriesLeft ()) {
                          $entry = $qo->getWeblogEnt ry ();
                          // now, do what you want with the $entry...
                          }
                          [color=blue]
                          > I can't imagine how I can hide the database from the client code
                          > if I use database return resources, as you suggest above.[/color]

                          You make it sound like it's a good thing... :)

                          Cheers,
                          NC

                          Comment

                          • lawrence

                            #14
                            Re: how does one trap for out-of-memory errors?

                            nc@iname.com (Nikolai Chuvakhin) wrote in message news:<32d7a63c. 0310251605.258c fb2b@posting.go ogle.com>...[color=blue]
                            > lkrubner@geocit ies.com (lawrence) wrote
                            > in message <da7e68e8.03102 41718.3fcdc191@ posting.google. com>:[color=green]
                            > >
                            > > I am putting database returns in PHP arrays to mask the client
                            > > code from the underlying database. That is, I don't want my code
                            > > to know what kind of database or datastore is being dealt with.[/color]
                            >
                            > Haven't you heard that abstraction doesn't work? :) I am not
                            > trying to start a flame war here, but you need to understand that
                            > higher level of abstraction ALWAYS implies a performance penalty.
                            > In your case, you are shooting for capital abstraction and what
                            > you get for it is the performance equivalent of capital
                            > punishment.[/color]

                            This runs counter to most of what I've been taught. I'm not sure what
                            you mean when you say abstraction doesn't work. Much software work
                            involves abstraction. You could argue that PHP itself is an
                            abstraction, hiding the complexity of the underlying C code. Some
                            might say that anything other than machine code is an abstraction. I
                            can't think of a software project that doesn't involve abstraction.
                            As for OO design, to hide the datastore seems like an obvious move.
                            Hasn't the team working on PEAR done the same?




                            [color=blue]
                            >[color=green]
                            > > I am storing all actual SQL in objects that I am calling
                            > > queryObjects. But the only place the query objects are called
                            > > is in what I call Get, Update, Delete, and Insert objects. The
                            > > Get, Update, Delete, and Insert objects are meant to mask, from
                            > > the rest of the code, what kind of database or datastore is in
                            > > use. In other words, if I want to get all weblog entries out of
                            > > a database, I might have a method called getAllWeblogEnt ries,
                            > > and this function would contain actual SQL, and return a PHP
                            > > array. So I might have a query object for MySql and a query
                            > > object for PostGre and a query object for XML streams. Each of
                            > > these query objects would have a method called
                            > > getAllWeblogEnt ries, and each would return a PHP array.[/color]
                            >
                            > And that's the root of your problem. You want to return something
                            > of potentially very large size, and you are hell-bent on storing
                            > it in memory.[/color]

                            No, I am not hell bent. I asked for alternatives.









                            [color=blue]
                            > But let me ask you: what will the application do
                            > with a PHP array that contains all entries in the weblog? Output
                            > them? Then the entries don't have to be in memory at all, they
                            > can be output straight from the DB buffer. Sort them? Again,
                            > they don't have to be in the memory for that, you should have
                            > taken care of it on the database level. Search through them?
                            > Same thing; too late to do it efficiently, had to be done on the
                            > database level.
                            >
                            > So my suggestion to you is not to have a getAllWeblogEnt ries
                            > method at all. Rather, you should have a getWeblogEntry method
                            > that returns a single weblog entry in an array and an entriesLeft
                            > method that returns a boolean value. This way, you can retrieve
                            > all entries one by one:
                            >
                            > $qo = new queryObjectMySQ L ('SELECT * FROM entries');
                            > // the constructor establishes the connection, retrieves data,
                            > // and keeps the resulting resource for the object's internal
                            > // use...
                            > while ($qo->entriesLeft ()) {
                            > $entry = $qo->getWeblogEnt ry ();
                            > // now, do what you want with the $entry...
                            > }[/color]

                            I think your suggestion is a good one. Somehow I must get just one
                            entry at a time. The query object also needs to remember which entry
                            it got last, so it can get the next one. The SELECT object would have
                            to wrap that interger and pass it along to the final code, especially
                            if the final code has a loop that needs to know how long it is
                            supposed to keep running.

                            Off hand, it sounds terribly complicated, and the whole of the code
                            would have to be rewritten. Every loop would have to be rewritten. But
                            it would eventually get the code free of working with dangerously
                            large arrays. It is a project that, I think, will take a few months,
                            but I think the direction you suggest is a good one.






                            [color=blue][color=green]
                            > > I can't imagine how I can hide the database from the client code
                            > > if I use database return resources, as you suggest above.[/color]
                            >
                            > You make it sound like it's a good thing... :)[/color]

                            Well, this code is being written first to work with MySql, but I'm
                            supposed to write it in such a way that would make it easy to switch
                            to PostGre. So I've written it the way I have to save myself some
                            future work. And saving oneself future work is the aim of most OO
                            abstraction.

                            Comment

                            • Nikolai Chuvakhin

                              #15
                              Re: how does one trap for out-of-memory errors?

                              lkrubner@geocit ies.com (lawrence) wrote in message
                              news:<da7e68e8. 0310261251.1d71 189b@posting.go ogle.com>...[color=blue]
                              > nc@iname.com (Nikolai Chuvakhin) wrote in message
                              > news:<32d7a63c. 0310251605.258c fb2b@posting.go ogle.com>...
                              >[color=green]
                              > > In your case, you are shooting for capital abstraction and what
                              > > you get for it is the performance equivalent of capital
                              > > punishment.[/color]
                              >
                              > This runs counter to most of what I've been taught. I'm not sure
                              > what you mean when you say abstraction doesn't work.[/color]

                              My apologies for vague language. I should have said "database
                              abstraction layers" rather than just "abstractio n".
                              [color=blue]
                              > As for OO design, to hide the datastore seems like an obvious move.[/color]

                              Which is exactly the reason I generally object to using OO design
                              in the first place.
                              [color=blue]
                              > Hasn't the team working on PEAR done the same?[/color]

                              Yes, they have. And, according to phplens, this resulted in 150-170%
                              increase in time it takes to complete a series of SELECT queries:

                              The Phone Tracker App by Snoopza is the best tracking app for cell phones. You can track mobile locations for free, as well as track calls, chats and text messages with this Android application.

                              [color=blue]
                              > I am not hell bent. I asked for alternatives.[/color]

                              I think looking for an altermative in your situation was the right
                              thing to do. Note also that you didn't have a compelling reason
                              to do it "your" way in the first place; at least, my question to
                              that extent:
                              [color=blue][color=green]
                              > > But let me ask you: what will the application do
                              > > with a PHP array that contains all entries in the weblog? Output
                              > > them? Then the entries don't have to be in memory at all, they
                              > > can be output straight from the DB buffer. Sort them? Again,
                              > > they don't have to be in the memory for that, you should have
                              > > taken care of it on the database level. Search through them?
                              > > Same thing; too late to do it efficiently, had to be done on the
                              > > database level.[/color][/color]

                              went unanswered...
                              [color=blue]
                              > Somehow I must get just one entry at a time.[/color]

                              I'd say this is the preferred way of dealing with the problem.
                              [color=blue]
                              > The query object also needs to remember which entry it got last,
                              > so it can get the next one.[/color]

                              It already does, since it has a resource with an internal pointer
                              to the next available entry...
                              [color=blue]
                              > Well, this code is being written first to work with MySql, but I'm
                              > supposed to write it in such a way that would make it easy to switch
                              > to PostGre.[/color]

                              This is yet another reason why database abstraction layers are
                              such a bad idea... PostgreSQL has a whole slew of features
                              that are not available in MySQL: views, rules, triggers, stored
                              procedures, etc. Database abstraction forces you to stick to
                              the lowest common denominator, which robs you of opportunities
                              to use the more advanced, higher-performance features...

                              Cheers,
                              NC

                              Comment

                              Working...