VBA crashes on large table

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • gershwyn
    New Member
    • Feb 2010
    • 122

    VBA crashes on large table

    I have a large table (~600,000 records) that needs to have a unique record number assigned to each number. The table itself is too large to add an autonumber field to directly, so I was trying to get around that through VBA.

    I have a function that will go through the table's recordset and assign a number to each record. The problem is that it crashes at around 180,000 (and irrecoverably damaged the VBA module - it wouldn't even allow me to delete it.)

    In troubleshooting the problem I noticed that as the function runs, the size of the database increases greatly, and the crash is caused by Access hitting the 2gb database limit. I'm wondering if the increased size is due to Access trying to keep everything in one transaction, and if I could tell it to somehow forget about remembering everything and just go ahead and make the changes. Something like a 'commit immediate' mode.

    This is the entirety of the VBA module, I call this function directly from the immediate window since it only needs to happen once:

    Code:
    Function NumberTables(TableName)
      Dim rs As Recordset
      On Error Resume Next
      DoCmd.SetWarnings False
      DoCmd.RunSQL "ALTER TABLE [" & TableName & "] ADD COLUMN [Record_ID] INTEGER;"
      If Err = 3380 Then Exit Function
      On Error GoTo 0
      Set rs = CurrentDb.TableDefs(TableName).OpenRecordset
      rs.MoveFirst
      Do While Not rs.EOF
        Count = Count + 1
        rs.Edit
        rs.Fields("Record_ID") = Count
        rs.Update
        rs.MoveNext
      Loop
      rs.Close
    End Function
    Any idea how I can cause this not to bloat horribly out of control?
    Last edited by gershwyn; Apr 27 '10, 09:50 PM. Reason: clarify wording
  • TheSmileyCoder
    Recognized Expert Moderator Top Contributor
    • Dec 2009
    • 2322

    #2
    I dont know about any kind of "commit directly" but a few different approaches and questions spring to mind:
    Q1) If you do Compact and repair, how large is the database?

    Q2) Could you copy the structure into a new table, add the autonumber field and then append the records using standard query?

    Q3) When count reaches 100.000 exit the loop, compact & repair, then continue the loop with those records that have not yet been assigned an ID.

    Q4) I thought I had a 4...but either I forgot it, or I never had a 4....

    Q5) Make a module with a public variable, intCount and a public function fintCounter() that will return intCount, and increment it by one. Then run a normal Update query, where you update the field to Counter(). That will keep you out of VBA recordsets, and may cause less bloating.

    Comment

    • gershwyn
      New Member
      • Feb 2010
      • 122

      #3
      1- After a compact and repair, the database is 1.32gb.

      2- I hadn't thought of this, actually. It seems like a simple solution, but I hesitate to try copying the data over merely because I need the data exactly the way it is, and I worry that some trivial detail will be missed until later.

      3- I was thinking of this as a possible solution before I noticed the file size issue. I could rewrite the function so that if the Record_ID fields already exists but contains null values, to pick up where it left off and keep renumbering. And then as you say, terminate the function every 100,000 records or so, and compact.

      5- I had tried this first, but I must have written the function incorrectly - it updated every record to the same value. It was after this I went to the recordset option.

      Thank you for the suggestions. I will see if I can't get this to work after all. (Though if anyone does know about the filesize thing, I am curious.)

      Comment

      • Jim Doherty
        Recognized Expert Contributor
        • Aug 2007
        • 897

        #4
        Originally posted by gershwyn
        1- After a compact and repair, the database is 1.32gb.

        2- I hadn't thought of this, actually. It seems like a simple solution, but I hesitate to try copying the data over merely because I need the data exactly the way it is, and I worry that some trivial detail will be missed until later.

        3- I was thinking of this as a possible solution before I noticed the file size issue. I could rewrite the function so that if the Record_ID fields already exists but contains null values, to pick up where it left off and keep renumbering. And then as you say, terminate the function every 100,000 records or so, and compact.

        5- I had tried this first, but I must have written the function incorrectly - it updated every record to the same value. It was after this I went to the recordset option.

        Thank you for the suggestions. I will see if I can't get this to work after all. (Though if anyone does know about the filesize thing, I am curious.)
        Keeping 600,000 records in one table and appending to another table in the same database? it will equal 1,200,000 rows between them and bang goes the filesize limit.

        A) Why not just export the data to delimited text file (test it out on a new database as a (text file) attached table to view the file contents).

        B) Copy the structure of the existing table to a new table and as has already been suggested. Delete your existing table only when you are content that your now 'external' data can be imported successfully using a test database

        C) Add the autonumber the new table and import your data (Do NOT index anything or else the import will be slow. You can always then add any indexes once the table has been populated)

        IMHO opinion using recordsets and loops to 'update' or 'batchupdate' on very large datasets using BeginTrans CommitTrans is slow in my experience. I favour raw SQL wherever possible, most of the time.

        As pointed out a different approach will bear fruit here I think

        Regards

        Jim

        Comment

        • ADezii
          Recognized Expert Expert
          • Apr 2006
          • 8834

          #5
          Originally posted by gershwyn
          I have a large table (~600,000 records) that needs to have a unique record number assigned to each number. The table itself is too large to add an autonumber field to directly, so I was trying to get around that through VBA.

          I have a function that will go through the table's recordset and assign a number to each record. The problem is that it crashes at around 180,000 (and irrecoverably damaged the VBA module - it wouldn't even allow me to delete it.)

          In troubleshooting the problem I noticed that as the function runs, the size of the database increases greatly, and the crash is caused by Access hitting the 2gb database limit. I'm wondering if the increased size is due to Access trying to keep everything in one transaction, and if I could tell it to somehow forget about remembering everything and just go ahead and make the changes. Something like a 'commit immediate' mode.

          This is the entirety of the VBA module, I call this function directly from the immediate window since it only needs to happen once:

          Code:
          Function NumberTables(TableName)
            Dim rs As Recordset
            On Error Resume Next
            DoCmd.SetWarnings False
            DoCmd.RunSQL "ALTER TABLE [" & TableName & "] ADD COLUMN [Record_ID] INTEGER;"
            If Err = 3380 Then Exit Function
            On Error GoTo 0
            Set rs = CurrentDb.TableDefs(TableName).OpenRecordset
            rs.MoveFirst
            Do While Not rs.EOF
              Count = Count + 1
              rs.Edit
              rs.Fields("Record_ID") = Count
              rs.Update
              rs.MoveNext
            Loop
            rs.Close
          End Function
          Any idea how I can cause this not to bloat horribly out of control?
          1. I'm surprised that the code runs at all since [Record_ID] is Defined as an INTEGER Data Type whose Maximum Value can only be 32,767. It should be defined as a LONG INTEGER.
          2. Manually add the [Record_ID] Field to your Table, create an Update Query that generates Unique Values in the [Record_ID] Field, and set the Use Transaction Property of the Query to NO which will force the Query NOT to run as a Single Transaction. This is only Theory and has not actually been tested.

          Comment

          • gershwyn
            New Member
            • Feb 2010
            • 122

            #6
            Originally posted by ADezii
            1. I'm surprised that the code runs at all since [Record_ID] is Defined as an INTEGER Data Type whose Maximum Value can only be 32,767. It should be defined as a LONG INTEGER.
            2. Manually add the [Record_ID] Field to your Table, create an Update Query that generates Unique Values in the [Record_ID] Field, and set the Use Transaction Property of the Query to NO which will force the Query NOT to run as a Single Transaction. This is only Theory and has not actually been tested.
            ...

            I can't believe I didn't notice that. The code "runs" fine, but is throwing error after error, though my setwarnings command is suppressing them. (I also just noticed I never turned them back on again - oops.)

            Defining the field as a Long Int solved the problem. I suspect the file size issue was Access compiling a list of the thousands of errors that were occurring. I don't how that works with updating individual records in a recordset, but if this were a single update query it would give me the option of continuing or canceling, which means it must save the old values somewhere.

            This whole thing was meant to be quick and dirty and I only succeeded at the dirty part.

            Your second suggestion probably would have saved a lot of time if I could get it to work. The following function results in updating everything to 1. Running it again changes everything to 2. What do I need to do to make it increment correctly?

            Code:
            Option Explicit
            Public Counter As Long
            
            Function ReturnCount()
              Counter = Counter + 1
              ReturnCount = Counter
            End Function

            Comment

            • ADezii
              Recognized Expert Expert
              • Apr 2006
              • 8834

              #7
              Originally posted by gershwyn
              ...

              I can't believe I didn't notice that. The code "runs" fine, but is throwing error after error, though my setwarnings command is suppressing them. (I also just noticed I never turned them back on again - oops.)

              Defining the field as a Long Int solved the problem. I suspect the file size issue was Access compiling a list of the thousands of errors that were occurring. I don't how that works with updating individual records in a recordset, but if this were a single update query it would give me the option of continuing or canceling, which means it must save the old values somewhere.

              This whole thing was meant to be quick and dirty and I only succeeded at the dirty part.

              Your second suggestion probably would have saved a lot of time if I could get it to work. The following function results in updating everything to 1. Running it again changes everything to 2. What do I need to do to make it increment correctly?

              Code:
              Option Explicit
              Public Counter As Long
              
              Function ReturnCount()
                Counter = Counter + 1
                ReturnCount = Counter
              End Function
              Does the Unique ID have to be Numeric, and if it does, does it have to be sequential?

              Comment

              Working...