How to have a batch transaction?

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Sri

    How to have a batch transaction?

    I am inserting records into a table - around 1 million records. I want
    to do the insert 10000 records at a time using TRAN. My insert
    statement is very simple:
    BEGIN TRAN T1
    Insert INTO TABLE A
    (ColA, ColB, ColC)
    Select B.TBLBCOLA, B.TBLBCOLB, B.TBLBCOLC
    FROM tbltable B
    WHERE blah balh...etc.

    COMMIT TRAN T1


    Any hep or link on the syntax will be helpful!
    Thanks
    Sri

  • Gary

    #2
    Re: How to have a batch transaction?

    Why would you do this? Your insert..select statement is already atomic
    and doesn't need a transaction. Besides, you're not checking for
    errors and rolling back so it's useless.

    Comment

    • Sri

      #3
      Re: How to have a batch transaction?

      I have to do this in batches because my log ran out of space in tempdb.

      Comment

      • Phil

        #4
        Re: How to have a batch transaction?

        You can't do it inside a single transaction, because this won't free-up
        any space in the tran log until the whole process has completed.

        As Gary says, if you're splitting the insert into multiple batches,
        there's no point in placing each one in iots own transaction because
        it's atomic.

        If you can delete the records from TableB after you've inserted into
        TableA, then life is comparatively simple:


        declare @counter int
        declare @rows_affected int

        set rowcount 10000

        Insert INTO TABLE A
        (ColA, ColB, ColC)
        Select B.TBLBCOLA, B.TBLBCOLB, B.TBLBCOLC
        FROM tbltable B
        WHERE blah balh...etc

        select @rows_affected = @@ROWCOUNT

        delete Table B
        WHERE blah balh...etc

        while @rows_affected > 0
        begin

        Insert INTO TABLE A
        (ColA, ColB, ColC)
        Select B.TBLBCOLA, B.TBLBCOLB, B.TBLBCOLC
        FROM tbltable B
        WHERE blah balh...etc

        select @rows_affected = @@ROWCOUNT

        delete Table B
        WHERE blah balh...etc

        end

        set rowcount 0


        This will repeatedly insert 10000 records into table A, then delete the
        same from Table B until there are no more records left that match the
        WHERE clause. The final iteration of the loop will probably affect <
        10000 records.

        If you can't delete from tableB, you can use the same principle, but
        will have to compare the PK values in TableA with the max equivalent
        for those records just inserted into table B, something like:

        (Assuming colA is the PK):

        set rowcount 10000

        Insert INTO TABLE A
        (ColA, ColB, ColC)
        Select B.TBLBCOLA, B.TBLBCOLB, B.TBLBCOLC
        FROM tbltable B
        WHERE blah balh...etc

        while @@rowcount > 0
        begin

        select @max_A = max(colA) from TableA

        Insert INTO TABLE A
        (ColA, ColB, ColC)
        Select B.TBLBCOLA, B.TBLBCOLB, B.TBLBCOLC
        FROM tbltable B
        WHERE blah balh...etc
        and ColA > @max_A

        end

        Comment

        • Gary

          #5
          Re: How to have a batch transaction?

          Doing it in batches is fine, that's not what I was complaining about.
          It's using the BEGIN TRAN..COMMIT TRAN, which you don't need unless
          you're doing multiple commands and actually checking for errors.
          Transactions should be implicit with atomic statements.

          Comment

          • Sri

            #6
            Re: How to have a batch transaction?

            Rather...I thought a batch process might help since this log space ran
            out comes up after I fire my query 2 hrs ago.

            Comment

            • Hugo Kornelis

              #7
              Re: How to have a batch transaction?

              On 8 Feb 2005 09:34:02 -0800, Sri wrote:
              [color=blue]
              >I am inserting records into a table - around 1 million records. I want
              >to do the insert 10000 records at a time using TRAN. My insert
              >statement is very simple:
              >BEGIN TRAN T1
              >Insert INTO TABLE A
              >(ColA, ColB, ColC)
              >Select B.TBLBCOLA, B.TBLBCOLB, B.TBLBCOLC
              >FROM tbltable B
              >WHERE blah balh...etc.
              >
              >COMMIT TRAN T1
              >
              >
              >Any hep or link on the syntax will be helpful![/color]

              Hi Sri,

              You'll need something like this generic format:

              SET ROWCOUNT 10000
              WHILE 1 = 1
              BEGIN
              INSERT INTO TableA (ColA, ColB, ColC)
              SELECT b.ColA, b.ColB, b.ColC
              FROM TableB AS b
              LEFT JOIN TableA AS a
              ON a.ColA = b.ColA
              AND a.ColB = b.ColB
              -- Replace the above with whichever column (or combination of columns)
              -- suffices to uniquely identify exactly one row in the data.
              WHERE a.ColA IS NULL
              -- Use any column from table A that will never contain NULL
              -- The left join with a test for NULL in a non NULLable column
              -- is a trick; it has the same effect as a NOT EXISTS subquery,
              -- but is often quicker
              IF @@ROWCOUNT = 0 BREAK
              BACKUP LOG MyDatabase
              END
              SET ROWCOUNT 0

              The "backup log" is necessary - else, the logged events will still be kept
              in the log. If you don't need recoverability during this operation (i.e.
              it suffices to restore an old full or incremental backup and repeat the
              mega insert process), you can also use BACKUP LOG MyDatebase WITH NO_LOG.
              If your database is in the simple recovery model, you can omit this step.

              If you used BACKUP LOG MyDatabase WITH NO_LOG, then you should take a full
              or incremental database backup directly after this process is finished.

              Best, Hugo
              --

              (Remove _NO_ and _SPAM_ to get my e-mail address)

              Comment

              Working...