Improving the Performance (Millions of Records!)

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • ssrirao
    New Member
    • Nov 2006
    • 17

    Improving the Performance (Millions of Records!)

    Table1
    --------------------------------------------------------
    a(varchar) | b(varchar) | c(varchar) | d(float)
    ---------------------------------------------------------
    has value | has value | null | has value

    ---------------------------------------------------------

    There are arround 13 million records in this table.
    The combination of a and b is unique.

    Table2
    ------------------------------------------------
    a(varchar) | b(varchar) | c(float)
    ------------------------------------------------
    has value | has value | has value

    -------------------------------------------------

    There are arround 13 million records in this table.
    The combination of a and b is unique.


    I have a simple procedure to put the c value of table2 into c value of table1.
    The procedure looks like this
    ------------------------------------------------------------

    create procedure my_procedure as
    begin
    declare @a nvarchar(255),@ b nvarchar(255)
    declare @c nvarchar(255)
    declare c1 cursor dynamic
    for select a,b,c from table2
    open c1
    fetch next from c1 into @a,@b,@c
    while(@@fetch_s tatus = 0)
    begin
    update table1
    set c = @c
    where a = @a and
    b = @b
    fetch next from c1 into @a,@b,@c
    if @@error <> 0
    print 'exception 1 ' + @@error
    end
    close c1
    deallocate c1
    end
    -----------------------------------------------------------------------------------------------------

    I have created index of type clustered the preformance of my procedure should increase. But still i am afraid to run the procedure as it took 17 hours to update 2000 matching records. Let me explain the problem very clearly.
    There are 2 tables:


    I have created Clustered index on the column a of both tables(note: column a has more repeating values)

    I have to transfer the value in column d in Table2 to column c of Table1
  • almaz
    Recognized Expert New Member
    • Dec 2006
    • 168

    #2
    If you want to write efficient code than:

    first recommendation: do not use cursors unless it is absolutely necessary;
    Second recommendation: do not use cursors...;
    ...
    n-th recommendation: do not use cursors...

    Your query can be rewritten with a simplest update statement:

    Code:
    create procedure my_procedure as
      update table1
      set c = table2.c
      from table2
      where table1.a = table2.a and table1.b = table2.b

    Comment

    • ssrirao
      New Member
      • Nov 2006
      • 17

      #3
      thanks...

      there is one case in which i have to use cursor
      I have to get a unique value y from table x,
      insert 10 values for each value of y into table a.
      so a procedure with,
      a cursor to get the value from table x
      cursor1 = select y,z from x
      while loop till @@fetch_statu = 0
      {
      insert into a (b,c) values (values from cursor1,c)
      c++
      if c = 10
      set c=1
      fetch from cursor1
      }
      something similar to this!!!

      Comment

      • almaz
        Recognized Expert New Member
        • Dec 2006
        • 168

        #4
        Originally posted by ssrirao
        thanks...

        there is one case in which i have to use cursor
        ...
        I didn't got the whole problem but as I see it still can be done without cursors:

        Code:
        declare @IDs table(ID int identity (1,1))
        
        declare @i int set @i = 0
        while(@i<10)
        begin
        	insert @IDs default values
        	set @i = @i + 1
        end
        
        insert a (b, c)
        select x.y, IDs.ID
        from x cross join @IDs IDs

        Comment

        • ssrirao
          New Member
          • Nov 2006
          • 17

          #5
          thanks ....this worked faster

          Comment

          Working...