Fastest way of updating a row

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Asger Jensen

    Fastest way of updating a row

    In relation to my last post, I have a question for the SQL-gurus.

    I need to update 70k records, and mark all those updated in a special
    column for further processing by another system.

    So, if the record was

    Key1, foo, foo, ""

    it needs to become

    Key1, fap, fap, "U"

    iff and only iff the datavalues are actually different (as above, foo
    becomes fap),

    otherwise it must become

    Key1, foo,foo, ""


    Is it quicker to :
    1) get the row of the destination table, inspect all values
    programatically , and determine IF an update query is needed

    OR

    2) just do a update on all rows, but adding
    and (field1 <> value1 or field2<>value2) to the update query

    that is
    update myTable
    set
    field1 = "foo"
    markField="u"
    where key="mykey" and (field1 <> foo)



    The first one will not generate new update queries if the record has
    not changed, on account of doing a select, whereas the second version
    always runs an update, but some of them will not affect any lines.

    Will I need a full index on the second version?

    Thanks in advance,
    Asger Henriksen
  • Erland Sommarskog

    #2
    Re: Fastest way of updating a row

    [posted and mailed, vänligen svara i nys]

    Asger Jensen (akj@tmnet.dk) writes:[color=blue]
    > Is it quicker to :
    > 1) get the row of the destination table, inspect all values
    > programatically , and determine IF an update query is needed
    >
    > OR
    >
    > 2) just do a update on all rows, but adding
    > and (field1 <> value1 or field2<>value2) to the update query
    >
    > that is
    > update myTable
    > set
    > field1 = "foo"
    > markField="u"
    > where key="mykey" and (field1 <> foo)[/color]

    I'm not sure that I follow, but it sounds to me that the in first
    approach you would retrieve rows one by one.

    In any case, the second approach leaves all the jub to the computer,
    and there is a reason why we have computers, isn't there? :-)

    The only catch is that with too many rows in the table there can be
    a strain on the transaction log. But with only 70000 rows, this is
    not worth worrying about.

    Obviously there query will run faster if there is a clustered index
    on the column "key".

    --
    Erland Sommarskog, SQL Server MVP, sommar@algonet. se

    Books Online for SQL Server SP3 at
    Get the flexibility you need to use integrated solutions, apps, and innovations in technology with your data, wherever it lives—in the cloud, on-premises, or at the edge.

    Comment

    • Asger Jensen

      #3
      Re: Fastest way of updating a row

      > I'm not sure that I follow, but it sounds to me that the in first[color=blue]
      > approach you would retrieve rows one by one.[/color]

      Yes, thats what it was. Nasty.[color=blue]
      >
      > In any case, the second approach leaves all the jub to the computer,
      > and there is a reason why we have computers, isn't there? :-)
      >[/color]
      :-), yes, only in my scenario it took soooo long.
      [color=blue]
      > The only catch is that with too many rows in the table there can be
      > a strain on the transaction log. But with only 70000 rows, this is
      > not worth worrying about.
      >[/color]

      ok,
      [color=blue]
      > Obviously there query will run faster if there is a clustered index
      > on the column "key".[/color]
      There was, actually, but during the 70000 records it became slower and
      slower. I solved it by adding a dynamically generated index on ALL
      fields, not just "key", this sped things up considerably, so I went
      from 1 hour to 10 minutes.

      I guess it is due to the server needing to do a lookup/record on all
      value fields to see if they have changed, and this can be done more
      efficiently with a n index.

      Regards
      Asger

      Comment

      Working...