MaxLocksPerFile

Collapse
This topic is closed.
X
X
 
  • Time
  • Show
Clear All
new posts
  • Trevor Fairchild

    MaxLocksPerFile

    I'm working with an ado recordset in VB6 - connecting to a database
    and updating every record for a new index number based on a filter set
    ordered by the user. I get the "File Sharing Lock Count Esceeded.
    Increas MaxLocksPerFile registry entry"

    I've already checked around for solutions, and have already seen the
    DAO.DBEngine.Se tOptions dbMaxLocksPerFi le, ##### - this does not work
    for whatever reason, possibly because I'm using ado connection and
    this applies to DAO???

    If I go into my registry and manually change my maxlocks value, my
    program will work, but I cannot get it to simulate this change
    temporarily during run-time. This program will be installed on
    multiple machines, so it is not feasible to have the registry modified
    every time.

    I am using adOpenDynmaic, adLockOptimisti c to open the recordset.
    I've tried some other variations with no luck.

    Any ideas why this is? For some strange reason, if I hover over the
    dbMaxLocksPerFi le name, my intellisense always provides the number 62.
  • Steve Gerrard

    #2
    Re: MaxLocksPerFile


    "Trevor Fairchild" <MRTrevorF@e-crime.on.ca> wrote in message
    news:f52d3238.0 402041201.11e6d ef@posting.goog le.com...[color=blue]
    > I'm working with an ado recordset in VB6 - connecting to a database
    > and updating every record for a new index number based on a filter set
    > ordered by the user. I get the "File Sharing Lock Count Esceeded.
    > Increas MaxLocksPerFile registry entry"
    >[/color]

    I would not recoomend trying to increase MaxLocksPerFile . I cannot
    imagine a scenario where updating every record in a single table with
    adLockOptimisti c would require this. Something else must be wrong with
    the way you are doing the update.


    Comment

    • Trevor Fairchild

      #3
      Re: MaxLocksPerFile

      It's not a very complicated bit of code:
      It opens a recordset and cycles through until rs.EOF
      Every record it hits, it changes a value in a specific field, and then
      moves on.
      rs.MoveNext initiates an update command on its own, but I have also
      forced an rs.Update every time I change the value

      It will stop before rs.eof with the error message.

      My first set of test data was 5000 records, and everything worked
      fine. My next set of test data is 11,000 records, and I get this
      error. My next set will actually be 350,000 records...

      My solution was to count the number of updates already executed, and
      when it reaches 750 then it closes the rs, and reopens it, rs.Find to
      get back to where it left, and resumes for the next 750.

      This workaround does the job, too, although it is a bit slow.

      Comment

      • Steve Gerrard

        #4
        Re: MaxLocksPerFile


        "Trevor Fairchild" <MRTrevorF@e-crime.on.ca> wrote in message
        news:f52d3238.0 402050625.4c08f 4d7@posting.goo gle.com...[color=blue]
        > It's not a very complicated bit of code:
        > It opens a recordset and cycles through until rs.EOF
        > Every record it hits, it changes a value in a specific field, and then
        > moves on.
        > rs.MoveNext initiates an update command on its own, but I have also
        > forced an rs.Update every time I change the value
        >
        > It will stop before rs.eof with the error message.
        >
        > My first set of test data was 5000 records, and everything worked
        > fine. My next set of test data is 11,000 records, and I get this
        > error. My next set will actually be 350,000 records...
        >
        > My solution was to count the number of updates already executed, and
        > when it reaches 750 then it closes the rs, and reopens it, rs.Find to
        > get back to where it left, and resumes for the next 750.
        >
        > This workaround does the job, too, although it is a bit slow.[/color]

        I just ran the code below. This is VB6 (SP4) on WinXP, with ADO 2.5
        referenced. The database is an Access 97 database, and the SortIndex
        field is a Long field. The TestData table has 150,000 records. It
        updated them all just fine.
        - - - - -
        Const ConnStr = "Provider=Micro soft.Jet.OLEDB. 4.0;Data
        Source=C:\Docum ents and Settings\Steve\ My Documents\Long
        List.mdb;Persis t Security Info=False"
        Const SQLStr = "SELECT * FROM TestData"

        Private Sub Command1_Click( )
        Dim oConn As ADODB.Connectio n
        Dim oRS As ADODB.Recordset
        Dim n As Long

        Set oConn = New ADODB.Connectio n
        Call oConn.Open(Conn Str)

        Set oRS = New ADODB.Recordset

        With oRS

        Call .Open(SQLStr, oConn, adOpenStatic, adLockOptimisti c,
        adCmdText)

        Do Until .EOF
        n = n + 1
        .Fields("SortIn dex") = n * 2
        .MoveNext
        Loop

        End With

        MsgBox "Did " & n & " records."

        End Sub


        Comment

        • Trevor Fairchild

          #5
          Re: MaxLocksPerFile

          could the difference be that my connection string is to an access
          database and yours is sql? I set a counter to keep track of how many
          updates have occurred, and the approximate maximum I can get is 9000
          records - I tried 9500 which is the windows default, but it didn't
          work - I imagine it is using some locks elswhere at the same time.
          Anyway, I just have the recordset close at 9000 updates and then
          re-open.

          Problem is solved for me, althought it's just a workaround.

          Comment

          • Steve Gerrard

            #6
            Re: MaxLocksPerFile


            "Trevor Fairchild" <MRTrevorF@e-crime.on.ca> wrote in message
            news:f52d3238.0 402061244.705cb 2aa@posting.goo gle.com...[color=blue]
            > could the difference be that my connection string is to an access
            > database and yours is sql? I set a counter to keep track of how many
            > updates have occurred, and the approximate maximum I can get is 9000
            > records - I tried 9500 which is the windows default, but it didn't
            > work - I imagine it is using some locks elswhere at the same time.
            > Anyway, I just have the recordset close at 9000 updates and then
            > re-open.
            >
            > Problem is solved for me, althought it's just a workaround.[/color]

            My test was with an Access 97 database, which I think is what you said
            you had also )The connect string is just the ADO jibberish).

            I do remember now seeing a file locking issue years ago, using Access
            2.0 on a Novell network. My solution then was the same idea as yours, do
            batches of 10,000 records or so. I haven't seen this happen since then,
            which is why I think it is odd that you are seeing it.

            Maybe its because your Access database is on a network, and my little
            test was off my C: drive?


            Comment

            • Trevor Fairchild

              #7
              Re: MaxLocksPerFile

              no, I'm working off a local mdb file - using Access 2000 actually.

              I'm not all that sure what the problem is, but it is directly
              connected to the update command for mass records - my program parses a
              csv file and will add hundreds of thousands of records easily - it's
              updating existing ones that seem to be the problem.

              My workaround is a solution, so I'm not going to concern myself with
              this anymore - it's probably something screwy on my computer if it's
              such an unusual error to get.

              Comment

              • Steve Gerrard

                #8
                Re: MaxLocksPerFile


                "Trevor Fairchild" <MRTrevorF@e-crime.on.ca> wrote in message
                news:f52d3238.0 402070646.3b3ba 876@posting.goo gle.com...[color=blue]
                > no, I'm working off a local mdb file - using Access 2000 actually.
                >
                > I'm not all that sure what the problem is, but it is directly
                > connected to the update command for mass records - my program parses a
                > csv file and will add hundreds of thousands of records easily - it's
                > updating existing ones that seem to be the problem.
                >
                > My workaround is a solution, so I'm not going to concern myself with
                > this anymore - it's probably something screwy on my computer if it's
                > such an unusual error to get.[/color]

                I know the horse is dead, but this got me curious, and I did some
                searching in MS knowledge base articles. The MaxLocksPerFile setting is
                available for both Access and the Jet database engine; to change for
                running under ADO, you would need to change the Jet setting, not the
                Access setting.

                According to the documentation, if Access or Jet exceed this number of
                locks, the transaction is automatically broken up into sections, and
                should proceed on its own (except for the occasional message in Access
                itself). The limit is present because Novell networks can crash, and
                Windows networks can slow down, if too many locks are active at once.

                So the odd things are 1) that you get the message at all, since Jet and
                ADO are supposed to take care of it, and 2) that it doesn't occur when
                adding thousands of records, but it does when updating thousands.

                It is possible that your particular machine has some other limit on file
                locks that is a lower number than the 9500 default for Access and Jet,
                and that is what you are bumping into. I'm not sure that your solution
                is really any slower than the "automatic" one in Jet, so you are
                probably fine. Still, it would be nice to know what is actually going
                on...


                Comment

                • J French

                  #9
                  Re: MaxLocksPerFile

                  On 7 Feb 2004 06:46:11 -0800, MRTrevorF@e-crime.on.ca (Trevor
                  Fairchild) wrote:
                  [color=blue]
                  >no, I'm working off a local mdb file - using Access 2000 actually.
                  >
                  >I'm not all that sure what the problem is, but it is directly
                  >connected to the update command for mass records - my program parses a
                  >csv file and will add hundreds of thousands of records easily - it's
                  >updating existing ones that seem to be the problem.
                  >
                  >My workaround is a solution, so I'm not going to concern myself with
                  >this anymore - it's probably something screwy on my computer if it's
                  >such an unusual error to get.[/color]

                  I don't use MDBs, but when dealing with large files, I have often
                  found that it is significantly faster to create a new file (database)
                  and merge in the data

                  Comment

                  Working...