Streamlining frontent/backend interaction.

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • Whizzo
    New Member
    • Feb 2009
    • 44

    Streamlining frontent/backend interaction.

    Hi folks;
    I'm just starting work on quite a large Access project, I've been called in to try and chase a few gremlins out of a database that's bcome pretty big... 25000 records, a bunch of queries (some multilayered) and lots of macros. The frontend is installed on 12 local machines and is often used by 12 users concurrently.

    The problem is that the backend is stored on a Windows shared drive in Copenhagen and the network is a little strained; bandwidth and latency are quite thin on the ground and some of these queries are taking up to five minutes to run. These problems combined with the multi-user nature of the DB means that records are getting lost and corrupted, seemingly at random but at a rate of 5 or 6 records per user per day. Too much! Maybe delays in the record-locking calls between the front end and back end are to blame? Anybody have any ideas?

    Thanks!

    Hen
  • Stewart Ross
    Recognized Expert Moderator Specialist
    • Feb 2008
    • 2545

    #2
    Hi Whizzo. Your current network problems are not good news for using Access. Access is not client-server based and is very fragile when faced with connection problems that interrupt links between a front-end DB and the back-end tables. The back-end database often ends up in a corrupt and unusable state when faced with such interruptions.

    I would suggest that to protect your data you may need to consider migrating the back-end tables onto a true client-server platform such as SQL Server. Such a platform implements true transaction processing, which in general should help ensure that records are not 'lost' or improperly updated the way you are currently seeing them.

    You can still use an Access front-end to link onto the SQL Server tables.

    You should also give careful consideration to the design of the back-end tables to ensure that fields you subsequently use in query joins and where clauses are appropriately indexed. The query optimisation methods used by Access are not ones we see or can have much influence over, but what is for certain is that we can speed up queries by ensuring that relationships between tables are set correctly, and indexes are used for non-key fields which are frequently joined or searched. Other tips include not using domain functions such as DLookup in queries that return many records, and avoiding IIF or Switch functions if at all possible (as the results from these are unpredictable from the Access database engine's point of view and prevent optimisation).

    -Stewart

    Comment

    • NeoPa
      Recognized Expert Moderator MVP
      • Oct 2006
      • 32661

      #3
      Although I would certainly have provided a less full answer, I would certainly have replied on very similar lines to Stewart. Access over any type of slow network is highly suspect. Even the solution proposed is a good one, and increasingly straightforward to implement nowadays. It doesn't take a great expert to set up a SQL server nowadays (and we do have a forum to help too if required).

      Comment

      • Whizzo
        New Member
        • Feb 2009
        • 44

        #4
        Thanks folks! I'm initially here for a three-day period to asses possible ways ahead for solving this problem, and SQL is certainly something I'm looking into. I've not had any experience setting up an SQL server, and I'm concerned that the machine it'll be living on is over in Denmark and I can't get my hands on it. Is it something that can be done from a client or does it need plenty of input from the techs in Copenhagen?

        I'm also worried that the database was built in Access without any heed for SQL syntax and naming conventions. Queries run on the thing all the time, to populate forms as they're opened and so on. Will this be trouble?

        The measures I was initially looking to propose mostly involve cleaning up the forms on the front end and stopping them from requesting data they don't expressly need from the back end. Also removing the sharing violation delay on the server, applying pessimistic record-locking, putting the back end in an 8.3 naming environment, restructuring some of the tables, adding VB instead of macros and unnecessary queries, and waving a magic wand. That kind of thing. I'm going to try and set up a dummy SQL server on my machine but in the meantime does anybody know any handy tricks for making the bond between here and Scandinavia work more efficiently for me?

        Cheers!

        Comment

        • Stewart Ross
          Recognized Expert Moderator Specialist
          • Feb 2008
          • 2545

          #5
          Originally posted by Whizzo
          ... does anybody know any handy tricks for making the bond between here and Scandinavia work more efficiently for me?
          In my opinion if you are working on a poor connection you have no choice but to limit the traffic across the network - which is what I was getting at in my previous reply. If you find a way to replace your currently unstable connection to Copenhagan with one that is fast and utterly reliable then you might be able to avoid facing this problem, otherwise no amount of tweaking with record locking and other parameters can resolve the inherent instability you face. Access is simply not a product designed for use across a world-wide network connection, and is not the right solution in such an environment without implementing a client-server back-end.

          Not knowing your application it is difficult to advise you on the other steps you mention. Be cautious about replacing Access queries with VBA code - Access queries are usually very efficient in operation (assuming that the use of IIFs etc is avoided as previously mentioned). I would be very surprised if the Access queries have much bearing on the performance issues you are experiencing, assuming that the tables and queries are reasonably well constructed to begin with. Behind the scenes, all Access queries are implemented as SQL statements that although not fully compliant with ANSI standards are not miles off.

          By the way, 25,000 records is small beer in Access (which can easily deal with several hundred thousand records), and very small beer for client-server DBs (which can routinely deal with millions). Attention to the choice of back-end is essential, as the current losses of just a few records a day may well climb considerably as record loads increase.

          -Stewart

          Comment

          • Whizzo
            New Member
            • Feb 2009
            • 44

            #6
            I've moved the DB to a local hard disk within the building now, so things should work much better. Does anybody have any idea of a hypothetical maximum user load over a LAN? We're only talking 6 or so users, so it shouldn't break the bank...

            Comment

            • Stewart Ross
              Recognized Expert Moderator Specialist
              • Feb 2008
              • 2545

              #7
              The maximum number of concurrent users Access can handle is 255. With the number you mention - 6 - you are unlikely to have any performance issues as such.

              -Stewart

              Comment

              • Whizzo
                New Member
                • Feb 2009
                • 44

                #8
                Excellent, cheers all. Mission accomplished!

                Comment

                Working...