Hello All:
I have been brought onboard to help on a project that had some
performance problems last year. I have taken some steps to address
the issues in question, but a huge question mark remains.
Last year, all of the tables and reports were stored in Access. The
database was put online so that end users could access the reports
online using Snapshot Viewer. The reports were aggregated on the fly,
and the selection criteria available to the end users was passed as
text strings to the query/reports as the searchable criteria as
opposed to an indexed numeric value. (Example: Pass Year_Desc of
200-2003 as opposed to Year_ID, of 3.)
Hundreds of users were accessing this system. Fortunately not all at
the same time, but it must have been enough people since the system
came down a few times while people were accessing their reports.
This year, I have preaggregated all reports into a reporting table, so
that no aggregation is happening on the fly when people are pulling
their reports. The only work that will be done to pull up the reports
is to pass the selection criteria to the query that the report is
based on. ALL TABLES including the aggregated reporting tables have
been moved to SQL Server. No tables live in Access anymore. The
selection criteria in the SQL reporting tables all have indexed
numeric fields that will be used to choose the appropriate recordset.
I think these items represent a big improvement for this process,
making us less likely to come down this year. The theory is that,
Access is known to not handle multiple users and major load on it's
tables. I focused on the fact that everything was in Access and since
it is not a shining star for this type of multiple user/heavy load
scenario, we would be better off in SQL.
The argument has been put forward, however, that the bottleneck is the
actual instances of Access Opening and closing in order to put out the
Snapshot Viewer for each report. It is argued that the move from
Access to SQL will have little impact, because that was not the
bottleneck. The bottleneck, it is said, was the number of actual
instances that Access must "Open" in order for people to view the
reports.
Since I have little experience with Snapshots, the argument may in
fact be valid. I stand by the move to SQL, and I still believe that
everything living in Access contributed to the poor performance and
crashes. I cannot speak towards the Access Snapshot piece ansd am
thus still concerned for the success of this project.
My concern is that we may still be looking at a crash scenario despite
the SQL move and preaggregated data. Does anyone out there have
experience with snapshots and can verify or refute the performance
issues mentioned above and the approach we are taking?
Any input is greatly appreciated. Thanks.
BlueDolphin
I have been brought onboard to help on a project that had some
performance problems last year. I have taken some steps to address
the issues in question, but a huge question mark remains.
Last year, all of the tables and reports were stored in Access. The
database was put online so that end users could access the reports
online using Snapshot Viewer. The reports were aggregated on the fly,
and the selection criteria available to the end users was passed as
text strings to the query/reports as the searchable criteria as
opposed to an indexed numeric value. (Example: Pass Year_Desc of
200-2003 as opposed to Year_ID, of 3.)
Hundreds of users were accessing this system. Fortunately not all at
the same time, but it must have been enough people since the system
came down a few times while people were accessing their reports.
This year, I have preaggregated all reports into a reporting table, so
that no aggregation is happening on the fly when people are pulling
their reports. The only work that will be done to pull up the reports
is to pass the selection criteria to the query that the report is
based on. ALL TABLES including the aggregated reporting tables have
been moved to SQL Server. No tables live in Access anymore. The
selection criteria in the SQL reporting tables all have indexed
numeric fields that will be used to choose the appropriate recordset.
I think these items represent a big improvement for this process,
making us less likely to come down this year. The theory is that,
Access is known to not handle multiple users and major load on it's
tables. I focused on the fact that everything was in Access and since
it is not a shining star for this type of multiple user/heavy load
scenario, we would be better off in SQL.
The argument has been put forward, however, that the bottleneck is the
actual instances of Access Opening and closing in order to put out the
Snapshot Viewer for each report. It is argued that the move from
Access to SQL will have little impact, because that was not the
bottleneck. The bottleneck, it is said, was the number of actual
instances that Access must "Open" in order for people to view the
reports.
Since I have little experience with Snapshots, the argument may in
fact be valid. I stand by the move to SQL, and I still believe that
everything living in Access contributed to the poor performance and
crashes. I cannot speak towards the Access Snapshot piece ansd am
thus still concerned for the success of this project.
My concern is that we may still be looking at a crash scenario despite
the SQL move and preaggregated data. Does anyone out there have
experience with snapshots and can verify or refute the performance
issues mentioned above and the approach we are taking?
Any input is greatly appreciated. Thanks.
BlueDolphin
Comment