Hi,
I am currently using MS Access + VBA to build reporting applications and also do adhoc reports in the company. However, from 2008 we are planning to change the way the source database is organised and the information in it will be drilled down a bit deeper. This will cause the size of the database to increase rather dramatically and it is quite likely that I will have few tables with 5M+ of rows. I am worried that Access will not be able to cope with it. It is not really about the size of the database itself, i.e. it is far away from 2GB, but I am afraid that calculations using these tables may get unreasonably slow.
I wonder what your opinion regarding this is and what other options you would offer. I was thinking about using FoxPro to manipulate the data before using it for analysis in Access. I assume FoxPro would be faster and more flexible for this but at the same time we cannot migrate to FoxPro or any other software as there are many people using the same databases and they are not trained on anything else except MS Office.
So basically I am looking for a solution to building an automated application which would manipulate/consolidate/split large database into smaller tables which later can be used for analysis in Access. It is important to understand that getting a seperate server for this task would be very unlikely at this stage, so all workload would fall on desktop computers with data being stored on central file servers.
Hope this explains the situation. Any ideas would be greatly appreciated.
Thanks!
JZ
I am currently using MS Access + VBA to build reporting applications and also do adhoc reports in the company. However, from 2008 we are planning to change the way the source database is organised and the information in it will be drilled down a bit deeper. This will cause the size of the database to increase rather dramatically and it is quite likely that I will have few tables with 5M+ of rows. I am worried that Access will not be able to cope with it. It is not really about the size of the database itself, i.e. it is far away from 2GB, but I am afraid that calculations using these tables may get unreasonably slow.
I wonder what your opinion regarding this is and what other options you would offer. I was thinking about using FoxPro to manipulate the data before using it for analysis in Access. I assume FoxPro would be faster and more flexible for this but at the same time we cannot migrate to FoxPro or any other software as there are many people using the same databases and they are not trained on anything else except MS Office.
So basically I am looking for a solution to building an automated application which would manipulate/consolidate/split large database into smaller tables which later can be used for analysis in Access. It is important to understand that getting a seperate server for this task would be very unlikely at this stage, so all workload would fall on desktop computers with data being stored on central file servers.
Hope this explains the situation. Any ideas would be greatly appreciated.
Thanks!
JZ
Comment