I need to import data programatically from spreadsheet file into database table. There are 30 such tables. User can upload multiple files at a time. For a table(which does not have any referential integrity relation) the alogorithm is as follow:
For each file, do the following operations :
a) Read the excel file
b) Begin Transaction
c) Delete all records from the tables
d) INSERT all of the record from excel into the table
e) If some error, then rollback the transaction for that table and repeat the above steps for the next table
f) If no error, then commit transaction and repeat the above steps for the next table
g) Show message specifying number of successfully uploaded table and names of unsuccessful files
All files are getting uploaded except one(Let's say ABC). This ABC file has 50 thousands records. All other tables have less than 5 thousand record. So I guess the problem is the huge data in the ABC table. The bigger problem is that the behaviour is inconsistent everytime I try to upload. It takes 50 mins to complete the upload of ABC and after that sometime it uploads only 12 thousand records, sometime 45 thousand and sometime 30 thousand. This is making me more worried. If there is some problem in logic of algorithm or of there is some technical limitation then alteast it should give the same output everytime I try to upload.
Any pointer why a program can give different output for different run?
In a Lighter Vein, if this problem is not solved I would submit it as a random number generator. I understand that there is no perfect random number generator :-))
For each file, do the following operations :
a) Read the excel file
b) Begin Transaction
c) Delete all records from the tables
d) INSERT all of the record from excel into the table
e) If some error, then rollback the transaction for that table and repeat the above steps for the next table
f) If no error, then commit transaction and repeat the above steps for the next table
g) Show message specifying number of successfully uploaded table and names of unsuccessful files
All files are getting uploaded except one(Let's say ABC). This ABC file has 50 thousands records. All other tables have less than 5 thousand record. So I guess the problem is the huge data in the ABC table. The bigger problem is that the behaviour is inconsistent everytime I try to upload. It takes 50 mins to complete the upload of ABC and after that sometime it uploads only 12 thousand records, sometime 45 thousand and sometime 30 thousand. This is making me more worried. If there is some problem in logic of algorithm or of there is some technical limitation then alteast it should give the same output everytime I try to upload.
Any pointer why a program can give different output for different run?
In a Lighter Vein, if this problem is not solved I would submit it as a random number generator. I understand that there is no perfect random number generator :-))
Comment