I have a table in one of my projects called "Days" that contains a record for every day over the next five years (1,826 records). It contains the fields ChargeDate, Total, Count, and Notes. I use the table to subtotal individual charges from another table "Charges" into their respective dates so that I have record of charges grouped by day. It is the same concept as the totals query below, except I couldn't use my Note field because it is a memo field and I got an error:
I need to preserve the data in the Note field so I wrote a procedure that totals the charges for me, while at the same time concatenating the note fields for each date together. When I run this procedure it does a lot of other operations, but the part that is of interest to me now is when I step through each record in my Days table and clear out the values to get them ready for the next round of calculations:
I just learned how to use the Progress Meter, but I don't think that has anything to do with my problem. Other than that there is nothing that complicated about it; I've used similar code plenty of other times with no bloating issues.
I had to run this part of the routine several dozen times while I was experimenting with different data and debugging the rest of the procedure. Before I opened the file last night it was at about 80 MB, then at some point several hours later I checked again and my file was over 360 MB.
The first thing I tried to do was compact and repair, but that only cut it down a few MB. I ended up importing all of my other tables, forms, and queries into a new file, re-creating the Days table and re-establishing my relationships. After I did that I'm below 4 MB total, but now I'm afraid it is going to happen again because I need to run this procedure repeatedly. Does anyone know why this would happen? I'm nearly positive it is because I'm re-writing the values into those three fields of the Days table because that was the only table that was a problem, and that is the only operation I did on that table (*edit- I also re-wrote the field values later in the procedure). I don't understand why that would happen and why it wouldn't get fixed by compacting.
Code:
SELECT ChargeDate, Sum(ChargeAmount) AS DailyTotal FROM Charges GROUP BY ChargeDate;
Code:
Dim db as Database
Set db = CurrentDb
Dim rstDays as Recordset
Set rstDays = db.OpenRecordset("Days", dbOpenDynaset)
Dim DaysRC as Integer
Dim DaysProgress as Integer
With rstDays
.MoveLast
DaysRC = .RecordCount
.MoveFirst
SysCmd acSysCmdInitMeter, "Step 1 of 2...", DaysRC
For DaysProgress = 1 To DaysRC
SysCmd acSysCmdUpdateMeter, DaysProgress
.Edit
!Total = 0
!Count = 0
!Notes = ""
.Update
.MoveNext
Next DaysProgress
SysCmd acSysCmdRemoveMeter
End With
rstDays.Close
Set rstDays = Nothing
db.Close
Set db = Nothing
I had to run this part of the routine several dozen times while I was experimenting with different data and debugging the rest of the procedure. Before I opened the file last night it was at about 80 MB, then at some point several hours later I checked again and my file was over 360 MB.
The first thing I tried to do was compact and repair, but that only cut it down a few MB. I ended up importing all of my other tables, forms, and queries into a new file, re-creating the Days table and re-establishing my relationships. After I did that I'm below 4 MB total, but now I'm afraid it is going to happen again because I need to run this procedure repeatedly. Does anyone know why this would happen? I'm nearly positive it is because I'm re-writing the values into those three fields of the Days table because that was the only table that was a problem, and that is the only operation I did on that table (*edit- I also re-wrote the field values later in the procedure). I don't understand why that would happen and why it wouldn't get fixed by compacting.
Comment