I am trying to process a CSV file but am having trouble with my hosts
maximum execution time of 30 seconds.
This is how the script works at the moment.
User uploads their CSV file
The script goes through the file() and writes smaller chunk files.
The script then goes through processing the smaller files, populating the
database, deleting the processed file and refreshing itself, thus starting
again.
This system works for files up to 15000 rows, but I need to be able to
process larger files.
The bottleneck is with the initial splitting, since I use the file()
function to read the entire uploaded file.
Does anyone know a quicker way to split a file into smaller chunks.
TIA
RG
maximum execution time of 30 seconds.
This is how the script works at the moment.
User uploads their CSV file
The script goes through the file() and writes smaller chunk files.
The script then goes through processing the smaller files, populating the
database, deleting the processed file and refreshing itself, thus starting
again.
This system works for files up to 15000 rows, but I need to be able to
process larger files.
The bottleneck is with the initial splitting, since I use the file()
function to read the entire uploaded file.
Does anyone know a quicker way to split a file into smaller chunks.
TIA
RG
Comment