Has anyone tried to run several copies of the same python code in the background? Let me explain. What I need to do is open 50 or so different folders and do operations on the stuff inside each folder. It works fine in serial, open folder1, do work, open folder2, do work etc. But this takes a long time, order of 3-4 hours. So I was trying to get it to run concurrently.
The way I was planning on doing this was by calling each instance of the python code from a bash script. Here it is for four folders.
The script takes the number as an argument and uses that number to open the appropriate folder.
Now here is the problem. When I run this script, folder0 is worked on correctly. The other 3 folders are not. Here is some output.
I assume that the "Error: can't open file" is a python error. Has anyone tried to do something like this before?
Thanks for all your help
The way I was planning on doing this was by calling each instance of the python code from a bash script. Here it is for four folders.
Code:
#!/bin/sh for i in $(seq 0 3); do python calculate.py $i & done wait exit 0
Now here is the problem. When I run this script, folder0 is worked on correctly. The other 3 folders are not. Here is some output.
Code:
-bash-3.1$ ./test.sh Engage the System Engage the System Engage the System Engage the System reading `/tmp/shared/sqb/folder0/snapshot_000' ... Error: can't open file. allocating memory...done Error: can't open file. Error: can't open file. reordering...done. space for particle ID freed DONE DONE DONE . . folder0 is being worked on... . . DONE
Thanks for all your help
Comment