I'm having a problem for python to find the installed libraries when I run it in a computer cluster.
When I try, e.g., to load numpy in the script:
#file: /home/foo/test.py
import numpy
print numpy.__version__
on the server, I get this:
foo@abax:~$ python test.py
1.4.1
but when I try to run the same in a node with remote shell, I get an error:
foo@abax:~$ rsh -l foo ab01 "python test.py"
Traceback (most recent call last):
File "test.py", line 2, in <module>
import numpy
ImportError: No module named numpy
is there a way to tell python to load the files that are installed in the the central node of the cluster?
First things to check:
PYTHONPATH
on both the frontal server and the cluster nodes, to make sure there is no inconsistencynumpy.__file__
on the frontal server, to check where it finds numpy. Then explore the filesystem of the cluster nodes a bit to see if numpy can be found in the same place (if not, run a search to see if you can find it, then update your PYTHONPATH
accordingly).It may just be that numpy is installed locally on the frontal server, but not on cluster nodes. In that case you will need to install numpy by yourself on a filesystem that cluster nodes can access (note that on a scientific cluster, it would be better to ask the cluster admins to install numpy on cluster nodes, to make it available to everyone).
If the platforms are similar enough, copying the numpy folder from the frontal server to somewhere in the shared filesystem (e.g. a subfolder of your home dir, that you would add to your PYTHONPATH
) might work, but a clean install is to be preferred.