I have been running nutch crawling commands for the passed 3 weeks and now I get the below error when I try to run any nutch command:
Java HotSpot(TM) 64-Bit Server VM warning: Insufficient space for shared memory file: /tmp/hsperfdata_user/27050 Try using the -Djava.io.tmpdir= option to select an alternate temp location.
Error: Could not find or load main class ___.tmp.hsperfdata_user.27055
How do I solve this issue?
I think that the temporary location that was used has got full. Try using some other location. Also, check the #inodes free in each partition and clear up some space.
EDIT: There is no need to change the /tmp at OS level. We want nutch and hadoop to use some other location for storing temp files. Look at this to do that : What should be hadoop.tmp.dir ?