I have a long script which I run on a remote server and I want to log all outputs as well as error messages to a file.
# in script:
import sys sys.stdout = open('./myfile.txt', 'w')
# ...
sys.stdout.close()
# in terminal:
python myscript.py > ./myfile.txt
This writes all print() outputs to a file, which I want. But it does not write error messages to the file in case it fails.
import logging
try:
# ...
except ZeroDivisionError as e:
logging.error(e) # ERROR:root:division by zero
The problem with all solutions linked to the logging module is, that I need to know where the error will occur to wrap it in a logging.error() function. But I don't always know where my errors will occur.
=> How do I (1) write all outputs and (2) the error that makes my script fail to a file?
Found the answer myself: just type this in the terminal and make sure that you have a log.text file in the right directory. No changes in the script needed.
python script.py &> ./log.txt
Inspired from this question: Redirect stdout and stderr to same file using Python