Is there a way to find out which imports are taking the longest in Python? Looking at the output of python -m cProfile <script>
, it doesn't seem to include import
statements (understandably given potentially huge dependency trees). Initially I thought it did since I saw a row for __import__()
calls, but I think this might actually be because code somewhere is explicitly calling it, toy programs with only import
statements don't have a row for it.
Right now I'm just using:
start = time.time()
import <module>
print '%s / time: %f' % (<module>, time.time()-start)
around each module, but it doesn't profile it recursively to see which import within an import might be inflating the time.
This is a totally legitimate question. For instance, it makes sense to try to accelerate cold start for CLI applications. Python 3.7 now offers an option to print the import time:
You can either run:
python -X importtime myscript.py
or:
PYTHONPROFILEIMPORTTIME=1 myscript.py
EDIT: to view those results I recommend tuna.