Local / IDE based development allows use of the most powerful IDE's [pycharm
in particuilar]. How can the notebooks presently in json format in our git repo be converted to jupyter/ipynb and/or python on the local environment. We have many notebooks and doing so manually by export on synapse directly is too tedious/time consuming and is also not repeatable.
Is there any mechanism /command line tool to perform the conversions?
In some cases, just renaming notebook.json
to notebook.ipynb
works. However, Azure Syanpse files do note work like that because Azure packs some platform specific junk in the files. Just taking everything out from the properties
should almost always work.
But I was met with cases where metadata
is missing in cell objects which is a violation of iPython spec it seems. So following is a crude script I wrote to resolve this issue for me.
import json
with open("notebook.json") as f:
in_json = json.load(f)
out_json = {
"nbformat": 4,
"nbformat_minor": 2,
"metadata": {"language_info": {"name": "python"}},
}
out_json["cells"] = [
{"cell_type": cell["cell_type"], "metadata": {}, "source": cell["source"]}
for cell in in_json["properties"]["cells"]
]
with open("notebook.ipynb", "w", encoding="utf-8") as f:
json.dump(out_json, f, ensure_ascii=False, indent=4)