deeplearning4j

Reduce DeepLearning4j dependency size of exported jar


In my application, I would like to use Deeplearning4j. Deeplearning4j has over 120mb of dependencies, which is a lot considering my own code is only 0.5mb.

enter image description here

Is it possible to reduce the dependencies required? Would loading an already-trained network allow me to ship my application with a smaller file size?


Solution

  • There are many ways to reduce the size of your jar depending on what your use case is. We cover this more recently in our docs, but I'll summarize some things to try here:

    1. DL4j is heavily based on javacpp. You can add -Djavacpp.platform=$YOUR_PLATFORM (linux-x86_64, windows-x86_64,..) to your build to reduce the number of native dependencies in there.

    2. If you are using deeplearning4j-core, that includes a lot of extra dependencies you may not need. In this case, you may only need deeplearning4j-nn for the configuration. The same goes for if you are using only samediff, you do not need the dl4j apis. I don't know enough about your use case to confirm what you do and don't need though.

    3. If you are deploying on an embedded platform, we also have the ability to reduce the number of supported operations and data types now as well. This feature is mainly for advanced users right now (involves building from source) but if you think that could also be applicable despite the first 2, please do confirm and I can try to clarify that a bit.