I've reorganized my Python project to be under a same name umbrella. My project can now be seen as multiple subsystems than can depend on each other. That means that every submodule can now be distributed alone so that only required dependencies can be installed.
The old structure:
/
├─ myproj/
│ ├─ __init__.py
│ ├─ mod1.py
│ ├─ subpackage1/
│ └─ subpackage2/
└─ setup.py
The new structure:
/
├─ myproj/
│ ├─ common/
│ │ └─ mod1.py
│ ├─ subpackage1/
│ └─ subpackage2/
└─ setup.py
As you can see not much has changed except that myproj
is now a namespace package and that sub-packages common
, subpackage1
and subpackage2
can now be distributed independently.
Is it possible, still keeping one unique setup.py
file, to create 3 independent packages?
myproj.common
myproj.subpackage1
myproj.subpackage2
Also I'd like to specify that when installing myproj.subpackage1
, myproj.common
is required or that myproj.subpackage2
will require both myproj.common
and myproj.subpackage1
.
As Martijn Pieters stated it is just python code, so yes you can do this. I don't even think this would be all that difficult either.
Basically you just want to manipulate the command line arguments in setup.py
import sys
if sys.argv[1] == "subpackage1":
# Remove the first command line argument so the setup function works normally.
sys.argv.pop(1)
# Run setup code for subpackage1 or
# Use a separate setup file and call "import setup_subpackage1"
...
elif sys.argv[1] == "subpackage2":
# Remove the first command line argument so the setup function works normally.
sys.argv.pop(1)
# Run setup code for subpackage2 or
# Use a separate setup file and call "import setup_subpackage2"
...
else:
# Check if they gave common as an argument or just left if blank
if sys.argv[1] == "common":
# Remove the first command line argument so the setup function works normally.
sys.argv.pop(1)
# Run setup code for both packages.
...
Once again though as Martijn Pieters stated it is probably not worth the effort. Python's main philosophy is that simple is better than complex. If your two sub-packages are completely different then maybe they should be different projects.
Example: Scipy
I tried to think of an example for why not to do this, but apparently scipy
does this. So I may be wrong in trying to dissuade you. Still probably not worth the effort, because most people just pip install scipy
.
It's interesting. Scipy's structure is very well thought out. Scipy has every sub-package as a Python package (directory with an __init__.py file.). Inside of every package is a setup.py file. They also use numpy.distutils.misc_util.Configuration
to add sub packages.
If you look through their source code, scipy's main setup.py file looks like.
from __future__ import division, print_function, absolute_import
import sys
def configuration(parent_package='',top_path=None):
from numpy.distutils.misc_util import Configuration
config = Configuration('scipy',parent_package,top_path)
config.add_subpackage('cluster')
config.add_subpackage('constants')
config.add_subpackage('fftpack')
config.add_subpackage('integrate')
config.add_subpackage('interpolate')
config.add_subpackage('io')
config.add_subpackage('linalg')
config.add_data_files('*.pxd')
config.add_subpackage('misc')
config.add_subpackage('odr')
config.add_subpackage('optimize')
config.add_subpackage('signal')
config.add_subpackage('sparse')
config.add_subpackage('spatial')
config.add_subpackage('special')
config.add_subpackage('stats')
config.add_subpackage('ndimage')
config.add_subpackage('_build_utils')
config.add_subpackage('_lib')
config.make_config_py()
return config
if __name__ == '__main__':
from numpy.distutils.core import setup
setup(**configuration(top_path='').todict())
So it looks like a good solution has already been found for you.