Say I have a Python project that looks like this:
project/
├── package/
│ ├── __init__.py
│ ├── module1.py
└── main.py
I would like package/
to be self-contained, though it is called from main.py
.
Here are the contents of package/module1.py
:
def some_fn():
print("hi")
now say I want to call some_fn
inside __init__.py
. I have the following options:
1:
import module1
- Benefit: can call when executing in package/
- Drawback: fails to import when executing outside of package/
2:
import package.module1
- Benefit: imports outside of package/
- Drawback: fails to import when inside of package/
3:
import os
import sys
sys.path.append(os.path.dirname(os.path.abspath(__file__)))
import module1
- Benefit: covers both cases
- Drawback: ugly
Is there a preferred or idiomatic way of doing this?
1 Answer 1
That relative imports work differently depending on whether you are in a package context or not is one the 'features' of Python that annoys me and I have never really understood the need or reason for it (anyone who knows please comment.) It seems like a workaround for some sort of 'painting oneself into a corner' type of mistake.
In any event that's how it works in Python. Here's a question on SO with a bunch of different answers and approaches so I don't see a need to reiterate what is already there.
My personal approach is to accept that once I have decided to go the package route, I commit to that and use the package-style imports for all use cases. One situation where that becomes a challenge is if you have multiple modules that can be run as standalone scripts e.g., with a __name__ == "__main__":
check and you want to make a package from those modules. In that scenario, I've taken to changing the if statement to a method definition and then using the excellent typer library to create a CLI that calls those methods. The effort required to create a CLI with typer is so low that I don't even bother with other approaches.
__main__
module?sys.path
manipulation. That gets really difficult to debug. Instead, define a shared/global module namespace hierarchy. You can define multiple command line entrypoints, and can re-export objects under different names. You can also use relative imports likefrom . import module1
so that a module doesn't have to know its place in the module hierarchy, which makes it easier to move a package around. The drawback of scripts/entrypoints is that your project must be installed, ideally in a venv. Modern tools likepoetry
oruv
make this a lot easier thanpip
.__main__.py
and how it will enable you topython -m my_module
, but does it help with relative imports?from . import module1
is I cannot use it independently, I would get the error:ImportError: attempted relative import with no known parent package