I'm refactoring the directory structure of a large Python repo that has a few dozen C and C++ files littered throughout. Is there a suggested architecture for a Python package that contains C/C++ code within it? What I have at the moment for my repo foo
is:
foo/
├── LICENSE
├── Makefile
├── README.md
├── docs/
├── examples/
├── foo/
│ ├── __init__.py
│ ├── README.md
│ ├── core/
│ │ ├── __init__.py
│ │ ├── foo_tricks_slow.py
│ │ ├── foo_tricks_fast.cpp
│ │ └── foo_tricks_fast.mk
│ ├── experiments/
│ └── utils/
├── requirements.txt
├── setup.py
└── tests/ # Run with py.test
├── __init__.py
└── test_foo.py
Each subdir core/
, experiments/
, utils/
, and so on contains mostly python files, and some C/C++ files.
Some things I've heard mixed opinions on that I'd like to reach a conclusion on include
- Putting tests outside the Python package
- Putting source in a
src/
dir - Keeping a directory for Python and a directory for C/C++
Some sources of info I've looked at include:
A. Filesystem structure of a Python project by Jean-Paul Calderone
B. cookiecutter
D. This SO question is close, but I don't have a self-contained C++ project, but rather some misc. files, each with their own mk
file that gets accumulated when running the main Makefile.
E. Example repositories like Tensorflow, Keras, Nupic, Django
Related: I would like the repo to be installed and built with pip install foo
. I'm unsure how this would also build the C/C++ files though. For example, pip install tensorflow
accomplishes this. Perhaps that's for a followup SO question though.
pip install tensorflow