I'm using Python with a Cygwin environment to develop data processing scripts and Python packages I'd like to actively use the scripts while also updating the packages on which those scripts depend. My question is what is the best practice, recommendation for managing the module loading path to isolate and test my development changes but not affect the working of a production script.
Python imports modules in the following order (see M. Lutz, Learning Python)
- Home directory.
PYTHONPATHdirectories.- Standard library directories.
- The contents of any
*.pthfile.
My current solution is to install my packages in a local (not in /usr/lib/python2.x/ ) site-packages directory and add a *.pth file in the global site-packages directory so these are loaded by default. In the development directory I then simply modify PYTHONPATH to load the packages I'm actively working on with local changes.
Is there a more standard way of handling this situation? Setting up a virtualenv or some other way of manipulating the module load path?
1 Answer 1
This is just my opinion, but I would probably use a combination of virtualenvs and Makefiles/scripts in this case. I haven't done it for your specific use case, but I often set up multiple virtualenvs for a project, each with a different python version. Then I can use Makefiles to run my code or tests in one or all of my virtualenvs. Seems like it wouldn't be too hard to set up a makefile that would let you type make devel to run in the development envionment, and make production for the production environment.
Alternatively, you could use git branches to do this. Keep your production scripts on master, and use feature branches to isolate and test changes while still having your production scripts just a git checkout master away.
sys.modulesandsys.pathgetting out of sync during execution.