I've been using Python for years, and creating new modules for distribution is still confusing and eludes me; and it is REALLY frustrating. I need an expert to explain this.
I recently discovered that setting a global variable in a sub-module behaves differently depending on how the module is imported in the __init__.py file.
I have something like this set up:
prime.py
import sub
sub.somevar = False
sub.show()
sub.py
somevar = True
def show():
global somevar
print("somevar is {}".format(somevar))
This code produces the expected output:
somevar is False.
when main.py and sub.py are in the same directory. When I make sub a separate installable module with setuptools, it stops working.
My sub installable module directory structure looks like this:
# ls
setup.py sub README.md
# ls sub
__init__.py sub.py
Where __init__.py contains the text:
from .sub import *
This setup appears to work great for installing and distributing the module, for both Python2 and Python3.
The problem is that now global variable setting is broken when I import it with prime.py; any attempt to set somevar from prime.py does not change the variable for functions of sub.py as is intended. I get the output:
somevar is True
What the hell is going on here? Why is python module importing so finicky?
2 Answers 2
It is not related to your project structure, but only how you import it.
I think it is easy to understand. As somevar is a value type when you use from x import * which implicitly from x import somevar, you get a copy of somevar in your current namespace.
And because it is just a copy, changing it won't change the somevar in your submodule.
But if somevar is a list, like:
# In sub.py
somevar = [True]
And you change it by
# In prime.py
somevar[0] = False
Then it will work.
For quick fix, you can simply use "setter" "getter" behavior:
prime.py
import sub
sub.set_somevar(False)
myFalseVar = sub.get_somevar()
sub.py
somevar = True
def get_somevar():
global somevar
return somevar
def set_somevar(var):
global somevar
somevar = var