I have a simple project trying to illustrate how Python Path works.
Illustrated below is my current project structure.main.py looks like this,
import pathlib
import sys
cwd = pathlib.Path(__file__).parent.resolve()
source_directory = cwd / 'depth_1' / 'depth_2' / 'depth_3'
sys.path.append(str(source_directory))
Each row_x_file.py simply contains one function,
def row_x_print():
print("Inside row_x_file.py")
(With the x substituted for the correct number). Each __init__.py is simply from . import *
Now, because I have added the path to depth_3 to the sys.path I can successfully type import row_1 without an error. However I can never access the function that is exported from the __init__, i.e. I cannot run row_1_print() after import row_1, but import row_1 runs without failure. from row_1 import row_1_print does not seem to succeed either.
How do I make it so after successfully typing import row_1 I can run the function inside of row_1_file.py?
In that example, you do not need to append to the python path at all.
For recreated example with such files:
d1/d2/d3/row_1/
__init__.py
test.py
main.py
and code:
main.py
#!python3
from d1.d2.d3.row_1 import test
if __name__ == "__main__":
print("text")
test.foo();
test.bar();
__init__.py
X="module"
test.py
from . import X
def foo():
print("yay it works")
def bar():
print(f"value X: {X}")
You would get the output:
text
yay it works
value X: module
Now, having working example, to demonstrate how the path behaves, edit main.py to:
#!python3
import pathlib, sys
cwd = pathlib.Path(__file__).parent.resolve()
source_directory = cwd / 'd1' / 'd2' / 'd3'
sys.path.append(str(source_directory))
from row_1 import test
if __name__ == "__main__":
print("text")
test.foo();
test.bar();
Which example still works, though has the path changed therefore now the d1.d2.d3 prefix for library sourcing is not required. Please be cautious to not override libraries' names as it might be hard to pinpoint the issues why seemingly fine code doesn't work.
Questions: What is __init__.py for? and python module` documentation might be useful for further research.
Related
Let's say I have a Python script file that uses sys.argv variables. And I want to test the sys.argv with different values in multiple tests. The issue is that I have to patch and import it multiple times but i think it holds the sys.args values only from the first import so both TestCases print the same values ['test1', 'Test1']. Is my approach wrong?
example.py
import sys
ex1 = sys.argv[0]
ex2 = sys.argv[1]
print(ex1)
print(ex2)
test_example.py
import unittest
import mock
import sys
class TestExample(unittest.TestCase):
#mock.patch.object(sys, 'argv', ['test1', 'Test1'])
def test_example1(self):
import example
#mock.patch.object(sys, 'argv', ['test2', 'Test2'])
def test_example2(self):
import example
The problem is that Python will not (unless you work hard at it) reimport the same file more than once.
What you should do is have the import only define stuff, and define a main() function in your module that gets called when it's time to run stuff. You call this main() function directly in the script when the script is called by itself (see the last two lines of example.py below), but then you can also call that function by name during unit testing.
For what it's worth, this is best practice in Python: your scripts and module should mostly define functions, classes and whatnot on import, but not run much. Then you call your main() function in a __name__ guard. See What does if __name__ == "__main__": do? for more details about that.
Here's your code modified to work, testing your module with two different sets of sys.argv values:
example.py:
import sys
def main():
ex1 = sys.argv[0]
ex2 = sys.argv[1]
print(ex1)
print(ex2)
if __name__ == "__main__":
main()
test-example.py:
import unittest
import mock
import sys
import example
class TestExample(unittest.TestCase):
#mock.patch.object(sys, 'argv', ['test1', 'Test1'])
def test_example1(self):
example.main()
#mock.patch.object(sys, 'argv', ['test2', 'Test2'])
def test_example2(self):
example.main()
Results from python -m unittest:
..
----------------------------------------------------------------------
Ran 2 tests in 0.000s
OK
test1
Test1
test2
Test2
I'm refactoring some code and have moved around some files. But for backwards compatibility, I would like to make all of my modules keep their old import paths.
my file structure is as follows
--| calcs/
----| __init__.py
----| new_dir
------| new_file1.py
------| new_file2.py
What do I need to do ensure that I can use an import like
import calcs.newfile1.foo
# OR
from calcs.newfile1 import foo
I have tried a few methods of adding the imports to the top level __init__.py file. As is reccommended here
But while this seems to allow an import such as import calcs.newfile1, An import such as import calcs.newfile1.foo raises ModuleNotFoundError: No module named calcs.newfile1
I expect that I need python to recognize calcs.newfile1 as a **module **. At the moment it seems to just be importing it as a class or other object of some sort
The only way i know how to do it is by creating a custom import hook.
Here is the PEP for more information.
If you need some help on how to implement one, i'll suggest you to take a look at the six module,
here
and here
Basically your calcs/__init__.py will become like this:
''' calcs/__init__.py '''
from .new_dir import new_file1, new_file2
import sys
__path__ = []
__package__ = __name__
class CalcsImporter:
def __init__(self, exported_mods):
self.exported_mods = {
f'{__name__}.{key}': value for key, value in exported_mods.items()
}
def find_module(self, fullname, path=None):
if fullname in self.exported_mods:
return self
return None
def load_module(self, fullname):
try:
return sys.modules[fullname]
except KeyError:
pass
try:
mod = self.exported_mods[fullname]
except KeyError:
raise ImportError('Unable to load %r' % fullname)
mod.__loader__ = self
sys.modules[fullname] = mod
return mod
_importer = CalcsImporter({
'new_file1': new_file1,
'new_file2': new_file2,
})
sys.meta_path.append(_importer)
and you should be able to do from calcs.new_file1 import foo
I am trying to run the follow python script:
#!/usr/bin/env python
#Author Jared Adolf-Bryfogle
#Python Imports
import os
import sys
from pathlib import Path
from typing import Union
#Append Python Path
p = os.path.split(os.path.abspath(__file__))[0]+"/src"
sys.path.append(p) #Allows all modules to use all other modules, without needing to update pythonpath
#Project Imports
from pic2.modules.chains.ProposedIgChain import ProposedIgChain
from pic2.modules.chains.IgChainSet import IgChainSet
from pic2.modules.chains.IgChainFactory import IgChainFactory
from pic2.modules.chains.AbChainFactory import AbChainFactory
from pic2.tools.fasta import *
class IgClassifyFASTA:
"""
Identify CDRs from a Fasta File
"""
def __init__(self, fasta_path: Union[Path, str]):
self.fasta_path = str(fasta_path)
self.outdir = os.getcwd()
self.outname = "classified"
self.fasta_paths = split_fasta_from_fasta(os.path.abspath(self.fasta_path), "user_fasta_split_"+self.outname, self.outdir)
def __exit__(self):
for path in self.fasta_paths:
if os.path.exists(path):
os.remove(path)
def set_output_path(self, outdir: Union[Path, str]):
self.outdir = str(outdir)
def set_output_name(self, outname: Union[Path, str]):
self.outname = str(outname)
My python version is 3.8, and the pic2 is a conda env. I get the the following error:
File "IgClassifyFASTA.py", line 29
def __init__(self, fasta_path:Union[Path, str]):
SyntaxError: invalid syntax
I cannot figure out what's wrong with this line. Could you kindly give me some hint what's the wrong lying in? I will appreciate any help.
Best regards!
I have a project like
.
|-application
|---foo.py
|---bar.py
|-tests
|---test_foo.py
Contents are:
bar.py
def myfunc():
return "hello from bar"
foo.py
from bar import myfunc
def do_work():
return myfunc()
if __name__ == "__main__":
print(do_work())
test_bar.py
from application import foo
class TestFoo:
def test_foo_do_work(self):
assert foo.do_work() == "hello from bar"
Running python application/foo.py works great, but running python -m pytest tests/ returns error ModuleNotFoundError: No module named 'bar'
In foo.py one way to fix this is to change the import line from from bar import myfunc to from application.bar import myfunc. This causes pytest to pass, but running python application/foo.py gives an error ModuleNotFoundError: No module named 'application'
I don't know what I'm doing wrong. I'm using python 3.7 so I heard __init__.py is unnecessary.
After a little more googling, I have an answer. I created an __init__.py file inside of tests dir with the contents:
import os
import sys
project_path = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
app_path = os.path.join(project_path, 'application')
sys.path.append(app_path)
From Python nested subpackage doesn't import when running tests
I have a module that uses enums defined in the same package. I want to run it locally for self testing and also access it from other packages, but I can't get the import for the enum to handle both situations.
I have tried a blank _ _ init _ _.py (without the spaces) in both the sub and the proj directories, with no apparent change.
I have tried variations of python -m without success.
Here is the minimal code that shows my problems:
/my/proj
|----sub
| |---e1.py
| |---one.py
| |---two.py
|-p1.py
|-p2.py
----
$ cat /my/proj/sub/e1.py
from enum import Enum
class UsefulEnums(Enum):
ZERO = 0
----
$ cat /my/proj/sub/one.py
from e1 import UsefulEnums as E
def profound():
print('The value of {} is {}'.format(E.ZERO.name, E.ZERO.value))
if __name__ == '__main__':
profound()
/my/proj/sub$ python one.py
The value of ZERO is 0
----
$ cat /my/proj/sub/two.py
# note the dot before the module name. No other change from one
from .e1 import UsefulEnums as E
def profound():
print('The value of {} is {}'.format(E.ZERO.name, E.ZERO.value))
if __name__ == '__main__':
profound()
/proj/sub$ python two.py
Traceback (most recent call last):
File "two.py", line 1, in <module>
from .e1 import UsefulEnums as E
ModuleNotFoundError: No module named '__main__.e1'; '__main__' is not a package
----
$ cd /my/proj
/my/proj$ cat p1.py
import sub.one as a
if __name__ == '__main__':
a.profound()
/my/proj$ python p1.py
Traceback (most recent call last):
File "p1.py", line 1, in <module>
import sub.be1 as a
File "/home/esmipau/delete_me/proj/sub/one.py", line 1, in <module>
from e1 import UsefulEnums as E
ModuleNotFoundError: No module named 'e1'
----
/my/proj$ cat p2.py
import sub.two as a
if __name__ == '__main__':
a.profound()
/my/proj$ python p2.py
The value of ZERO is 0
When run from the 'sub' directory, one.py works as expected, two.py fails with a 'ModuleNotFoundError' error as detailed above.
When imported in parent code and run from the parent directory, two.py now works, and one.py fails with a different 'ModuleNotFoundError' error.
I would like a three.py in the 'sub' directory that uses the enums defined in e1.py, and which can be run locally for self testing etc, and can be included from external module not in the same directory.
--- edit for close as duplicate suggestions ---
This is not the same question as others that have been suggested such as Correct import and package structure now that __init__.py is optional as I require a way for one module to import another in the same directory regardless of whether the module is being executed locally or imported from another module.
I had simmilar issue and this solved it:
try creating __init__ file like this:
import my.proj.sub.e1
and then in file you want to use add:
from my.proj.sub import UsefulEnums as E
Given the structure :
/proj
|---/pkg
| |---> usefulEnums.py
| |---> usefulCode.py
|--> myProj.py
I want to be able to run usefulCode.py in self test mode and I want to import and use it in myProj.
$ cat proj/pkg/usefulEnums.py
from enum import Enum
class UsefulEnums(Enum):
ZERO = 0
$ cat proj/pkg/usefulCode.py
if __package__ is None:
# works with python <path/>tcp.py
from usefulEnums import UsefulEnums as E
else:
# works with python -m pkg.tcp
# or when imported by another module
from pkg.usefulEnums import UsefulEnums as E
# This works as well and appears more more portable and pythonic
# but the logic that makes it work is obtuse and fragile
# and it doesn't work if __package__ is None (even though it should!)
# from .usefulEnums import UsefulEnums as E
def profound():
print('The value of {} is {}'.format(E.ZERO.name, E.ZERO.value))
if __name__ == '__main__':
profound()
The main project code is as expected:
cat proj/myProj.py
from pkg.usefulEnums import UsefulEnums as E
import pkg.usefulCode as u
if __name__ == '__main__':
u.profound()
The above is the work around I am accepting as the answer to my stated question, but the reasoning eludes me.
I do not understand why the behavior of the 'dot' import does not work as expected when __package__ is not defined. Surely it always means import relative to the current package, and if no package is explicitly defined, then why doesn't it mean import from the same directory as the current module, which could be argued to be the currently default package?