I've searched for a solution here and on other sites, but it feels like all import problems I come across are subtly different.
I have a project with the following setup:
/
__init__.py
package1
___init__.py
a.py
b.py
tests/
test_a.py
test_a.py
package2
package3
In b.py:
from .a import Foo
In the tests:
import a, b
package1, package2, and package3 are essentially smaller packages that are bundled together in the same project/super-package as utilities. The purpose of this project is to be nested inside another package (say, package4) and to have these packages/modules imported by package4. Hence, relative imports to other files in the package are required, if I don't want to modify the path.
As an example, package4:
/
main.py
src/
external/
project_from_above
package1
package2
package3
I'm omitting the __init__.py's in the hierarchy above. In main.py, I might do:
import src.external.project_from_above.package1.a
My problem: this structure works fine, except for unit testing. I am in the habit of running python3 -m unittest discover tests from each package (package1, package2, package3). This works fine when there are not relative imports. However, running with relative imports will yield the following error: "SystemError: Parent module '' not loaded, cannot perform relative import"
I desire:
A way of running the unit tests in package1/tests from the package1 directory, with no imports changing (or at least, maintaining the ability to use this entire project inside the aforementioned package4 as a sub-package). I'd like to avoid any manipulation of the path, but if we can restrict it to a run_tests.py file in package1, then that is okay.
Here's one solution: add a file called run_tests in package1. In it, do the following:
cd ..
python -m unittest discover package1/tests
This requires you to use absolute imports in your tests (e.g., import package1.a)
Related
I'm trying to clean a python project on pypi to make it installable. It has the following structure (very simplified)
project_x/
project_x/
pack_a/
__init__.py
mod_a1.py
mod_a2.py
pack_b/
__init__.py
mod_b.py
setup.py
README
...
In mod_a1.py there is something like from pack_a.mod_a2 import something.
So I added, to the setup.py, the line package_dir: {"": "project_x"} for this to work.
However, mod_b.py uses relative imports: from ..pack_a.mod_a import ....
This fails with
ImportError: attempted relative import beyond top-level package
What is the correct way to handle this? Avoid using relative imports? Writing full paths, such a from project_x.pack_a import... everywhere? Or something smart in the setup.py?
The problem seems to be that pack_a (and its modules) assumes itself to be a top-level package, while pack_b (and its modules) assumes itself to be a sub-package of project_x.
You cannot have both. Instead, you will need to reconcile their implementation.
If you use package_dir: {"": "project_x"}, then both packages should use absolute imports towards each other (but internally they can use either relative/absolute imports). Please note that this option also means that project_x will not exist as an importable item, i.e. users will NOT be able to execute the following statements:
import project_x
from project_x import pack_a, pack_b
If you remove package_dir, then any statement in the form of:
from pack_a import ... # => it should be from project_x.pack_a ...
from pack_b import ... # => it should be from project_x.pack_b ...
will be wrong because pack_a and pack_b are not top-level packages.
Instead you could use relative imports, or fix the absolute imports to start with project_x.
I have the following file structure:
parentfolder/
utils.py
myProgram/
main.py
other.py
I will be running the main.py which utilizes other.py which needs to utilize everything in utils.py (NOT just import one method from utils.py at a time - there are global variables and functions that call other functions within this file.)
I have tried lots of different examples online utilizing sys, path, etc. Along with adding __init__.py in none, some, and all directories. None of which worked for me.
How do I go about this importing of utils.py within other.py?
If I need to create init.py files could you also specify where they need to be created and if anything needs to be placed in them? Do I need to run them once before running the main.py the first time?
Thank you so much for any help in advanced
Yes you should add init files as in:
parentfolder/
__init__.py
utils.py
myProgram/
__init__.py
main.py
other.py
Those can be empty or better containing a docstring on the package contents, but you should not run them or anything
The correct way is to run your script from the parent folder of the parentFolder using the module path:
$ cd parentfolder/..
$ python -m parentFolder.myProgram.main
This way the import utils statement will work without the sys.path hack which can lead to subtle bugs
This question already has answers here:
Attempted relative import with no known parent package [duplicate]
(4 answers)
Relative imports in Python 3
(31 answers)
Closed 1 year ago.
I have the following directory structure:
py_test
├── __init__.py
├── dir1
│ ├── __init__.py
│ └── script1.py
└── dir2
├── __init__.py
└── script2.py
In script2 I want to "import ..\script1".
What I tried in script2:
Does not work
from ..dir1 import script1
ImportError: attempted relative import with no known parent package`
Works
import sys, os
path2add = os.path.normpath(os.path.abspath(os.path.join(os.path.dirname(__file__), os.path.pardir, 'dir1')))
if (not (path2add in sys.path)) :
sys.path.append(path2add)
If I want to go with option 1, what is the simplest (i.e., with the least files) file/dir structure that makes it work?
I am aware of this, but I wonder if creating that directory structure can be avoided, and still use type-1 import.
I am currently using this workaround, which uses type-2 import.
Related:
How to import a Python class that is in a directory above?
Import a module from a directory (package) one level up
Getting "ImportError: attempted relative import with no known parent package" when running from Python Interpreter
Using importlib to dynamically import module(s) containing relative imports
How to import variables in a different python file
As mentioned in the comments, attempting to import modules a directory up will not work if script2.py is your entry point.
As mentioned in this link you included:
If the module's __name__ does not contain any package information (e.g., it is set to __main__), then relative imports are resolved as if the module were a top-level module, regardless of where the module is actually located on the file system.
The module's __name__ is set to __main__ if it is the entry point, or the one that you pass to the interpreter with something like python script2.py.
Any python module run as such no longer has the information needed to import files from higher directories.
Therefore you have two options:
Option 1: Keep using the workaround with sys.path.append
This will work how you want it to, but it is rather cumbersome.
Option 2: Make your entry point at the top level
Assuming your package has more than one script that needs to be run, you could create a new file that imports both script1 and script2 and then calls the functionality you want based on a command line argument. Then you will be able to keep your current directory structure and have your relative imports work just fine, without any kind of fiddling with sys.path.
The problem
I have a directory structure for my project which follows the standard for Python packages, as it was created with this cookiecutter template:
https://github.com/audreyr/cookiecutter-pypackage#quickstart
The directory structure is
project_name
├── project_name
│ ├── __init__.py
│ └── module1.py
└── tests
└── test_module1.py
The first code line of test_module1.py is:
from project_name import module1
But I get a ModuleNotFoundError: No module named 'project_name'.
To my understanding, this should work since the folder called project_name is a package, which is ensured by presence of the __init__.py file.
I have always had trouble understanding how imports like this work. For my projects I have always just settled with having my tests in the same folder as the modules to test. I know this is bad practice, but the only way I could get the modules to actually import.
What I already tried
I have tried renaming the folder with the __init__.py file to something else and then import, as I thought it could have something to do with the parent folder and the child folder both having the name project_name. This did not work, same error.
I also tried making the test folder into a package by creating an __init.py__ file inside it, even though the Cookiecutter template does not have that.
I read in many places that making the test folder into a package is discouraged, but some suggest that structure. That did not work either.
I have searched thoroughly for solutions to this seemingly very standard problem, some of the links are here:
Python, importing modules for testing
https://gist.github.com/tasdikrahman/2bdb3fb31136a3768fac
Importing modules from parent folder
https://alex.dzyoba.com/blog/python-import/
Sibling package imports
Python imports for tests using nose - what is best practice for imports of modules above current package
My last try was to start a project with Cookiecutter, so everything would be set up properly form the beginning. However, I still get the ModuleNotFoundError.
What I don't want
I don't want to modify sys.path as many answers seem to suggest. There must be a cleaner way for such a common problem.
What am I doing wrong?
Edit for some additional info (see question from #Nicholas):
The contents of __init__.py is
# -*- coding: utf-8 -*-
"""Top-level package for project_name."""
__author__ = """my_name"""
__email__ = 'my_email'
__version__ = '0.1.0'
Which was generated by the Cookiecutter template.
Inside test_module1, I added the following before the before the ModuleNotFoundError occurs:
import sys
import os
print(sys.path)
print(os.getcwd())
sys.path prints a list, where first element is the tests directory.
['c:\\Users\\...\\project_name\\tests',
'C:\\Users\\...\\Miniconda3\\python37.zip',
'C:\\Users\\...\\Miniconda3\\DLLs', 'C:\\Users\\...\\Miniconda3\\lib',
'C:\\Users\\...\\Miniconda3',
'C:\\Users\\...\\Miniconda3\\lib\\site- packages',
'C:\\Users\\...\\Miniconda3\\lib\\site-packages\\win32',
'C:\\Users\\...\\Miniconda3\\lib\\site-packages\\win32\\lib',
'C:\\Users\\...\\Miniconda3\\lib\\site-packages\\Pythonwin']
I don't know if the lowercase 'c' in the first element matters.
os.getcwd() prints the root directory 'c:\Users\....\project_name'. Also with a lowercase 'c'.
You should create a virtual environment and install the project in order for the test modules to correctly resolve import statements.
In the project root, i.e. the directory project_name which contains a subdirectory project_name and a subdirectory tests, create a setup.py (or pyproject.toml) file for the package metadata. See here for details about that part.
From this same project root directory which is now containing the installer (setup.py), create and activate a venv and install your project:
python3 -m venv .venv
source .venv/bin/activatate # linux/macOS
# .\Scripts\activate.bat # windows
pip install --editable .
pip install pytest
pytest
If for some reason you don't want to create an installer for your project, you may run pytest like this, from the project directory:
python3 -m pytest
Unlike the bare pytest command, this will add the current working directory to sys.path allowing import statements to be resolved in tests.
Okay, the scenario is very simple. I have this file structure:
.
├── interface.py
├── pkg
│ ├── __init__.py
│ ├── mod1.py
│ ├── mod2.py
Now, these are my conditions:
mod2 needs to import mod1.
both interface.py and mod2 needs to be run independently as a main script. If you want, think interface as the actual program and mod2 as an internal tester of the package.
So, in Python 2 I would simply do import mod1 inside mod2.py and both python2 mod2.py and python2 interface.py would work as expected.
However, and this is the part I less understand, using Python 3.5.2, if I do import mod1; then I can do python3 mod2.py, but python3 interface.py throws: ImportError: No module named 'mod1' :(
So, apparently, python 3 proposes to use import pkg.mod1 to avoid collisions against built-in modules. Ok, If I use that I can do python3 interface.py; but then I can't python3 mod2.py because: ImportError: No module named 'pkg'
Similarly, If I use relative import:
from . import mod1 then python3 interface.py works; but mod2.py says SystemError: Parent module '' not loaded, cannot perform relative import :( :(
The only "solution", I've found is to go up one folder and do python -m pkg.mod2 and then it works. But do we have to be adding the package prefix pkg to every import to other modules within that package? Even more, to run any scripts inside the package, do I have to remember to go one folder up and use the -m switch? That's the only way to go??
I'm confused. This scenario was pretty straightforward with python 2, but looks awkward in python 3.
UPDATE: I have upload those files with the (referred as "solution" above) working source code here: https://gitlab.com/Akronix/test_python3_packages. Note that I still don't like it, and looks much uglier than the python2 solution.
Related SO questions I've already read:
Python -- import the package in a module that is inside the same package
How to do relative imports in Python?
Absolute import module in same package
Related links:
https://docs.python.org/3.5/tutorial/modules.html
https://www.python.org/dev/peps/pep-0328/
https://www.python.org/dev/peps/pep-0366/
TLDR:
Run your code with python -m pkg.mod2.
Import your code with from . import mod1.
The only "solution", I've found is to go up one folder and do python -m pkg.mod2 and then it works.
Using the -m switch is indeed the "only" solution - it was already the only solution before. The old behaviour simply only ever worked out of sheer luck; it could be broken without even modifying your code.
Going "one folder up" merely adds your package to the search path. Installing your package or modifying the search path works as well. See below for details.
But do we have to be adding the package prefix pkg to every import to other modules within that package?
You must have a reference to your package - otherwise it is ambiguous which module you want. The package reference can be either absolute or relative.
A relative import is usually what you want. It saves writing pkg explicitly, making it easier to refactor and move modules.
# inside mod1.py
# import mod2 - this is wrong! It can pull in an arbitrary mod2 module
# these are correct, they uniquely identify the module
import pkg.mod2
from pkg import mod2
from . import mod2
from .mod2 import foo # if pkg.mod2.foo exists
Note that you can always use <import> as <name> to bind your import to a different name. For example, import pkg.mod2 as mod2 lets you work with just the module name.
Even more, to run any scripts inside the package, do I have to remember to go one folder up and use the -m switch? That's the only way to go??
If your package is properly installed, you can use the -m switch from anywhere. For example, you can always use python3 -m json.tool.
echo '{"json":"obj"}' | python -m json.tool
If your package is not installed (yet), you can set PYTHONPATH to its base directory. This includes your package in the search path, and allows the -m switch to find it properly.
If you are in the executable's directory, you can execute export PYTHONPATH="$(pwd)/.." to quickly mount the package for import.
I'm confused. This scenario was pretty straightforward with python 2, but looks awkward in python 3.
This scenario was basically broken in python 2. While it was straightforward in many cases, it was difficult or outright impossible to fix in any other cases.
The new behaviour is more awkward in the straightforward case, but robust and reliable in any case.
I had similar problem.
I solved it adding
import sys
sys.path.insert(0,".package_name")
into the __init__.py file in the package folder.