How to import the module to test into the test module - python-3.x

The problem
I have a directory structure for my project which follows the standard for Python packages, as it was created with this cookiecutter template:
https://github.com/audreyr/cookiecutter-pypackage#quickstart
The directory structure is
project_name
├── project_name
│ ├── __init__.py
│ └── module1.py
└── tests
└── test_module1.py
The first code line of test_module1.py is:
from project_name import module1
But I get a ModuleNotFoundError: No module named 'project_name'.
To my understanding, this should work since the folder called project_name is a package, which is ensured by presence of the __init__.py file.
I have always had trouble understanding how imports like this work. For my projects I have always just settled with having my tests in the same folder as the modules to test. I know this is bad practice, but the only way I could get the modules to actually import.
What I already tried
I have tried renaming the folder with the __init__.py file to something else and then import, as I thought it could have something to do with the parent folder and the child folder both having the name project_name. This did not work, same error.
I also tried making the test folder into a package by creating an __init.py__ file inside it, even though the Cookiecutter template does not have that.
I read in many places that making the test folder into a package is discouraged, but some suggest that structure. That did not work either.
I have searched thoroughly for solutions to this seemingly very standard problem, some of the links are here:
Python, importing modules for testing
https://gist.github.com/tasdikrahman/2bdb3fb31136a3768fac
Importing modules from parent folder
https://alex.dzyoba.com/blog/python-import/
Sibling package imports
Python imports for tests using nose - what is best practice for imports of modules above current package
My last try was to start a project with Cookiecutter, so everything would be set up properly form the beginning. However, I still get the ModuleNotFoundError.
What I don't want
I don't want to modify sys.path as many answers seem to suggest. There must be a cleaner way for such a common problem.
What am I doing wrong?
Edit for some additional info (see question from #Nicholas):
The contents of __init__.py is
# -*- coding: utf-8 -*-
"""Top-level package for project_name."""
__author__ = """my_name"""
__email__ = 'my_email'
__version__ = '0.1.0'
Which was generated by the Cookiecutter template.
Inside test_module1, I added the following before the before the ModuleNotFoundError occurs:
import sys
import os
print(sys.path)
print(os.getcwd())
sys.path prints a list, where first element is the tests directory.
['c:\\Users\\...\\project_name\\tests',
'C:\\Users\\...\\Miniconda3\\python37.zip',
'C:\\Users\\...\\Miniconda3\\DLLs', 'C:\\Users\\...\\Miniconda3\\lib',
'C:\\Users\\...\\Miniconda3',
'C:\\Users\\...\\Miniconda3\\lib\\site- packages',
'C:\\Users\\...\\Miniconda3\\lib\\site-packages\\win32',
'C:\\Users\\...\\Miniconda3\\lib\\site-packages\\win32\\lib',
'C:\\Users\\...\\Miniconda3\\lib\\site-packages\\Pythonwin']
I don't know if the lowercase 'c' in the first element matters.
os.getcwd() prints the root directory 'c:\Users\....\project_name'. Also with a lowercase 'c'.

You should create a virtual environment and install the project in order for the test modules to correctly resolve import statements.
In the project root, i.e. the directory project_name which contains a subdirectory project_name and a subdirectory tests, create a setup.py (or pyproject.toml) file for the package metadata. See here for details about that part.
From this same project root directory which is now containing the installer (setup.py), create and activate a venv and install your project:
python3 -m venv .venv
source .venv/bin/activatate # linux/macOS
# .\Scripts\activate.bat # windows
pip install --editable .
pip install pytest
pytest
If for some reason you don't want to create an installer for your project, you may run pytest like this, from the project directory:
python3 -m pytest
Unlike the bare pytest command, this will add the current working directory to sys.path allowing import statements to be resolved in tests.

Related

Python3 importing files from parent directory / relative importing

I have the following file structure:
parentfolder/
utils.py
myProgram/
main.py
other.py
I will be running the main.py which utilizes other.py which needs to utilize everything in utils.py (NOT just import one method from utils.py at a time - there are global variables and functions that call other functions within this file.)
I have tried lots of different examples online utilizing sys, path, etc. Along with adding __init__.py in none, some, and all directories. None of which worked for me.
How do I go about this importing of utils.py within other.py?
If I need to create init.py files could you also specify where they need to be created and if anything needs to be placed in them? Do I need to run them once before running the main.py the first time?
Thank you so much for any help in advanced
Yes you should add init files as in:
parentfolder/
__init__.py
utils.py
myProgram/
__init__.py
main.py
other.py
Those can be empty or better containing a docstring on the package contents, but you should not run them or anything
The correct way is to run your script from the parent folder of the parentFolder using the module path:
$ cd parentfolder/..
$ python -m parentFolder.myProgram.main
This way the import utils statement will work without the sys.path hack which can lead to subtle bugs

Sublime Text 3 + Build (ctrl + b) [duplicate]

I am running Python 2.5.
This is my folder tree:
ptdraft/
nib.py
simulations/
life/
life.py
(I also have __init__.py in each folder, omitted here for readability)
How do I import the nib module from inside the life module? I am hoping it is possible to do without tinkering with sys.path.
Note: The main module being run is in the ptdraft folder.
You could use relative imports (python >= 2.5):
from ... import nib
(What’s New in Python 2.5) PEP 328: Absolute and Relative Imports
EDIT: added another dot '.' to go up two packages
I posted a similar answer also to the question regarding imports from sibling packages. You can see it here.
Solution without sys.path hacks
Summary
Wrap the code into one folder (e.g. packaged_stuff)
Create a setup.py script where you use setuptools.setup().
Pip install the package in editable state with pip install -e <myproject_folder>
Import using from packaged_stuff.modulename import function_name
Setup
I assume the same folder structure as in the question
.
└── ptdraft
├── __init__.py
├── nib.py
└── simulations
├── __init__.py
└── life
├── __init__.py
└── life.py
I call the . the root folder, and in my case it is located in C:\tmp\test_imports.
Steps
Add a setup.py to the root folder
--
The contents of the setup.py can be simply
from setuptools import setup, find_packages
setup(name='myproject', version='1.0', packages=find_packages())
Basically "any" setup.py would work. This is just a minimal working example.
Use a virtual environment
If you are familiar with virtual environments, activate one, and skip to the next step. Usage of virtual environments are not absolutely required, but they will really help you out in the long run (when you have more than 1 project ongoing..). The most basic steps are (run in the root folder)
Create virtual env
python -m venv venv
Activate virtual env
. venv/bin/activate (Linux) or ./venv/Scripts/activate (Win)
Deactivate virtual env
deactivate (Linux)
To learn more about this, just Google out "python virtualenv tutorial" or similar. You probably never need any other commands than creating, activating and deactivating.
Once you have made and activated a virtual environment, your console should give the name of the virtual environment in parenthesis
PS C:\tmp\test_imports> python -m venv venv
PS C:\tmp\test_imports> .\venv\Scripts\activate
(venv) PS C:\tmp\test_imports>
pip install your project in editable state
Install your top level package myproject using pip. The trick is to use the -e flag when doing the install. This way it is installed in an editable state, and all the edits made to the .py files will be automatically included in the installed package.
In the root directory, run
pip install -e . (note the dot, it stands for "current directory")
You can also see that it is installed by using pip freeze
(venv) PS C:\tmp\test_imports> pip install -e .
Obtaining file:///C:/tmp/test_imports
Installing collected packages: myproject
Running setup.py develop for myproject
Successfully installed myproject
(venv) PS C:\tmp\test_imports> pip freeze
myproject==1.0
Import by prepending mainfolder to every import
In this example, the mainfolder would be ptdraft. This has the advantage that you will not run into name collisions with other module names (from python standard library or 3rd party modules).
Example Usage
nib.py
def function_from_nib():
print('I am the return value from function_from_nib!')
life.py
from ptdraft.nib import function_from_nib
if __name__ == '__main__':
function_from_nib()
Running life.py
(venv) PS C:\tmp\test_imports> python .\ptdraft\simulations\life\life.py
I am the return value from function_from_nib!
Relative imports (as in from .. import mymodule) only work in a package.
To import 'mymodule' that is in the parent directory of your current module:
import os
import sys
import inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parentdir = os.path.dirname(currentdir)
sys.path.insert(0, parentdir)
import mymodule
edit: the __file__ attribute is not always given. Instead of using os.path.abspath(__file__) I now suggested using the inspect module to retrieve the filename (and path) of the current file
It seems that the problem is not related to the module being in a parent directory or anything like that.
You need to add the directory that contains ptdraft to PYTHONPATH
You said that import nib worked with you, that probably means that you added ptdraft itself (not its parent) to PYTHONPATH.
You can use OS depending path in "module search path" which is listed in sys.path .
So you can easily add parent directory like following
import sys
sys.path.insert(0,'..')
If you want to add parent-parent directory,
sys.path.insert(0,'../..')
This works both in python 2 and 3.
Don't know much about python 2.
In python 3, the parent folder can be added as follows:
import sys
sys.path.append('..')
...and then one is able to import modules from it
If adding your module folder to the PYTHONPATH didn't work, You can modify the sys.path list in your program where the Python interpreter searches for the modules to import, the python documentation says:
When a module named spam is imported, the interpreter first searches for a built-in module with that name. If not found, it then searches for a file named spam.py in a list of directories given by the variable sys.path. sys.path is initialized from these locations:
the directory containing the input script (or the current directory).
PYTHONPATH (a list of directory names, with the same syntax as the shell variable PATH).
the installation-dependent default.
After initialization, Python programs can modify sys.path. The directory containing the script being run is placed at the beginning of the search path, ahead of the standard library path. This means that scripts in that directory will be loaded instead of modules of the same name in the library directory. This is an error unless the replacement is intended.
Knowing this, you can do the following in your program:
import sys
# Add the ptdraft folder path to the sys.path list
sys.path.append('/path/to/ptdraft/')
# Now you can import your module
from ptdraft import nib
# Or just
import ptdraft
Here is an answer that's simple so you can see how it works, small and cross-platform.
It only uses built-in modules (os, sys and inspect) so should work
on any operating system (OS) because Python is designed for that.
Shorter code for answer - fewer lines and variables
from inspect import getsourcefile
import os.path as path, sys
current_dir = path.dirname(path.abspath(getsourcefile(lambda:0)))
sys.path.insert(0, current_dir[:current_dir.rfind(path.sep)])
import my_module # Replace "my_module" here with the module name.
sys.path.pop(0)
For less lines than this, replace the second line with import os.path as path, sys, inspect,
add inspect. at the start of getsourcefile (line 3) and remove the first line.
- however this imports all of the module so could need more time, memory and resources.
The code for my answer (longer version)
from inspect import getsourcefile
import os.path
import sys
current_path = os.path.abspath(getsourcefile(lambda:0))
current_dir = os.path.dirname(current_path)
parent_dir = current_dir[:current_dir.rfind(os.path.sep)]
sys.path.insert(0, parent_dir)
import my_module # Replace "my_module" here with the module name.
It uses an example from a Stack Overflow answer How do I get the path of the current
executed file in Python? to find the source (filename) of running code with a built-in tool.
from inspect import getsourcefile
from os.path import abspath
Next, wherever you want to find the source file from you just use:
abspath(getsourcefile(lambda:0))
My code adds a file path to sys.path, the python path list
because this allows Python to import modules from that folder.
After importing a module in the code, it's a good idea to run sys.path.pop(0) on a new line
when that added folder has a module with the same name as another module that is imported
later in the program. You need to remove the list item added before the import, not other paths.
If your program doesn't import other modules, it's safe to not delete the file path because
after a program ends (or restarting the Python shell), any edits made to sys.path disappear.
Notes about a filename variable
My answer doesn't use the __file__ variable to get the file path/filename of running
code because users here have often described it as unreliable. You shouldn't use it
for importing modules from parent folder in programs used by other people.
Some examples where it doesn't work (quote from this Stack Overflow question):
• it can't be found on some platforms • it sometimes isn't the full file path
py2exe doesn't have a __file__ attribute, but there is a workaround
When you run from IDLE with execute() there is no __file__ attribute
OS X 10.6 where I get NameError: global name '__file__' is not defined
Here is more generic solution that includes the parent directory into sys.path (works for me):
import os.path, sys
sys.path.append(os.path.join(os.path.dirname(os.path.realpath(__file__)), os.pardir))
The pathlib library (included with >= Python 3.4) makes it very concise and intuitive to append the path of the parent directory to the PYTHONPATH:
import sys
from pathlib import Path
sys.path.append(str(Path('.').absolute().parent))
In a Jupyter Notebook (opened with Jupyter LAB or Jupyter Notebook)
As long as you're working in a Jupyter Notebook, this short solution might be useful:
%cd ..
import nib
It works even without an __init__.py file.
I tested it with Anaconda3 on Linux and Windows 7.
I found the following way works for importing a package from the script's parent directory. In the example, I would like to import functions in env.py from app.db package.
.
└── my_application
└── alembic
└── env.py
└── app
├── __init__.py
└── db
import os
import sys
currentdir = os.path.dirname(os.path.realpath(__file__))
parentdir = os.path.dirname(currentdir)
sys.path.append(parentdir)
Above mentioned solutions are also fine. Another solution to this problem is
If you want to import anything from top level directory. Then,
from ...module_name import *
Also, if you want to import any module from the parent directory. Then,
from ..module_name import *
Also, if you want to import any module from the parent directory. Then,
from ...module_name.another_module import *
This way you can import any particular method if you want to.
Two line simplest solution
import os, sys
sys.path.insert(0, os.getcwd())
If parent is your working directory and you want to call another child modules from child scripts.
You can import all child modules from parent directory in any scripts and execute it as
python child_module1/child_script.py
For me the shortest and my favorite oneliner for accessing to the parent directory is:
sys.path.append(os.path.dirname(os.getcwd()))
or:
sys.path.insert(1, os.path.dirname(os.getcwd()))
os.getcwd() returns the name of the current working directory, os.path.dirname(directory_name) returns the directory name for the passed one.
Actually, in my opinion Python project architecture should be done the way where no one module from child directory will use any module from the parent directory. If something like this happens it is worth to rethink about the project tree.
Another way is to add parent directory to PYTHONPATH system environment variable.
Though the original author is probably no longer looking for a solution, but for completeness, there one simple solution. It's to run life.py as a module like this:
cd ptdraft
python -m simulations.life.life
This way you can import anything from nib.py as ptdraft directory is in the path.
I think you can try this in that specific example, but in python 3.6.3
import sys
sys.path.append('../')
same sort of style as the past answer - but in fewer lines :P
import os,sys
parentdir = os.path.dirname(__file__)
sys.path.insert(0,parentdir)
file returns the location you are working in
In a Linux system, you can create a soft link from the "life" folder to the nib.py file. Then, you can simply import it like:
import nib
I have a solution specifically for git-repositories.
First I used sys.path.append('..') and similar solutions. This causes especially problems if you are importing files which are themselves importing files with sys.path.append('..').
I then decided to always append the root directory of the git repository. In one line it would look like this:
sys.path.append(git.Repo('.', search_parent_directories=True).working_tree_dir)
Or in more details like this:
import os
import sys
import git
def get_main_git_root(path):
main_repo_root_dir = git.Repo(path, search_parent_directories=True).working_tree_dir
return main_repo_root_dir
main_repo_root_dir = get_main_git_root('.')
sys.path.append(main_repo_root_dir)
For the original question: Based on what the root directory of the repository is, the import would be
import ptdraft.nib
or
import nib
Our folder structure:
/myproject
project_using_ptdraft/
main.py
ptdraft/
__init__.py
nib.py
simulations/
__init__.py
life/
__init__.py
life.py
The way I understand this is to have a package-centric view.
The package root is ptdraft, since it's the top most level that contains __init__.py
All the files within the package can use absolute paths (that are relative to package root) for imports, for example
in life.py, we have simply:
import ptdraft.nib
However, to run life.py for package dev/testing purposes, instead of python life.py, we need to use:
cd /myproject
python -m ptdraft.simulations.life.life
Note that we didn't need to fiddle with any path at all at this point.
Further confusion is when we complete the ptdraft package, and we want to use it in a driver script, which is necessarily outside of the ptdraft package folder, aka project_using_ptdraft/main.py, we would need to fiddle with paths:
import sys
sys.path.append("/myproject") # folder that contains ptdraft
import ptdraft
import ptdraft.simulations
and use python main.py to run the script without problem.
Helpful links:
https://tenthousandmeters.com/blog/python-behind-the-scenes-11-how-the-python-import-system-works/ (see how __init__.py can be used)
https://chrisyeh96.github.io/2017/08/08/definitive-guide-python-imports.html#running-package-initialization-code
https://stackoverflow.com/a/50392363/2202107
https://stackoverflow.com/a/27876800/2202107
Work with libraries.
Make a library called nib, install it using setup.py, let it reside in site-packages and your problems are solved.
You don't have to stuff everything you make in a single package. Break it up to pieces.
I had a problem where I had to import a Flask application, that had an import that also needed to import files in separate folders. This is partially using Remi's answer, but suppose we had a repository that looks like this:
.
└── service
└── misc
└── categories.csv
└── test
└── app_test.py
app.py
pipeline.py
Then before importing the app object from the app.py file, we change the directory one level up, so when we import the app (which imports the pipeline.py), we can also read in miscellaneous files like a csv file.
import os,sys,inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parentdir = os.path.dirname(currentdir)
sys.path.insert(0,parentdir)
os.chdir('../')
from app import app
After having imported the Flask app, you can use os.chdir('./test') so that your working directory is not changed.
It's seems to me that you don't really need to import the parent module. Let's imagine that in nib.py you have func1() and data1, you need to use in life.py
nib.py
import simulations.life.life as life
def func1():
pass
data1 = {}
life.share(func1, data1)
life.py
func1 = data1 = None
def share(*args):
global func1, data1
func1, data1 = args
And now you have the access to func1 and data in life.py. Of course you have to be careful to populate them in life.py before you try to use them,
I made this library to do this.
https://github.com/fx-kirin/add_parent_path
# Just add parent path
add_parent_path(1)
# Append to syspath and delete when the exist of with statement.
with add_parent_path(1):
# Import modules in the parent path
pass
This is the simplest solution that works for me:
from ptdraft import nib
After removing some sys path hacks, I thought it might be valuable to add
My preferred solution.
Note: this is a frame challenge - it's not necessary to do in-code.
Assuming a tree,
project
└── pkg
└── test.py
Where test.py contains
import sys, json; print(json.dumps(sys.path, indent=2))
Executing using the path only includes the package directory
python pkg/test.py
[
"/project/pkg",
...
]
But using the module argument includes the project directory
python -m pkg.test
[
"/project",
...
]
Now, all imports can be absolute, from the project directory. No further skullduggery required.
Although it is against all rules, I still want to mention this possibility:
You can first copy the file from the parent directory to the child directory. Next import it and subsequently remove the copied file:
for example in life.py:
import os
import shutil
shutil.copy('../nib.py', '.')
import nib
os.remove('nib.py')
# now you can use it just fine:
nib.foo()
Of course there might arise several problems when nibs tries to import/read other files with relative imports/paths.
This works for me to import things from a higher folder.
import os
os.chdir('..')

Can't access top level python package from sub packages

I have a directory structure like below:
chatbot/
__init__.py
utils/
__init__.py
parser.py
nlu/
__init__.py
training/
__init__.py
module.py
I want to access parser.py from module.py.
I tried using this line from module.py:
from chatbot.utils import parser
And I got this error:
ModuleNotFoundError: No module named 'chatbot'
Any pointers to what I am doing wrong?
I am using python3 and trying to run the script as python3 nlu/training/module.py.
Thanks in advance!
I believe the right way of solving such problems is:
figure out what your top-level packages and modules are and in what directory they are
change to that directory
make all your imports absolute, i.e. always start with a top-level module or package (instead of things like from . import blah)
use the executable module (python -m) way of running your code, for example path/to/pythonX.Y -m top_level_package.executable_module (instead of path/to/pythonX.Y top_level_package/executable_module.py)
In case the top-level modules or packages are not all in the same directory:
Either work on packaging your libraries or applications correctly and install them in the site packages (eventually as editable, also known as develop mode).
Or, as a last resort, collect the other directories containing the top-level packages and modules in the PYTHONPATH environment variable like the following:
PYTHONPATH=path/to/alpha:path/to/bravo path/to/pythonX.Y -m top_level_package.executable_module
I was able to solve the issue by adding the parent directory of chatbot package to sys.path.
Say if chatbot python package is in /home/my_project/chatbot/, I added a statement like:
sys.path.append('/home/my_project') in module.py
This made all top-level python packages visible to all lower-level python packages.

How import module in PyCharm

I have two projects. In first one, I can import my module importme.py like:
import importme
And now I can use my function hello() in importme module without any problem. In second one, I recieve:
ImportError: No module named 'importme'
But I can import it via:
from . import importme
Why I cant import my module the same way in both projects? Should I configure some paths ?
EDIT1:
Directory structure of first project:
testproject/
├── importme.py
└── start.py
Directory structure of second project:
spiders/
├── spider.py
├── download_page.py
├── importme.py
└── __init__.py
file init.py is empty.
My favorite method of dealing with PYTHONPATH is installing package in edit mode in virtual environment.
Creating virtual environment
# create
$ python -m venv ~/virtualevns/myenv
# then activate it
$ source ~/virtualenvs/myenv/bin/activate
# you can check whether it got activated
$ which python
home/user/virtualenvs/myenv/bin/python
Writing setup.py for your project. For this refer to the official distributing packages tutorial.
Installing package in editable format.
If you install package with -e flag pip will install it in editable format meaning all changes you have in code will be present in your environment's package:
$ cd mypkg
$ pip install -e .
Finally you need to set your virtual-environment to be used in whatever IDE your editor you are using.
This is a great workflow because it's clean and reliable - you are using exactly what you'd be using in production/finished package environment.

Python intra-package imports and unit tests

I've searched for a solution here and on other sites, but it feels like all import problems I come across are subtly different.
I have a project with the following setup:
/
__init__.py
package1
___init__.py
a.py
b.py
tests/
test_a.py
test_a.py
package2
package3
In b.py:
from .a import Foo
In the tests:
import a, b
package1, package2, and package3 are essentially smaller packages that are bundled together in the same project/super-package as utilities. The purpose of this project is to be nested inside another package (say, package4) and to have these packages/modules imported by package4. Hence, relative imports to other files in the package are required, if I don't want to modify the path.
As an example, package4:
/
main.py
src/
external/
project_from_above
package1
package2
package3
I'm omitting the __init__.py's in the hierarchy above. In main.py, I might do:
import src.external.project_from_above.package1.a
My problem: this structure works fine, except for unit testing. I am in the habit of running python3 -m unittest discover tests from each package (package1, package2, package3). This works fine when there are not relative imports. However, running with relative imports will yield the following error: "SystemError: Parent module '' not loaded, cannot perform relative import"
I desire:
A way of running the unit tests in package1/tests from the package1 directory, with no imports changing (or at least, maintaining the ability to use this entire project inside the aforementioned package4 as a sub-package). I'd like to avoid any manipulation of the path, but if we can restrict it to a run_tests.py file in package1, then that is okay.
Here's one solution: add a file called run_tests in package1. In it, do the following:
cd ..
python -m unittest discover package1/tests
This requires you to use absolute imports in your tests (e.g., import package1.a)

Resources