Import py file in another directory in Jupyter notebook - python-3.x

My question is related to this. I am using Python 3.6 in Jupyter Notebook. My project directory is /user/project. In this directory I'm building a number of models and each has its own folder. However, there is a common functions.py file with functions that I want to use across all models. So I want to keep the functions.py file in /user/project but be able to call it from an .ipynb file in /user/project/model1, /user/project/model2, etc... How can I do this?

There is no simple way to import python files in another directory.
This is unrelated to the jupyter notebook
Here are 3 solutions to your problem
You can add the directory containing the file you want to import to your path and then import the file like this:
import sys
sys.path.insert(0, '/path/to/application/app/folder')
import file
You can create a local module by having an empty __init__.py file in the folder you want to import. There are some weird rules regarding the folder hierarchy that you have to take into consideration.
You can create a module for the file you wish to import and install it globally.

Assuming you have a folder name Jupyter and you wish to import modules (employee) from another folder named nn_webserver.
visualizing it:
do this:
import sys
import os
module_path = os.path.abspath(os.path.join('..'))
if module_path not in sys.path:
sys.path.append(module_path+"\\nn_webserver")
from employee import motivation_to_work
see additional information here from #metakermit

I've been thinking about this problem because I don't like the sys.path.append() answers. A solution I propose uses the built-in Jupyter magic command to change the current working directory. Assuming you have this file structure:
project
├── model1
| └── notebook1.ipynb
├── model2
| └── notebook2.ipynb
└── functions.py
Whether you wanted to import functions from notebook1.ipynb or notebook2.ipynb, you could simply add a cell with the following line before the cell that has your package imports:
%cd ..
This changes the current working directory to the parent directory of the notebook, which then adds the path of the functions module to the default locations that Python will check for packages. To import functions:
import functions
This would work similarly if you had multiple modules in the same package directory that you wanted to import:
project
├── model1
| └── notebook1.ipynb
├── model2
| └── notebook2.ipynb
└── package
├── functions1.py
└── functions2.py
You can import both modules functions1 and functions2 from package like this:
from package import functions1, functions2
EDIT: As pointed out below, the local imports will no longer work if the cell containing the magic command is run more than once (the current working directory will be changed to the directory above at each rerun of the command). To prevent this from happening, the %cd .. command should be in its own cell (not in the same cell as the imports) at the top of the notebook and before the imports so it won't be run multiple times. Restarting the kernel and running all cells will reset the current working directory however will still return the desired imports/results.

I've solved this problem in the past by creating a symbolic link in the directory where the Jupyter notebook is located to the library it wants to load, so that python behaves as if the module is in the correct path. So for the example above, you would run the following command once per directory inside a Jupyter cell:
!ln -s /user/project/functions.py functions.py
and then you could import with
import functions
Note: I've only tried this on Linux and Mac Os, so I can't vouch for Windows.

I would suggest to install functions.py as a package in your virtual environment. There are some benefits of this:
You can access functions.py file from any iPython notebook located in any place, but at the given environment (kernel).
Once you changed any function in functions.py file, you don't need to reload your iPython notebook again and again. It will automatically reload every change.
This is the way how it can be done:
Create setup.py file (https://docs.python.org/2/distutils/setupscript.html) in your project folder
Activate your virtual environment, go to your project location, and use this command pip install -e .
Then, in your iPython notebook:
%load_ext autoreload
%autoreload 1
%aimport yourproject.functions
from functions import *
That's it!

In addition to the answer from adhg, I recommend using Pathlib, for compatibility between Linux/Windows/WSL paths formats:
Assuming the following folder structure:
.
├── work
| ├── notebook.ipynb
| └── my_python_file.py
├── py
| ├──modules
| | ├──__init__.py # empty
| | └──preparations.py
| ├──__init__.py # empty
| └── tools.py
├──.git
└── README.md
To load tools.py or preparations.py in my_python_file.py (or in notebook notebook.ipynb):
import sys
from pathlib import Path
# in jupyter (lab / notebook), based on notebook path
module_path = str(Path.cwd().parents[0] / "py")
# in standard python
module_path = str(Path.cwd(__file__).parents[0] / "py")
if module_path not in sys.path:
sys.path.append(module_path)
from modules import preparations
import tools
...

Found myself in the same exact situation as the OP, going to create several notebooks hence the wish to organise them in different subfolders
Tried this that seems to do what I need and seems cleaner to me
import os
os.chdir(os.path.dirname(os.path.dirname(os.getcwd())))
My function being two levels above so nested two os.path.dirname (with different folder structure could be only one or more)
Just implemented it and working fine, and btw I'm using JupyterLab started... two levels above where the function resides

Related

Python3 importing files from parent directory / relative importing

I have the following file structure:
parentfolder/
utils.py
myProgram/
main.py
other.py
I will be running the main.py which utilizes other.py which needs to utilize everything in utils.py (NOT just import one method from utils.py at a time - there are global variables and functions that call other functions within this file.)
I have tried lots of different examples online utilizing sys, path, etc. Along with adding __init__.py in none, some, and all directories. None of which worked for me.
How do I go about this importing of utils.py within other.py?
If I need to create init.py files could you also specify where they need to be created and if anything needs to be placed in them? Do I need to run them once before running the main.py the first time?
Thank you so much for any help in advanced
Yes you should add init files as in:
parentfolder/
__init__.py
utils.py
myProgram/
__init__.py
main.py
other.py
Those can be empty or better containing a docstring on the package contents, but you should not run them or anything
The correct way is to run your script from the parent folder of the parentFolder using the module path:
$ cd parentfolder/..
$ python -m parentFolder.myProgram.main
This way the import utils statement will work without the sys.path hack which can lead to subtle bugs

python3 import modules from different projects [duplicate]

This question already has answers here:
Attempted relative import with no known parent package [duplicate]
(4 answers)
Relative imports in Python 3
(31 answers)
Closed 1 year ago.
I have the following directory structure:
py_test
├── __init__.py
├── dir1
│ ├── __init__.py
│ └── script1.py
└── dir2
├── __init__.py
└── script2.py
In script2 I want to "import ..\script1".
What I tried in script2:
Does not work
from ..dir1 import script1
ImportError: attempted relative import with no known parent package`
Works
import sys, os
path2add = os.path.normpath(os.path.abspath(os.path.join(os.path.dirname(__file__), os.path.pardir, 'dir1')))
if (not (path2add in sys.path)) :
sys.path.append(path2add)
If I want to go with option 1, what is the simplest (i.e., with the least files) file/dir structure that makes it work?
I am aware of this, but I wonder if creating that directory structure can be avoided, and still use type-1 import.
I am currently using this workaround, which uses type-2 import.
Related:
How to import a Python class that is in a directory above?
Import a module from a directory (package) one level up
Getting "ImportError: attempted relative import with no known parent package" when running from Python Interpreter
Using importlib to dynamically import module(s) containing relative imports
How to import variables in a different python file
As mentioned in the comments, attempting to import modules a directory up will not work if script2.py is your entry point.
As mentioned in this link you included:
If the module's __name__ does not contain any package information (e.g., it is set to __main__), then relative imports are resolved as if the module were a top-level module, regardless of where the module is actually located on the file system.
The module's __name__ is set to __main__ if it is the entry point, or the one that you pass to the interpreter with something like python script2.py.
Any python module run as such no longer has the information needed to import files from higher directories.
Therefore you have two options:
Option 1: Keep using the workaround with sys.path.append
This will work how you want it to, but it is rather cumbersome.
Option 2: Make your entry point at the top level
Assuming your package has more than one script that needs to be run, you could create a new file that imports both script1 and script2 and then calls the functionality you want based on a command line argument. Then you will be able to keep your current directory structure and have your relative imports work just fine, without any kind of fiddling with sys.path.

Sublime Text 3 + Build (ctrl + b) [duplicate]

I am running Python 2.5.
This is my folder tree:
ptdraft/
nib.py
simulations/
life/
life.py
(I also have __init__.py in each folder, omitted here for readability)
How do I import the nib module from inside the life module? I am hoping it is possible to do without tinkering with sys.path.
Note: The main module being run is in the ptdraft folder.
You could use relative imports (python >= 2.5):
from ... import nib
(What’s New in Python 2.5) PEP 328: Absolute and Relative Imports
EDIT: added another dot '.' to go up two packages
I posted a similar answer also to the question regarding imports from sibling packages. You can see it here.
Solution without sys.path hacks
Summary
Wrap the code into one folder (e.g. packaged_stuff)
Create a setup.py script where you use setuptools.setup().
Pip install the package in editable state with pip install -e <myproject_folder>
Import using from packaged_stuff.modulename import function_name
Setup
I assume the same folder structure as in the question
.
└── ptdraft
├── __init__.py
├── nib.py
└── simulations
├── __init__.py
└── life
├── __init__.py
└── life.py
I call the . the root folder, and in my case it is located in C:\tmp\test_imports.
Steps
Add a setup.py to the root folder
--
The contents of the setup.py can be simply
from setuptools import setup, find_packages
setup(name='myproject', version='1.0', packages=find_packages())
Basically "any" setup.py would work. This is just a minimal working example.
Use a virtual environment
If you are familiar with virtual environments, activate one, and skip to the next step. Usage of virtual environments are not absolutely required, but they will really help you out in the long run (when you have more than 1 project ongoing..). The most basic steps are (run in the root folder)
Create virtual env
python -m venv venv
Activate virtual env
. venv/bin/activate (Linux) or ./venv/Scripts/activate (Win)
Deactivate virtual env
deactivate (Linux)
To learn more about this, just Google out "python virtualenv tutorial" or similar. You probably never need any other commands than creating, activating and deactivating.
Once you have made and activated a virtual environment, your console should give the name of the virtual environment in parenthesis
PS C:\tmp\test_imports> python -m venv venv
PS C:\tmp\test_imports> .\venv\Scripts\activate
(venv) PS C:\tmp\test_imports>
pip install your project in editable state
Install your top level package myproject using pip. The trick is to use the -e flag when doing the install. This way it is installed in an editable state, and all the edits made to the .py files will be automatically included in the installed package.
In the root directory, run
pip install -e . (note the dot, it stands for "current directory")
You can also see that it is installed by using pip freeze
(venv) PS C:\tmp\test_imports> pip install -e .
Obtaining file:///C:/tmp/test_imports
Installing collected packages: myproject
Running setup.py develop for myproject
Successfully installed myproject
(venv) PS C:\tmp\test_imports> pip freeze
myproject==1.0
Import by prepending mainfolder to every import
In this example, the mainfolder would be ptdraft. This has the advantage that you will not run into name collisions with other module names (from python standard library or 3rd party modules).
Example Usage
nib.py
def function_from_nib():
print('I am the return value from function_from_nib!')
life.py
from ptdraft.nib import function_from_nib
if __name__ == '__main__':
function_from_nib()
Running life.py
(venv) PS C:\tmp\test_imports> python .\ptdraft\simulations\life\life.py
I am the return value from function_from_nib!
Relative imports (as in from .. import mymodule) only work in a package.
To import 'mymodule' that is in the parent directory of your current module:
import os
import sys
import inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parentdir = os.path.dirname(currentdir)
sys.path.insert(0, parentdir)
import mymodule
edit: the __file__ attribute is not always given. Instead of using os.path.abspath(__file__) I now suggested using the inspect module to retrieve the filename (and path) of the current file
It seems that the problem is not related to the module being in a parent directory or anything like that.
You need to add the directory that contains ptdraft to PYTHONPATH
You said that import nib worked with you, that probably means that you added ptdraft itself (not its parent) to PYTHONPATH.
You can use OS depending path in "module search path" which is listed in sys.path .
So you can easily add parent directory like following
import sys
sys.path.insert(0,'..')
If you want to add parent-parent directory,
sys.path.insert(0,'../..')
This works both in python 2 and 3.
Don't know much about python 2.
In python 3, the parent folder can be added as follows:
import sys
sys.path.append('..')
...and then one is able to import modules from it
If adding your module folder to the PYTHONPATH didn't work, You can modify the sys.path list in your program where the Python interpreter searches for the modules to import, the python documentation says:
When a module named spam is imported, the interpreter first searches for a built-in module with that name. If not found, it then searches for a file named spam.py in a list of directories given by the variable sys.path. sys.path is initialized from these locations:
the directory containing the input script (or the current directory).
PYTHONPATH (a list of directory names, with the same syntax as the shell variable PATH).
the installation-dependent default.
After initialization, Python programs can modify sys.path. The directory containing the script being run is placed at the beginning of the search path, ahead of the standard library path. This means that scripts in that directory will be loaded instead of modules of the same name in the library directory. This is an error unless the replacement is intended.
Knowing this, you can do the following in your program:
import sys
# Add the ptdraft folder path to the sys.path list
sys.path.append('/path/to/ptdraft/')
# Now you can import your module
from ptdraft import nib
# Or just
import ptdraft
Here is an answer that's simple so you can see how it works, small and cross-platform.
It only uses built-in modules (os, sys and inspect) so should work
on any operating system (OS) because Python is designed for that.
Shorter code for answer - fewer lines and variables
from inspect import getsourcefile
import os.path as path, sys
current_dir = path.dirname(path.abspath(getsourcefile(lambda:0)))
sys.path.insert(0, current_dir[:current_dir.rfind(path.sep)])
import my_module # Replace "my_module" here with the module name.
sys.path.pop(0)
For less lines than this, replace the second line with import os.path as path, sys, inspect,
add inspect. at the start of getsourcefile (line 3) and remove the first line.
- however this imports all of the module so could need more time, memory and resources.
The code for my answer (longer version)
from inspect import getsourcefile
import os.path
import sys
current_path = os.path.abspath(getsourcefile(lambda:0))
current_dir = os.path.dirname(current_path)
parent_dir = current_dir[:current_dir.rfind(os.path.sep)]
sys.path.insert(0, parent_dir)
import my_module # Replace "my_module" here with the module name.
It uses an example from a Stack Overflow answer How do I get the path of the current
executed file in Python? to find the source (filename) of running code with a built-in tool.
from inspect import getsourcefile
from os.path import abspath
Next, wherever you want to find the source file from you just use:
abspath(getsourcefile(lambda:0))
My code adds a file path to sys.path, the python path list
because this allows Python to import modules from that folder.
After importing a module in the code, it's a good idea to run sys.path.pop(0) on a new line
when that added folder has a module with the same name as another module that is imported
later in the program. You need to remove the list item added before the import, not other paths.
If your program doesn't import other modules, it's safe to not delete the file path because
after a program ends (or restarting the Python shell), any edits made to sys.path disappear.
Notes about a filename variable
My answer doesn't use the __file__ variable to get the file path/filename of running
code because users here have often described it as unreliable. You shouldn't use it
for importing modules from parent folder in programs used by other people.
Some examples where it doesn't work (quote from this Stack Overflow question):
• it can't be found on some platforms • it sometimes isn't the full file path
py2exe doesn't have a __file__ attribute, but there is a workaround
When you run from IDLE with execute() there is no __file__ attribute
OS X 10.6 where I get NameError: global name '__file__' is not defined
Here is more generic solution that includes the parent directory into sys.path (works for me):
import os.path, sys
sys.path.append(os.path.join(os.path.dirname(os.path.realpath(__file__)), os.pardir))
The pathlib library (included with >= Python 3.4) makes it very concise and intuitive to append the path of the parent directory to the PYTHONPATH:
import sys
from pathlib import Path
sys.path.append(str(Path('.').absolute().parent))
In a Jupyter Notebook (opened with Jupyter LAB or Jupyter Notebook)
As long as you're working in a Jupyter Notebook, this short solution might be useful:
%cd ..
import nib
It works even without an __init__.py file.
I tested it with Anaconda3 on Linux and Windows 7.
I found the following way works for importing a package from the script's parent directory. In the example, I would like to import functions in env.py from app.db package.
.
└── my_application
└── alembic
└── env.py
└── app
├── __init__.py
└── db
import os
import sys
currentdir = os.path.dirname(os.path.realpath(__file__))
parentdir = os.path.dirname(currentdir)
sys.path.append(parentdir)
Above mentioned solutions are also fine. Another solution to this problem is
If you want to import anything from top level directory. Then,
from ...module_name import *
Also, if you want to import any module from the parent directory. Then,
from ..module_name import *
Also, if you want to import any module from the parent directory. Then,
from ...module_name.another_module import *
This way you can import any particular method if you want to.
Two line simplest solution
import os, sys
sys.path.insert(0, os.getcwd())
If parent is your working directory and you want to call another child modules from child scripts.
You can import all child modules from parent directory in any scripts and execute it as
python child_module1/child_script.py
For me the shortest and my favorite oneliner for accessing to the parent directory is:
sys.path.append(os.path.dirname(os.getcwd()))
or:
sys.path.insert(1, os.path.dirname(os.getcwd()))
os.getcwd() returns the name of the current working directory, os.path.dirname(directory_name) returns the directory name for the passed one.
Actually, in my opinion Python project architecture should be done the way where no one module from child directory will use any module from the parent directory. If something like this happens it is worth to rethink about the project tree.
Another way is to add parent directory to PYTHONPATH system environment variable.
Though the original author is probably no longer looking for a solution, but for completeness, there one simple solution. It's to run life.py as a module like this:
cd ptdraft
python -m simulations.life.life
This way you can import anything from nib.py as ptdraft directory is in the path.
I think you can try this in that specific example, but in python 3.6.3
import sys
sys.path.append('../')
same sort of style as the past answer - but in fewer lines :P
import os,sys
parentdir = os.path.dirname(__file__)
sys.path.insert(0,parentdir)
file returns the location you are working in
In a Linux system, you can create a soft link from the "life" folder to the nib.py file. Then, you can simply import it like:
import nib
I have a solution specifically for git-repositories.
First I used sys.path.append('..') and similar solutions. This causes especially problems if you are importing files which are themselves importing files with sys.path.append('..').
I then decided to always append the root directory of the git repository. In one line it would look like this:
sys.path.append(git.Repo('.', search_parent_directories=True).working_tree_dir)
Or in more details like this:
import os
import sys
import git
def get_main_git_root(path):
main_repo_root_dir = git.Repo(path, search_parent_directories=True).working_tree_dir
return main_repo_root_dir
main_repo_root_dir = get_main_git_root('.')
sys.path.append(main_repo_root_dir)
For the original question: Based on what the root directory of the repository is, the import would be
import ptdraft.nib
or
import nib
Our folder structure:
/myproject
project_using_ptdraft/
main.py
ptdraft/
__init__.py
nib.py
simulations/
__init__.py
life/
__init__.py
life.py
The way I understand this is to have a package-centric view.
The package root is ptdraft, since it's the top most level that contains __init__.py
All the files within the package can use absolute paths (that are relative to package root) for imports, for example
in life.py, we have simply:
import ptdraft.nib
However, to run life.py for package dev/testing purposes, instead of python life.py, we need to use:
cd /myproject
python -m ptdraft.simulations.life.life
Note that we didn't need to fiddle with any path at all at this point.
Further confusion is when we complete the ptdraft package, and we want to use it in a driver script, which is necessarily outside of the ptdraft package folder, aka project_using_ptdraft/main.py, we would need to fiddle with paths:
import sys
sys.path.append("/myproject") # folder that contains ptdraft
import ptdraft
import ptdraft.simulations
and use python main.py to run the script without problem.
Helpful links:
https://tenthousandmeters.com/blog/python-behind-the-scenes-11-how-the-python-import-system-works/ (see how __init__.py can be used)
https://chrisyeh96.github.io/2017/08/08/definitive-guide-python-imports.html#running-package-initialization-code
https://stackoverflow.com/a/50392363/2202107
https://stackoverflow.com/a/27876800/2202107
Work with libraries.
Make a library called nib, install it using setup.py, let it reside in site-packages and your problems are solved.
You don't have to stuff everything you make in a single package. Break it up to pieces.
I had a problem where I had to import a Flask application, that had an import that also needed to import files in separate folders. This is partially using Remi's answer, but suppose we had a repository that looks like this:
.
└── service
└── misc
└── categories.csv
└── test
└── app_test.py
app.py
pipeline.py
Then before importing the app object from the app.py file, we change the directory one level up, so when we import the app (which imports the pipeline.py), we can also read in miscellaneous files like a csv file.
import os,sys,inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parentdir = os.path.dirname(currentdir)
sys.path.insert(0,parentdir)
os.chdir('../')
from app import app
After having imported the Flask app, you can use os.chdir('./test') so that your working directory is not changed.
It's seems to me that you don't really need to import the parent module. Let's imagine that in nib.py you have func1() and data1, you need to use in life.py
nib.py
import simulations.life.life as life
def func1():
pass
data1 = {}
life.share(func1, data1)
life.py
func1 = data1 = None
def share(*args):
global func1, data1
func1, data1 = args
And now you have the access to func1 and data in life.py. Of course you have to be careful to populate them in life.py before you try to use them,
I made this library to do this.
https://github.com/fx-kirin/add_parent_path
# Just add parent path
add_parent_path(1)
# Append to syspath and delete when the exist of with statement.
with add_parent_path(1):
# Import modules in the parent path
pass
This is the simplest solution that works for me:
from ptdraft import nib
After removing some sys path hacks, I thought it might be valuable to add
My preferred solution.
Note: this is a frame challenge - it's not necessary to do in-code.
Assuming a tree,
project
└── pkg
└── test.py
Where test.py contains
import sys, json; print(json.dumps(sys.path, indent=2))
Executing using the path only includes the package directory
python pkg/test.py
[
"/project/pkg",
...
]
But using the module argument includes the project directory
python -m pkg.test
[
"/project",
...
]
Now, all imports can be absolute, from the project directory. No further skullduggery required.
Although it is against all rules, I still want to mention this possibility:
You can first copy the file from the parent directory to the child directory. Next import it and subsequently remove the copied file:
for example in life.py:
import os
import shutil
shutil.copy('../nib.py', '.')
import nib
os.remove('nib.py')
# now you can use it just fine:
nib.foo()
Of course there might arise several problems when nibs tries to import/read other files with relative imports/paths.
This works for me to import things from a higher folder.
import os
os.chdir('..')

How to import the module to test into the test module

The problem
I have a directory structure for my project which follows the standard for Python packages, as it was created with this cookiecutter template:
https://github.com/audreyr/cookiecutter-pypackage#quickstart
The directory structure is
project_name
├── project_name
│ ├── __init__.py
│ └── module1.py
└── tests
└── test_module1.py
The first code line of test_module1.py is:
from project_name import module1
But I get a ModuleNotFoundError: No module named 'project_name'.
To my understanding, this should work since the folder called project_name is a package, which is ensured by presence of the __init__.py file.
I have always had trouble understanding how imports like this work. For my projects I have always just settled with having my tests in the same folder as the modules to test. I know this is bad practice, but the only way I could get the modules to actually import.
What I already tried
I have tried renaming the folder with the __init__.py file to something else and then import, as I thought it could have something to do with the parent folder and the child folder both having the name project_name. This did not work, same error.
I also tried making the test folder into a package by creating an __init.py__ file inside it, even though the Cookiecutter template does not have that.
I read in many places that making the test folder into a package is discouraged, but some suggest that structure. That did not work either.
I have searched thoroughly for solutions to this seemingly very standard problem, some of the links are here:
Python, importing modules for testing
https://gist.github.com/tasdikrahman/2bdb3fb31136a3768fac
Importing modules from parent folder
https://alex.dzyoba.com/blog/python-import/
Sibling package imports
Python imports for tests using nose - what is best practice for imports of modules above current package
My last try was to start a project with Cookiecutter, so everything would be set up properly form the beginning. However, I still get the ModuleNotFoundError.
What I don't want
I don't want to modify sys.path as many answers seem to suggest. There must be a cleaner way for such a common problem.
What am I doing wrong?
Edit for some additional info (see question from #Nicholas):
The contents of __init__.py is
# -*- coding: utf-8 -*-
"""Top-level package for project_name."""
__author__ = """my_name"""
__email__ = 'my_email'
__version__ = '0.1.0'
Which was generated by the Cookiecutter template.
Inside test_module1, I added the following before the before the ModuleNotFoundError occurs:
import sys
import os
print(sys.path)
print(os.getcwd())
sys.path prints a list, where first element is the tests directory.
['c:\\Users\\...\\project_name\\tests',
'C:\\Users\\...\\Miniconda3\\python37.zip',
'C:\\Users\\...\\Miniconda3\\DLLs', 'C:\\Users\\...\\Miniconda3\\lib',
'C:\\Users\\...\\Miniconda3',
'C:\\Users\\...\\Miniconda3\\lib\\site- packages',
'C:\\Users\\...\\Miniconda3\\lib\\site-packages\\win32',
'C:\\Users\\...\\Miniconda3\\lib\\site-packages\\win32\\lib',
'C:\\Users\\...\\Miniconda3\\lib\\site-packages\\Pythonwin']
I don't know if the lowercase 'c' in the first element matters.
os.getcwd() prints the root directory 'c:\Users\....\project_name'. Also with a lowercase 'c'.
You should create a virtual environment and install the project in order for the test modules to correctly resolve import statements.
In the project root, i.e. the directory project_name which contains a subdirectory project_name and a subdirectory tests, create a setup.py (or pyproject.toml) file for the package metadata. See here for details about that part.
From this same project root directory which is now containing the installer (setup.py), create and activate a venv and install your project:
python3 -m venv .venv
source .venv/bin/activatate # linux/macOS
# .\Scripts\activate.bat # windows
pip install --editable .
pip install pytest
pytest
If for some reason you don't want to create an installer for your project, you may run pytest like this, from the project directory:
python3 -m pytest
Unlike the bare pytest command, this will add the current working directory to sys.path allowing import statements to be resolved in tests.

ModuleNotFoundError: cannot import local file

I have a module with multiple files structured like this:
/bettermod/
├── __init__.py
├── api.py
├── bettermod.py
├── errors.py
└── loggers.py
From bettermod.py, I'm trying to import two things:
a class called API from api.py
the whole errors.py file
For the first thing, it is quite easy, I just have to do this:
from .api import API
However, for importing the whole errors.py file, I'm encountering a problem; I'm trying to do like this:
from . import errors
which should work, according to this python documentation, but it's raising the following error:
File "/path/to/bettermod/bettermod.py", line 10, in <module>
from . import errors
ModuleNotFoundError: No module named 'bettermod'
Edit: when debugging, I found that __name__ was equal to bettermod.bettermod
From docs:
Note that relative imports are based on the name of the current module.
I cannot tell you what is wrong with certainty, but there is a smell: bettermod package has a bettermod module. You want to do from bettermod.bettermod import MyBetterClass? I doubt it. In Python files ARE namespaces, so choosing your file and directory names is also an API design. Just keep that in mind.
I suspect the problem is masked by the name collision. Try this. If you run python in bettermod directory and say import bettermod you are importing bettermod\bettermod.py. And . is relative to the bettermod.py module. Now try to run python in directory above bettermod package directory. It will work because now . resolves to bettermod package.
Try:
import mymodule
mymodule.__file__
This will tell what mymodule is. For packages, it will show path to __init__.py. This will help you to orient yourself. Also look up PYHTONPATH and how you can use it to make sure you are importing from the right path.

Resources