Python fails importing submodules from built wheel - python-3.x

I am trying to build a Python (3.7) module that has a structure similar to this:
module_dir/
├─ setup.py
├─ my_module/
│ ├─ utils/
│ │ ├─ common.py
│ │ ├─ __init__.py
│ ├─ __init__.py
│ ├─ dao/
│ │ ├─ base_dao.py
│ │ ├─ dwh/
│ │ │ ├─ dwh_dao.py
│ │ │ ├─ __init__.py
│ │ ├─ __init__.py
But I am having some issues:
when I import from this module in other projects, I am getting an ImportError.
In particular, my dwh_dao.py file contains the following import:
from dao.base_dao import BaseDAO
And it seems to make the import fail, but if I replace this import with a relative one, it works:
from ..dao.base_dao import BaseDAO
So far so good, but from dwh_dao.py I am also trying to reach utils.common and I cannot do that with a relative import AFAIK.
Therefore my questions are:
Is there any way to go up more than one level in a module using relative imports?
Why is my library failing to resolve absolute imports and I need to replace them with relative ones?
EDIT1:
Including my setup.py as well
import setuptools
with open("README.md", "r") as fh:
long_description = fh.read()
with open('requirements.txt') as f:
requirements = f.read().splitlines()
setuptools.setup(
name="my_module",
version="1",
author="",
author_email="",
description="ACMECorp Internal Shared Library",
long_description=long_description,
long_description_content_type="text/markdown",
url="",
packages=setuptools.find_packages(),
classifiers=[
"Programming Language :: Python :: 3",
],
install_requires=requirements
)
EDIT2:
If I use absolute imports also in my library's dwh_dao.py file, like
from my_module.dao.base_dao import BaseDAO
Then it works, but why in a common project it would have been fine to use
from dao.base_dao import BaseDAO
and in a library project I also have to include module name in the import itself?

why in a common project it would have been fine to use
from dao.base_dao import BaseDAO and in a library project I also have
to include module name in the import itself?
Python have a packages hierarchy. When you package a project in a standard way (this is why I wanted to see your setup.py) the main package is mymodule and the pip installer makes sure it will be found in the python sys.path. This is why importing mymodule or mymodule.dao is working.
Now when you run your code directly from the command line or IDE things might be different. This is mostly dependent on your working directory or PYTHONPATH
If you want to simulate exactly what will happen after pip install, do the following:
Open a python on the project folder (module_dir in your case)
Make sure your PYTHON_PATH is empty (it should be).
Try to import from the python.
You will see that import dao is not working.
However, if you will do it from the my_module folder (guess this is probably what you did) import dao will work. This is happening because the current folder is implicitly in the sys.path

Related

Unable to import flask app folder in unittest folder of an repo

I am facing issues in writing unit test for my flask app. Exact issue is that test files in unit-test directory is not able to import files from app folder.
My directory structure:
.
├── Dockerfile
├── README.md
├── app
│ ├── __init__.py
│ ├── api.py
│ └── wsgi.py
├── docker
│ ├── docker-compose.yml
│ └── start.sh
├── requirements.txt
└── unit-test
├── __init__.py
└── test_api.py
Code in unit-test/test_api.py:
import unittest
from app import api
Absolute import throws this error:
from app import api
ModuleNotFoundError: No module named 'app'
I have tried the following after checking a few resources of absolute and relative imports, none of them worked.
from .. import app
error:
from .. import app
ImportError: attempted relative import with no known parent package
I checked a few questions on SO and someone recommended having _init_.py file in the unit-test folder as well but I already have a blank init.py file there.
I have reviewed numerous blogs and youtube tutorials and absolute imports work for them easily.
Please advise how to fix this error and run unit tests.
import sys
sys.path.append('/path/to/the/required/folder')
import name_of_file
if not an inbuilt package, Python only searches in the current directory when you try to import, thus we have to add this path, so that it looks inside this particular folder as well. After adding that, you can simply do the import.

ModuleNotFoundError when trying to import from a sub package in python

I have the following layout:
└── folder_1
└── __init__.py
└── level_1.py
└── folder_2
└── __init__.py
└── level_2.py
└── test
└── __init__.py
└── test_in.py
└── test_out.py
└── setup.py
And in my setup.py I specify packages=["folder_1", "test"]
In the test_out.py file I can import from level_1.py and level_2.py with no problem, but for some reason in the test_in.py file I can only import from level_1.py. If I try to import from level_2.py I get the error
ModuleNotFoundError: No module named 'folder_1.folder_2'
I also get errors if I try to do the import in the init.py from folder_1.
I'm running this on a jupyter lab and can't find a way to make this work. Is there a way to fix this without using the PYTHONPATH or things like that?
EDIT:
Just found out that if I add folder_1.folder_2 to the content of packages in setup.py it works, however I'm not sure if this is the correct way to fix this?

Why is pylance unable to resolve the local relative path import (VSCode)?

Environment:
Python 3.6.9
VSCode
Pylance
The structure of my workspace in VSCode is:
.../sync
├── README.rm
├── script-program-1
│ ├── lib
│ │ ├── __init__.py
│ │ └── queries.py
│ └── sync.py
├── lib
│ ├── __init__.py
│ ├── config.yaml
│ ├── logging.yaml
│ └── sync_config.py
└── requirements.txt
I've added "python.analysis.extraPaths": ["./lib"] to settings.json located in sync\.vscode.
The issue is that within the sync.py script I'm having issues importing lib.queries.
If I try from .lib.queries import * pylance doesn't complain but it fails with the error attempted relative import with no known parent package
If I instead use from lib.queries import * python3 works as intended but pylance complains saying Import "lib.queries" could not be resolved(PylancereportMissingImports)
What can I change so that the import works AND pylance doesn't have an issue with it?
Update
If I add ./script-program-1 to the settings.json python.analysis.extraPaths the problem goes away. The issue I have with this is that this project could have hundreds of "script-program-X" folders and all of them may have their own local import files. Updating this extraPaths variable for every one of them, on ever VSCode machine i work with, is not ideal.
I've tried using a variable like "./${relativeFileDirname}" as well as wild cards like "./**" and "./*but none of these work.
I know that the issue is caused because the workspace root is the sync folder and not script-program-1 but this is how I want it set up. Anyone have a solution to this?

Unable to import a module from a sub-package into another sub-package

I am unable to import module from a different package.
The module connection.py is in a package instance_connector and the module record_parameter.py is in a different package called instance_parameters.
Both of these packages are the sub-packages of a package called snowflake.
Here is the tree diagram of directory structure.
snowflake
├── __init__.py
├── instance_connector
│ ├── __init__.py
│ └── connection.py
└── instance_parameters
├── __init__.py
├── load_parameters.py
├── modals.py
└── record_parameters.py
I am trying to import a module connection.py into module record_parameter.py like this -
record_parameter.py
from snowfake.instance_connector.connection import SnowflakeConnector
When I run this file via terminal using command python record_parameter.py it returns me an error stated below -
Traceback (most recent call last):
File "record_parameters.py", line 3, in <module>
from snowflake.instance_connector.connection import SnowflakeConnector
ModuleNotFoundError: No module named 'snowflake.instance_connector'
Where am I going wrong?
Have you tried appending the path which leads to the file connection.py using the sys module in record_parameters.py?
import sys
sys.path.append(1, 'xyz/snowflake/instance_connector/connection.py')

Jupyter ImportError: cannot import name

I have the following project structure:
path_to_dir/
│
├── a_notebook.ipynb
└── testCases_v2.py
In the .py file I have defined several functions.
I'm trying to import them in the notebook like this:
from testCases_v2 import layer_sizes_test_case
However, I am getting:
ImportError: cannot import name 'layer_sizes_test_case' from 'testCases_v2' (/path_to_dir/testCases_v2.py)
I even tried adding the directory to the system path:
import os
import sys
module_path = os.path.abspath(os.path.join('path_to_dir'))
if module_path not in sys.path:
sys.path.append(module_path)
But the problem persists.
How to solve it? (And, yes, I checked that the name of the function I am importing is spelled correctly)
Let's say you have this:
path_to_dir/
│
├── a_notebook.ipynb
└── tests
|
└── testCases_v2.py
And in testCases_v2 you have a class with the same name.
You have to manually import the modules you want Jupyter take into
account in sys.path In this case, the module is 'tests' and the path
you have to include is 'path_to_dir'
import sys
sys.path.insert(0, 'path/to/dir')
In the module where your py files are, add an empty __init__.py
path_to_dir/
│
├── a_notebook.ipynb
└── tests
|
├──testCases_v2.py
└── __init__.py
You should be able to import it now by doing:
from tests.testCases_v2 import testCases_v2
The python doc about modules might come in handy

Resources