Python3 import - running local vs from pip discrepancy - python-3.x

I have a small pip-package (let's call it my_package) that I wrote in python3 with the following directory structure. I am confused as to a discrepancy I'm seeing in running my_package.py locally vs when I'm testing it by downloading it from PyPI, importing it into some other code, and then running it.
.
| README.md
| LICENSE
| setup.py
| build
| dist
| my_package
| -- __init__.py
| -- my_package.py
| -- helpers
| ---- __init__.py
| ---- helper1.py
| ---- helper2.py
| ---- helper3.py
| ---- helper4.py
In my_package.py I have the following imports:
from helpers import helper1
from helpers import helper2
from helpers import helper3
from helpers import helper4
Obviously these are just filler names, but the point remains that I am trying to import some code from the helpers directory from the my_package.py script.
If I were to run my_package.py locally my code executes without any issue - I think this is the expected behavior for python3. However, if I upload this to PyPI and then import the package, I receive the following error:
Traceback (most recent call last):
File "test.py", line 1, in <module>
import my_package
File "/Users/fakeUser/.virtualenvs/pip-testing/lib/python3.7/site-packages/my_package/__init__.py", line 1, in <module>
from . my_package import main_function
File "/Users/fakeUser/.virtualenvs/pip-testing/lib/python3.7/site-packages/my_package/my_package.py", line 6, in <module>
from helpers import helper1
ModuleNotFoundError: No module named 'helpers'
To resolve this issue, I modified the imports in my_package.py to look like this:
from .helpers import helper1
from .helpers import helper2
from .helpers import helper3
from .helpers import helper4
As far as I understand, python3 uses the . to help resolve relative imports. That made sense to try to me because if I am running my_package.py, adding the . should make it clear that the helpers dir is in the same directory as my_package.py. Making this modification does in fact resolve the issue for downloading the package from pip, but now introduces the following issue if I were to run this code locally:
Traceback (most recent call last):
File "my_package.py", line 6, in <module>
from .helpers import helper1
ModuleNotFoundError: No module named '__main__.helpers'; '__main__' is not a package
I am trying to understand what's going on here. In particular if someone could explain the following:
Why does adding the . make the code incompatible for local use?
Why does removing the . make the code incompatible for use from pip?
I really want to understand why these imports aren't working to avoid similar issues in the future.

To begin with, read up on modules
The start by using the following pattern
my_package
| README.md
| LICENSE
| setup.py
| build
| dist
| src
| --my_package
| ---- __init__.py
| ---- helpers
| ------__init__.py
| ------ helper1.py
| ------ helper2.py
| ------helper3.py
| ------helper4.py
You can define the top level __init__.py empty for now, and the inner __init__.py according to how your helperx.py looks like, and then when you install the module, you can call helper1 accordingly, for e.g.
from my_package.helpers import helper1

Original poster here:
The issue was caused by the fact that pip will install the package at level my_package, therefore relying on the imports to be setup either as .helpers or my_package.helpers, whereas running the script my_package.py does not install the package, and therefore the imports need to be written differently.
I will mark this is the correct answer asap (I believe that will be tomorrow)

Related

Can't import relative module

The file structure for a module im creating goes as follows:
PIV
| __init__.py
| base.py
| core.py
| exceptions.py
.gitignore
LICENSE
requirements.txt
But whenever I run a file like core.py, I get the following error:
Traceback (most recent call last):
File "c:/Users/ghub4/OneDrive/Desktop/Python-Image-and-Video-tools/PIV/core.py", line 33, in <module>
from . import base
ImportError: attempted relative import with no known parent package
The same thing happens when I run the __init__.py file. I'm not sure on what went wrong because all of the python files are in the same folder. Can someone clarify what's the problem and explain how I should fix it?
Import code for core.py file:
from __future__ import absolute_import
import sys
import os
from PIL import Image
import io
from . import base
from . import exceptions
(The __init__.py folder has the same relative imports as in the core file but also including: from . import core)
Based upon the two links you will given below, here is what needed for the problem to solve:
Relative Imports error rectified
No module named base
You need to import the package as this
from mymodule import some_useful_method
Sometimes, we get the no module error, in this case, we can import like this
from module_name.classname import some_useful_method

can't import a script from the same module

This situation may be related to python's configuration. My OS is OSX 10.14.6
Here is my directory tree:
code
|--- main.py
|--- module
|--- __init__.py
|--- core.py
|--- util.py
In main.py
from module import core
In core.py
import util
This works (python2):
python main.py
And this does not:
python3 main.py
Error:
Traceback (most recent call last):
File "main.py", line 1, in <module>
from module import core
File "/code/module/core.py", line 1, in <module>
import util
ModuleNotFoundError: No module named 'util'
The solution I can give you is replace
import util
with
from . import util
I think this has something to do with python3's Implicit Namespace Packages basically making . it's own module as it were. That's my guess, I haven't found anything in the docs explicitly saying that.

Cannot import another python file in directory

Trying to import another python file into a program. This is my directory structure:
+root<br>
|-- train.py<br>
|--+src<br>
| |--layers.py
| |--mccnn.py
Currently my train.py file uses this
import src.mccnn as mccnn
import src.layers as L
And the mccnn.py file uses
import layers as L
When I run test.py, I run into an error "No module named 'layers', and in the traceback I can see its raised in the mccnn.py file
I'm running the python file in root folder. I can't figure out why this is throwing an error. None of the answers I could find helped solve the issue.
PS: the original code was written for Python 2.7. Not sure if its relevant information.
Follow the structure as
+root
|--
|----src
|------<package_name>
| |--__init__.py
| |--layers.py
| |--mccnn.py
| |--train.py<br>
Then say in mccnn.py you can just say
import layers as L
and in train.py you can say
import mccnn as mccnn
import layers as L
More details can be seen here https://packaging.python.org/tutorials/packaging-projects/

Python3 Relative Import Is not Working

I'm new in python 3. I'm trying to run lark examples http://github.com/lark-parser/lark in a development mode, but was blocked on relative import problem.
lark
|examples
| |
| |conf_lalr.py
|
|lark
| |
| |lark.py
|
|tools
| |
|common.py
In conf_lalr.py, there's a line:
from lark import Lark
Since I want use relative import, then I updated it with below methods:
1, from ..lark.lark import Lark
Traceback (most recent call last):
File "conf_lalr.py", line 16, in <module>
from ..lark.lark import Lark
ValueError: attempted relative import beyond top-level package
2, from .lark.lark import Lark
Traceback (most recent call last):
File "conf_lalr.py", line 16, in <module>
from .lark.lark import Lark
ModuleNotFoundError: No module named '__main__.lark'; '__main__' is not a package
I searched lots of answers from internet, including stackoverflow. However, none is working.
Need someone tell why.
You have missing init.py files to make the folders as python packages.
Also for the first part , see this or this
To run the examples, you should do the following:
~$ cd lark
~/lark$ python -m examples.conf_lalr

Python module not recognizing files in the same folder

I've made (at least tried) a package using setuptools and attempted to use it from another python-file. However, the modules within the packages don't seem to recognize each other.
Tree
pkg
|-- pkg
| |-- __init__.py
| \-- module.py
\-- setup.py
__init__.py:
import module
#code
pyfile.py
import pkg
#code
When I attempt to run pyfile.py, I get
Traceback (most recent call last):
File "/.../py/pyfile.py", line 1, in <module>
import pkg
File "/.../pkg/pkg/__init__.py", line 1, in <module>
import module
ModuleNotFoundError: No module named 'module'
It works fine if I write import pkg.module, but I don't see why self.referential code would be practical.
Change the import in your __init__ to
from . import module
You can read more about intra-package references in the python documentation.
(BTW, as far as I can tell, setuptools is not involved here.)

Resources