Trying to import another python file into a program. This is my directory structure:
+root<br>
|-- train.py<br>
|--+src<br>
| |--layers.py
| |--mccnn.py
Currently my train.py file uses this
import src.mccnn as mccnn
import src.layers as L
And the mccnn.py file uses
import layers as L
When I run test.py, I run into an error "No module named 'layers', and in the traceback I can see its raised in the mccnn.py file
I'm running the python file in root folder. I can't figure out why this is throwing an error. None of the answers I could find helped solve the issue.
PS: the original code was written for Python 2.7. Not sure if its relevant information.
Follow the structure as
+root
|--
|----src
|------<package_name>
| |--__init__.py
| |--layers.py
| |--mccnn.py
| |--train.py<br>
Then say in mccnn.py you can just say
import layers as L
and in train.py you can say
import mccnn as mccnn
import layers as L
More details can be seen here https://packaging.python.org/tutorials/packaging-projects/
Related
The file structure for a module im creating goes as follows:
PIV
| __init__.py
| base.py
| core.py
| exceptions.py
.gitignore
LICENSE
requirements.txt
But whenever I run a file like core.py, I get the following error:
Traceback (most recent call last):
File "c:/Users/ghub4/OneDrive/Desktop/Python-Image-and-Video-tools/PIV/core.py", line 33, in <module>
from . import base
ImportError: attempted relative import with no known parent package
The same thing happens when I run the __init__.py file. I'm not sure on what went wrong because all of the python files are in the same folder. Can someone clarify what's the problem and explain how I should fix it?
Import code for core.py file:
from __future__ import absolute_import
import sys
import os
from PIL import Image
import io
from . import base
from . import exceptions
(The __init__.py folder has the same relative imports as in the core file but also including: from . import core)
Based upon the two links you will given below, here is what needed for the problem to solve:
Relative Imports error rectified
No module named base
You need to import the package as this
from mymodule import some_useful_method
Sometimes, we get the no module error, in this case, we can import like this
from module_name.classname import some_useful_method
I have a small pip-package (let's call it my_package) that I wrote in python3 with the following directory structure. I am confused as to a discrepancy I'm seeing in running my_package.py locally vs when I'm testing it by downloading it from PyPI, importing it into some other code, and then running it.
.
| README.md
| LICENSE
| setup.py
| build
| dist
| my_package
| -- __init__.py
| -- my_package.py
| -- helpers
| ---- __init__.py
| ---- helper1.py
| ---- helper2.py
| ---- helper3.py
| ---- helper4.py
In my_package.py I have the following imports:
from helpers import helper1
from helpers import helper2
from helpers import helper3
from helpers import helper4
Obviously these are just filler names, but the point remains that I am trying to import some code from the helpers directory from the my_package.py script.
If I were to run my_package.py locally my code executes without any issue - I think this is the expected behavior for python3. However, if I upload this to PyPI and then import the package, I receive the following error:
Traceback (most recent call last):
File "test.py", line 1, in <module>
import my_package
File "/Users/fakeUser/.virtualenvs/pip-testing/lib/python3.7/site-packages/my_package/__init__.py", line 1, in <module>
from . my_package import main_function
File "/Users/fakeUser/.virtualenvs/pip-testing/lib/python3.7/site-packages/my_package/my_package.py", line 6, in <module>
from helpers import helper1
ModuleNotFoundError: No module named 'helpers'
To resolve this issue, I modified the imports in my_package.py to look like this:
from .helpers import helper1
from .helpers import helper2
from .helpers import helper3
from .helpers import helper4
As far as I understand, python3 uses the . to help resolve relative imports. That made sense to try to me because if I am running my_package.py, adding the . should make it clear that the helpers dir is in the same directory as my_package.py. Making this modification does in fact resolve the issue for downloading the package from pip, but now introduces the following issue if I were to run this code locally:
Traceback (most recent call last):
File "my_package.py", line 6, in <module>
from .helpers import helper1
ModuleNotFoundError: No module named '__main__.helpers'; '__main__' is not a package
I am trying to understand what's going on here. In particular if someone could explain the following:
Why does adding the . make the code incompatible for local use?
Why does removing the . make the code incompatible for use from pip?
I really want to understand why these imports aren't working to avoid similar issues in the future.
To begin with, read up on modules
The start by using the following pattern
my_package
| README.md
| LICENSE
| setup.py
| build
| dist
| src
| --my_package
| ---- __init__.py
| ---- helpers
| ------__init__.py
| ------ helper1.py
| ------ helper2.py
| ------helper3.py
| ------helper4.py
You can define the top level __init__.py empty for now, and the inner __init__.py according to how your helperx.py looks like, and then when you install the module, you can call helper1 accordingly, for e.g.
from my_package.helpers import helper1
Original poster here:
The issue was caused by the fact that pip will install the package at level my_package, therefore relying on the imports to be setup either as .helpers or my_package.helpers, whereas running the script my_package.py does not install the package, and therefore the imports need to be written differently.
I will mark this is the correct answer asap (I believe that will be tomorrow)
I am trying to import a module from another directory and run the script using python-mode. I am encountering, module not found error, but the module is present in the location and my sys.path shows the module path has been added successfully. I am having hard time troubleshooting/fixing. Could some one shed some light on this?
import numpy as np
import sys
sys.path.append('./extnsn/')
from extnsn import FX
The error stack is:
Feat/mFeat/feat_Xt_v2.py|7 error| in from
extnsn import FX ImportError: No module named 'extnsn'
My directory structure is:
Feat
|
|--mFeat
|
|--feat_Xt_v2.py
|
|--extnsn
|
|--__init__.py
|--FX.py
The extnsn directory has a __init__.py with the following:
from extnsn import FX
FX.py is the module name, for information.
sys.path contains the appended path as ./extnsn/ as the last entry in the list.
What makes me conclude this is not path issue is that, the program runs fine, if executed from atom with script plugin.
Any help is much appreciated.
EDIT:
This doesn't seem to be an issue with just python-mode, rather the way how vim invoke the python interpretor and execute the buffer. I tried with the following command without python-mode and the issue is the same.
To import a module or a package you have to add to sys.path its parent directory. In your case if you've added ./extnsn/ to sys.path you cannot import extnsn (it cannot be found in sys.path) but you can import FX directly:
import FX
But as FX seems to be a module in a package extnsn you better add to sys.path the parent directory of extnsn, i.e. Feat:
sys.path.append('../Feat')
from extnsn import FX
I try to run multiple crawlers like it's told in the docs. The script is running by calling scrapy crawl crawler.
Now by calling it via python crawler.py I get the following Error:
from crawler.items import LinkItem
ModuleNotFoundError: No module named 'crawler.items'; 'crawler' is not a package
import scrapy
from scrapy.spiders import CrawlSpider, Rule
from scrapy.linkextractors import LinkExtractor
from scrapy.crawler import CrawlerProcess
from crawler.items import LinkItem
from crawler.settings import DB_CREDS
Does anyone know how to handle this?
My project is named crawler. Could this be a problem?
Hierarchy:
Crawler
|-crawler
| |-__pycache__
| | |-...
| |-spiders
| | |-__pycache__
| | |-__init__.py
| | |-crawler.py
| |-__init__.py
| |-items.py
| |-middlewares.py
| |-pipelines.py
| |-settings.py
|-scrapy.cfg
I think this is a kind of Python path problem. You run your script by python interpreter instead of Scrapy, thus you have to make sure that Python know where is your package. If you run as Scrapy command, then scrapy will take care it by the scrapy.cfg.
The easiest way to solve it is to append your project path to PYTHONPATH. For example export PYTHONPATH=YOUR_PROJECT_PATH:$PYTHONPATH. You may need to add this to your .bashrc file.
There are also other solutions, e.g., follow the distribution of package to install your project into site-packages python library.
Thanks.
I run into problems when using a project structure as suggested here: What is the best project structure for a Python application?.
Imagine a project layout like this:
Project/
|-- bin/
| |-- project.py
|
|-- project/
| |-- __init__.py
| |-- foo.py
In bin/project.py I want to import from package project.
#project.py
from project import foo
Since sys.path[0] is always Project/bin when running bin/project.py, it tries to import the module bin/project.py (itself) resulting in an attribute error. Is there a way to use this project layout without playing around with sys.path in the module bin/project.py? I basically would need a "importpackage" statement, that ignores modules with same name.
Since the project structure is suggested, I'm wondering why no one else has this kind of problems...
You could try:
import imp
module_name = imp.load_source('project','../project')
module_name would be the package.
EDIT:
For python 3.3+
import importlib.machinery
loader = importlib.machinery.SourceFileLoader("project", "../project")
foo = loader.load_module("project")
Source