Pycharm unresolved reference highlighting - python-3.x

I'm having difficulties understanding how pyCharm resolves references
I have a project structure like this:
ProjectFolder/
main.py
libraries/
my_library.py
sublibraries/
my_sublibrary.py
Basically libraries is a folder that contains some files (libraries) and some subfolders for better structuring of the libraries
main.py imports from libraries ans sublibraries without any problem.
my_library.py imports some classes from sublibraries
# inside my_library.py
from sublibraries.my_sublibrary import MyClass
However pycharm highlights sublibraries in red with the error
Unresolved reference 'sublibraries'
This doesn't really affect the execution of the program. It works fine, but the red highlightting is annoying.
I understand that pycharm has ProjectFolder as the current folder and it cannot find subfolders since there's a folder in the middle.
I tried a couple of things:
I can solve this by mark directory as sources root but I'm not sure what does this do, and since I'm not commiting .idea files, whoever pulls the repository has the same hightlighting problem and has to manually do the same.
I also tried doing a relative import
# inside my_library.py
from .sublibraries.my_sublibrary import MyClass
This solves the highlighting problem, but the program then gives an error:
ModuleNotFoundError: No module named '__main__.sublibraries'; '__main__' is not a package
I'm not sure what is the best approach to have such a simple file structure working, be able to distribute it via git and have correct hightlighting.
Should I commit .idea files so the others can import pycharm settings and use the Mark as Sources Root option?

You can create empty __init__.py files in the libraries/ and sublibraries/ folders, which will turn these folders into (sub)packages. This will make the relative import work (and remove the highlighting).

Related

Relative imports within a git repo

I want to create a git repo that can be used like this:
git clone $PROJECT_URL my_project
cd my_project
python some_dir/some_script.py
And I want some_dir/some_script.py to import from another_dir/some_module.py.
How can I accomplish this?
Some desired requirements, in order of decreasing importance to me:
No sys.path modifications from within any of the .py files. This leads to fragility when doing IDE-powered automated refactoring.
No directory structure changes. The repo has been thoughtfully structured.
No changes to my environment. I don't want to add a hard-coded path to my $PYTHONPATH for instance, as that can result in unexpected behavior when I cd to other directories and launch unrelated python commands.
Minimal changes to the sequence of 3 commands above. I don't want a complicated workflow, I want to use tab-completion for some_dir/some_script.py, and I don't want to spend keystrokes on extra python cmdline flags.
I see four solutions to my general problem described here, but none of them meet all of the above requirements.
If no solution is possible, then why are things this way? This seems like such a natural want, and the requirements I list seem perfectly reasonable. I'm aware of a religious argument in a 2007 email from Guido:
I'm -1 on this and on any other proposed twiddlings of the __main__
machinery. The only use case seems to be running scripts that happen
to be living inside a module's directory, which I've always seen as an
antipattern. To make me change my mind you'd have to convince me that
it isn't.
But not sure if things have changed since then.
Opinions haven't changed on this topic since Guido's 2007 comment. If anything, we're moving even further in the opposite direction, with the additions of PYTHONSAFEPATH var and corresponding -P option in 3.11:
https://docs.python.org/3/using/cmdline.html#envvar-PYTHONSAFEPATH
https://docs.python.org/3/using/cmdline.html#cmdoption-P
These options will nerf direct sibling module imports too, requiring sys.path to be explicitly configured even for scripts!
So, scripts still can't easily do relative imports, and executable scripts living within a package structure are still considered an anti-pattern. What to do instead?! The widely accepted alternative here is to use the packaging feature of entry-points. One type of entry-point group in packaging metadata is the "console_scripts" group, used to point to arbitrary callables defined within your package code. If you add entries in this group within your package metadata, then script wrappers for those callables will be auto-generated and put somewhere on $PATH at pip install time). No hacking of sys.path necessary.
That being said, it's still possible to run .py files directly as scripts, provided you've configured the underlying Python environment for them to resolve their dependencies (imports) correctly. To do that, you'll want to define a package structure and "install" the package so that your source code is visible on sys.path.
Here's a minimum example:
my_project
├── another_dir
│ ├── __init__.py <-- __init__ file required for package dirs (it can be empty)
│ └── some_module.py
├── pyproject.toml <-- packaging metadata lives here
└── some_dir <-- no __init__ file necessary for non-packaged subdirs
└── some_script.py
Minimal contents of the packaging definition in pyproject.toml:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "my_proj"
version = "0.1"
[tool.setuptools.packages.find]
namespaces = false
An additional once-off step is required to create/configure an environment in between the git clone and the script execution:
python3 -m venv .venv
source .venv/bin/activate
pip install -e .
This makes sure that another_dir is available to import from the environment's site-packages directory, which is already one of the locations on sys.path (check with python -m site). That's what's required for any/all of these import statements to work from within the script file(s)
from another_dir import some_module
import another_dir.some_module
from another_dir.some_module import something
Note that this does not necessarily put the parent of another_dir onto sys.path directly. For an editable install, it will setup some scaffolding which makes your package appear to be "installed" in the site, which is sufficient for those imports to succeed. For a non-editable install (pip install without the -e flag), it will just copy your package directly into the site, compile the .pyc files, and then the code will be found by the normal SourceFileLoader.

import so file from different folder

I am using ubuntu 20.04 and python3. I want to import so file "ext.so" like this:
impot Ext
from another code. But the so file is in different folder. What is the right way to do it?
What is the right way to do it?
your project should be structured like so:
-head
--sub1
---Ext.so
--sub2
---caller.py
you should have head the folder containing head in your pythonpath somehow (by installing the python module using distutils, or just having head as your working directory or added by modifying PYTHONPATH in .bashrc, or appending it to sys.path in your script), and you should use
from head.sub1 import Ext
granted that your .so file is a python extension and not some sort of dll, anyone installing your project should be able to run your code without any problems.
however, there is definitely nothing stopping you from adding sub1 to your pythonpath and just import Ext.
Edit: i am sorry, if head is in pythonpath, you only need to import from sub1, not head, so you should have the folder containing head in your pythonpath, my bad.

How do I recursively generate documentation of an entire project with pydoc?

I have a python project inside a specific folder named "Project 1". I want to extract all the docstrings of all the python files inside this project.
In this project all the modules are imported dynamically through __init__.py and, for that reason, when I run pydoc it fails on the imports.
python -m pydoc -w module_folder/ will work for some scenarios, but not all. For example, if you want to document modules and submodules of an installed package, it won't work, you'd need to pivot to a different tool.
Using your favorite language you will need to:
Iterate through files in your target folder
Call pydoc once per (sub)module
Here is one of many examples on Github.
Pdoc, pydoctor both handle walking folders automatically, my fork of pydoc walks the module dependency tree by default.

Packaging Multiple Python Files

I currently am using this guide to package up my project wasp. However currently everything lives inside of the wasp file.
That's not ideal. I would rather have all the classes in separate files so it can be more effectively managed. I have the series of files needed in the debian directory. But I'm not sure how to configure the packaging to package multiple files.
Is there a way to change my packaging to package more than just the one script file?
I'm not a debian package or Python expert, but one way would be to copy the various source files to another location (outside of /usr/bin), and then have /usr/bin/wasp call out to them.
Say you put all of your python code in src/ in the root of your repo. In the debian/install file, you'd have:
wasp usr/bin
src/* usr/lib/wasp/
You'd then just need /usr/bin/wasp to call some entry point in src. For example,
#!/usr/bin/python3
import sys
sys.path.append('/usr/lib/wasp/')
import wasp # or whatever you expose in src
# ...
Again, I don't know the best practices here (either in directory or python usage) but I think this would at least work!

Multiple locations within a folder hierarchy to run SCons from

So far, I've only seen examples of running SCons in the same folder as the single SConstruct file resides. Let's say my project structure is like:
src/*.(cpp|h)
tools/mytool/*.(cpp|h)
What I'd like is to be able to run 'scons' at the root and also inside tools/mytool. The latter compiles only mytool. Is this possible with SCons?
I assume it involves creating another SConstruct file. I've made another one: tools/mytool/SConstruct
I made it contain only:
SConscript('../../SConstruct')
and I was thinking of doing Import('env mytoolTarget') and calling Default(mytoolTarget), but running it with just the above runs in the current directory instead of from the root, so the include paths are broken.
What's the correct way to do this?
You can use the -u option to do this. From any subdirectory, scons -u will search upwards in the directory tree for an SConstruct file.

Resources