How to import recursively all classes in a module? - python-3.x

Is it possible to use __all__ recursively?
This imports modules in __all__ ...
from mypkg import *
but not recursively.
Trying to put everything into __all__ (inside __init__.py) doesn't seem to be useful either:
from pathlib import Path
# List all python (.py) files in the current folder and put them as __all__
fs = [f for f in Path('mypkg/').rglob('*.py') if not f.name.endswith('__init__.py')]
__all__ = [str(f).replace('/','.')[:-3][5:] for f in fs]
Since...
from mypkg import *
Results in AttributeError: module 'mypkg' has no attribute 'module1.file1' for the first folder in the package.

Solved by adding this to __init__.py. May not be the best approach for most situations, but it is good enough for my use case.
from pathlib import Path
# Include all classes when 'from mypkg import *' is called.
fs = [f for f in Path('mypkg').rglob('*.py') if not f.name.startswith('_')]
for f in [str(f).replace('/', '.')[:-3] for f in fs]:
statement = f'from {f} import *'
exec(statement)

Related

Import __all__ from Python module given by variable

I wan to import all the functions and class into a module/file of Python in high level file just passing a variable that contains the low level file name.
I have a application with several module like:
__all__ = ['MyClass1', 'my_function1']
class MyClass1():
pass
def my_function1():
pass
that previous was import at the high level file as:
from sub_module1 import *
from sub_module2 import *
...
# To direct use, of the different subfiles:
obj1 = MyClass1()
obj2 = MyClass2()
The application became a plugin based and I have to dynamic import all module into a folder and provide direct access to all objects defined into __all__ of those submodules.
The code bellow imports fine the submodules but I don not give my direct access to the directives defined into __all__ of those files.
from os import path
from importlib import import_module
directory_name = ## Define the plugins dir.
for importer, package_name, _ in iter_modules([directory_name]):
module_specification = importlib.util.spec_from_file_location(
package_name, path.join(directory_name, package_name + '.py'))
module_loader = importlib.util.module_from_spec(module_specification)
module_specification.loader.exec_module(module_loader)
How do I put those object define into __all__ of the submodules inside locals() of the high module?

Is there a way to auto-import all the models in my folder when loading __init__.py?

In my Python 3.9, Django 3.2 project, I have this general folder structure
- manage.py
+ cbapp
+ models
- __init__.py
- vendor.py
- user_preferences.py
In my init.py file, I have all my model entries listed out ...
from .vendor import Vendor
from .user_preferences import UserPreferences
...
Each model class, e.g. Vendor, has this general sturcture
from django.db import models
class Vendor(models.Model):
...
Every time I add a new model I have to add a line into my init.py file. Is there any way I can write my init.py file so that it will just auto-import new files I add into my models directory?
What you're looking for is some fancy dynamic imports, such as these.
If your model names are always the same pattern on your module names, the following code in init.py will probably work:
import os, glob
path = os.path.dirname(__file__)
modules = [os.path.basename(f)[:-3] for f in glob.glob(path + "/*.py")
if not os.path.basename(f).startswith('_')]
stripped_path = os.path.relpath(path).replace('/', '.')
imports = {}
for module in modules:
model_name = module.title().replace("_", "")
imports[model_name] = getattr(__import__(stripped_path + "." + module, fromlist=[model_name]), model_name)
print(imports)
globals().update(imports)

Importing a daily changing variable name in python [duplicate]

I'm writing a Python application that takes a command as an argument, for example:
$ python myapp.py command1
I want the application to be extensible, that is, to be able to add new modules that implement new commands without having to change the main application source. The tree looks something like:
myapp/
__init__.py
commands/
__init__.py
command1.py
command2.py
foo.py
bar.py
So I want the application to find the available command modules at runtime and execute the appropriate one.
Python defines an __import__() function, which takes a string for a module name:
__import__(name, globals=None, locals=None, fromlist=(), level=0)
The function imports the module name, potentially using the given globals and locals to determine how to interpret the name in a package context. The fromlist gives the names of objects or submodules that should be imported from the module given by name.
Source: https://docs.python.org/3/library/functions.html#__import__
So currently I have something like:
command = sys.argv[1]
try:
command_module = __import__("myapp.commands.%s" % command, fromlist=["myapp.commands"])
except ImportError:
# Display error message
command_module.run()
This works just fine, I'm just wondering if there is possibly a more idiomatic way to accomplish what we are doing with this code.
Note that I specifically don't want to get in to using eggs or extension points. This is not an open-source project and I don't expect there to be "plugins". The point is to simplify the main application code and remove the need to modify it each time a new command module is added.
See also: How do I import a module given the full path?
With Python older than 2.7/3.1, that's pretty much how you do it.
For newer versions, see importlib.import_module for Python 2 and Python 3.
Or using __import__ you can import a list of modules by doing this:
>>> moduleNames = ['sys', 'os', 're', 'unittest']
>>> moduleNames
['sys', 'os', 're', 'unittest']
>>> modules = map(__import__, moduleNames)
Ripped straight from Dive Into Python.
The recommended way for Python 2.7 and 3.1 and later is to use importlib module:
importlib.import_module(name, package=None)
Import a module. The name argument specifies what module to import in absolute or relative terms (e.g. either pkg.mod or ..mod). If the name is specified in relative terms, then the package argument must be set to the name of the package which is to act as the anchor for resolving the package name (e.g. import_module('..mod', 'pkg.subpkg') will import pkg.mod).
e.g.
my_module = importlib.import_module('os.path')
Note: imp is deprecated since Python 3.4 in favor of importlib
As mentioned the imp module provides you loading functions:
imp.load_source(name, path)
imp.load_compiled(name, path)
I've used these before to perform something similar.
In my case I defined a specific class with defined methods that were required.
Once I loaded the module I would check if the class was in the module, and then create an instance of that class, something like this:
import imp
import os
def load_from_file(filepath):
class_inst = None
expected_class = 'MyClass'
mod_name,file_ext = os.path.splitext(os.path.split(filepath)[-1])
if file_ext.lower() == '.py':
py_mod = imp.load_source(mod_name, filepath)
elif file_ext.lower() == '.pyc':
py_mod = imp.load_compiled(mod_name, filepath)
if hasattr(py_mod, expected_class):
class_inst = getattr(py_mod, expected_class)()
return class_inst
Using importlib
Importing a source file
Here is a slightly adapted example from the documentation:
import sys
import importlib.util
file_path = 'pluginX.py'
module_name = 'pluginX'
spec = importlib.util.spec_from_file_location(module_name, file_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
# Verify contents of the module:
print(dir(module))
From here, module will be a module object representing the pluginX module (the same thing that would be assigned to pluginX by doing import pluginX). Thus, to call e.g. a hello function (with no parameters) defined in pluginX, use module.hello().
To get the effect "importing" functionality from the module instead, store it in the in-memory cache of loaded modules, and then do the corresponding from import:
sys.modules[module_name] = module
from pluginX import hello
hello()
Importing a package
To import a package instead, calling import_module is sufficient. Suppose there is a package folder pluginX in the current working directory; then just do
import importlib
pkg = importlib.import_module('pluginX')
# check if it's all there..
print(dir(pkg))
Use the imp module, or the more direct __import__() function.
You can use exec:
exec("import myapp.commands.%s" % command)
If you want it in your locals:
>>> mod = 'sys'
>>> locals()['my_module'] = __import__(mod)
>>> my_module.version
'2.6.6 (r266:84297, Aug 24 2010, 18:46:32) [MSC v.1500 32 bit (Intel)]'
same would work with globals()
Similar as #monkut 's solution but reusable and error tolerant described here http://stamat.wordpress.com/dynamic-module-import-in-python/:
import os
import imp
def importFromURI(uri, absl):
mod = None
if not absl:
uri = os.path.normpath(os.path.join(os.path.dirname(__file__), uri))
path, fname = os.path.split(uri)
mname, ext = os.path.splitext(fname)
if os.path.exists(os.path.join(path,mname)+'.pyc'):
try:
return imp.load_compiled(mname, uri)
except:
pass
if os.path.exists(os.path.join(path,mname)+'.py'):
try:
return imp.load_source(mname, uri)
except:
pass
return mod
The below piece worked for me:
>>>import imp;
>>>fp, pathname, description = imp.find_module("/home/test_module");
>>>test_module = imp.load_module("test_module", fp, pathname, description);
>>>print test_module.print_hello();
if you want to import in shell-script:
python -c '<above entire code in one line>'
The following worked for me:
import sys, glob
sys.path.append('/home/marc/python/importtest/modus')
fl = glob.glob('modus/*.py')
modulist = []
adapters=[]
for i in range(len(fl)):
fl[i] = fl[i].split('/')[1]
fl[i] = fl[i][0:(len(fl[i])-3)]
modulist.append(getattr(__import__(fl[i]),fl[i]))
adapters.append(modulist[i]())
It loads modules from the folder 'modus'. The modules have a single class with the same name as the module name. E.g. the file modus/modu1.py contains:
class modu1():
def __init__(self):
self.x=1
print self.x
The result is a list of dynamically loaded classes "adapters".

python packaging results in import error

I have a boost python generating a shared object to be used with python in /home/user/service/org/boost_py.so (This folder does not contain a __init__.py)
and /home/user/service is part of sys.path so when I need to use this ,just do
import org.boost_py #works
Now I have added a pure python module in a different directory.
/home/user/service/pure_python/org/
__init__.py
tester.py
__init__.py contains
__import__('pkg_resources').declare_namespace(__name__)
Now when sys.path is
['/home/user/service/','/home/user/service/pure_python']
and I
import org.boost_py #ImportError: No module named 'org.boost_py'
but I can import org.tester. How to import both org.tester and org.boost_py ? (I cannot change the location of either)
update:
Found pth file with following contents in pure_python directory
import sys, types, os;
p = os.path.join(sys._getframe(1).f_locals['sitedir'], *('org',));
ie = os.path.exists(os.path.join(p,'__init__.py'));
m = not ie and sys.modules.setdefault('org', types.ModuleType('org'));
mp = (m or []) and m.__dict__.setdefault('__path__',[]);
(p not in mp) and mp.append(p)
Apparently there is no file or directory boost_py in 'pure_python/org ', this produces the error.
If it exists please edit the question!
Alternatively, the path of the boost could be not in the path browser, then it just wont recognize the file.

Which form of relative import to prefer inside a package

I'm writing a library named Foo for an example.
The __init__.py file:
from .foo_exceptions import *
from .foo_loop import FooLoop()
main_loop = FooLoop()
from .foo_functions import *
__all__ = ['main_loop'] + foo_exceptions.__all__ + foo_functions.__all__
When installed, it can be used like this:
# example A
from Foo import foo_create, main_loop
foo_obj = foo_create()
main_loop().register(foo_obj)
or like this:
# example B
import Foo
foo_obj = Foo.foo_create()
Foo.main_loop().register(foo_obj)
I clearly prefer the example B approach. No name conflicts and the source of each external object is explicitely stated.
So much for introduction, now my question. Inside this library I need to import something from a different file. Again, I have several ways to do it. And the question is which style to prefer - C, D or E? Read below.
# example C
from . import foo_exceptions
raise foo_exceptions.FooError("fail")
or
# example D
from .foo_exceptions import FooError
raise FooError("fail")
or
# example E
from . import FooError
raise FooError("fail")
Approach C has the disadvantage, that importing a whole module instead of importing just a few required objects increases the chance of a cyclical import problem. Also consider this line:
from . import foo_exceptions, main_loop
It looks like an import of 2 symbols from one source, but it isn't. The former (foo_exceptions) is a module (.py file) in the current directory and the latter is an object defined in __init__.py.
That's why I'm not using style C and the question in its final form is: D or E (and why)?
(Thank you for reading this long question. All code fragments are examples only and may contain typos)
After the answer from alexanderlukanin:
EDIT1: corrected errors in init.py
NOTE1: foo_ prefixes are only to emphasize the relationship between objects
EDIT2: When importing an object which is not part of the library interface, style E is not usable. I think we have a winner: It's the from .module import symbol form.
Don't use old-style relative imports:
# Import from foo/foo_loop.py
# This DOES NOT WORK in Python 3
# and MAY NOT WORK AS EXPECTED in Python 2
from foo_loop import FooLoop
# This is reliable and unambiguous
from .foo_loop import FooLoop
Don't use asterisk import unless you really have to.
# Namespace pollution! Name clashes!
from .submodule import *
Don't use prefixes - you've got namespaces exactly for that purpose.
# Unpythonic
from foo import foo_something_create
foo_something_create()
# Pythonic
import foo.something
foo.something.create()
Your package's API must be well-defined. Your implementation must not be too tangled. The rest is a matter of taste.
# [C] This is good.
# Import order: __init__.py, exceptions.py
from . import exceptions
raise exceptions.FooError
# [D] This is also fine.
# Import order is the same as above,
# only name binding inside the current module is different.
from .exceptions import FooError
raise FooError
# [E] This is not as good because it adds one unnecessary level of indirection
# submodule.py -> __init__.py -> exceptions.py
from . import FooError
raise FooError
See also: Circular (or cyclic) imports in Python

Resources