Loading a class of unknown name in a dynamic location - python-3.x

Currently I am extracting files to the temp directory of the operating system. One of the files is a Python file containing a class which I need to get a handle of. The Python's file is known, but the name of the class inside the file is unknown. But it is safe to assume, that the there is only one single class, and that the class is a subclass of another.
I tried to work with importlib, but I am not able to get a handle of the class.
So far I tried:
# Assume
# module_name contains the name of the class and -> "MyClass"
# path_module contains the path to the python file -> "../Module.py"
spec = spec_from_file_location(module_name, path_module)
module = module_from_spec(spec)
for pair in inspect.getmembers(module):
print(f"{pair[1]} is class: {inspect.isclass(pair[1])}")
When I iterate over the members of the module, none of them get printed as a class.
My class in this case is called BasicModel and the Output on the console looks like this:
BasicModel is class: False
What is the correct approach to this?
Edit:
As the content of the file was requested, here you go:
class BasicModel(Sequential):
def __init__(self, class_count: int, input_shape: tuple):
Sequential.__init__(self)
self.add(Input(shape=input_shape))
self.add(Flatten())
self.add(Dense(128, activation=nn.relu))
self.add(Dense(128, activation=nn.relu))
self.add(Dense(class_count, activation=nn.softmax))

Use dir() to get the attributes of the file and inspect to check if the attribute is a class. If so, you can create an object.
Assuming that your file's path is /tmp/mysterious you can do this:
import importlib
import inspect
from pathlib import Path
import sys
path_pyfile = Path('/tmp/mysterious.py')
sys.path.append(str(path_pyfile.parent))
mysterious = importlib.import_module(path_pyfile.stem)
for name_local in dir(mysterious):
if inspect.isclass(getattr(mysterious, name_local)):
print(f'{name_local} is a class')
MysteriousClass = getattr(mysterious, name_local)
mysterious_object = MysteriousClass()

Related

Custom importer to fetch files from web before execution

I'm looking at the documentation here to try and manipulate the way the import statement works. My code uses imports in all forms
import <module>
import <package.module>
from <package> import <module>
from <package.module> import <function>
from <package.module> import *
My goal is: for a certain folder, let's call it myfolder, any import for any module within myfolder (however deep in the structure) should have some preprocessing. No matter how it's imported. Preprocessing in this case is to download the python file from an internal CMS and use that instead of the one on the disk.
​
​
I understand the meta_path and path_hooks part, and I think I need to work with the path_hooks to return a FileFinder object to the built-in meta's PathFinder. Here's what I have so far:
import os, sys
class PathhookOverride():
def __init__(self, path) -> None:
"""
This will be called when PathFinder() iterates through sys.path_hooks
"""
relative_path = os.path.relpath(path, os.getcwd())
if not relative_path.startswith('myfolder'):
## We want to override only imports that have myfolder as the first part of the relative path
raise ImportError
if os.path.isdir(path):
## We know that this is a directory, we don't want to handle this
print(f'PathhookOverride: {path} is a directory')
raise ImportError
dot_separated_path = ".".join(relative_path.split(os.path.sep))
print(dot_separated_path)
## Pull file here later
cache = sys.path_importer_cache
raise ImportError ## Go to next path_hook
def change_importer():
"""Inserts the finder into the import machinery"""
sys.path_hooks.insert(0, PathhookOverride)
from myfolder.package.module import function
Expected output:
When I import my module or function using any of the above formats, I should get the path of the file being imported.
i.e., in the code snippet above, it should print the dot_separated_path:
myfolder.package.module
Actual output:
PathhookOverride: c:\test1\myfolder is a directory
PathhookOverride: c:\test1\myfolder\package is a directory
The override only catches the directories. The path of the files are never sent to the override hook.
What am I missing? Thanks.

Importing a daily changing variable name in python [duplicate]

I'm writing a Python application that takes a command as an argument, for example:
$ python myapp.py command1
I want the application to be extensible, that is, to be able to add new modules that implement new commands without having to change the main application source. The tree looks something like:
myapp/
__init__.py
commands/
__init__.py
command1.py
command2.py
foo.py
bar.py
So I want the application to find the available command modules at runtime and execute the appropriate one.
Python defines an __import__() function, which takes a string for a module name:
__import__(name, globals=None, locals=None, fromlist=(), level=0)
The function imports the module name, potentially using the given globals and locals to determine how to interpret the name in a package context. The fromlist gives the names of objects or submodules that should be imported from the module given by name.
Source: https://docs.python.org/3/library/functions.html#__import__
So currently I have something like:
command = sys.argv[1]
try:
command_module = __import__("myapp.commands.%s" % command, fromlist=["myapp.commands"])
except ImportError:
# Display error message
command_module.run()
This works just fine, I'm just wondering if there is possibly a more idiomatic way to accomplish what we are doing with this code.
Note that I specifically don't want to get in to using eggs or extension points. This is not an open-source project and I don't expect there to be "plugins". The point is to simplify the main application code and remove the need to modify it each time a new command module is added.
See also: How do I import a module given the full path?
With Python older than 2.7/3.1, that's pretty much how you do it.
For newer versions, see importlib.import_module for Python 2 and Python 3.
Or using __import__ you can import a list of modules by doing this:
>>> moduleNames = ['sys', 'os', 're', 'unittest']
>>> moduleNames
['sys', 'os', 're', 'unittest']
>>> modules = map(__import__, moduleNames)
Ripped straight from Dive Into Python.
The recommended way for Python 2.7 and 3.1 and later is to use importlib module:
importlib.import_module(name, package=None)
Import a module. The name argument specifies what module to import in absolute or relative terms (e.g. either pkg.mod or ..mod). If the name is specified in relative terms, then the package argument must be set to the name of the package which is to act as the anchor for resolving the package name (e.g. import_module('..mod', 'pkg.subpkg') will import pkg.mod).
e.g.
my_module = importlib.import_module('os.path')
Note: imp is deprecated since Python 3.4 in favor of importlib
As mentioned the imp module provides you loading functions:
imp.load_source(name, path)
imp.load_compiled(name, path)
I've used these before to perform something similar.
In my case I defined a specific class with defined methods that were required.
Once I loaded the module I would check if the class was in the module, and then create an instance of that class, something like this:
import imp
import os
def load_from_file(filepath):
class_inst = None
expected_class = 'MyClass'
mod_name,file_ext = os.path.splitext(os.path.split(filepath)[-1])
if file_ext.lower() == '.py':
py_mod = imp.load_source(mod_name, filepath)
elif file_ext.lower() == '.pyc':
py_mod = imp.load_compiled(mod_name, filepath)
if hasattr(py_mod, expected_class):
class_inst = getattr(py_mod, expected_class)()
return class_inst
Using importlib
Importing a source file
Here is a slightly adapted example from the documentation:
import sys
import importlib.util
file_path = 'pluginX.py'
module_name = 'pluginX'
spec = importlib.util.spec_from_file_location(module_name, file_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
# Verify contents of the module:
print(dir(module))
From here, module will be a module object representing the pluginX module (the same thing that would be assigned to pluginX by doing import pluginX). Thus, to call e.g. a hello function (with no parameters) defined in pluginX, use module.hello().
To get the effect "importing" functionality from the module instead, store it in the in-memory cache of loaded modules, and then do the corresponding from import:
sys.modules[module_name] = module
from pluginX import hello
hello()
Importing a package
To import a package instead, calling import_module is sufficient. Suppose there is a package folder pluginX in the current working directory; then just do
import importlib
pkg = importlib.import_module('pluginX')
# check if it's all there..
print(dir(pkg))
Use the imp module, or the more direct __import__() function.
You can use exec:
exec("import myapp.commands.%s" % command)
If you want it in your locals:
>>> mod = 'sys'
>>> locals()['my_module'] = __import__(mod)
>>> my_module.version
'2.6.6 (r266:84297, Aug 24 2010, 18:46:32) [MSC v.1500 32 bit (Intel)]'
same would work with globals()
Similar as #monkut 's solution but reusable and error tolerant described here http://stamat.wordpress.com/dynamic-module-import-in-python/:
import os
import imp
def importFromURI(uri, absl):
mod = None
if not absl:
uri = os.path.normpath(os.path.join(os.path.dirname(__file__), uri))
path, fname = os.path.split(uri)
mname, ext = os.path.splitext(fname)
if os.path.exists(os.path.join(path,mname)+'.pyc'):
try:
return imp.load_compiled(mname, uri)
except:
pass
if os.path.exists(os.path.join(path,mname)+'.py'):
try:
return imp.load_source(mname, uri)
except:
pass
return mod
The below piece worked for me:
>>>import imp;
>>>fp, pathname, description = imp.find_module("/home/test_module");
>>>test_module = imp.load_module("test_module", fp, pathname, description);
>>>print test_module.print_hello();
if you want to import in shell-script:
python -c '<above entire code in one line>'
The following worked for me:
import sys, glob
sys.path.append('/home/marc/python/importtest/modus')
fl = glob.glob('modus/*.py')
modulist = []
adapters=[]
for i in range(len(fl)):
fl[i] = fl[i].split('/')[1]
fl[i] = fl[i][0:(len(fl[i])-3)]
modulist.append(getattr(__import__(fl[i]),fl[i]))
adapters.append(modulist[i]())
It loads modules from the folder 'modus'. The modules have a single class with the same name as the module name. E.g. the file modus/modu1.py contains:
class modu1():
def __init__(self):
self.x=1
print self.x
The result is a list of dynamically loaded classes "adapters".

The best way to share a class between processes

First of all, I'm pretty new in multiprocessing and I'm here for learning of all of you. I have several files doing something similar to this:
SharedClass.py:
class simpleClass():
a = 0
b = ""
.....
MyProcess.py:
import multiprocessing
import SharedClass
class FirstProcess(multiprocessing.Process):
def __init__(self):
multiprocessing.Process.__init__(self)
def modifySharedClass():
# Here I want to modify the object shared with main.py defined in SharedClass.py
Main.py:
from MyProcess import FirstProcess
import sharedClass
if __name__ == '__main__':
pr = FirstProcess()
pr.start()
# Here I want to print the initial value of the shared class
pr.modifySharedClass()
# Here I want to print the modified value of the shared class
I want to define a shared class (in SharedClass.py), in a kind of shared memory that can be readed and writted for both files Main.py and MyProcess.py.
I have try to use the Manager of multiprocessing and multiprocessing.array but Im not having good results, the changes made in one file are not beeing reflected in the other file (maybe Im doing this in the wrong way).
Any ideas? Thank you.

Django move models classmethod to another file

I have model
Order(models.Model):
name = models.Charfield()
#classmethod
do_something(cls):
print('do soemthing')
What I want to do is to move do_something method from my model to another file.I want to do it because I have several other big methods in this model and want to structure the code, don't like lengh of this file. It's getting big > 700 lines of code.
So I want to move my method to another file and import it, so it still can be used like modelmethod
like this:
Order.do_something()
Any ideas?
Use inheritance -- (wiki)
# some_package/some_module.py
class MyFooKlass:
#classmethod
def do_something(cls):
# do something
return 'foo'
# my_app/models.py
from some_package.some_module import MyFooKlass
class Order(models.Model, MyFooKlass):
name = models.CharField()

__post_init__ of python 3.x dataclasses is not called when loaded from yaml

Please note that I have already referred to StackOverflow question here. I post this question to investigate if calling __post_init__ is safe or not. Please check the question till the end.
Check the below code. In step 3 where we load dataclass A from yaml string. Note that it does not call __post_init__ method.
import dataclasses
import yaml
#dataclasses.dataclass
class A:
a: int = 55
def __post_init__(self):
print("__post_init__ got called", self)
print("\n>>>>>>>>>>>> 1: create dataclass object")
a = A(33)
print(a) # print dataclass
print(dataclasses.fields(a))
print("\n>>>>>>>>>>>> 2: dump to yaml")
s = yaml.dump(a)
print(s) # print yaml repr
print("\n>>>>>>>>>>>> 3: create class from str")
a_ = yaml.load(s)
print(a_) # print dataclass loaded from yaml str
print(dataclasses.fields(a_))
The solution that I see for now is calling __-post_init__ on my own at the end like in below code snippet.
a_.__post_init__()
I am not sure if this is safe recreation of yaml serialized dataclass. Also, it will pose a problem when __post_init__ takes kwargs in case when dataclass fields are dataclasses.InitVar type.
This behavior is working as intended. You are dumping an existing object, so when you load it pyyaml intentionally avoids initializing the object again. The direct attributes of the dumped object will be saved even if they are created in __post_init__ because that function runs prior to being dumped. When you want the side effects that come from __post_init__, like the print statement in your example, you will need to ensure that initialization occurs.
There are few ways to accomplish this. You can use either the metaclass or adding constructor/representer approaches described in pyyaml's documentation. You could also manually alter the dumped string in your example to be ''!!python/object/new:' instead of ''!!python/object:'. If your eventual goal is to have the yaml file generated in a different manner, then this might be a solution.
See below for an update to your code that uses the metaclass approach and calls __post_init__ when loading from the dumped class object. The call to cls(**fields) in from_yaml ensures that the object is initialized. yaml.load uses cls.__new__ to create objects tagged with ''!!python/object:' and then loads the saved attributes into the object manually.
import dataclasses
import yaml
#dataclasses.dataclass
class A(yaml.YAMLObject):
a: int = 55
def __post_init__(self):
print("__post_init__ got called", self)
yaml_tag = '!A'
yaml_loader = yaml.SafeLoader
#classmethod
def from_yaml(cls, loader, node):
fields = loader.construct_mapping(node, deep=True)
return cls(**fields)
print("\n>>>>>>>>>>>> 1: create dataclass object")
a = A(33)
print(a) # print dataclass
print(dataclasses.fields(a))
print("\n>>>>>>>>>>>> 2: dump to yaml")
s = yaml.dump(a)
print(s) # print yaml repr
print("\n>>>>>>>>>>>> 3: create class from str")
a_ = yaml.load(s, Loader=A.yaml_loader)
print(a_) # print dataclass loaded from yaml str
print(dataclasses.fields(a_))

Resources