I have a main config.py file and then specific client config files, e.g. client1_config.py.
What I'd like to do is import all variables within my client1_config.py file into my config.py file. The catch is I want to do this flexibly at runtime according to an environment variable. It would look something like this:
import os
import importlib
client = os.environ['CLIENT']
client_config = importlib.import_module(
'{client}_config'.format(
client=client))
from client_config import *
This code snippet returns the following error: ModuleNotFoundError: No module named 'client_config'
Is it possible (and how) to achieve what I'm trying to do or Python does not support this kind of importing at all?
The call to import_module already imports the client configuration. from client_config import * assumes that client_config is the name of the module you are trying to import, just as import os will import the module os even if you create a variable os beforehand:
os = "sys"
import os # still imports the os module, not the sys module
In the following, assume that we have a client1_config.py which just contains one variable:
dummy = True
To add its elements to the main namespace of config.py so that you can access them directly, you can do the following:
import importlib
client = "client1"
# Import the client's configuration
client_config = importlib.import_module(f"{client}_config")
print(client_config.dummy) # True
# Add all elements from client_config
# to the main namespace:
globals().update({v: getattr(client_config, v)
for v in client_config.__dict__
if not v.startswith("_")})
print(dummy) # True
However, I would suggest to access the client's configuration as config.client for clarity and to avoid the client's configuration file overwriting values in the main configuration file.
Related
I have the following file structure
home/user/app.py
home/user/content/resource.py
home/user/content/call1.py
home/user/content/call2.py
I have imported resources.py in app.py as below:
import content.resource
Also, I have imported call1 and call2 in resource.py
import call1
import call2
The requirement is to run two tests individually.
run app.py
run resource.py
When I run app.py, it says cannot find call1 and call2.
When run resource.py, the file is running without any issues. How to run app.py python file to call import functions in resource.py and also call1.py and call2.py files?
All the 4 files having __init__ main function.
In your __init__ files, just create a list like this for each init, so for your user __init__: __all__ = ["app", "content"]
And for your content __init__: __all__ = ["resource", "call1", "call2"]
First try: export PYTHONPATH=/home/user<-- Make sure this is the correct absolute path.
If that doesn't solve the issue, try adding content to the path as well.
try: export PYTHONPATH=/home/user/:/home/user/content/
This should definitely work.
You will then import like so:
import user.app
import user.content.resource
NOTE
Whatever you want to use, you must import in every file. Don't bother importing in __init__. Just mention whatever modules that __init__ includes by doing __all__ = []
You have to import call1 and call2 in app.py if you want to call them there.
My goal is to define a function that gets a relative path and returns an absolute path.
The issue is that I want this function to be located in a specific module and that other modules will be able to use it.
If the function would be located in each module (and not being imported) it would have looked something like this:
import os
def get_absolute(relative):
dir_path = os.path.dirname(os.path.realpath(__file__))
return os.path.abspath(os.path.join(dir_path, relative_path))
absolute = get_absolute('../file.txt')
but if I want to define this function once (in one module) and be able to import it by other modules I will need to add another argument to the function for the directory path of the module that imported the function like so:
helper.py
import os
def get_absolute(relative, dir_path):
return os.path.abspath(os.path.join(dir_path, relative_path))
importer.py
import os
from helper import get_absolute
DIR_PATH = os.path.dirname(os.path.realpath(__file__))
absolute = get_absolute('../file.txt', DIR_PATH)
I saw Get path of importing module but in my case the module is unknown (multiple modules can import this function).
Is there a way for the helper module to know the directory path of the importing module without passing it as an argument?
I'm setting up a new functionality in mi gcloud buckets that allows me to upload or download files using a python library called "boto", but appears this error
I am using linux, visual studio code, python 3.7, gsutil and boto in their last versions.
import os
import boto
import gcs_oauth2_boto_plugin
import shutil
import io
import tempfile
import time
import sys
# Activate virtual environment
activate_this = os.path.join(VENV + 'bin/activate_this.py')
exec(open(activate_this, dict(__file__=activate_this)))
# Check arguments
if len(sys.argv) < 2:
print ("Usage: " + sys.argv[0] + ' FILENAME')
quit()
filename = sys.argv[1]
# URI scheme for Cloud Storage.
GOOGLE_STORAGE = "gs"
# URI scheme for accessing local files.
LOCAL_FILE = "file"
header_values = {"x-goog-project-id": PROJECT_ID}
# Open local file
with open(filename, 'r') as localfile:
dst_uri = boto.storage_uri(BUCKET + '/' + filename, GOOGLE_STORAGE)
# The key-related functions are a consequence of boto's
# interoperability with Amazon S3 (which employs the
# concept of a key mapping to localfile).
dst_uri.new_key().set_contents_from_file(localfile)
print ('Successfully created "%s/%s"' % (dst_uri.bucket_name, dst_uri.object_name))
Traceback (most recent call last):
File "./upload2gcs.py", line 10, in
import boto
ImportError: No module named boto
The directory containing the boto module probably isn't findable from any of the paths where Python looks for modules to be imported.
From within your script, check the sys.path list and see if the expected directory is present:
import pprint
import sys
pprint.pprint(sys.path)
As an example, gsutil is packaged with its own fork of Boto; it performs some additional steps at runtime to make sure the Boto module's parent directory is added to sys.path, which allows subsequent import boto statements to work:
https://github.com/GoogleCloudPlatform/gsutil/blob/c74a5964980b4f49ab2c4cb4d5139b35fbafe8ac/gslib/init.py#L102
How do I load a python module, that is not built in. I'm trying to create a plugin system for a small project im working on. How do I load those "plugins" into python? And, instaed of calling "import module", use a string to reference the module.
Have a look at importlib
Option 1: Import an arbitrary file in an arbiatrary path
Assume there's a module at /path/to/my/custom/module.py containing the following contents:
# /path/to/my/custom/module.py
test_var = 'hello'
def test_func():
print(test_var)
We can import this module using the following code:
import importlib.machinery
myfile = '/path/to/my/custom/module.py'
sfl = importlib.machinery.SourceFileLoader('mymod', myfile)
mymod = sfl.load_module()
The module is imported and assigned to the variable mymod. We can then access the module's contents as:
mymod.test_var
# prints 'hello' to the console
mymod.test_func()
# also prints 'hello' to the console
Option 2: Import a module from a package
Use importlib.import_module
For example, if you want to import settings from a settings.py file in your application root folder, you could use
_settings = importlib.import_module('settings')
The popular task queue package Celery uses this a lot, rather than giving you code examples here, please check out their git repository
Is there a way to tell scons to use a particular file to setup the default environment? I am using TI DSPs and the compiler is something different than cc; I'd like to have one "environment file" that defines where the compiler is, and what the default flags are, and then be able to use this for several projects.
Any suggestions?
You can use the normal python utilities to read a file or process XML and then import it into your env. If you don't have some external file that you need to import into SCons, then you can simply encode the environment in the scons file. If, for some reason, your environment is defined in a Perl dictionary ( as in my case...) you can either try to use PyPerl or convert the Perl dictionary into YAML and then read the YAML into python. ( I was able to do the later, but not the former).
Let's say you simply have a file that you need to read which has environment variables in the form:
ENV_VAR1 ENV_VAL1
ENV_VAR2 ENV_VAL2
...
You could import this into your SConstruct.py file like:
import os
env_file = open('PATH_TO_ENV_FILE','r')
lines = env.file.readlines()
split_regex = re.compile('^(?P<env_var>[\w_]+) *(?P<env_val>.*)')
for line in lines:
regex_search = split_regex.search(line)
if regex_search:
env_var = regex_search.group('env_var')
env_val = regex_search.group('env_val').strip()
os.environ[env_var] = env_val
base_env = Environment(ENV=os.environ)
# even though the below lines seem redundant, it was necessary in my build
# flow...
for key in os.environ.iterkeys():
base_env[key] = os.environ[key]
If you want to stick this ugliness inside a different file and then import it from your main SConstruct.py file, you can add the following to enable access to the 'Environment' class from your other file:
from SCons.Environment import *
Then in your main SConstruct.py file, import the env file like:
from env_loader import *
SInclusion file:
...
myenv = Environment(...)
...
SConstruct file:
...
execfile('SInclusion')
...
myenv.Object(...)
...