Python import from parent directory for dockerize structure - python-3.x

I have a project with two applications. They both use a mongo-engine database model file. Also they have to start in different Docker containers, but use the same Mongo database in the fird container. Now my app structure looks like this:
app_root/
app1/
database/
models.py
main.py
app2/
database/
models.py
main.py
And it works fine, BUT I have to support two same files database/models.py. I dont want to do this and I make the next structure:
app_root/
shared/
database/
models.py
app1/
main.py
app2/
main.py
Unfortunately it doesnt work for me, because when I try this in my main.py:
from ..shared.database.models import *
I get
Exception has occurred: ImportError
attempted relative import with no known parent package
And when I try
from app_root.shared.database.models import *
I get
Exception has occurred: ModuleNotFoundError No module named 'app_root'
Please, what do I do wrong?

In the file you perform the import, try adding this:
import os
import sys
sys.path.append(os.path.abspath('../../..'))
from app_root.shared.database.models import *

Related

How to get the correct path for a django script

Here is my arborescence
V1 :
project/
---AppUser/
------models.py, view.Py etc ...
---project/
------settings.py, manage.py etc ...
myscript.py
here my script works perfectly :
import sys
import os
import django
sys.path.append("../../../project")
os.environ["DJANGO_SETTINGS_MODULE"] = "project.settings"
django.setup()
from AppUser.models import Subscription
maps = Subscription.objects.get(uuid="1234565")
print(maps)
It works fine, i launch it from the root of the project ...
But when i want to put my script in a script folder :
V2 :
project/
---AppUser/
------models.py, view.py etc ...
---project/
------settings.py, manage.py etc ...
---script/
------myscript.py
Here is my script :
import sys
import os
import django
sys.path.append("../../../../project")
os.environ["DJANGO_SETTINGS_MODULE"] = "project.settings"
django.setup()
from AppUser.models import Subscription
maps = Subscription.objects.get(uuid="123")
print(maps)
and when i am in script/
and i do a python3 script.Py
I have a :
Traceback (most recent call last):
File "myscript.py", line 12, in <module>
from AppUser.models import Subscription
ModuleNotFoundError: No module named 'AppUser'
error
How to be in script and not having this error ?
The django.setup() seems to works fine, but after it seems to have a problem.
To run the script, you don't have to be in the script folder as you already updated the path in the script.
sys.path.append("../../../../project")
If you want to run from the script folder you can update the path in the script
sys.path.append("../../../project")

Python import files from 3 layers

I have the following file structure
home/user/app.py
home/user/content/resource.py
home/user/content/call1.py
home/user/content/call2.py
I have imported resources.py in app.py as below:
import content.resource
Also, I have imported call1 and call2 in resource.py
import call1
import call2
The requirement is to run two tests individually.
run app.py
run resource.py
When I run app.py, it says cannot find call1 and call2.
When run resource.py, the file is running without any issues. How to run app.py python file to call import functions in resource.py and also call1.py and call2.py files?
All the 4 files having __init__ main function.
In your __init__ files, just create a list like this for each init, so for your user __init__: __all__ = ["app", "content"]
And for your content __init__: __all__ = ["resource", "call1", "call2"]
First try: export PYTHONPATH=/home/user<-- Make sure this is the correct absolute path.
If that doesn't solve the issue, try adding content to the path as well.
try: export PYTHONPATH=/home/user/:/home/user/content/
This should definitely work.
You will then import like so:
import user.app
import user.content.resource
NOTE
Whatever you want to use, you must import in every file. Don't bother importing in __init__. Just mention whatever modules that __init__ includes by doing __all__ = []
You have to import call1 and call2 in app.py if you want to call them there.

How does one import python files from the same sub-directory?

I am trying to create a project with the Panda3D game engine. I have the files stored in C:/%users%/Documents/Python/Starbound. Here is the directory of the project:
Starbound
|--.git
| |--all the git system stuff
|--__pycache__
| |--__init__.cpython-37.pyc
| |--main.cpython-37.pyc
| |--run.cpython-37.pyc
|--__init_.py
|--main.py
|--run.py
I would like to use run.py as an easy command-line quick-run for the project. It is a script designed to call the main application as a library. This allows me to change the order of the setup without accidentally messing up the main program. When I call run.py from Windows CMD(in the Starbound directory), I get a traceback to line 9 of run.py:
'loadWorld' missing 1 required positional argument: 'self'
When I import run.py from the Python interpreter, I get a different traceback to line 5 of run.py:
ModuleNotFoundError: No module named 'main'
run.py:
#run.py
#File to call "main.py"
#from Starbound import main
import main
print("Import of main file successful.")
App = main.Application
print("Declaration of application class successful.")
App.loadWorld()
print("Loading of world successful.")
main.py:
#main.py
#File which contains the application control. Designed to be called from "run.py".
from direct.showbase.ShowBase import ShowBase
from direct.task import Task
from direct.actor.Actor import Actor
from direct.interval.IntervalGlobal import Sequence
from panda3d.core import Point3
class Application(ShowBase):
#variables
def __init__(self):
ShowBase.__init__(self)
def loadMainMenu(self):
print("Main menu is not currently available.")
def loadWorld(self):
self.scene = self.loader.loadModel('models/environment')
self.scene.reparentTo(self.render)
__init__.py
import sys
sys.path.insert(1, '/Starbound')
How do I call the main.py from __init__.py and other files?
Visual Studio 2019 installation of Python 3.7.5, Windows 10 Home.
If you want to be able to call a python package, create a __main__.py file inside of it. It can be called with python -m mymodule (calls mymodule.main).

Why are my custom operators not being imported into my DAG (Airflow)?

I am creating an ETL pipeline using Apache Airflow and I am trying to create generalized custom operators. There seems to be no problem with the operators but they are not being imported into my DAG python file.
This is my directory structure.
my_project\
.env
Pipfile
Pipfile.lock
.gitignore
.venv\
airflow\
dags\
logs\
plugins\
__init__.py
helpers\
operators\
__init__.py
data_quality.py
load_fact.py
load_dimension.py
stage_redshift
This is what is present in the __init__.py file under plugins folder.
from __future__ import division, absolute_import, print_function
from airflow.plugins_manager import AirflowPlugin
import airflow.plugins.operators as operators
import airflow.plugins.helpers as helpers
# Defining the plugin class
class SparkifyPlugin(AirflowPlugin):
name = "sparkify_plugin"
operators = [
operators.StageToRedshiftOperator,
operators.LoadFactOperator,
operators.LoadDimensionOperator,
operators.DataQualityOperator
]
helpers = [
helpers.SqlQueries
]
I'm importing these operators into my DAG file as following
from airflow.operators.sparkify_plugin import (StageToRedshiftOperator,
LoadFactOperator,
LoadDimensionOperator,
DataQualityOperator)
I am getting an error as follows
ERROR - Failed to import plugin /Users/user_name/Documents/My_Mac/Projects/sparkify_etl_sql_to_sql/airflow/plugins/operators/stage_redshift.py
Can you help me understand why this is happening?
I figured out how to register my custom operators with Airflow without dedicating a python file to use AirflowPlugin class.
I achieved this by declaring them in my __init__.py file under plugins directory.
This is how I did it.
My project folder structure is as follows
my_project\
.env
Pipfile
Pipfile.lock
.gitignore
.venv\
airflow\
dags\
logs\
plugins\
__init__.py
helpers\
operators\
__init__.py
data_quality.py
load_fact.py
load_dimension.py
stage_redshift
My code in plugins/__init__.py
from airflow.plugins_manager import AirflowPlugin
import operators
import helpers
# Defining the plugin class
class SparkifyPlugin(AirflowPlugin):
name = "sparkify_plugin"
operators = [
operators.StageToRedshiftOperator,
operators.LoadFactOperator,
operators.LoadDimensionOperator,
operators.DataQualityOperator
]
helpers = [
helpers.SqlQueries
]
My code in plugins/operators/__init__.py
from operators.stage_redshift import StageToRedshiftOperator
from operators.load_fact import LoadFactOperator
from operators.load_dimension import LoadDimensionOperator
from operators.data_quality import DataQualityOperator
__all__ = [
'StageToRedshiftOperator',
'LoadFactOperator',
'LoadDimensionOperator',
'DataQualityOperator'
]
I am importing these custom operators in my dag file(dags/etl.py) as:
from airflow.operators.spark_plugin import LoadDimensionOperator
spark_plugin is what the name attribute in SparkifyPlugin class (stored in plugins/__init__.py) holds.
Airflow automatically registers these custom operators.
Hope it helps someone else in the future.
In case you are having some import errors, try running python __init__.py for each module as described by #absolutelydevastated. Make sure that the one in plugins directory runs without throwing errors.
I used Pycharm and it did throw me a few errors when running __init__.py files in the plugins/operators directory.
Fixing the one in plugins directory and ignoring the errors thrown by plugins/operators/__init__.py fixed my issue.
If you check out: Writing and importing custom plugins in Airflow
The person there was having a similar problem with their plugin, which they fixed by including a file under airflow/plugins named for their plugin, rather than defining it in the __init__.py file.

Unable to run celery task directly but still possible via Python console

I'd like to run a simple test (run a task) first via RabbitMQ and once this is setup correctly, then encapsulate in Docker and run from there.
My structure looks like so:
-rabbitmq_docker
- test_celery
- __init__.py
- celeryapp.py
- celeryconfig.py
- runtasks.py
- tasks.py
- docker-compose.yml
- dockerfile
- requirements.txt
celeryconfig.py
## List of modules to import when celery starts
CELERY_IMPORTS = ['test_celery.tasks',] # Required to import module containing tasks
## Message Broker (RabbitMQ) settings
CELERY_BROKER_URL = "amqp://guest#localhost//"
CELERY_BROKER_PORT = 5672
CELERY_RESULT_BACKEND = 'rpc://'
celeryapp.py
from celery import Celery
app = Celery('test_celery')
app.config_from_object('test_celery.celeryconfig', namespace='CELERY')
__init__.py
from .celeryapp import app as celery_app
run_tasks.py
from tasks import reverse
from celery.utils.log import get_task_logger
LOGGER = get_task_logger(__name__)
if __name__ == '__main__':
async_result = reverse.delay("rabbitmq")
LOGGER.info(async_result.get())
tasks.py
from test_celery.celeryapp import app
#app.task(name='tasks.reverse')
def reverse(string):
return string[::-1]
I run celery -A test_celery worker --loglevel=info from the rabbitmq_docker directory. Then in a separate window I trigger reverse.delay("rabbitmq") in the Python console, after importing the required module. This works. Now when I try to trigger the reverse function via the run_tasks.py i.e. python test_celery/run_tasks.py I get:
Traceback (most recent call last):
File "test_celery/run_tasks.py", line 1, in <module>
from tasks import reverse
File "/Users/my_mbp/Software/rabbitmq_docker/test_celery/tasks.py", line 1, in <module>
from test_celery.celeryapp import app
ModuleNotFoundError: No module named 'test_celery'
What I don't get is why this Traceback doesn't get thrown when called directly from the Python console. Could anyone help me out here? I'd eventually like to startup docker, and just run the tests automatically (without going into the Python console).
The problem is simply because your module is not in the Python path.
These should help:
Specify the PYTHONPATH to point to the directory where your test_celery package.
Always run your Python code in the directory where your test_celery package is located.
Or alternatively reorganise your imports...

Resources