mod_wsgi django not working with Apache24 - python-3.x

I am using a MS Windows 11 Machine running Python 3.11 with virtualenv package installed.
I am using Apache24 httpd.exe web server for my django app production server.
My app is called mysite and it is fully functional inside a virtual environment (called venv) with the folowing packages (requirements.txt):
django
django-extensions
django-iprestrict
mod-wsgi
Pillow
pyOpenSSL
odfpy
werkzeug
whitenoise
pandas
plotly
matplotlib
I can fully run the server in DEBUG mode with the virtual environment activated:
(venv) python.exe manage.py runserver
And, on the other hand, I was able to make the Apache web server to run a test website without problems.
The issue is when I edit httpd.conf files to integrate with my django app thorugh mod_wsgi:
# httpd.conf
(...)
LoadFile "C:/Users/myuser/AppData/Local/Programs/Python/Python311/python311.dll"
LoadModule "C:/Users/myuser/mysite/mysite_venv/venv/Lib/site-packages/mod_wsgi.cp311-win_amd64.pyd"
WSGIScriptAlias / C:\Users\myuser\mysite\mysite\wsgi.py
WSGIPythonHome C:\Users\myuser\mysite\mysite_venv\venv
WSGIPythonPath C:\Users\myuser\mysite
<Directory />
<Files wsgi.py>
Require all granted
</Files>
</Directory>
# media files hosting
Alias /media "C:/Users/myuser/mysite/media/"
<Directory "C:/Users/myuser/mysite/media/">
Require all granted
</Directory>
(...)
My directory tree is:
.
├── mainapp
│ ├── admin.py
│ ├── apps.py
│ ├── forms.py
│ ├── __init__.py
│ ├── migrations
│ ├── models.py
│ ├── static
│ ├── templates
│ ├── tests.py
│ ├── urls.py
│ └── views.py
├── mainapp.sqlite3
├── manage.py
├── media
├── mysite
│ ├── asgi.py
│ ├── __init__.py
│ ├── settings.py
│ ├── urls.py
│ └── wsgi.py
├── mysite_venv
│ ├── requirements.txt
│ ├── venv
└── staticfiles
├── admin
├── css
├── django_extensions
├── font
├── icon
├── javascript
├── js
└── uploads
The issue is when I run httpd.exe the web page loads forever because the server does not respond the client.
I have opened the error.txt log file from Apache to find out what is going on, but there was any error message about anything.
I took care about:
setting PYTHONPATH environment variable to the DLLs as described here
natively compile and place the mod_wsgi .pyd in the correct path.
Can someone help me what is going on? Thanks in advance!

There is some issue/conflict between modules plotly and mod_wsgi that stucks the HTTP/S requests when Apache24 loads the plotly module through WSGI.
The same thing happens with pandas module, too.
I could fix this problem by commenting all views and urls that loads these modules.
There is no problem about virtualenv having these modules installed, the issue is that they cannot be loaded by mod_wsgi.
The big question is that the development server works with them through python manage.py runserver, the issue is exactly when you put the server in production mode.

Related

How to import modules in same directory between different packages

I am not able to import certain files/classes
My directory structure is below:
Main Project Directory
├── config
│   ├── config.py
│   ├── constants.py
│   └── __init__.py
└── utils
├── __init__.py
├── logger_config.py
├── sql_helper.py
├── update_data_on_request.py
└── utility_functions.py
I want to import config.py from config into utility_functions.
Currently init.py is blank.
Error I face was ModuleNotFoundError: No module named 'config'
Try to use from config.config import * in utility_functions.py

Get Data files from Python Package to local working directory

I am making a package that uses flask. I will be using a single command to create the project. For the flask application to run properly, I want the following directories in the user's local directory-
├── main.py
├── requirements.txt
├── static
│   └── style.css
├── templates
│   └── index.html
How do I package the code so that I am able to store these files in the user's local directory?

google.cloud namespace import error in __init__.py

I have read through at least a dozen different stackoverflow questions that all present the same basic problem and have the same basic answer: either the module isn't installed correctly or the OP is doing the import wrong.
In this case, I am trying to do from google.cloud import secretmanager_v1beta1.
It works in my airflow container when I run airflow dags or if I run pytest tests/dags/test_my_dag.py. However, if I run cd dags; python -m my_dag or cd dags; python my_dag.py I get this error:
from google.cloud import secretmanager as secretmanager
ImportError: cannot import name 'secretmanager' from 'google.cloud' (unknown location)
I can add from google.cloud import bigquery in the line right above this line and that works OK. It appears to literally just be a problem with this particular package.
Why does it matter if pytest and airflow commands succeed? Because, I have another environment where I am trying to run dataflow jobs from the command-line and I get this same error. And unfortunately I don't think I can bypass this error in that environment for several reasons.
UPDATE 6
I have narrowed down the error to an issue with the google.cloud namespace and the secretmanager package within that namespace in the __init__.py file.
If I add from google.cloud import secretmanager to airflow/dags/__init__.py and then try to run python -m dags.my_dag.py, I receive this error but with a slightly different stacktrace:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/runpy.py", line 183, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
File "/usr/local/lib/python3.7/runpy.py", line 109, in _get_module_details
__import__(pkg_name)
File "/workspace/airflow/dags/__init__.py", line 3, in <module>
from google.cloud import secretmanager
ImportError: cannot import name 'secretmanager' from 'google.cloud' (unknown location)
OLD INFORMATION
I am 95% sure that it's still a path problem and that pytest and airflow are fixing something I'm not aware of that isn't handled when I try to manually run the python script.
Things I have tried:
cd /airflow; python setup.py develop --user
cd /airflow; pip install -e . --user
cd /airflow/dags; pip install -r requirements.txt --user
UPDATE
As per requests in the comments, here are the contents of requirements.txt:
boto3>=1.7.84
google-auth==1.11.2
google-cloud-bigtable==1.2.1
google-cloud-bigquery==1.24.0
google-cloud-spanner==1.14.0
google-cloud-storage==1.26.0
google-cloud-logging==1.14.0
google-cloud-secret-manager>=0.2.0
pycloudsqlproxy>=0.0.15
pyconfighelper>=0.0.7
pymysql==0.9.3
setuptools==45.2.0
six==1.14.0
And I accidentally omitted the --user flags from the pip and python installation command examples above. In my container environment everything is installed into the user's home directory using --user and NOT in the global site-packages directory.
UPDATE 2
I've added the following code to the file that is generating the error:
print('***********************************************************************************')
import sys
print(sys.path)
from google.cloud import secretmanager_v1beta1 as secretmanager
print('secretmanager.__file__: {}'.format(secretmanager.__file__))
From airflow list_dags:
['/home/app/.local/bin', '/usr/local/lib/python37.zip', '/usr/local/lib/python3.7', '/usr/local/lib/python3.7/lib-dynload', '/home/app/.local/lib/python3.7/site-packages', '/home/app/.local/lib/python3.7/site-packages/Jeeves-0.0.1-py3.7.egg', '/home/app/.local/lib/python3.7/site-packages/google_cloud_secret_manager-0.2.0-py3.7.egg', '/home/app/.local/lib/python3.7/site-packages/pyconfighelper-0.0.7-py3.7.egg', '/home/app/.local/lib/python3.7/site-packages/click-7.1.1-py3.7.egg', '/workspace/airflow', '/usr/local/lib/python3.7/site-packages', '/workspace/airflow/dags', '/workspace/airflow/config', '/workspace/airflow/plugins']
secretmanager.__file__: /home/app/.local/lib/python3.7/site-packages/google_cloud_secret_manager-0.2.0-py3.7.egg/google/cloud/secretmanager_v1beta1/__init__.py
From python my_dag.py:
['/workspace/airflow/dags', '/usr/local/lib/python37.zip', '/usr/local/lib/python3.7', '/usr/local/lib/python3.7/lib-dynload', '/home/app/.local/lib/python3.7/site-packages', '/home/app/.local/lib/python3.7/site-packages/Jeeves-0.0.1-py3.7.egg', '/home/app/.local/lib/python3.7/site-packages/google_cloud_secret_manager-0.2.0-py3.7.egg', '/home/app/.local/lib/python3.7/site-packages/pyconfighelper-0.0.7-py3.7.egg', '/home/app/.local/lib/python3.7/site-packages/click-7.1.1-py3.7.egg', '/home/app/.local/lib/python3.7/site-packages/icentris_ml_airflow-0.0.0-py3.7.egg', '/usr/local/lib/python3.7/site-packages']
UPDATE 3
tree airflow/dags
airflow/dags
├── __init__.py
├── __pycache__
│   ├── __init__.cpython-37.pyc
│   ├── bq_to_cs.cpython-37.pyc
│   ├── bq_to_wrench.cpython-37.pyc
│   ├── fetch_cloudsql_tables-bluesun.cpython-37.pyc
│   ├── fetch_cloudsql_tables.cpython-37.pyc
│   ├── fetch_app_tables-bluesun.cpython-37.pyc
│   ├── fetch_app_tables.cpython-37.pyc
│   ├── gcs_to_cloudsql.cpython-37.pyc
│   ├── gcs_to_s3.cpython-37.pyc
│   ├── lake_to_staging.cpython-37.pyc
│   ├── schedule_dfs_sql_to_bq-bluesun.cpython-37.pyc
│   ├── schedule_dfs_sql_to_bq.cpython-37.pyc
│   ├── app_to_bq_initial_load-bluesun.cpython-37.pyc
│   ├── app_to_lake-bluesun.cpython-37.pyc
│   └── app_to_lake.cpython-37.pyc
├── bq_to_wrench.py
├── composer_variables.json
├── my_ml_airflow.egg-info
│   ├── PKG-INFO
│   ├── SOURCES.txt
│   ├── dependency_links.txt
│   └── top_level.txt
├── lake_to_staging.py
├── libs
│   ├── __init__.py
│   ├── __pycache__
│   │   ├── __init__.cpython-37.pyc
│   │   ├── checkpoint.cpython-37.pyc
│   │   └── utils.cpython-37.pyc
│   ├── checkpoint.py
│   ├── io
│   │   ├── __init__.py
│   │   ├── __pycache__
│   │   │   └── __init__.cpython-37.pyc
│   │   └── gcp
│   │   ├── __init__.py
│   │   ├── __pycache__
│   │   │   ├── __init__.cpython-37.pyc
│   │   │   └── storage.cpython-37.pyc
│   │   └── storage.py
│   ├── shared -> /workspace/shared/
│   └── utils.py
├── requirements.txt
├── table_lists
│   └── table-list.json
└── templates
└── sql
├── lake_to_staging.contacts.sql
├── lake_to_staging.orders.sql
└── lake_to_staging.users.sql
11 directories, 41 files
UPDATE 4
I tried fixing it so that sys.path looked the same when running python dags/my_dag.py as it does when running airflow list_dags or pytest test_my_dag.py.
Still get the same error.
Looking at a more recent version of documentation, I noticed that you should be able to just do from google.cloud import secretmanager. Which gave me the same result (works with airflow and pytest, not when trying to run directly).
At this point, my best guess is that it has something to do with namespace magic, but I'm not sure?
It have to be installed via terminal: pip install google-cloud-secret-manager
Because package name is not secretmanager but google-cloud-secret-manager
Similar to Noah's answer, this fixed the issue for me without having to import an unneeded module:
import google.cloud.secretmanager as secretmanager
After much trial and error, the issue is that currently one cannot import secretmanager from the google.cloud namespace if one has not previously imported another package from google.cloud.
Ex.
mod.py
from google.cloud import secretmanager # Fails with error
mod2.py
from google.cloud import bigquery
from google.cloud import secretmanager # Works because the first import initialized the namespace
I'm using Python 3.8.2 and google-cloud-secret-manager 2.7.1. The following line fixed the issue for me:
from google.cloud import secretmanager_v1beta1 as secretmanager

AWS Lambda layer and local structure

I'm facing a problem on how configuring my local environment to reproduce the behaviour of the prod env.
A simplified view of my prod env :
├── λf A
│ └── layer L
│
├── λf B
│ └── layer L
│
└── λf C
A,B & C are classic node.js lambda functions. A & B share some common dependencies (let's say lodash.js) that I want to group under a lambda layer : L.
For my dev env, I'm using Lerna (but that's not mandatory) to work as a mono repo with this structure :
.
├── packages
│ ├── A
│ │ ├── node_modules
│ │ └── package.json
│ ├── B
│ │ ├── node_modules
│ │ └── package.json
│ ├── C
│ │ ├── node_modules
│ │ └── package.json
│ └── L
│ ├── node_modules
│ └── package.json
├── package.json
└── lerna.json
To ship L, all I have to do is installing it's dependencies and copying the content of node_modules folder into nodejs/node_modules path of my layer.
But what I'm currently unable to do it's making the module A & B to resolve their dependencies both in standard node_modules paths and the L node_modules folder.
Some solutions that are not acceptable :
Duplicating the dependencies
Adding the L dependencies globally in the lerna package.json (because in my real environment I have multiple layers)
Technically, that would be the complex case when you are making environment based on Lambda.
It has separate running environment, so it is difficult to regard same as traditional environment.
But there is easier path that AWS provides.
How about creating your lambda function from Container Image?
Basically, Lambda Layer is "Hidden" file system layer in Lambda Virtual Machine. So it would work very similar to containerized layered package.
If you build your serverless function with container package, you can build your development environment same as production environment.
Reference :
https://docs.aws.amazon.com/lambda/latest/dg/images-create.html
You can follow this out of the box example to start :
https://github.com/jinspark-lab/lambdakit

Relative vs. Absolute imports for Flask and nose

I have a Flask app that has the following directory structure:
├── README.md
├── __init__.py
├── constants.py
├── businesspackage
│   ├── README.md
│   ├── __init__.py
│   ├── __pycache__
│   ├── detection
│   ├── flagging_spec.txt
│   └── tests
├── requirements.txt
├── run.py
└── tests
├── __init__.py
├── __pycache__
└── test_api.py
Within detection's __init__.py, I have imported my necessary classes, so that I can import the classes from that top-level module, rather than needing to give the full path to each of the .py files inside of the module.
I am attempting to import some classes from detection from inside run.py, but am coming across the following error: when I try to run my Flask application from the top-level directory using python3 run.py:
Traceback (most recent call last):
File "run.py", line 9, in <module>
from .businesspackage.detection import AdsDetection
SystemError: Parent module '' not loaded, cannot perform relative import
After reading some other questions on here, it suggested that I change from a relative import to an absolute import. However, if I try the following import in run.py:
from businesspackage.detection import AdsDetection
Then I can run my Flask server without the import error, however, my imports break for the nose test runner. I am running the tests using the nosetests command, with nose 1.3.7 . How can I define my imports so that they work for both the Flask server, and for my nosetests?
Edit:
businesspackage.__init__.py is the following:
from .business_detection import BusinessDetector
So for some weird reason, I managed to get the absolute imports to work after deleting the __init__.py file in the base level directory: i.e. my directory structure looks like follows:
├── README.md
├── __init__.py
├── constants.py
├── businesspackage
│ ├── README.md
│ ├── __init__.py
│ ├── __pycache__
│ ├── detection
│ ├── flagging_spec.txt
│ └── tests
├── requirements.txt
├── run.py
└── tests
├── __init__.py
├── __pycache__
└── test_api.py
I figured I should give it a try after seeing one of the answers on here: Python imports for tests using nose - what is best practice for imports of modules above current package . So now, all of my package is using absolute imports.

Resources