How to package python3 modules for google app engine - python-3.x

I'm trying to figure out how to add a internal package to a Google App Engine deployment with Python 3 (standard).
For Python 2 the way to package modules was using a local lib/ folder and appengine_config.py. This seems not to work anymore for Python 3? At least my app cannot find modules which are in the lib/ folder.
For Python 3 it's possible to just pip3 install -t . the package. But this gets really messy as all packages are just installed in the app root and will also be added to the git repository of our app.
We cannot use requirements.txt as the module is internal and will not be available on PyPI.
Is there another way to package modules for Google App Engine using Python 3?

The Python 3.7 Standard runtime is an "idiomatic" Python runtime, which means it doesn't automatically add the special lib directory to the search path for modules.
You should continue "vendoring" your private modules into a lib directory, but you'll need to make a change to how you import them.
If your private package is foobar, and you've done pip install -t lib foobar, then in your project, instead of:
import foobar
you do:
import lib.foobar
(You'll also need to add an empty __init__.py file to your lib directory, to make it a module.)

Related

How to create a Python Wheel Or Determine what modules / libraries are within a Python Wheel

I am trying to create a Python Wheel for Great_Expectations. The .whl provided by Great_Expectations exists here https://pypi.org/project/great-expectations/#files - great-expectations 0.13.25. Unfortunately, it appears that this .whl doesn't contain all the libraries I need to in order to work with Great_Expectations in Azure Synapse Apache Spark Pool.
Therefore, it looks like I will either have to create my own Great_Expectations package a python project with all of its dependencies for offline install.whl or at the very least try and establish what libraries are contained within the existing package great-expectations 0.13.25
Therefore, can someone let me know how to create a Python Wheel(ie. Python Package, with all of its dependencies for Great_Expectations). Alternatively, can someone let me know how to determine what module/dependencies are contained with a package?
Thanks
To add new dependencies, update requirements.txt (You actually need to update install_requires in setup.py, but in this project they are reading the requirements file to fetch the requirements)
You will need to clone the git repo so as to update that list.
Then to create a new wheel out of that source, just run:
python setup.py bdist_wheel
(You may need to run pip install wheel if wheel doesn't exist)
Docs: wheel
To the second question: What modules / libraries are within a Python Wheel?
Just the package, the dependencies are installed from sources when you install the package.
Consider to use Conda pack. It was explicitly created for such a use case of making Python / Conda packages easily portable

How can I set up my imports in order to run my python application without installing it, and still be able to run tox using poetry?

I have a python 3.6 code-base which needs to be installed in the environment's site-packages directory in order to be called and used. After moving it to docker, I decided that I should set up a shared volume between the docker container and the host machine in order to avoid copying and installing the code on the container and having to rebuild every time I made a change to the code and wanted to run it. In order to achieve this, I had to change a lot of the import statements from relative to absolute. Here is the structure of the application:
-root
-src
-app
-test
In order to run the application from the root directory without installing it, I had to change a lot of the import statements from
from app import something
to:
import src.app.something
The problem is that I use poetry to build the app on an azure build agent, and tox to run the tests. The relevant part of my pyproject.toml file looks like this:
[tool.poetry]
name = "app"
version = "0.1.0"
packages = [{include = 'app', from='src'}]
The relevant part of my tox.ini file looks like this:
[tox]
envlist = py36, bandit, black, flake8, safety
isolated_build = True
[testenv:py36]
deps =
pytest
pytest-cov
pytest-env
pytest-mock
fakeredis
commands =
pytest {posargs} -m "not external_service_required" --junitxml=junit_coverage.xml --cov report=html --cov-report=xml:coverage.xml
I'm not an expert in tox or poetry, but from what I could tell, the problem was that the src directory wasn't being included in the build artifact, only the inner app directory was, so I added a parent directory and changed the directory structure to this:
-root
-app
-src
-app
-test
And then changed the poetry configuration to the following in order to include the src directory
[tool.poetry]
name = "app"
version = "0.1.0"
packages = [{include = 'src', from='app'}]
Now when I change the imports in the tests from this:
from app import something
to this:
from app.src.app import something
The import is recognized in Pycharm, but when I try to run tox -r, the I get the following error:
E ModuleNotFoundError: No module named 'app'
I don't understand how tox installs the application, and what kind of package structure I need to specify in order to be able to call the code both from the code-base directory and from site packages. I looked at some example projects, and noticed that they don't use the isolated_build flag, but rather the skip_dist flag, but somehow they also install the application in site packages before running their tests.
Any help would be much appreciated.
Specs:
poetry version: 1.1.6
python version:3.6.9
tox version:3.7
environment: azure windows build agent
You have to change the imports back to from app import something, the src part is, with respect to the code as a deliverable, completely transient. Same goes for adding in another app directory, your initial project structure was fine.
You were right about going from relative imports to absolute ones though, so all that is necessary thereafter is telling your python runtime within the container that root/src should be part of the PYTHONPATH:
export PYTHONPATH="{PYTHONPATH}:/path/to/app/src"
Alternatively, you can also update the path within your python code right before importing your package:
import sys
sys.path.append("/path/to/root/src")
import app # can be found now
Just to state the obvious, meddling with the interpreter in this way is a bit hacky, but as far as I'm aware it should work without any issues.

Overriding package libraries in Google App Engine project

I am writing a Google App Engine Django REST Framework project that uses external libraries through requirements.txt.
In one of the of files in a module installed in requirements.txt, I am manually editing some code there. How do I get GAP to use this modified version instead of original one.
The way I am doing this is installing the packages in a folder called lib, modifying the package inside it and then creating a file called appengine_config.py which contains this:
from google.appengine.ext import vendor
vendor.add('lib')
But when I deploy it, it still uses the original package in requirements.txt. Any idea how to make this work?
GAE will use requirements.txt and install those libraries in the lib folder when you deploy. That is just how it works.
Nothing prevents you from deploying code outside the lib folder. You can structure your project like this:
GAE_folder:
-- app.yaml
-- requirements.txt
-- lib
-- my_app
-- my_custom_lib

How to package npm and python dependencies in a deb?

I am developing small desktop application with web GUI (ReactJS) and python backend (Flask). I need to pack this application to deb-package for simple distribution.
For now scheme is pretty simple: I have standard setup.py file where python dependencies are described and debian/rules which uses dh-python to parse that file and extract dependencies to debian/control (if I understand correctly):
#!/usr/bin/make -f
export DH_VERBOSE = 1
export PYBUILD_NAME=myapp
%:
dh $# --with python3 --buildsystem=pybuild
All this works fine but the problem is that I need to manage npm-dependencies as well (for GUI part). I can't add something like npm run build as a custom build step to my setup.py since pybuild is setting proxy-server to avoid downloading of any side packages (only deb-dependencies are allowed). There are no deb-packages for my npm-dependencies and I don't want to create them myself.
So the only way I found is to add npm-dependencies (files like bootstrap.min.js, etc.) or bundle.js to git-repository, which seems bad. Is there any other way I can solve the problem?

How can I include the parent folder structure on a library distribution in Python 3.6 using setuptools?

I am using setuptools to distribute a Python library. I have the following directory structure:
/src
/production
setup.py
/prod-library
/package1
/package2
The folder structure has to stay like this because there will be multiple libraries living under src in the future and need to have their own setup.py files. So the traditional answer of having 1 parent folder and moving out setup.py to the root folder will not work in this case.
I am using the following in the setup.py of the library to export the library (which is working)
package_dir={'': '.'},
packages=find_packages()
Inside the project tar.gz it looks like this:
/prod-library
/package1
/package2
But inside the prod-library package Python files, imports referencing other modules need to be structured as follows:
import src.production.prod-library.package1
import src.production.prod-library.package2
The problem:
After importing one of those libraries to a different project, errors are raised as follows:
ModuleNotFoundError: No module named 'src.production'
Since the build only drops in the /prod-library package, the project importing the code fails due to the missing folder structure (src/production) since the built distribution only has /prod-library.
What I need to do is include the src/production folder in the distribution build so the resulting tar.gz file looks like this:
/src
/production
/prod-library
/package1
/package2
I am not sure how I can get those in the build structure since they are above the setup.py location. How can that be accomplished?
If it can’t, then I am open to suggestions about fixing the imports if that can be a solution.
I found the solution to the problem. It has to do with how the package_dir was configured:
package_dir={'': '.'}
Although the above package_dir built the files and included all subfolders as expected, the egg-info file's SOURCES.txt was incorrect and showing as follows:
./prod-library/__init__.py
./prod-library/package1/__init__.py
etc...
When the package was imported into another API, the imports could not be found when attempting import prod-libary.package1.file.py
After changing the package_dir as follows, I was able to use the library normally:
package_dir={'.': ''}
The above effectively removed the ./ prefix in the SOURCES.txt file which was breaking the imports. Now the egg-info's SOURCES.txt looks correct:
prod-library/__init__.py
prod-library/package1/__init__.py
etc...

Resources