I have a device with Python 3.7 pre-installed, without any pip package.
I created the program on my local machine with some packages in my venv (I have a requirements.txt file) and it works perfectly.
My problem is that now I want to create a directory with my programs and upload it to my device. This doesn't work because I don't have additional packages installed.
My question: Is there a way to export the installed package to a directory in my program files and import it locally and not from venv?
Copy the all the venv modules to some directory and modify PYTHONPATH variable when running your program, append your modules directory path to it.
man python3
PYTHONPATH
Augments the default search path for module files. The format is the same as the shell's $PATH: one or more directory
pathnames separated by colons. Non-existent directories are silently ignored. The default search path is installation
dependent, but generally begins with ${prefix}/lib/python<version> (see PYTHONHOME above). The default search path is al‐
ways appended to $PYTHONPATH. If a script argument is given, the directory containing the script is inserted in the path
in front of $PYTHONPATH. The search path can be manipulated from within a Python program as the variable sys.path.
In general, you have the following options to run a python script on another device than the one you developed the script on:
Generate an executable (for example with the package pyinstaller). With that solution, it is not required to have python installed on your device, as everything is embedded in the executable
If you have python installed on the device (like your case), you can just run it on it. However, if you have dependency (from PyPi or Conda), you must also install them on your device
If you have access to internet and have your requirements.txt file, you can just run pip install -r requirements.txt
If you don't have access to internet, you can either:
download the wheel for each package and then ship it to your device
just ship to your device the contents of the folders lib and lib64 of your virtual environnement folder .venv of your local machine (I hope you are using one python -m venv .venv) into the virtual environment of your device
Related
I want to create a venv environment (not virtualenv) using the following commands:
sudo apt-get install python3.8-venv
python3.8 -m venv venv_name
source venv_name/bin/activate
But it seems to be that it contains dependency on the system where it is created and it creates problems whenever I want to make it portable. That means, I want when I copy this folder along with my project and run it in another machine, it will work without making any changes.
But I am unable to activate the environment (it gets activated but the interpreter still uses system's python and pip.
Therefore, I tried making another venv in the second computer and copied the lib and lib64 folders from the older venv to this newer venv (without replacing existing files) but getting the following error this time:
File "/usr/local/lib/python3.8/ctypes/__init__.py" line 7, in <module>
from _ctypes import Union, Structure, Array
ModuleNotFoundError: No module named '_ctypes'
But interesting thing is, if you notice, the newly created venv in the new machine also searching the missing package in its local directory and not in the venv.
How do I make the venv portable along with all its dependencies and reliably deploy in another device just by activating it?
Disclaimer: None of this is my work, I just found this blog-post and will briefly summarize: https://aarongorka.com/blog/portable-virtualenv/ archived
Caveat: This only works (semi-reliably) among Linux machines. Don't use in production!
The first step is to get copies of your python-executables in the venv/bin folder, so be sure to specify --copies when creating the virtual environment:
python3 -m venv --copies venv
All that's left seems to be changing the hardcoded absolute paths into relative paths, using your tool of choice. In the blogpost, they use pwd after changing to the venv-parent-directory whenever venv/bin/activate is run.
sed -i '43s/.*/VIRTUAL_ENV="$(cd "$(dirname "$(dirname "${BASH_SOURCE[0]}" )")" && pwd)"/' venv/bin/activate
Then, similarly all pip-scripts need to be adapted to run execution with the local python
sed -i '1s/./#!/usr/bin/env python/' venv/bin/pip
BUT, the real problem starts when installing new modules. I would expect most modules to behave nicely, but there will be those that hardcode expected path-structures or similarly thwart any work towards replacing path dependencies.
However: I find this trick is very useful to share a single folder among developers for finding elusive bugs.
By error, I forgot to specify the WORKON_HOME variable before creating my virtual environments, and they were created in /root/.virtualenvs directory. They worked fine, and I did some testing by activating certain environment and then doing (env)$ pip freeze to see what specific modules are installed there.
So, whe I discovered the workon home path error, I needed to change the host directory to /usr/local/pythonenv. I created it and moved all the contents of /root/.virtualenvs directory to /usr/local/pythonenv, and changed the value of WORKON_HOME variable. Now, activating an environment using workon command seems to work fine (ie, the promt changes to (env)$), however if I do (env)$ pip freeze, I get way longer list of modules than before and those do not include the ones installed in that particular env before the move.
I guess that just moving the files and specifying another dir for WORKON_HOME variable was not enough. Is there some config where I should specify the new location of the host directory, or some config files for the particular environment?
Virtualenvs are not by default relocatable. You can use virtualenv --relocatable <virtualenv> to turn an existing virtualenv into a relocatable one, and see if that works. But that option is experimental and not really recommended for use.
The most reliable way is to create new virtualenvs. Use pip freeze -l > requirements.txt in the old ones to get a list of installed packages, create the new virtualenv, and use pip install -r requirements.txt to install the packages in the new one.
I used the virtualenv --relocatable feature. It seemed to work but then I found a different python version installed:
$ . VirtualEnvs/moslog/bin/activate
(moslog)$ ~/VirtualEnvs/moslog/bin/mosloganalisys.py
python: error while loading shared libraries: libpython2.7.so.1.0: cannot open shared object file: No such file or directory
Remember to recreate the same virtualenv tree on the destination host.
I'm trying to install python package in editable mode with:
pip3 install -e ./
setup.py file contains:
data_files=[
(os.path.expanduser("~") + "/.xxx", ["xxx/yyy.data"])
],
After installation the yyy.data file is not copied to .xxx folder.
Is there an option to create data files outside of the package folder when working in editable mode?
The truth is data_files has caveats. See No single, complete solution for packaging data issue on the list of Problems in Python Packaging, note in data_files section of Packaging and Distributing Project tutorial from Python Packaging User Guide, pip's bug All packages that contain non-package data are now likely installed in a broken way since 7.0.0 and wheel's bug bdist_wheel makes absolute data_files relative to site-packages.
According to information gathered from above sources your data was installed into site-packages directory instead of your home directory as you were expecting.
Once a program is installed in Linux, sometimes I find out that it is easier to put in a different location. In general, what is the significance of the location of the files of an installed program on Linux?
Often the advice on the internet is to add the (wrong or inconvenient) paths to environment variables. I'd much rather move the files to locations where they are automatically found by commands and programs.
One recent example is site-packages of Python. My Python did not appear to check the PYTHONPATH variable, moving the libraries there to the Python2.7/ directory worked well.
Now Ia m facing the same issue with OpenCV.
I also wonder why Linux installation does not prompt (like Windows) for the desired installation directory and why, so often, things wind up in places where they don't work?
In general, programs are installed in /usr/bin (for binaries) and /usr/lib, or a specific path to that specific linux distro, so that any program that you install that uses a specific library/program will search in that path for it. If you install a program in a different path, let's say /home/user/program, it will be installed locally and other programs won't be able by default to access it.
You can install any program wherever you want. However, it is good use to use the repo and install them in the general path.
I don't know how you install programs, but I use apt-get and dpkg on Ubuntu. You can also install some python modules this way.
Generally you are supposed to use the package system provided by your distro (IMHO).
If you do not use packages then you are on your own.
About PYTHONPATH. Did you add it to your .bashrc and made sure that it was set in the terminal you are using?
Also please see:
http://en.wikipedia.org/wiki/Filesystem_Hierarchy_Standard
I am trying to create a python module but a .pyc file is not created. I am working in windows. I execute 2 commands in command prompt(which I run as administrator) as following:
c:\Python33\python.exe setup.py sdist
after this statement is executed in the same window I execute
c:\Python33\python.exe setup.py install
I don't know what I am lacking???
Python3.x doesn't create .pyc files in the same directory. This is part of python3's __pycacahe__ standard. Instead, your .pyc files are stored in __pycache__ for the version that you build for.
This was mentioned in Brett Cannon's talk at Pycon-2013