how to activate linux virtualenv in windows 10 [duplicate] - python-3.x

By error, I forgot to specify the WORKON_HOME variable before creating my virtual environments, and they were created in /root/.virtualenvs directory. They worked fine, and I did some testing by activating certain environment and then doing (env)$ pip freeze to see what specific modules are installed there.
So, whe I discovered the workon home path error, I needed to change the host directory to /usr/local/pythonenv. I created it and moved all the contents of /root/.virtualenvs directory to /usr/local/pythonenv, and changed the value of WORKON_HOME variable. Now, activating an environment using workon command seems to work fine (ie, the promt changes to (env)$), however if I do (env)$ pip freeze, I get way longer list of modules than before and those do not include the ones installed in that particular env before the move.
I guess that just moving the files and specifying another dir for WORKON_HOME variable was not enough. Is there some config where I should specify the new location of the host directory, or some config files for the particular environment?

Virtualenvs are not by default relocatable. You can use virtualenv --relocatable <virtualenv> to turn an existing virtualenv into a relocatable one, and see if that works. But that option is experimental and not really recommended for use.
The most reliable way is to create new virtualenvs. Use pip freeze -l > requirements.txt in the old ones to get a list of installed packages, create the new virtualenv, and use pip install -r requirements.txt to install the packages in the new one.

I used the virtualenv --relocatable feature. It seemed to work but then I found a different python version installed:
$ . VirtualEnvs/moslog/bin/activate
(moslog)$ ~/VirtualEnvs/moslog/bin/mosloganalisys.py
python: error while loading shared libraries: libpython2.7.so.1.0: cannot open shared object file: No such file or directory
Remember to recreate the same virtualenv tree on the destination host.

Related

Python pip packages on local files, not in venv

I have a device with Python 3.7 pre-installed, without any pip package.
I created the program on my local machine with some packages in my venv (I have a requirements.txt file) and it works perfectly.
My problem is that now I want to create a directory with my programs and upload it to my device. This doesn't work because I don't have additional packages installed.
My question: Is there a way to export the installed package to a directory in my program files and import it locally and not from venv?
Copy the all the venv modules to some directory and modify PYTHONPATH variable when running your program, append your modules directory path to it.
man python3
PYTHONPATH
Augments the default search path for module files. The format is the same as the shell's $PATH: one or more directory
pathnames separated by colons. Non-existent directories are silently ignored. The default search path is installation
dependent, but generally begins with ${prefix}/lib/python<version> (see PYTHONHOME above). The default search path is al‐
ways appended to $PYTHONPATH. If a script argument is given, the directory containing the script is inserted in the path
in front of $PYTHONPATH. The search path can be manipulated from within a Python program as the variable sys.path.
In general, you have the following options to run a python script on another device than the one you developed the script on:
Generate an executable (for example with the package pyinstaller). With that solution, it is not required to have python installed on your device, as everything is embedded in the executable
If you have python installed on the device (like your case), you can just run it on it. However, if you have dependency (from PyPi or Conda), you must also install them on your device
If you have access to internet and have your requirements.txt file, you can just run pip install -r requirements.txt
If you don't have access to internet, you can either:
download the wheel for each package and then ship it to your device
just ship to your device the contents of the folders lib and lib64 of your virtual environnement folder .venv of your local machine (I hope you are using one python -m venv .venv) into the virtual environment of your device

How to make venv completely portable?

I want to create a venv environment (not virtualenv) using the following commands:
sudo apt-get install python3.8-venv
python3.8 -m venv venv_name
source venv_name/bin/activate
But it seems to be that it contains dependency on the system where it is created and it creates problems whenever I want to make it portable. That means, I want when I copy this folder along with my project and run it in another machine, it will work without making any changes.
But I am unable to activate the environment (it gets activated but the interpreter still uses system's python and pip.
Therefore, I tried making another venv in the second computer and copied the lib and lib64 folders from the older venv to this newer venv (without replacing existing files) but getting the following error this time:
File "/usr/local/lib/python3.8/ctypes/__init__.py" line 7, in <module>
from _ctypes import Union, Structure, Array
ModuleNotFoundError: No module named '_ctypes'
But interesting thing is, if you notice, the newly created venv in the new machine also searching the missing package in its local directory and not in the venv.
How do I make the venv portable along with all its dependencies and reliably deploy in another device just by activating it?
Disclaimer: None of this is my work, I just found this blog-post and will briefly summarize: https://aarongorka.com/blog/portable-virtualenv/ archived
Caveat: This only works (semi-reliably) among Linux machines. Don't use in production!
The first step is to get copies of your python-executables in the venv/bin folder, so be sure to specify --copies when creating the virtual environment:
python3 -m venv --copies venv
All that's left seems to be changing the hardcoded absolute paths into relative paths, using your tool of choice. In the blogpost, they use pwd after changing to the venv-parent-directory whenever venv/bin/activate is run.
sed -i '43s/.*/VIRTUAL_ENV="$(cd "$(dirname "$(dirname "${BASH_SOURCE[0]}" )")" && pwd)"/' venv/bin/activate
Then, similarly all pip-scripts need to be adapted to run execution with the local python
sed -i '1s/./#!/usr/bin/env python/' venv/bin/pip
BUT, the real problem starts when installing new modules. I would expect most modules to behave nicely, but there will be those that hardcode expected path-structures or similarly thwart any work towards replacing path dependencies.
However: I find this trick is very useful to share a single folder among developers for finding elusive bugs.

Project directory accidentally in sys.path - how to remove it?

I don't know how it happened, but my sys.path now apparently contains the path to my local Python project directory, let's call that /home/me/my_project. (Ubuntu).
echo $PATH does not contain that path and echo $PYTHONPATH is empty.
I am currently preparing distribution of the package and playing with setup.py, trying to always work in an virtualenv. Perhaps I messed something up while not having a virtualenv active. Though I trying to re-install using python3 setup.py --record (in case I did an accidental install) fails with insufficient privileges - so I probably didn't accidentally install it into the system python.
Does anyone have an idea how to track down how my module path got to the sys.path and how to remove that?
I had the same problem. I don't have the full understanding of my solution, but here it is nonetheless.
My solution
Remove my package from site-packages/easy-install.pth
(An attempt at) explanation
The first hurdle is to understand that PYTHONPATH only gets added to sys.path, but is not necessarily equal to it. We are thus after what adds the package into sys.path.
The variable sys.path is defined by site.py.
One of the things site.py does is automatically add packages from site-packages into sys.path.
In my case, I incorrectly installed my package as a site-package, causing it to get added to easy-install.pth in site-packages and thus its path into sys.path.

Learn python 3 the hard way Ex46 instructions not clear

In excercise 46, it says
When you are done setting all of this up, your directory should look like mine here:
skeleton/
NAME/
__init__.py
bin/
docs/
**setup.py**
tests/
**NAME_tests.py**
__init__.py
...but he never said how to save these files inside the virtual directory.
How do I make a .py file and save it inside this venv directory?
I can't find the files and folder structures in windows explorer so I have no clue as to where to save.
I am stuck.
Thanks a lot for you help.
It sounds like you're confusing the use of venv with the layout of your code in a project structure. You shouldn't be putting your Python code and modules in a venv generated directory. You didn't mention what OS you're using but here is the general workflow I use on OSX.
I put all of my venv environments in $HOME$/.venv. So I'd generate a venv environment like python -m venv venv ~/.venv/skeleton or you might have to use python3 -m venv venv ~/.venv/skeleton depending on your OS.
You would then activate the venv environment by source ~/.venv/skeleton/bin/activate
You'd then create your project like LPTHW says in a directory like $HOME$/projects/skeleton

Process for building a package to be managed by an offline conda/puppet environment

I’m trying build a package to be managed by an offline conda environment
in Linux. I’m doing a dry run with py4j.
On my online build server:
I download the py4j recipe
And download the source distribution (py4j-0.8.2.1.tar.gz)
Copy the recipe and the source distribution to the offline puppet
server
On my offline puppet server:
tweak the recipe to point to my the copy of the source distribution.
condabuildpy4j− conda install –use-local py4j
$ conda index linux-64
conda index linux-64 writes the py4j configuration to repodata.json. I
can see py4j is in repodata.json. And there’s also a
py4j-0.8.2.1-py27_0.json created under /opt/anaconda/conda-meta/
We have a custom channel mapped to /srv/www/yum/anaconda_pkgs/
$ cat .condarc
channels:
- http://10.1.20.10/yum/anaconda_pkgs/
I can see that py4j configuration is added to the following files:
./envs/_test/conda-meta/py4j-0.8.2.1-py27_0.json
./pkgs/cache/ef2e2e6cbda49e8aeeea0ae0164dfc71.json
./pkgs/py4j-0.8.2.1-py27_0/info/recipe.json
./pkgs/py4j-0.8.2.1-py27_0/info/index.json
./conda-bld/linux-64/repodata.json ./conda-bld/linux-64/.index.json
./conda-meta/py4j-0.8.2.1-py27_0.json
Can someone explain what each of these json files is supposed to do?
I can also see that there is a repodata.json and .index.json in
/srv/www/yum/anaconda_pkgs/linux-64 that were updated but don’t have a
configuration for py4j.
I manually copied my py4j-0.8.2.1.tar.gz into my custom repo
(channel) in /srv/www/yum/anaconda_pkgs/linux-64?
I still can’t do conda install –use-local py4j from host machines or
puppet agent -t. I get the following:
err: /Stage[main]/Anaconda::Packages/Anaconda::Install_pkg[py4j]/Package[py4j]/ensure: change from absent to present failed: Execution of ‘/opt/anaconda/bin/conda install –yes –quiet py4j’ returned 1: Fetching package metadata: ..
Error: No packages found in current linux-64 channels matching: py4j
You can search for this package on Binstar with
binstar search -t conda py4j
--use-local only searches the conda-bld/linux-64 channel. If you move the package to another local channel, you will need to add it to your ~/.condarc channels as a file:// url.
Whenever you add a package to a local repo, you need to run conda index on that directory. This will regenerate the repodata.json file.
I'll answer you question about the various json files, but note that you really don't need to care about any of these.
./envs/_test/conda-meta/py4j-0.8.2.1-py27_0.json
This is a remnant from the build process. Once the package is built, it is installed into a _test environment so that the actions in the test section of your meta.yaml can be run. Each environment has a conda-meta directory that contains the metadata for each package installed in that environment.
./pkgs/cache/ef2e2e6cbda49e8aeeea0ae0164dfc71.json
Everything in the pkgs directory is a cache. This is a local cache of the channel repodata, so that conda doesn't have to redownload it when it is "fetching package metadata" if it hasn't changed.
./pkgs/py4j-0.8.2.1-py27_0/info/recipe.json
Again, this is a cache. When the p4js package is installed anywhere, it is extracted in the pkgs directory. Inside the package, in the info directory, is all the metadata for the package. This file is the metadata from the recipe that was used to create the package. Conda doesn't use this metadata anywhere, it is just included for convenience.
./pkgs/py4j-0.8.2.1-py27_0/info/index.json
This is the metadata of the package included in the package itself. It's what conda index will use to create the repodata.json.
./conda-bld/linux-64/repodata.json
This is the repo metadata for the special channel of packages you have built (the channel used with --use-local, and used by conda build automatically.
./conda-bld/linux-64/.index.json
This is a special cache file used internally by conda index.
./conda-meta/py4j-0.8.2.1-py27_0.json
This is similar to the first one. It's the environment metadata for the package that you installed into your root environment.

Resources