Project directory accidentally in sys.path - how to remove it? - python-3.x

I don't know how it happened, but my sys.path now apparently contains the path to my local Python project directory, let's call that /home/me/my_project. (Ubuntu).
echo $PATH does not contain that path and echo $PYTHONPATH is empty.
I am currently preparing distribution of the package and playing with setup.py, trying to always work in an virtualenv. Perhaps I messed something up while not having a virtualenv active. Though I trying to re-install using python3 setup.py --record (in case I did an accidental install) fails with insufficient privileges - so I probably didn't accidentally install it into the system python.
Does anyone have an idea how to track down how my module path got to the sys.path and how to remove that?

I had the same problem. I don't have the full understanding of my solution, but here it is nonetheless.
My solution
Remove my package from site-packages/easy-install.pth
(An attempt at) explanation
The first hurdle is to understand that PYTHONPATH only gets added to sys.path, but is not necessarily equal to it. We are thus after what adds the package into sys.path.
The variable sys.path is defined by site.py.
One of the things site.py does is automatically add packages from site-packages into sys.path.
In my case, I incorrectly installed my package as a site-package, causing it to get added to easy-install.pth in site-packages and thus its path into sys.path.

Related

Python pip packages on local files, not in venv

I have a device with Python 3.7 pre-installed, without any pip package.
I created the program on my local machine with some packages in my venv (I have a requirements.txt file) and it works perfectly.
My problem is that now I want to create a directory with my programs and upload it to my device. This doesn't work because I don't have additional packages installed.
My question: Is there a way to export the installed package to a directory in my program files and import it locally and not from venv?
Copy the all the venv modules to some directory and modify PYTHONPATH variable when running your program, append your modules directory path to it.
man python3
PYTHONPATH
Augments the default search path for module files. The format is the same as the shell's $PATH: one or more directory
pathnames separated by colons. Non-existent directories are silently ignored. The default search path is installation
dependent, but generally begins with ${prefix}/lib/python<version> (see PYTHONHOME above). The default search path is al‐
ways appended to $PYTHONPATH. If a script argument is given, the directory containing the script is inserted in the path
in front of $PYTHONPATH. The search path can be manipulated from within a Python program as the variable sys.path.
In general, you have the following options to run a python script on another device than the one you developed the script on:
Generate an executable (for example with the package pyinstaller). With that solution, it is not required to have python installed on your device, as everything is embedded in the executable
If you have python installed on the device (like your case), you can just run it on it. However, if you have dependency (from PyPi or Conda), you must also install them on your device
If you have access to internet and have your requirements.txt file, you can just run pip install -r requirements.txt
If you don't have access to internet, you can either:
download the wheel for each package and then ship it to your device
just ship to your device the contents of the folders lib and lib64 of your virtual environnement folder .venv of your local machine (I hope you are using one python -m venv .venv) into the virtual environment of your device

How to make venv completely portable?

I want to create a venv environment (not virtualenv) using the following commands:
sudo apt-get install python3.8-venv
python3.8 -m venv venv_name
source venv_name/bin/activate
But it seems to be that it contains dependency on the system where it is created and it creates problems whenever I want to make it portable. That means, I want when I copy this folder along with my project and run it in another machine, it will work without making any changes.
But I am unable to activate the environment (it gets activated but the interpreter still uses system's python and pip.
Therefore, I tried making another venv in the second computer and copied the lib and lib64 folders from the older venv to this newer venv (without replacing existing files) but getting the following error this time:
File "/usr/local/lib/python3.8/ctypes/__init__.py" line 7, in <module>
from _ctypes import Union, Structure, Array
ModuleNotFoundError: No module named '_ctypes'
But interesting thing is, if you notice, the newly created venv in the new machine also searching the missing package in its local directory and not in the venv.
How do I make the venv portable along with all its dependencies and reliably deploy in another device just by activating it?
Disclaimer: None of this is my work, I just found this blog-post and will briefly summarize: https://aarongorka.com/blog/portable-virtualenv/ archived
Caveat: This only works (semi-reliably) among Linux machines. Don't use in production!
The first step is to get copies of your python-executables in the venv/bin folder, so be sure to specify --copies when creating the virtual environment:
python3 -m venv --copies venv
All that's left seems to be changing the hardcoded absolute paths into relative paths, using your tool of choice. In the blogpost, they use pwd after changing to the venv-parent-directory whenever venv/bin/activate is run.
sed -i '43s/.*/VIRTUAL_ENV="$(cd "$(dirname "$(dirname "${BASH_SOURCE[0]}" )")" && pwd)"/' venv/bin/activate
Then, similarly all pip-scripts need to be adapted to run execution with the local python
sed -i '1s/./#!/usr/bin/env python/' venv/bin/pip
BUT, the real problem starts when installing new modules. I would expect most modules to behave nicely, but there will be those that hardcode expected path-structures or similarly thwart any work towards replacing path dependencies.
However: I find this trick is very useful to share a single folder among developers for finding elusive bugs.

how to activate linux virtualenv in windows 10 [duplicate]

By error, I forgot to specify the WORKON_HOME variable before creating my virtual environments, and they were created in /root/.virtualenvs directory. They worked fine, and I did some testing by activating certain environment and then doing (env)$ pip freeze to see what specific modules are installed there.
So, whe I discovered the workon home path error, I needed to change the host directory to /usr/local/pythonenv. I created it and moved all the contents of /root/.virtualenvs directory to /usr/local/pythonenv, and changed the value of WORKON_HOME variable. Now, activating an environment using workon command seems to work fine (ie, the promt changes to (env)$), however if I do (env)$ pip freeze, I get way longer list of modules than before and those do not include the ones installed in that particular env before the move.
I guess that just moving the files and specifying another dir for WORKON_HOME variable was not enough. Is there some config where I should specify the new location of the host directory, or some config files for the particular environment?
Virtualenvs are not by default relocatable. You can use virtualenv --relocatable <virtualenv> to turn an existing virtualenv into a relocatable one, and see if that works. But that option is experimental and not really recommended for use.
The most reliable way is to create new virtualenvs. Use pip freeze -l > requirements.txt in the old ones to get a list of installed packages, create the new virtualenv, and use pip install -r requirements.txt to install the packages in the new one.
I used the virtualenv --relocatable feature. It seemed to work but then I found a different python version installed:
$ . VirtualEnvs/moslog/bin/activate
(moslog)$ ~/VirtualEnvs/moslog/bin/mosloganalisys.py
python: error while loading shared libraries: libpython2.7.so.1.0: cannot open shared object file: No such file or directory
Remember to recreate the same virtualenv tree on the destination host.

My PYTHONPATH was deleted by a package. How can I recover the PYTHONPATH?

My PYTHONPATH environment variable was accidentally deleted by a package during some testing. I don't have a back up of the path variable, but now I can't import any third party packages into python because I'm missing all the paths that were previously there.
Is there a way to recover just the standard PYTHONPATH variables?
I'm using Anaconda distribution with Python 3.6.
Windows 10.
I tried system restore, but this has failed for my only restore points, another issue I have to sort out.

installed ghc from PPA 'no such file or directory'

I added GHC-7.10.1 from this ppa:hvr/ghc # https://launchpad.net/~hvr/+archive/ubuntu/ghc
However I can't seem to find ghc from the command line. any suggestions?
me#ubuntu:~/Documents/haskell$ ghc
bash: /usr/bin/ghc: No such file or directory
As the linked page describes
The packages install into /opt/ghc/$VER/ so in order to use them, the easiest way is to bring a particular GHC version into scope by placing the respective /opt/ghc/$VER/bin folder early into the PATH environment variable.

Resources