EDIT:
In addition to the behaviour outlined below, the Python3.10 based environment seems to be ignoring packages installed with the pip -e option (development mode).
Specifically, a package installed in development mode is not listed in pip freeze (which, it does in the Python3.9 based virtual environment) and a simple import <package_name> fails. If the package is installed normally (i.e. not in development mode) everything works as expected.
I am using virtualenv to create a virtual environment based around Python 3.10. While virtualenv finishes without errors and does seem to activate, it still fails to pick up its own Python unless the PYTHONPATH environment variable has been set manually.
I am not sure if the situation I am faced with is due to Ubuntu's way of incorporating Python or the way virtualenv is setting up the environment so that it picks up a local interpreter. Here is what I have gathered this far:
My base system is an Ubuntu 21.04. It has its own Python 3 (Python3.9.5) installation which I have not touched at all except installing the python3-virtualenv package using apt.
With Python 3.10, I installed the python3.10-dev package and proceeded to create a virtual environment in the usual way:
> virtualenv -p python3.10 the_env/
> source the_env/bin/activate
Although this looks OK so far, this environment does not have any information about its own site-packages directory. Not even the one that virtualenv is supposed to be creating which includes pip. In this installation, if you try to > pip --version you simply get an error that
the pip package does not exist (the pip "executable" location is picked up correctly, but because the interpreter does not know anything about its site-packages it fails to start pip properly).
Long story short, I created two environments, one based on Python3.9 (which works perfectly) and one based on Python3.10 (which does not work) and did a very simple test in each environment:
> python -m site
On the Python3.9 environment, sys.path includes a path that leads all the way to this particular environment's site-packages
On the Python3.10 environment, sys.path does not include that particular path but still includes the typical paths you expect to find (e.g. those pointing to the interpreter itself and the top environment directory but not the specific path that points to the site-packages location.
Following this, I defined a PYTHONPATH manually, before activating the environment which points exactly to the site-packages for that particular environment and everything worked as expected.
I suspect that this might be something to do with the fact that my system's Python is 3.9 which means that the USER_SITE variable is valid while in the case of Python3.10, it is not (because, I do not have a use for it, this is just a virtual environment I am creating). So, I suspect that this might be throwing off the way the site module determines where things are.
As I am not sure, I would like to ask the following:
Could this be something to do with the way Ubuntu handles the Python installation that might just be creating this small problem with virtual environments?
Could the problem be with virtualenv that does not explicitly specify a PYTHONPATH?
Could this behaviour be something of a corner case of the site module?
What worked for me is an installation from source. After unpacking the source code, as a summary:
$ ./configure --enable-optimizations --with-ensurepip=install --prefix=/path/to/install/to/
$ make -j
$ make test
$ make install
$ /path/to/install/to/bin/python3.10 -m venv /path/to/test
$ source /path/to/test/bin/activate
$ pip list
Package Version
---------- -------
pip 21.2.4
setuptools 58.1.0
WARNING: You are using pip version 21.2.4; however, version 21.3.1 is available.
You should consider upgrading via the '/path/to/test/bin/python3.10 -m pip install --upgrade pip' command.
$ python -m pip install --upgrade pip
[...]
$ pip list
Package Version
---------- -------
pip 21.3.1
setuptools 58.1.0
Edit
As mentioned below, I believe the reason why the procedure above just works is simply that when building and installing Python from source, the installation simply will be correct, which I suspect the 3.10 installation in Ubuntu 21.04 is not.
I made my Python installation from source somewhere under my home directory, in order to not risk messing things up, and I also did not permanently modify the PATH.
Setting the path to $PATH:/path/to/install/to/bin should be fine however, either permanently, or just when running mkvirtualenv.
Doing that, my new installation even seems to integrate seamlessly with my system's virtualenvwrapper.
I don't really need to be on the Python leading edge myself, but if I did, I would definitely make myself independent of Ubuntu's latest development by using the procedure described above.
By the way, if I update pip directly in the Python installation built from source, I will no longer get the message about the older pip (see above) when I install new virtual environments.
Edit 2
Bug reported to Ubuntu:
https://bugs.launchpad.net/ubuntu/+source/python3.10/+bug/1955742
Edit 3
Also, since Ubuntu 21.04 has end of life in a month or so, upgrading to 21.10 could really make sense. I have just tried that, and it seems that Python 3.10 virtual environments work just fine in that release.
With the advent of Ubuntu 22.04 (which caused a few minor issues with some specific python virtualenv setups I had) and having already spent some time figureing this out, I ended up switching to using pyenv.
Pyenv made it very easy to install any python version and any number of virtual environments within it which can be further customised in complete isolation.
The only thing to be careful of is installing all necessary python prerequisites before installing a specific version to avoid missing functionality from some packages (e.g. not including the lzma library will generate an ominous warning from pandas ("your python installation is incomplete..."). This can be ignored if you are not using that functionality or otherwise, easily fixed.
I think that this is a better option overall if you have to manage different versions and specific configurations for python, even across distros.
This solution is very close to the one suggested before ("install from source") so I will be accepting that one and leave my contribution as additional information about this problem.
Related
For a project I made a virtual environment (venv) using Python3. I installed all the necessary dependencies using a simple bash script (see picture below) after I activated my venv. (I verified the installed packages using: pip3 list and concluded that every dependency was installed succesfully.)
My project uses snakemake, so I ran this snakemake commando:
snakemake --snakefile Snakefile.py all
I get this error:
I know it has to do something with the venv, because without the venv snakemake runs perfectly. I have read the Snakemake installation documents and it says I have to install conda and make & activate a conda venv. But, I do not have the sudo privileges to download and install conda (I work on a protected server).
What is happening and does someone know a fix?
One possible reason could be the difference in Python versions. What version of Python does the pip3 prepare environment for?
As I can see from the picture provided, the invalid syntax may be because of the version of Python doesn't support f-strings.
Imagine the following two scenarios: when you run Snakemake manually, you use the latest Python3 (e.g. 3.9). But if the pip3 is configured for an older version (e.g. 3.5), you can configure a very different environment for Python3.5 that doesn't support f-strings.
I tried to install a certain python module that required python 3.6 minimum to work properly so I checked my version using python --version which gave me the output of Python 2.7.17 and then used python3 --version giving me Python 3.6.9. Now, I know for a fact that i have Python 3.8 installed because I ran apt install python3.8 just before checking the version.
If someone wants to know what my system is running; I am currently running Elementary OS 5.1.7 Hera.
EDIT:
(IDK what term to use, I want to say I am done going through answers, and I liked none.)
After a while of whacking my brain, I decided not to uninstall the 3.6 version as It may have version specific modules which if removed may cause other installed programs to break.
Since I just use Linux for my college-work, It wont matter if more than one versions are installed anyway.
Sorry for any mistakes I may have made, I was never good at this kind of things.
Use conda to install and manage different versions of Python (or lots of other software, for that matter). Install different Python versions in separate conda environments. For example:
Find what python versions are available:
conda search python
Output (as of the time of writing, in the future I expect that the latest Python versions will be higher):
Loading channels: done
# Name Version Build Channel
python 1.0.1 0 conda-forge
python 1.2 0 conda-forge
...
python 3.9.0 h88f2d9e_1 pkgs/main
python 3.9.0 ha017127_4_cpython conda-forge
Install the Python versions you need in separate environments. Here, install Python 3.9.0 in environment named python39:
conda create --name python39 python=3.9.0
Switching environments is easy:
conda activate python39
# python is now 3.9.0
conda deactivate
# back to the default python
or (deprecated now):
source activate python39
source deactivate
SEE ALSO:
conda docs: Managing environments
This question is more appropriate for Unix & Linux.
Python installations (more generally, versioned installations of software) co-exist on linux using version numbers. You can likely run Python 3.8 using the python3.8 command (or else, locate where you installed it and run from there / add the location to the PATH environment variable to run directly).
Likewise, for each python version you can install its own package manager (e.g. install pip3.8 by python3.8 -m pip install pip) and use it to install packages for that python version. (As different projects require different sets of packages, the general practice is to create a "virtual environment", i.e. a separate copy of the needed version of python, pip and their package installation folders for each project, and install the required packages there - for more information see e.g. this excellent answer).
Regarding the command python3 (usually /usr/bin/python3) is just a symbolic link, you can replace it with the version you like (as long as it remains compatible with what the system expects - i.e. python3 of version no less than your built-in python3/python3-minimal, otherwise you will probably break something), e.g. assuming which python3 gives you /usr/bin/python3, you can
sudo rm /usr/bin/python3 #remove existing link
sudo ln /usr/bin/python3.8 /usr/bin/python3 # create a new link to the version of your choice
(although a better alternative could be instead aliasing it alias python3='/usr/bin/python3.8' and adding this to ~/.bashrc).
Whatever you do, do not uninstall python3-minimal as it is - at least, on Ubuntu - required by the system and uninstalling or changing it will likely break the system.
I debated which Stackoverflow site this best fit but couldn't decide.
I'd like to contribute to an open-source project on Github, but I can't figure out how to prevent the stable version already installed on my machine and the development version I'd like to make a patch for from clashing on import.
The repository only suggests pip installing with the editable.
What I've done so far is: clone the repository locally, and then trying to import it in a Jupyter Notebook from the directory above. However, the Jupyter Notebook is referencing the stable version installed earlier with pip. I tried to append to sys.path the child directory holding the package but still the same issue. I can't seem to get relative imports working either. Do I need to uninstall the stable version?
Any tips are appreciated!
You'd use virtualenv for this. It will allow you to create an environment that is isolated from your system python and you can install the dev version of the library on it.
Basic usage (for Unix-like systems) is:
$ pip install virtualenv
$ virtualenv MY_ENV
$ cd MY_ENV
$ source bin/activate # activates the local python for this shell only
(MY_ENV)$ pip install <some-module> # installs to a local and isolated python
(MY_ENV)$ python ... # runs python in the local environment
(MY_ENV)$ deactivate # disable the isolated python
$
Some things you should know before I ask my question:
I am utterly new to both Linux & Python, and have a hard time understanding official documentation and technical answers (but have a burning desire to deeply understand both)
I am running elementary OS 0.4.1 Loki
My Python 3 version is 3.5.2. When I search the online documentation on the venv module for python 3.5.2, I get the documentation for the 3.5.6 version. I don't understand why there is no documentation for the .2 version.
So, here's my problem. I was trying to create a virtual environment using venv and proceeded thusly:
According to Python's 3.5.6 venv module documentation, a virtual environment is created using the command pyvenv /path/to/new/virtual/environment. I tried that command and got:
The program 'pyvenv' is currently not installed. You can install it by typing: sudo apt install python3-venv
I then searched documentation for newer Python versions and tried the new venv command python3 -m venv /path/to/new/virtual/environment and got the following result:
The virtual environment was not created successfully because ensurepip is not available. On Debian/Ubuntu systems, you need to install the python3-venv package using the following command. apt-get install python3-venv
In both cases, the solution seems to be to install python3-venv. My question is: What exactly am I installing by installing python3-venv: Isn't venv already part of the Standard Library? Furthermore, why do I have to install it via apt-get if it is a Python module? It is my understanding that standard library modules are imported, not installed; and that modules external to the standard library are installed via pip. Related to this, why is ensurepip not available?
Second part of my question: if installing python3-venv is the way to go, what is the proper way to create a virtual environment using venv in Python 3.5.2: pyvenv my_virtual_environment or python3 -m venv my_virtual_environment?
Don't worry about the documentation not matching the micro version number – increments in that place are only for bugfixes, so the documentation stays the same.
Your question is interesting since venv is indeed not an optional module. My guess is that the Python version shipped with your OS (or that you installed yourself) seems to come with a stripped down or no standard library. For instance, the python3.5-minimal package doesn't appear to have it. Does your Python have the other modules in the standard library?
Edit: See also this question.
Installation can be described as "putting files onto your computer, in the right place". Importing a module, however, means that you tell Python to make available some functionality. To import a module, it must be installed (e.g. in /usr/lib/python3.5 for Python 3 on my computer), and one method for installing additional modules is via apt.
The python3 -m venv my_virtual_environment method should work in 3.5 as well and is the future-proof version, so you should probably go with that.
I am having an issue when it comes to installing python3.3 modules. The current modules are installing to /usr/lib64/python3.3/site-packages but they need to be installed to /usr/lib/python3.3/site-packages. Is there a way to point the installation to put the modules in the appropriate location?
I am using CentOS 6.6
While you could specify a prefix directory to tell pip to install into a specific place:
pip install --install-option="--prefix=/usr/lib/python3.3/site-packages/" my_module
I would instead heavily recommend learning about and using virtual environments to give you greater control of your Python programming environment.
The creation and activation of a virtual environment looks like this:
pyvenv-3.4 my_new_environment
source my_new_environment/bin/activate
pip install my_module
which will make my_module install reliably inside of my_new_environment.
Whenever you want to "activate" this environment, you simply source the activate script to tell the Python interpreter where to find its libraries!