does pip reinstall libraries everytime when creating a virtual environment? - python-3.x

I know it might sound stupid, but I genuinely tried my best to understand if pip installs packages from the internet every single time or does it just clone and use the already globally installed packages when I am creating a venv?
What exactly is the difference between pip install and pip download?
What does it mean by
Collecting package <package_name>...
Using cached <package_name>...
and
Downloading <package_name>
Can someone help me out...

pip download replaces the --download option to pip install, which is now deprecated and was removed in pip 10.
pip download does the same resolution and downloading as pip install, but instead of installing the dependencies, it collects the downloaded distributions into the directory provided (defaulting to the current directory). This directory can later be passed as the value to pip install --find-links to facilitate offline or locked down package installation.
The idea behind the pip cache is simple, when you install a Python package using pip for the first time, it gets saved on the cache. If you try to download/install the same version of the package on a second time, pip will just use the local cached copy instead of retrieving it from the remote register .
If you plan to use the same version of the package in another project then using cached packages is much faster.
But if pip installs the cached version of the package and you want to upgrade to the newest version of the package then you can simply upgrade by: pip install <package_name> --upgrade

Related

Installing or updating a package using the pip in Python

If I accidentally run any of the following commands to install or update a package using pip in Python 3.x twice, will it install or update that package twice on the machine?
pip install <package_name>
pip install --upgrade <package_name>
After updating a package twice, it says that:
Requirement already satisfied: appnope in ./.pyenv/versions/3.11.0/lib/python3.11/site-packages (from ipykernel) (0.1.3)"
Does this mean I already updated or installed the package?
Yes, it means you have already installed or upgraded.
The first command installs the package. Because you have not specified a package version with something like pip install package-name==1.0.0, pip will install the package with the latest version.
The second command attempts to upgrade the same package. Because it is installed with the latest version, there is nothing to upgrade. pip will not reinstall packages unless you ask it to.
pip install --upgrade --force-reinstall <package-name>
pip will also attempt to install dependencies for your packages that may be required for the package you requested.
Requirement already satisfied: appnope in ./.pyenv/versions/3.11.0/lib/python3.11/site-packages (from ipykernel) (0.1.3)"

Pip: force installation of package version from a git repo

I am facing the problem to install a package based on a specific commit hash from Github.
This works great if the used venv does not already contain the installed package:
pip install --upgrade git+https://github.com/user/pyckagexyz.git#1234567890032ab36c732dc32d9c257d401e71b0
This installs pyckagexyz and it's dependencies if it does not yet exist in the used venv. If it already exists this command does nothing. I also tried without success
pip install --upgrade --no-cache-dir git+https://github.com/user/pyckagexyz.git#1234567890032ab36c732dc32d9c257d401e71b0
=> No effect
pip install --upgrade --force git+https://github.com/user/pyckagexyz.git#1234567890032ab36c732dc32d9c257d401e71b0
=> Installation fails because on of the dependencies can't be installed.
The only workaround I have found so far is to uninstall the package before re-installing it or to first install the package without dependencies and force --no-deps --force and then again a second time without force and dependencies to make sure all dependencies are present.
Is there no other way to say pip to install the selected version of a packet and overwrite an installed version?
Have you tried the --force-reinstall option with pip install?
pip install --force-reinstall git+https://github.com/user/pyckagexyz.git#1234567890032ab36c732dc32d9c257d401e71b0
From the pip docs:
--force-reinstall
Reinstall all packages even if they are already up-to-date.

How do I force pip to install a package directly from the Internet not local cache?

I was installing the package ibm_db using the command:
pip3 install ibm_db
However, there was a power outage and the installation stopped midway.
Now, the package is available in my system:
pip3 list --local | grep ibm
DEPRECATION: The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning.
ibm-db (3.0.1)
But when I try to import it, it doesn't work.
>>> import ibm_db
>>> ModuleNotFoundError: No module named 'ibm_db
I'm suspecting that something went wrong with the package installation, but every time I try to reinstall (uninstalling, then installing it again) it it uses the locally cached version now, and the problem continues.
I would like to try reinstalling the package straight from the internet, without having to use my local cache, but I don't want to clear the entire local cache.
https://pip.readthedocs.io/en/stable/reference/pip_install/#caching
pip install --no-cache-dir …
PS. But I doubt the problem is in cache. After uninstalling and installing packages again everything must be ok. If it's not — the problem is somewhere else.

Python Packages Installation from local directory

There is a need to install python packages on machine without internet connection
I used pip download to download the packages and their dependencies
I copied all the dependencies to the offline machine
I run pip from the local python packages repository using
pip install *
package with dependencies are trying to access the internet to download their dependencies even that they are locate in the same directory
I would like to avoid the requirement.txt file and would like it to install all the packages from the local directory with their dependencies.
Is there any way to do so?
It's possible to download the wheels directly for each package, and once you have them on the machine you can run pip install name-of-wheel.whl and it will install them without routing to pypi.
You can use on the online machine:
pip download -r requirements.txt
to download package without installing them.
Then, on the offline machine:
pip install --no-index --find-links /path/to/download/dir/ -r requirements.txt
Source: Python Packages Offline Installation

Why isn't pip v7.1.0 caching wheels?

I'm running pip v7.1.0 (latest as of this writing) and running into an issue where it's not caching at all.
Here is how I'm installing Django -
pip install --cache-dir=d:\pipcache django
The package installs successfully, but there is nothing cached. I've read the latest documentation and checked my AppData/Local path and it's empty. What I'm looking to do is have everything I install through pip cached, so all subsequent virtual environment creations are quick.
EDIT
Turns out that pip won't cache packages that have wheel files. I tried forcing pip to build the source --no-binary=django to no avail.
Having said that, how can I force pip to cache my requirements whether the maintainers have provided wheels or not?
Based on my understanding of pip, this should be the new default. Not sure why it isn't working, though.
Alternatively, I have to do this -
pip wheel --wheel-dir="D:/"-r reqs
pip install --no-index --find-links="D:/" -r reqs
Is this within a venv? If so, you may have to explicitly install wheel into the venv using pip install wheel. After that, pip should start automatically building/caching your wheels.

Resources