setuptools, use package on local system instead of getting it from PyPI - python-3.x

There is an open-source python package that I want to work on (toga-android). To test the code I write, I have to be able to build my own project that has said open-source package as a dependency. My project has to be built with setuptools, so I need setuptools to fulfill the dependency using my version of the package, and not get the package from PyPI.
The problem is that setuptools always gets the package from PyPI.
Whenever I build with setuptools I see:
Collecting toga-android==0.3.0.dev8
Downloading https://files.pythonhosted.org/packages/92/fe/348a39e2e0bbcac2d3ed511dd2b62943b488e7dcb8097c437416caf1c179/toga_android-0.3.0.dev8-py3-none-any.whl
or
Collecting toga-android==0.3.0.dev8
Using cached https://files.pythonhosted.org/packages/92/fe/348a39e2e0bbcac2d3ed511dd2b62943b488e7dcb8097c437416caf1c179/toga_android-0.3.0.dev8-py3-none-any.whl
Clearly it is getting the package from PyPI or using a cached version from it.
I have installed my version using pip install -e . , and that has no effect. I have also tried including the package's source in my project's directory with setup.py. Setuptools apparently includes this code because syntax errors there make the build fail, but it doesn't recognize that it can satisfy the dependency. It still gets the package from PyPI and any modules imported from the package are the PyPI versions.
How can I use of custom version of a package that is also in PyPI as a setuptools dependency?
Steps to reproduce:
pip install briefcase (using or not using virtualenv does not matter)
git clone https://github.com/pybee/toga.git
cd ~/toga/src/core; sudo pip install -e .
cd ~/toga/src/android/; sudo pip install -e .
cd ~/toga/examples/tutorial0
python setup.py android
The output will show that an older version of toga-android is downloaded even though it was already installed with pip.

Related

import distutils.command.bdist_wininst as orig

Everything went well while I was trying to create a ros 2 package in Ubuntu 22.04 by following the ros2 documentation, however when I got to the colcon build step, it failed for python packages even though it works for cMake packages.
colcon build --packages-select mypkgpython
...
import distutils.command.bdist_wininst as orig
ModuleNotFoundError: No module named 'distutils.command.bdist_wininst'
knowing that my python version is:
python3 --version
Python 3.10.6
i have tried:
sudo apt-get install python3-distutils:
python3-distutils is already the newest version (3.10.6-1~22.04).
sudo apt-get install python3-apt:
python3-apt is already the newest version (2.4.0).
sudo apt install python3-colcon-common-extensions
python3-colcon-common-extensions is already the newest version(0.3.0-1)
how can i solve this problem?
i expected when i build my python package using
colcon build package
it will be successfully build, so i could work with
The bdist_wininst command was deprecated in Python 3.8 and you are using python 3.10
Its no more found in python3-distutils package .
use
bdist_wheel (wheel packages) instead
or
if you want to run your code as it is downgrade your python to < 3.8
The bdist_wininst format was deprecated in Python 3.8, and the documentation for this format has been removed in Python 3.9. The recommended way to distribute Python packages now is the Wheel format.
you can overcome this error by JUST updating the setuptools
pip install --upgrade setuptools
Note that: Setuptools version 58.2.0 is the last version that works with ROS 2 python packages without any warnings because it is the last version that supports the old installation method, "python setup.py install." This method has been deprecated and replaced by newer, standards-based tools, such as pip and ament.
pip install setuptools==58.2.0

does pip reinstall libraries everytime when creating a virtual environment?

I know it might sound stupid, but I genuinely tried my best to understand if pip installs packages from the internet every single time or does it just clone and use the already globally installed packages when I am creating a venv?
What exactly is the difference between pip install and pip download?
What does it mean by
Collecting package <package_name>...
Using cached <package_name>...
and
Downloading <package_name>
Can someone help me out...
pip download replaces the --download option to pip install, which is now deprecated and was removed in pip 10.
pip download does the same resolution and downloading as pip install, but instead of installing the dependencies, it collects the downloaded distributions into the directory provided (defaulting to the current directory). This directory can later be passed as the value to pip install --find-links to facilitate offline or locked down package installation.
The idea behind the pip cache is simple, when you install a Python package using pip for the first time, it gets saved on the cache. If you try to download/install the same version of the package on a second time, pip will just use the local cached copy instead of retrieving it from the remote register .
If you plan to use the same version of the package in another project then using cached packages is much faster.
But if pip installs the cached version of the package and you want to upgrade to the newest version of the package then you can simply upgrade by: pip install <package_name> --upgrade

How to fix 'Could not find a version that satisfies the requirement' for install_requires list when pip installing in custom package?

I am trying to build my own Python package (installable by pip) using the twine package. This is all going well right up until the point where I try to pip install my actual package (so after uploading to PyPi).
So I first run:
python3 setup.py sdist bdist_wheel
In which my setup.py install_requires list looks like this:
install_requires=[
'jupyter_kernel_gateway==2.4.0',
'pandas==1.0.2',
'numpy==1.18.1',
'azure-storage-blob==2.0.1',
'azure-datalake-store==0.0.48',
'psycopg2-binary==2.8.4',
'xlsxwriter==1.2.6',
'SQLAlchemy==1.3.12',
'geoalchemy2==0.6.3',
'tabulate==0.8.2',
'pyproj==1.9.6',
'geopandas==0.4.0',
'contextily==0.99.0',
'matplotlib==3.0.2',
'humanize==0.5.1',
'ujson==1.35',
'singleton-decorator==1.0.0',
'dataclasses==0.6',
'xlrd==1.2.0'],
In my understanding, these install_requires would be installed by pip when installing my own package.
After this I run
python3 -m twine upload --repository testpypi dist/*
To actually upload my package to PyPi. However, when pip installing my package, I get errors that say there are no versions that satisfy the requirements for a lot of the listed requirements. E.g.: ERROR: Could not find a version that satisfies the requirement psycopg2-binary==2.8.4
When I manually install these packages (e.g. pip install psycopg2-binary==2.8.4), they do get installed.
Is there any way to make the pip install of my package actually install the install_requires requirement list succesfully?
You didn't show how your pip install-ing your package, but I'm guessing you're using something like:
pip install your_project --index-url https://test.pypi.org/simple
The issue is that TestPyPI doesn't contain copies of your dependencies that exist on PyPI. For example:
Exists: https://pypi.org/project/psycopg2-binary/2.8.4/
Does not exist: https://test.pypi.org/project/psycopg2-binary/2.8.4/
You can configure pip to fall back on TestPyPI when a package is missing instead by specifying --extra-index-url instead:
pip install your_project --extra-index-url https://test.pypi.org/simple

python : request dependency idna failed "ImportError: No module named idna"

I am using the request module in python, which has inda as a dependency.
I am keeping idna module inside request module but still it is not able to detect the inda module.
"/mnt/yarn/usercache/root/appcache/application_1522067995292_0020/container_1522067995292_0020_01_000001/slackclient.zip/slackclient/requests/packages.py", line 7, in
ImportError: No module named idna
I had the same issue and strangely installing idna worked for me
$ pip install idna
Use requirements.txt avoid these dependency mix-ups.
Firstly, when your code is working all good, do this
$ pip freeze > requirements.txt
This stores all the installed packages into the text file.
Now use requirements.txt file to install all modules wherever it runs hereafter
$ pip install -r requirements.txt
If needed, can upgrade your modules and check everything working cool and then update requirements.txt again.
When not mentioning the versions, usually the latest versions of the packages are fetched and installed. Some of the updates from dependency's dependency(inception) package might break.

How to install library

I am a little bit confused....
I installed anaconda on my computer (I have windows 10).
Normally, when I want to install a package I simply do "pip install package_name" or "conda install package_name" and it is done.
First question: what is the difference between pip and conda?
Now I tried to install xgboost and it was really complicated I tried lot of things nothings worked until I install something called miniconda.
There it works but now, when I do "conda install package_name" it install it in miniconda3/lib/site _package and I have to copy/paste it in Anaconda3/lib/site_package if I want it to work.
Second question: how can I ask to the computer that "conda install
package_name" install it directly in anaconda3 and not miniconda3?
Finally I tried to install the package "surprise" for recommended systems. Both "pip install" or "conda install" failed.
I went in github and got the file "surprise" from https://github.com/NicolasHug/Surprise
I tried to copy it in Anaconda3/lib/site_package but it doesn't work.
When I do from surprise import Reader I did not get the error "no module name surprise" anymore but I get "cannot import name 'Reader'"
Last question: how can I make it work? I think I have to build it but
I do not now how...
Thank you in advance for anyone that can explain all this for me :-)
Similarly to you, I had issues installing the surprise package.
I tried both pip install surprise and conda install surprise unsuccessfully.
conda install -c conda-forge scikit-surprise
conda install -c conda-forge/label/gcc7 scikit-surprise
conda install -c conda-forge/label/cf201901 scikit-surprise
I found those on the anconda website and the first one worked for me.
Hopefully this would help you as well
pip vs conda
pip is a package manager that facilitates installation, upgrade,
and uninstallation of python packages. It also works with virtual python environments.
conda is a package manager for any software (installation, upgrade and uninstallation).
It also works with virtual system environments.
Conda is a packaging tool and installr that aims to do more than what pip does;
handle library dependencies outside of the Python packages as well as the Python packages themselves.
Conda also creates a virtual environment, like virtualenv does.
For more see here
Anaconda vs miniconda
The open source version of Anaconda is an easy-to-install
high performance Python and R distribution with a package manager,
environment manager and collection of 720+ open source packages.
It also comes with the options to install RStudio.
The "lite" version of Anaconda without the collection of 720 packages.
The downside is that you need to type in command line commands,
"conda install PACKAGENAME"
And Last
To install this package with conda run:
conda install -c anaconda py-xgboost=0.60
Update for surprise
The easiest way is to use pip (you'll need numpy):
$ pip install numpy
$ pip install scikit-surprise
Or you can clone the repo and build the source (you'll need Cython and numpy):
$ git clone https://github.com/NicolasHug/surprise.git
$ python setup.py install

Resources