How can I install Image class in Python 3 Ubuntu 14.04? - python-3.x

I am trying to use the PIL in Python 3, but there is no way I can make it work. I tried several links, but none of them really helped.
I am using Ubuntu 14.04 and installed IDLE.
Can anyone here, please help me?
Thanks in advance,

sudo apt-get install python3-pil python3-pil.imagetk
You may need to manually remove any left-overs of non-successful installs tried with pip or compiling with setup.py - this will be indicated by conflicting files listed in the error message.
The rule of thumb is: If you want a Python general tool for interactive use, or that provides scripts you will use directly, libraries for small scripts for personal use: you have to install the package built for your system (the .deb) with apt-get, aptitude whatever - unless that Python module is not packaged at all.
For all other uses, you should create a Python virtualenv and install the desired modules inside that virtualenv with pip install <name>. Including the cases Ubuntu does not have the desired Python module packaged.
Regarding PIL there is still another interesting bit: the original PIL had become unmaintained and bit-rot over the years. One of the most proeminent bugs arising from that is exactly it inability to be cleanly installed using pip or easy_install. Another one is that it never was (at least properly) ported to work with Python3. Due to that a fork named "Pillow" was created. It is a drop-in replacement for the orinal PIL - PIP and Easy_install should install "Pillow" and not "PIL".
(NB. I do not know which PIL or Pillow - is packaged by Ubuntu under the "python-pil" names. I suppose and hope it is the actively maintained project Pillow)

Related

Ubuntu 21.04, Virtualenv and its configuration of Python

EDIT:
In addition to the behaviour outlined below, the Python3.10 based environment seems to be ignoring packages installed with the pip -e option (development mode).
Specifically, a package installed in development mode is not listed in pip freeze (which, it does in the Python3.9 based virtual environment) and a simple import <package_name> fails. If the package is installed normally (i.e. not in development mode) everything works as expected.
I am using virtualenv to create a virtual environment based around Python 3.10. While virtualenv finishes without errors and does seem to activate, it still fails to pick up its own Python unless the PYTHONPATH environment variable has been set manually.
I am not sure if the situation I am faced with is due to Ubuntu's way of incorporating Python or the way virtualenv is setting up the environment so that it picks up a local interpreter. Here is what I have gathered this far:
My base system is an Ubuntu 21.04. It has its own Python 3 (Python3.9.5) installation which I have not touched at all except installing the python3-virtualenv package using apt.
With Python 3.10, I installed the python3.10-dev package and proceeded to create a virtual environment in the usual way:
> virtualenv -p python3.10 the_env/
> source the_env/bin/activate
Although this looks OK so far, this environment does not have any information about its own site-packages directory. Not even the one that virtualenv is supposed to be creating which includes pip. In this installation, if you try to > pip --version you simply get an error that
the pip package does not exist (the pip "executable" location is picked up correctly, but because the interpreter does not know anything about its site-packages it fails to start pip properly).
Long story short, I created two environments, one based on Python3.9 (which works perfectly) and one based on Python3.10 (which does not work) and did a very simple test in each environment:
> python -m site
On the Python3.9 environment, sys.path includes a path that leads all the way to this particular environment's site-packages
On the Python3.10 environment, sys.path does not include that particular path but still includes the typical paths you expect to find (e.g. those pointing to the interpreter itself and the top environment directory but not the specific path that points to the site-packages location.
Following this, I defined a PYTHONPATH manually, before activating the environment which points exactly to the site-packages for that particular environment and everything worked as expected.
I suspect that this might be something to do with the fact that my system's Python is 3.9 which means that the USER_SITE variable is valid while in the case of Python3.10, it is not (because, I do not have a use for it, this is just a virtual environment I am creating). So, I suspect that this might be throwing off the way the site module determines where things are.
As I am not sure, I would like to ask the following:
Could this be something to do with the way Ubuntu handles the Python installation that might just be creating this small problem with virtual environments?
Could the problem be with virtualenv that does not explicitly specify a PYTHONPATH?
Could this behaviour be something of a corner case of the site module?
What worked for me is an installation from source. After unpacking the source code, as a summary:
$ ./configure --enable-optimizations --with-ensurepip=install --prefix=/path/to/install/to/
$ make -j
$ make test
$ make install
$ /path/to/install/to/bin/python3.10 -m venv /path/to/test
$ source /path/to/test/bin/activate
$ pip list
Package Version
---------- -------
pip 21.2.4
setuptools 58.1.0
WARNING: You are using pip version 21.2.4; however, version 21.3.1 is available.
You should consider upgrading via the '/path/to/test/bin/python3.10 -m pip install --upgrade pip' command.
$ python -m pip install --upgrade pip
[...]
$ pip list
Package Version
---------- -------
pip 21.3.1
setuptools 58.1.0
Edit
As mentioned below, I believe the reason why the procedure above just works is simply that when building and installing Python from source, the installation simply will be correct, which I suspect the 3.10 installation in Ubuntu 21.04 is not.
I made my Python installation from source somewhere under my home directory, in order to not risk messing things up, and I also did not permanently modify the PATH.
Setting the path to $PATH:/path/to/install/to/bin should be fine however, either permanently, or just when running mkvirtualenv.
Doing that, my new installation even seems to integrate seamlessly with my system's virtualenvwrapper.
I don't really need to be on the Python leading edge myself, but if I did, I would definitely make myself independent of Ubuntu's latest development by using the procedure described above.
By the way, if I update pip directly in the Python installation built from source, I will no longer get the message about the older pip (see above) when I install new virtual environments.
Edit 2
Bug reported to Ubuntu:
https://bugs.launchpad.net/ubuntu/+source/python3.10/+bug/1955742
Edit 3
Also, since Ubuntu 21.04 has end of life in a month or so, upgrading to 21.10 could really make sense. I have just tried that, and it seems that Python 3.10 virtual environments work just fine in that release.
With the advent of Ubuntu 22.04 (which caused a few minor issues with some specific python virtualenv setups I had) and having already spent some time figureing this out, I ended up switching to using pyenv.
Pyenv made it very easy to install any python version and any number of virtual environments within it which can be further customised in complete isolation.
The only thing to be careful of is installing all necessary python prerequisites before installing a specific version to avoid missing functionality from some packages (e.g. not including the lzma library will generate an ominous warning from pandas ("your python installation is incomplete..."). This can be ignored if you are not using that functionality or otherwise, easily fixed.
I think that this is a better option overall if you have to manage different versions and specific configurations for python, even across distros.
This solution is very close to the one suggested before ("install from source") so I will be accepting that one and leave my contribution as additional information about this problem.

Packages in python with pip-installed but new projects don't detect package

Just as a background, I am an R user but learning python:
I have installed python libraries I want to get familiar with (pandas, numpy, Pytorch, tensorflow...), but when I start a new project, it does not detect the packages when I try to import and requests that I install them, even though I definitely did at one point. I am obviously making mistakes when installing them, but don't know what.
I use pip as a package manager if that is relevant
Have you attempted installing pip3 and using that in place of pip? I know I had some issues a while back where pip would install my packages for python 2.X and not 3.X.

Rationale for having having to install the python3-venv package

Some things you should know before I ask my question:
I am utterly new to both Linux & Python, and have a hard time understanding official documentation and technical answers (but have a burning desire to deeply understand both)
I am running elementary OS 0.4.1 Loki
My Python 3 version is 3.5.2. When I search the online documentation on the venv module for python 3.5.2, I get the documentation for the 3.5.6 version. I don't understand why there is no documentation for the .2 version.
So, here's my problem. I was trying to create a virtual environment using venv and proceeded thusly:
According to Python's 3.5.6 venv module documentation, a virtual environment is created using the command pyvenv /path/to/new/virtual/environment. I tried that command and got:
The program 'pyvenv' is currently not installed. You can install it by typing: sudo apt install python3-venv
I then searched documentation for newer Python versions and tried the new venv command python3 -m venv /path/to/new/virtual/environment and got the following result:
The virtual environment was not created successfully because ensurepip is not available. On Debian/Ubuntu systems, you need to install the python3-venv package using the following command. apt-get install python3-venv
In both cases, the solution seems to be to install python3-venv. My question is: What exactly am I installing by installing python3-venv: Isn't venv already part of the Standard Library? Furthermore, why do I have to install it via apt-get if it is a Python module? It is my understanding that standard library modules are imported, not installed; and that modules external to the standard library are installed via pip. Related to this, why is ensurepip not available?
Second part of my question: if installing python3-venv is the way to go, what is the proper way to create a virtual environment using venv in Python 3.5.2: pyvenv my_virtual_environment or python3 -m venv my_virtual_environment?
Don't worry about the documentation not matching the micro version number – increments in that place are only for bugfixes, so the documentation stays the same.
Your question is interesting since venv is indeed not an optional module. My guess is that the Python version shipped with your OS (or that you installed yourself) seems to come with a stripped down or no standard library. For instance, the python3.5-minimal package doesn't appear to have it. Does your Python have the other modules in the standard library?
Edit: See also this question.
Installation can be described as "putting files onto your computer, in the right place". Importing a module, however, means that you tell Python to make available some functionality. To import a module, it must be installed (e.g. in /usr/lib/python3.5 for Python 3 on my computer), and one method for installing additional modules is via apt.
The python3 -m venv my_virtual_environment method should work in 3.5 as well and is the future-proof version, so you should probably go with that.

iPython on Win8 fails to install modules

Please bear with me the total novice of Python 3 and non native English speaker.
I'm using windows 8 in iPython notebook environment, and I have problem installing modules such as Jieba. If you go to its homepage, the English version is in the bottom (however, not as updated as the Chinese version). It says it supports Python 3 as well, so I tried using git as it suggested, but it gave me this (I successfully cloned it before).
Some other things I tried as well
Using cmd to run setup.py
Using pip3 (it's another module I failed to install)
Installing easy_install (same, couldn't install it)
Do I have to use other approaches to install modules cause I'm using iPython notebook? I'm so frustrated. I'm trying to do text mining on some Chinese texts but I struggle so much with this hurdle already.
You've already cloned Jieba into your home directory (~), which is why your second attempt at cloning failed. Enter the jieba directory, and run git pull to sync any changes from the master repo. You can then run python setup.py install in that directory to install the module.
To install pip on Windows, follow the instructions in this question.

Install python package (e.g. lxml) to specific python version (e.g. 3.1) when another python package (e.g 2.6) is default

It bugs me theese simple things:
I notice that my installed lxml can't be found from my python3.1 shell.
There is no problem in python2.6 shell
So my question now is - how can I install lxml to python3.1? sudo pip install lxml just tells me that lxml is already installed.
I know I could properbly use virtualenv to do this - but quite honestely I am just trying to learn python in the most simple manner posible. I therefore don't think I need "package lockdown" the way I (mis)understands virtualenv provides. Anything that can keep the confusion at bay and let me focus on the programming :-)
I could of course just uninstall python2.6 but I guess all hell would break loose on my debian box then.
Thanks in advance
Each version of Python is independent of the others. This goes for everything including pip commands. You need to install pip once per each version, and you then get one pip command per version.
For a longer explanation, see my blog post on the topic.

Resources