Docker installed the wrong version of Python despite specifying the version - python-3.x

This is the part of my Dockerfile that installs Python and my code's dependencies.
FROM ubuntu:18.04
RUN apt-get update && \
apt-get install -y software-properties-common && \
add-apt-repository ppa:deadsnakes/ppa && apt-get update && apt-get install -y \
python3.8 \
python3-pip \
&& rm -rf /var/lib/apt/lists/*
RUN ln -s /usr/bin/python3 /usr/bin/python
RUN ln -s /usr/bin/pip3 /usr/bin/pip
# Update Python with the required packages
RUN pip install --upgrade pip
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
The image gets created and then when I ran the code I got this error back
q9zp213vt4-algo-1-cqgxl | /usr/local/lib/python3.6/dist-packages/paramiko/transport.py:33: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography and will be removed in a future release.
This message alerted me to the use of Python 3.6 and when I checked my image's Python version using the CLI I could indeed see it was the default Python version 3.6.9.
Apologies for this basic question, but I'm not familiar with working with Docker and I'm not sure where I'm going wrong. The Base image of Ubuntu cannot be changed.

You need to set the specific Python 3 version. RUN ln -s /usr/bin/python3 /usr/bin/python only tells Ubuntu to use the default Python 3 instead of Python 2, but not which Python 3. On my computer, python3 is linked to python3.10. You can forcibly replace the version with RUN ln -fs /usr/bin/python3.8 /usr/bin/python3

Related

Install specific version of python in docker

I try to run container from image nvcr.io/nvidia/tensorflow:22.08-tf2-py3. But I have a problem.
The built docker-image contains python3.8. But I don't understand why I have this version of python in my docker-image. It is necessary to use python with version>=3.10 for correct work with libraries that I need. Version=3.8 is not explicitly specified in Dockerfile. When I try to install an another version:
RUN apt-get update && apt-get install -y software-properties-common && add-apt-repository ppa:deadsnakes/ppa && apt-get install -y python3.11
RUN python3.11 -m pip install --upgrade --no-cache -r requirements.txt
I get an error /usr/bin/python3.11: No module named pip during image building.
How can I correctly install specific version of python in my docker-image using Dockerfile?
You got Python 3.11 alright but you are missing the PIP module. Add python-pip or python3-pip to the list of packages you install with apt-get.
This is regarding your problem with python3.11 pip and not the tensor version support
Looking at the original python3.11 docker in here you should be able to use the bellow code.
# if this is called "PIP_VERSION", pip explodes with "ValueError: invalid truth value '<VERSION>'"
ENV PYTHON_PIP_VERSION 22.3
# https://github.com/docker-library/python/issues/365
ENV PYTHON_SETUPTOOLS_VERSION 65.5.0
# https://github.com/pypa/get-pip
ENV PYTHON_GET_PIP_URL https://github.com/pypa/get-pip/raw/66030fa03382b4914d4c4d0896961a0bdeeeb274/public/get-pip.py
ENV PYTHON_GET_PIP_SHA256 1e501cf004eac1b7eb1f97266d28f995ae835d30250bec7f8850562703067dc6
RUN set -eux; \
\
wget -O get-pip.py "$PYTHON_GET_PIP_URL"; \
echo "$PYTHON_GET_PIP_SHA256 *get-pip.py" | sha256sum -c -; \
\
export PYTHONDONTWRITEBYTECODE=1; \
\
python get-pip.py \
--disable-pip-version-check \
--no-cache-dir \
--no-compile \
"pip==$PYTHON_PIP_VERSION" \
"setuptools==$PYTHON_SETUPTOOLS_VERSION" \
; \
rm -f get-pip.py; \
\
pip --version
I was able to install pip for my python3.11 on Fedora(outside Docker) using python3 -m ensurepip (taken from here)

How to install pip for certain version of python3 in docker

I'm trying to run docker:
FROM nvcr.io/nvidia/l4t-base:r32.6.1 as base
COPY requirements.txt /tmp/r.txt
RUN apt update && apt install -y python3.7
RUN update-alternatives --install /usr/local/bin/python python /usr/bin/python3.6 20 && \
update-alternatives --install /usr/local/bin/python python /usr/bin/python3.7 40
RUN apt-get -yqq update && \
apt install -y python3-pip gcc
The image nvcr.io/nvidia/l4t-base:r32.6.1 has ubuntu 18.04 and python 3.6 installed. I want to make image with python 3.7 as default.
But when I build container from the following image pip3 is from python 3.6. And after I run pip install it install all libraries for python 3.6.
root#localhost:/# pip3 --version
pip 9.0.1 from /usr/lib/python3/dist-packages (python 3.6)
But I want that libraries to be for python 3.7.
I tried to delete python3.6 the docker file is:
FROM nvcr.io/nvidia/l4t-base:r32.6.1 as base
RUN apt-get -y purge python3.6 && \
apt-get -y autoremove
RUN apt-get -yqq update && \
apt install -y python3.7 python3-pip gcc
But again pip3 is for python 3.6.
Any thoughts?

docker ERROR: Could not find a version that satisfies the requirement apturl==0.5.2

I am using windows 10 OS. I want to build an container based on linux so I can replicate code and dependencies developed from ubuntu. When I try to build it outputs Error message as above.
From my understanding docker for desktop runs linux OS kernel under-the-hood therefore allowing window users to run linux based containers, not sure why it is outputting this error.
My dockerfile looks like this:
FROM ubuntu:18.04
ENV PATH="/root/miniconda3/bin:${PATH}"
ARG PATH="/root/miniconda3/bin:${PATH}"
RUN apt update \
&& apt install -y htop python3-dev wget
RUN wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh \
&& mkdir root/.conda \
&& sh Miniconda3-latest-Linux-x86_64.sh -b \
&& rm -f Miniconda3-latest-Linux-x86_64.sh
RUN conda create -y -n ml python=3.7
COPY . src/
RUN /bin/bash -c "cd src \
&& source activate ml \
&& pip install -r requirements.txt"
requirements.txt contains:
apturl==0.5.2
asn1crypto==0.24.0
bleach==2.1.2
Brlapi==0.6.6
certifi==2020.11.8
chardet==3.0.4
click==7.1.2
command-not-found==0.3
configparser==5.0.1
cryptography==2.1.4
cupshelpers==1.0
dataclasses==0.7
When I run docker build command it outputs:
1.649 ERROR: Could not find a version that satisfies the requirement apturl==0.5.2 1.649 ERROR: No matching distribution found for apturl==0.5.2 Deleting it and running it lead to another error. All error seem to be associated with ubuntu packages.
Am I not running a ubuntu container? why aren't I allowed to install ubuntu packages?
Thanks!
You try to install ubuntu packages with pip (which is for python packages")
try apt install -y apturl
If you want to install python packages write pip install package_name

lsb_release: command not found in latest Ubuntu Docker container

I just wanted to test something out real quick. So I ran a docker container and I wanted to check which version I was running:
$ docker run -it ubuntu
root#471bdb08b11a:/# lsb_release -a
bash: lsb_release: command not found
root#471bdb08b11a:/#
So I tried installing it (as suggested here):
root#471bdb08b11a:/# apt install lsb_release
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package lsb_release
root#471bdb08b11a:/#
Anybody any idea why this isn't working?
It seems lsb_release is not installed.
you can install it via
apt-get update && apt-get install -y lsb-release && apt-get clean all
This error can happen due to uninstalling or upgrading the default python3 program version in ubuntu 16.04
The way to correct this is by reinstalling the original python3 version which comes with ubuntu and relinking again. (in ubuntu 16.04 - the default python3 version is python 3.5
sudo rm /usr/bin/python3
sudo ln -s /usr/bin/python3.5 /usr/bin/python3
Just use cat /etc/os-release and that should display the OS details.
Screenshot from debian.
Screenshot from ubuntu.
Screenshot from fedora.
lsb_release.py lives in /usr/share/pyshared which to me doesn't look like python3.6 and above is referencing.
I found the following will create a link back from a later Python install to this /usr/share script:
sudo ln -s /usr/share/pyshared/lsb_release.py /usr/lib/python3.9/site-packages/lsb_release.py
In case one is trying to deal with lsb_release: command not found on fedora or redhat, the package to install is redhat-lsb-core , so sudo dnf install redhat-lsb-core
While writing Dockerfile we can add lsb-release package - like this
RUN apt-get update -y \
&& apt-get upgrade -y \
&& apt-get install lsb-release -y \
&& apt-get clean all
Assuming OS is Ubuntu.

How can I Dockeries a python script which contains spark dependencies?

I have a Python file, in which I tried to import Spark libraries.
When I built it with the Docker File it is giving me error as 'JAVA_HOME' is not set.
I tried to install Java through Docker file, but it is giving error as well.
Below is the Dockerfile I tried to execute.
FROM python:3.6.4
RUN apt-get update && \
apt-get upgrade -y && \
apt-get install -y software-properties-common && \
add-apt-repository ppa:webupd8team/java -y && \
apt-get update && \
echo oracle-java7-installer shared/accepted-oracle-license-v1-1 select true | /usr/bin/debconf-set-selections && \
apt-get install -y oracle-java8-installer && \
apt-get clean
ENV JAVA_HOME /usr/lib/jvm/java-8-oracle
ADD Samplespark.py /
COPY Samplespark.py /opt/ml/Samplespark.py
RUN pip install pandas
RUN pip install numpy
RUN pip install pyspark
RUN pip install sklearn
RUN pip install sagemaker_pyspark
RUN pip install sagemaker
CMD [ "python", "./Samplespark.py" ]
ENTRYPOINT ["python","/opt/ml/Samplespark.py"]
Please help me to install the Java dependencies for PySpark in Docker.
You have Debian os, not ubuntu os. These ppas are for ubuntu os. According to this, article oracle java8 is not available in Debian due to licensing issues.
You have following options-
1. Use an Ubuntu docker image which comes with preinstalled oracle java8 like this one
2. Follow this tutorial on how to install Oracle java8 on Debian Jessie
3. Install open_jdk sudo apt-get install openjdk-8-jre

Resources