Python 3.6 in tensorflow gpu docker images - python-3.x

How can I have python3.6 in tensorflow docker images.
All the images I tried (latest, nighty) are using python3.5 and I don't want to modify all my scripts.

The Tensorflow images are based on Ubuntu 16.04, as you can see from the Dockerfile. This release ships with Python 3.5 as standard.
So you'll have to re-build the image, and the Dockerfile will need editing, even though you need to do the actual build with the parameterized_docker_build.sh script.
This answer on ask Ubuntu covers how to get Python 3.6 on Ubuntu 16.04
The simplest way would probably be just to change the From line in the Dockerfile to FROM ubuntu:16.10, and python to python3.6 in the initial apt-get install line
Of course, this may break some other Ubuntu version-specific thing, so an alternative would be to keep Ubuntu 16.04 and install one of the alternative ppa's also listed in the linked answer:
RUN add-apt-repository ppa:deadsnakes/ppa &&
apt-get update &&
apt-get install -y python3.6
Note that you'll need this after the initial apt-get install, because that installs software-properties-common, which you need to add the ppa.
Note also, as in the comments to the linked answer, that you will need to symlink to Python 3.6.
Finally, note that I haven't tried any of this. The may be gotchas, and you may need to make another change to ensure that the correct version of Python is used by the running container.

You can use stable images which are supplied by third parties, like ufoym/deepo.
One that fits TensorFlow, python3.6 and cuda10 can be found here or you can pull it directly using the command docker pull ufoym/deepo:py36-cu100
I use their images all the time, never had problems

With this anwer, I just wanted to specify how I solved this problem (the previous answer of SiHa helped me a lot but I had to add a few steps so that it worked completly).
Context:
I'm using a package (segmentation model for unet++) that requires tensorflow==1.4.0 and keras==2.2.2.
I tried to use the docker image for tensorflow 1.4.0, however, the default version of python of this image is 3.5 which is not compatible with my package.
I managed to install python3.6 on the docker images thanks to the following files:
My Dockerfile contains the following lines:
Dockerfile:
FROM tensorflow/tensorflow:1.4.0-gpu-py3
RUN mkdir /AI_PLATFORM
WORKDIR /AI_PLATFORM
COPY ./install.sh ./install.sh
COPY ./requirements.txt ./requirements.txt
COPY ./computer_vision ./computer_vision
COPY ./config.ini ./config.ini
RUN bash install.sh
Install.sh:
#!/urs/bin/env bash
pip install --upgrade pip
apt-get update
apt-get install -y python3-pip
add-apt-repository ppa:deadsnakes/ppa &&
apt-get update &&
apt-get install python3.6 --assume-yes
apt-get install libpython3.6
python3.6 -m pip install --upgrade pip
python3.6 -m pip install -r requirements.txt
Three things are important:
use python3.6 -m pip instead of pip, else the packages are installed on python 3.5 default version of Ubuntu 16.04
use docker run python3.6 <command> to run your containers with python==3.6
in the requirements.txt file, I had to specify the following things:
h5py==2.10.0
tensorflow-gpu==1.4.1
keras==2.2.2
keras-applications==1.0.4
keras-preprocessing==1.0.2
I hope that this answer will be useful

Maybe the image I created will help you. It is based on the cuda-10.0-devel image and has tensorflow 2.0a-gpu installed.
You can use it as base image for your own implementation. The image itself doesn't do anything. I put the image on dockerhub https://cloud.docker.com/repository/docker/patientzero/tensorflow2.0a-gpu-py3.6
The github repo is located here: https://github.com/patientzero/tensorflow2.0-python3.6-Docker
Pulling it won't do much, but for completeness:
$ docker pull patientzero/tensorflow2.0-gpu-py3.6
edit: changed to general tensorflow 2.0x image.
Also as mentioned here, the official image for the beta 2.0 release now comes with python 3.6 support

Related

How to install google-cloud-bigquery on python-alpine based docker?

I'm trying to build a docker with python 3 and google-cloud-bigquery with the following docker file:
FROM python:3.10-alpine
RUN pip3 install google-cloud-bigquery
WORKDIR /home
COPY *.py /home/
ENTRYPOINT ["python3", "-u", "myscript.py"]
But getting errors on the pip3 install google-cloud-bigquery (too long for here)..
What's missing for installing this on python-alpine?
Looks like an incompatibility issue with the latest version of google-cloud-bigquery (>3) and numpy:
ERROR: Could not build wheels for numpy, which is required to install pyproject.toml-based projects
Try specifying a previous version, this works for me:
RUN pip3 install google-cloud-bigquery==2.34.4
Actually it seems like not a problem with numpy, which builds smoothly with all the dependency libs install, but rather with pyarrow, which does not support alpine+pip build. I've found a workaround by using alpine pre-built version of pyarrow. It is much easier than building pyarrow from source. This build works for me just fine:
FROM python:3.10.6-alpine3.16
RUN apk add --no-cache build-base linux-headers \
py3-apache-arrow=8.0.0-r0
# Copying pyarrow to site-package of actual python path. Alpine python path
# and python's docker hub path are different.
RUN mv /usr/lib/python3.10/site-packages/* \
/usr/local/lib/python3.10/site-packages/
RUN rm -rf /usr/lib/python3.10
RUN --mount=type=cache,target=/root/.cache/pip \
pip install google-cloud-bigquery==3.3.2
Update python version, alpine version and py3-apache-arrow version to install later versions. This is the latest one on the time of writing.
And make sure to remove build dependencies (build-base, linux-headers) for your release docker. I prefer multistage dockers for this.

Setting up mysql-connector-python in Docker file

I am trying to set up a mysql connection that will work with SqlAlchemy in Python 3.6.5 . I have the following in my Dockerfile:
RUN pip3 install -r /event_git/requirements.txt
I also have, in requirements.txt:
mysql-connector-python==8.0.15
However, I am not able to connect to the DB. Is there anything else that I need to do to set this up?
Update:
I got 8.0.5 working but not 8.0.15 . Apparently, a protobuff dependency was added; does anyone know how to handle that?
docker file is:
RUN apt-get -y update && apt-get install -y python3 python3-pip fontconfig wget nodejs nodejs-legacy npm
RUN pip3 install --upgrade pip
# Copy contents of this directory (i.e. full source) to image
COPY . /my_project
# Install Python dependencies
RUN pip3 install -r /event_git/requirements.txt
# Set event_git folder as working directory
WORKDIR /my_project
ENV LANG C.UTF-8
I am running it via
docker build -t event_git .;docker run -t -i event_git /bin/bash
and then executing a script; the db is on my local machine. This is working on mysql-connector-python==8.0.5 but not 8.0.15, so the setup is ok; I think I just need to fulfill the protobuff dependency that was added (see https://github.com/pypa/warehouse/issues/5537 for mention of protobuff dependency).
The mysql-connector-python has the Python Protobuf as an installation requirement, this means that protobuf will be installed along mysql-connector-python.
If this doesn't work, try to add protobuf==3.6.1 in your requirements.txt.
Figured out the issue. The key is that import mysql.connector needs to be at the top of the file where the create_engine is. Still not sure of the exact reason, but at the very least that seems to define _CONNECTION_POOLS = {}. If anyone knows why, please do give your thoughts.

How do I get Pip to work with Python3 on Ubuntu 12.04

I'm running Ubuntu 12.04 and do not have the option of upgrading / using something else. It has Python3.2 on it. I have found out that pip doesn't come automatically with < 3.4. How can I get pip to work with Python3 on this machine?
I tried downloading a copy of get-pip.py and running it with python3.2, but I kept getting an error message about how < 3.4 support was dropped.
I spent a little bit getting this all to work so I wanted to make a detailed post.
First I found out I could get Python3.4 onto my Ubuntu 12.04 machine. To do this run:
sudo add-apt-repository ppa:fkrull/deadsnakes
sudo apt-get update
sudo apt-get install python3.4
Source: Is There An Easy Way To Install Python 34 On
Next, I removed Python3.2 (I only wanted to have a single Python3.x version on my machine) by running:
sudo apt-get remove python3.2
sudo apt-get remove python3.2-minimal
Then, to get pip I ran:
sudo curl "https://bootstrap.pypa.io/get-pip.py" -o "get-pip.py"
sudo python3.4 get-pip.py
Source: How To Install Pip On Ubuntu 12 04 LTS
At this point Python3.4 and pip can both be ran. Example:
python3.4 main.py
pip3 install requests
At this point I wanted to be able to just use python3 to run python3.4 - trying just python3 kept telling me I had to install python-minimal. Doing this reinstalled python3.2 for me which I did not want. Instead, I created an alias by doing the following:
vim ~/.bashrc # Open the file for editing
Add: alias python3=python3.4 to the file.
. ~/.bashrc # Make the changes apply to the current session
Source: How Do I Create A Permanent Bash Alias

Step by step: Installing Python 3.3, Lighttpd & Pymongo on Ubuntu 12.04

I'm currently migrating to new computer and I need to reinstall the software I am using which are:
Python 3.3,
Lighttpd (newest version),
Pymongo (newest version),
Ubuntu 12.04 Desktop (The System I'm using)
I started to install Python 3.3 by downloading it from the its official website (in tar.bz2 file) and by following this tutorial. Afterwards I installed Lighttpd and changed the lighttpd.conf for Python by following this tutorial, too.
I tried several paths for my cgi.assign, none of them worked. Especially /opt/python3.3/bin/python3.3 should be working, but it shows 500 - internal Server error all the time with a "hello world" test script.
Now regardless to this problem I have no clue on installing Pymongo. If I try to intall pip OR easy_install python3.3 I have to manually download it and execute the setup.py with my python3.3 executable, right? Because this always fails with an error:
`Error missing zlib on a bundle called distribute-0.7.3 (is this even the right tool I need, because it seems to be a legacy wrapper !?) or unknown url type: https for pymongo2.6.2 itself.`
I'm getting crazy with this setup. Why is this so difficult to handle? Other programs are just a few clicks to install even on a system like Ubuntu, but these particular development tools seem to be really difficult to install.If anybody has an idea on how to install all three together or has information on a better solution please help me out.
The system is used to program Python scripts in Eclipse and trying them out directly on the system (lighttpd). The database used is MongoDB. Python and MongoDB are communicating over the Pymongo driver. I am planning to use the system on a Server distribution on release and it has to be nicely scalable on a high amount of excecutions.
Thanks for your time,
It's easiest to use the Ubuntu repositories:
sudo apt-get update
sudo apt-get install python3 python3-pip lightppd python-pymongo
Or if that only installs the python2.x pymongo, use pip, which you've just installed:
sudo pip-3.3 install pymongo
Or better yet, use a virtualenv with the help of virtualenvwrapper (docs)
sudo pip install virtualenvwrapper
... # follow instructions for installing virtualenvwrapper
mkvirtualenv --python=/usr/bin/python3 -i pymongo mongoppd
workon mongoppd
... which will segregate the environment I've called 'mongoppd' from the rest of your system so you can't cause any trouble. Then you don't need sudo to pip-3.3 install things, just workon mongoppd then pip-3.3 install [...]. Or after the -i flag when you create the virtualenv to get it installed straight away.
In general, on Ubuntu, you should hardly ever have to install something manually. Your first attempt should be using sudo apt-get install (use tab-complete to see what's available or just google "ubuntu 12.04 packages [...]" and you'll find the list of packages). Then for python use pip install or pip-3.3 install as appropriate. You'll only need to run python setup.py install if you need to install a development version of a package or something obscure that's not on pip. I don't think there's a good reason to ever use easy_install these days.

ubuntu scipy works for python2.7 but not for 3.2

I have tried many ways to get scipy to play nice with python3.2 but no joy yet.
I have tried:
sudo apt-get build-dep scipy
no joy
and
sudo apt-get install python-numpy python-scipy python-matplotlib ipython ipython-notebook python-pandas python-sympy python-nose
and still no joy
The goal is to get scipy to play nice with ipython running python3.2.
Here is the terminal output.
http://pastebin.com/LkPZUSAX
help / assistance is appreciated.
try:
sudo apt-get install python32-numpy
if you have multiple versions of python installed on your system then you have to specify the version for which you want to install the library..
also you can do
python --version
to check the default python for your system
Try running the command
sudo apt-get install python3.2-numpy
instead.
Running the command
sudo apt-get install python-numpy
installs for python2.7 be default in my case.
So one must specify the python version in apt-get command.

Resources