Dockerfile Build Fails - Pipenv and Pyenv Multiple Versions of Python Found - python-3.x

Question: How can I fix the Dockerfile to properly freeze requirements.txt and successfully build?
I am working through deploying a Dockerfile of a Python script utilizing Pyenv and Pipenv for local development.
On the build step where the Piplock file is frozen to requirements.txt, I receive the following error:
Error: Invalid value for "--python": Expected Python at path
/Users/jz/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python3 does
not exist
My Dockerfile is:
FROM python:3.7
RUN pip install pipenv
COPY Pipfile* /tmp/
RUN cd /tmp && pipenv --python /Users/x/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python3 lock --requirements > requirements.txt
ENV RDKAFKA_INSTALL=system
RUN pip install -r /tmp/requirements.txt
COPY . /tmp/app/
RUN pip install /tmp/app/
CMD ["python", "./tmp/app/main.py"]
Creation of the local Pipenv environment provided this information about the interpreter, (which was used in the Dockerfile):
Using /usr/local/opt/pyenv/versions/3.8.0/bin/python3 (3.8.0) to
create virtualenv… ⠙ Creating virtual environment...Using base prefix
'/usr/local/opt/pyenv/versions/3.8.0' New python executable in
/Users/x/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python3 Also
creating executable in
/Users/x/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python
Pyenv is using 3.8.0 locally:
pyenv versions
system
3.7.5
* 3.8.0 (set by /Users/x/Projects/quanter/.python-version)
Any help getting this working would be greatly appreciated! Thank you.

The original error came from the idea that pipenv can not find python executable on the given path. It is hardcoded, but on every fully new build, pipenv will create env with a different name. Take a look at this interesting discussion: How does pipenv know the virtualenv for the current project?, it seems like it could be predictable, but there should be a better solution.
Firstly, you do not need to specify a path to python executable directly because python image comes only with one system python installation, e.g. it will be available by default via python (of course you can create different environments with different versions of python by hand).
Secondly, you can handle pipenv in docker env in a more nicer and preferable way, instead of converting pipenv into pip like flow. Take a look at the example below:
FROM python:3.7
COPY Pipfile /
COPY Pipfile.lock /
RUN pip3 install pipenv \
&& pipenv install --system --deploy --ignore-pipfile
Several notes worthing to consider:
Commit Pipfile.lock into VCS. In this case, you can be sure that in every environment you have exactly the same versions of packages and its dependencies.
You may be interested in --system option. That says to install dependencies into a system instead of virtualenv (this is what you expect). But according to this answer: https://stackoverflow.com/a/55610857/9926721 It is not officially recommended. But still, a quite popular project wemake-django-template uses it as well.
Consider splitting up environments. There is useful flag --dev for it. No more several requirement files. Handle it more efficient way.

Related

Install the latest version of my package from working directory into my local environment using Python's poetry

It's extremely useful for the development workflow to be able to build and install the latest version of a package into a local environment. You can then interactively validate and debug by importing this latest version into a Python shell or a Jupyter notebook. The problem is I've recently adopted Poetry and cannot figure how to do this now. So...
How do I install the latest version of my package from the current working directory into my local environment using Poetry?
Moving on from setuptools
Back in the day, I used to always use setuptools and it worked great. I'd put a setup.py file in the root of my repository, create a virtual environment (let's say using conda) for the project, and do...
pip install -e .
From here, I could fire up a python shell, or even configure a jupyter kernel to use this virtual environment, and I'd always have the latest version of my package to interact with.
Now setuptools has its limitations, and we've since moved on to Poetry to more tightly control dependencies and handle more sophisticated build needs and such.
The problem with poetry
If you look up what's the pip install -e . equivalent in poetry you will find this issue. Looks like the creator of poetry thinks installing directly from source like this is a hack and has no interest in supporting it. (BTW: I've tried poetry build and then pulling out the setup.py file like he suggested and it does not work)
Linking directly to source is not necessary, I'm willing to run an install command to get the latest version of the package. And when I do this with poetry, it appears to work.
cd root/of/my/project
poetry install
Installing dependencies from lock file
No dependencies to install or update
Installing the current project: my-project (0.4.8) <-- this is the latest version according to the source code in the working directory
The problem is that if I open a Python shell and try and import my package for instance, it is linked to the last version of my package that installed from a remote artifact repository (via pip install my-package) – not what's in my working directory.
python
...
>>> import my_package
>>> my_package.__version__
'0.4.7`
Now, even though I'm using poetry I'm using a conda environment to specify the Python version for my project and then installing, and using, poetry from inside that.
source activate my-package
(my-package) ... $ poetry update
I also know that poetry (not very transparently) can create and manage its own virtual environment on your behalf. I thought maybe the reason this is not working is because I need to be inside this environment (whereas I was only inside my conda environment, while poetry is installing the 0.4.8 version of my package within the virtual environment it manages).
I tried both shell and run to test this out. I get the same result.
poetry shell
Virtual environment already activated: /Users/U6020643/.conda/envs/my-package
Python 3.8.5
...
>>> import my_package
>>> my_package.__version__
'0.4.7`
What gives?
The way I fixed this: I stopped using conda to manage the Python version for projects that involve poetry and instead use pyenv.
Below is how I made that work. This was very helpful!
1. Removing conda as default environment manager.
This involved removing conda base activate from ~/.bash_profile.
Now open a new shell and verify that there's no conda environment prefix, e.g. (base) ... $.
2. Installing pyenv using Homebrew
Had been awhile, and an OS upgrade or two, since I interacted with Homebrew. Needed to do some housekeeping.
brew cleanup # This made it so that brew update didn't take forever
brew update
brew upgrade
brew cleanup
Then...
brew install pyenv
3. Install Python version(s)
Let's say you need/want Python 3.
pyenv install 3.8.5
4. Set the local Python version for your project
cd your/project/root/
pyenv local 3.8.5
5. Install poetry
See here.
6. Use it
Now install and use shell. Check the version -> hey it works!
cd your/project/root
poetry install
poetry shell
my_package --version # Package has a CLI.

Dependency file in python

I am new in python. I am creating an API in python using flask-restful. I have created APIs in java.In java we have pom.xml file for dependencies...is there any for python and flask-restful
Yes. In python We generally make requirements.txt so anyone who wants to download all the requirements can simply run the command
pip install -r requirements.txt
so if you are using virtualenv you can simply do
pip freeze > requirements.txt
Otherwise you need to add all the dependencies manually & the requirements.txt file will looks like
decorator==4.3.0
defusedxml==0.5.0
entrypoints==0.2.3
Flask==1.0.2
google==2.0.1
Note: It's just for example.
I would recommend using pipenv.
In Java, you need to know where your library dependencies are located and they usually are downloaded once for each project as you need them. In other words, each project will have their own set of plugins. This is also the same for non-global NPM packages (package.json), Ruby gems (Gemfile), and so on.
But in Python, everything you install with pip is global. Anything you install with pip will make your system Python installation messy at best and at worst will not be portable between developer machines. We get around this problem with the concept of a virtual environment which more or less is a copy of whatever Python version you're using self-contained to a project.
pipenv works pretty similarly to npm.
You initialise with pipenv --three and use pipenv install Flask (for example) to install packages and keep track of them in Pipfile.toml and a lock file. You can then clone it on another computer and pipfile install to install all dependencies.
If this tool does not work for you, you may also want to try pyenv and virtualenv and use a requirements.txt file as suggested by Rahul.
Hope that helps!

Python 3 - How do you re-create your Pipfile?

I am a major Python noob, and I made the mistake of manually deleting my Pipfile and Pipfile.lock, thinking that they would eventually regenerate.
How can I re-create these files?
There is a simple fix to this:
First you need to install pipenv if you haven't already:
pip install pipenv
Then change directory to the folder containing your Python project and initiate Pipenv (replace my_project with the name of your project folder):
cd my_project
pipenv install
This will create two new files, Pipfile and Pipfile.lock, in your project directory, and a new virtual environment for your project if it doesn’t exist already.
For regular pip:
pip freeze > requirements.txt
For pipenv:
pipenv run pip freeze > requirements.txt
And to install:
pip install requirements.txt
Situation: you have deleted Pipfile and Pipfile.lock, but your pipenv environment is still there. By this I mean pipenv shell puts you in your environment with a python that has all your packages right there.
# Create a new Pipfile
pipenv install
# Get a list of your packages + version from the would-be "reqiurements.txt", then install them one by one.
pipenv run pip freeze|sed 's/^/pipenv install /'|source /dev/stdin
Note that this is not a perfect solution, as it will specify the versions in the production target environment as you would get from pip freeze i.e. requirements.txt. However this has saved me more than once, so I hope it helps you.

Correctly patching Python open source package without package clashing

I debated which Stackoverflow site this best fit but couldn't decide.
I'd like to contribute to an open-source project on Github, but I can't figure out how to prevent the stable version already installed on my machine and the development version I'd like to make a patch for from clashing on import.
The repository only suggests pip installing with the editable.
What I've done so far is: clone the repository locally, and then trying to import it in a Jupyter Notebook from the directory above. However, the Jupyter Notebook is referencing the stable version installed earlier with pip. I tried to append to sys.path the child directory holding the package but still the same issue. I can't seem to get relative imports working either. Do I need to uninstall the stable version?
Any tips are appreciated!
You'd use virtualenv for this. It will allow you to create an environment that is isolated from your system python and you can install the dev version of the library on it.
Basic usage (for Unix-like systems) is:
$ pip install virtualenv
$ virtualenv MY_ENV
$ cd MY_ENV
$ source bin/activate # activates the local python for this shell only
(MY_ENV)$ pip install <some-module> # installs to a local and isolated python
(MY_ENV)$ python ... # runs python in the local environment
(MY_ENV)$ deactivate # disable the isolated python
$

Can I create a virtualenv after making the mistake of installing all project packages globally?

So I am a python newb like first project ever newb. I jumped right in disregarding virtualenv and installed everything globally. Now I need to be able to share my project with other team members.
Can I create a virtualenv after making the mistake of installing all project packages globally?
I am using python 3. I've read these links:
pip installing in global site-packages instead of virtualenv
How to import a globally installed package to virtualenv folder
But I don't think thats what Im looking for. I want to go the requirements.txt route I think.
Any guidance or suggestions?
Yes, you can create a virtual env.
You can create a requirements.txt file for the packages you installed globally.
pip3 freeze > requirements.txt
and then you can use this requirements file for installing all the packages in the virtual env which will be isolated from your global environment.
First you need to install virtualenv:
pip3 install virtualenv
Create a new virtual env using the following command:
virtualenv -p python3 envname
You can activate the virtualenv by:
source /path/to/new/virtual/environment/bin/activate
To deactivate the environment and return to your local environment just run:
deactivate
Install the requirements from the file.
cat requirements.txt | xargs -n 1 pip3 install
This should install all your packages in the virtual environment.
To check which python you are using use which python command and to check installed packages use pip3 list
I hope this will clear your doubt.

Resources