Python 3 - How do you re-create your Pipfile? - python-3.x

I am a major Python noob, and I made the mistake of manually deleting my Pipfile and Pipfile.lock, thinking that they would eventually regenerate.
How can I re-create these files?

There is a simple fix to this:
First you need to install pipenv if you haven't already:
pip install pipenv
Then change directory to the folder containing your Python project and initiate Pipenv (replace my_project with the name of your project folder):
cd my_project
pipenv install
This will create two new files, Pipfile and Pipfile.lock, in your project directory, and a new virtual environment for your project if it doesn’t exist already.

For regular pip:
pip freeze > requirements.txt
For pipenv:
pipenv run pip freeze > requirements.txt
And to install:
pip install requirements.txt

Situation: you have deleted Pipfile and Pipfile.lock, but your pipenv environment is still there. By this I mean pipenv shell puts you in your environment with a python that has all your packages right there.
# Create a new Pipfile
pipenv install
# Get a list of your packages + version from the would-be "reqiurements.txt", then install them one by one.
pipenv run pip freeze|sed 's/^/pipenv install /'|source /dev/stdin
Note that this is not a perfect solution, as it will specify the versions in the production target environment as you would get from pip freeze i.e. requirements.txt. However this has saved me more than once, so I hope it helps you.

Related

Pip gets removed from virtual env after install a package

I have created a virtual environment:
Python -m venv test
I then checked that pip was indeed in the environment. this is the content of the scripts folder:
activate deactivate.bat pip.exe* python.exe*
activate.bat easy_install.exe* pip3.9.exe* pythonw.exe*
Activate.ps1 easy_install-3.9.exe* pip3.exe*
I then ran which pip to confirm that it would be the env pip, which it was.
I wanted to install django:
pip install django
After i installed it, i ran which pip again, but this time it was pointing to system pip. The scripts folder in my environment no longer had the pip files.
How do i keep things separate?

Pipenv not activating in correct directory for virtual environment

Newbie to pipenv,
TL;DR: Trying to integrate pipenv with a project using venv from PyCharm and getting unexpected dictionary behaviour
Background: Was Using PyCharm to make virtual environments for a Python project. I was recommended to move onto pipenv so the project could be run on different machines easier.
Followed procedure to install pipenv to my knowledge correctly:
pip3 install pipenv
Pipenv also seems to be working correctly when using sys.executable
/Users/swapneel/.local/share/virtualenvs/Desktop-zQLGYB_4/bin/python
Problem: When I run pipenv shell in my project directory everything is written to my desktop i.e Pipfile and Pipfile.lock.
I also tried to use my requirements.txt with:
pipenv install -r requirements.text
Only to get the message:
Requirements file provided! Importing into Pipfile…
Requirements file doesn't appear to exist. Please ensure the file exists in your project directory or you provided the correct path.
Also, this is what displays when I try activate pipenv in my desired directory:
(Desktop) bash-3.2$
Here is my directory path. For clarity I included undesired Pipfiles in desktop:
Desktop
|-Pipfile
|-Pipefile.lock
|--projects/
| |--desired project/
| |------------------__init__.py
| |------------------main.py
| |------------------requirements.txt
| |------------------modules
Any help would be greatly appreciated
EDIT:
Worth mentioning I was getting this error initially
Pipenv: Command Not Found' After 'pip install pipenv
which after some research
This seemed to fix
sudo -H pip install -U pipenv

Dockerfile Build Fails - Pipenv and Pyenv Multiple Versions of Python Found

Question: How can I fix the Dockerfile to properly freeze requirements.txt and successfully build?
I am working through deploying a Dockerfile of a Python script utilizing Pyenv and Pipenv for local development.
On the build step where the Piplock file is frozen to requirements.txt, I receive the following error:
Error: Invalid value for "--python": Expected Python at path
/Users/jz/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python3 does
not exist
My Dockerfile is:
FROM python:3.7
RUN pip install pipenv
COPY Pipfile* /tmp/
RUN cd /tmp && pipenv --python /Users/x/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python3 lock --requirements > requirements.txt
ENV RDKAFKA_INSTALL=system
RUN pip install -r /tmp/requirements.txt
COPY . /tmp/app/
RUN pip install /tmp/app/
CMD ["python", "./tmp/app/main.py"]
Creation of the local Pipenv environment provided this information about the interpreter, (which was used in the Dockerfile):
Using /usr/local/opt/pyenv/versions/3.8.0/bin/python3 (3.8.0) to
create virtualenv… ⠙ Creating virtual environment...Using base prefix
'/usr/local/opt/pyenv/versions/3.8.0' New python executable in
/Users/x/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python3 Also
creating executable in
/Users/x/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python
Pyenv is using 3.8.0 locally:
pyenv versions
system
3.7.5
* 3.8.0 (set by /Users/x/Projects/quanter/.python-version)
Any help getting this working would be greatly appreciated! Thank you.
The original error came from the idea that pipenv can not find python executable on the given path. It is hardcoded, but on every fully new build, pipenv will create env with a different name. Take a look at this interesting discussion: How does pipenv know the virtualenv for the current project?, it seems like it could be predictable, but there should be a better solution.
Firstly, you do not need to specify a path to python executable directly because python image comes only with one system python installation, e.g. it will be available by default via python (of course you can create different environments with different versions of python by hand).
Secondly, you can handle pipenv in docker env in a more nicer and preferable way, instead of converting pipenv into pip like flow. Take a look at the example below:
FROM python:3.7
COPY Pipfile /
COPY Pipfile.lock /
RUN pip3 install pipenv \
&& pipenv install --system --deploy --ignore-pipfile
Several notes worthing to consider:
Commit Pipfile.lock into VCS. In this case, you can be sure that in every environment you have exactly the same versions of packages and its dependencies.
You may be interested in --system option. That says to install dependencies into a system instead of virtualenv (this is what you expect). But according to this answer: https://stackoverflow.com/a/55610857/9926721 It is not officially recommended. But still, a quite popular project wemake-django-template uses it as well.
Consider splitting up environments. There is useful flag --dev for it. No more several requirement files. Handle it more efficient way.

Can I create a virtualenv after making the mistake of installing all project packages globally?

So I am a python newb like first project ever newb. I jumped right in disregarding virtualenv and installed everything globally. Now I need to be able to share my project with other team members.
Can I create a virtualenv after making the mistake of installing all project packages globally?
I am using python 3. I've read these links:
pip installing in global site-packages instead of virtualenv
How to import a globally installed package to virtualenv folder
But I don't think thats what Im looking for. I want to go the requirements.txt route I think.
Any guidance or suggestions?
Yes, you can create a virtual env.
You can create a requirements.txt file for the packages you installed globally.
pip3 freeze > requirements.txt
and then you can use this requirements file for installing all the packages in the virtual env which will be isolated from your global environment.
First you need to install virtualenv:
pip3 install virtualenv
Create a new virtual env using the following command:
virtualenv -p python3 envname
You can activate the virtualenv by:
source /path/to/new/virtual/environment/bin/activate
To deactivate the environment and return to your local environment just run:
deactivate
Install the requirements from the file.
cat requirements.txt | xargs -n 1 pip3 install
This should install all your packages in the virtual environment.
To check which python you are using use which python command and to check installed packages use pip3 list
I hope this will clear your doubt.

Save all currently installed packages in anaconda to a file

I want to make a .txt file with all the python packages my environment is using, and include it in the git repo, so that anyone who needs to run the code, can just make an environment, install all the packages from the list and use it.
I have two questions, first, how can I create that .txt files with all the installed packages? Second, how can someone with the .txt file install everything from it (using pip3?) in their fresh anaconda environment?
After activating your environment, you can do this:
pip freeze > requirements.txt
And to install all these packages in a fresh environment:
pip install -r requirements.txt
Hope this helps!

Resources