Dependency file in python - python-3.x

I am new in python. I am creating an API in python using flask-restful. I have created APIs in java.In java we have pom.xml file for dependencies...is there any for python and flask-restful

Yes. In python We generally make requirements.txt so anyone who wants to download all the requirements can simply run the command
pip install -r requirements.txt
so if you are using virtualenv you can simply do
pip freeze > requirements.txt
Otherwise you need to add all the dependencies manually & the requirements.txt file will looks like
decorator==4.3.0
defusedxml==0.5.0
entrypoints==0.2.3
Flask==1.0.2
google==2.0.1
Note: It's just for example.

I would recommend using pipenv.
In Java, you need to know where your library dependencies are located and they usually are downloaded once for each project as you need them. In other words, each project will have their own set of plugins. This is also the same for non-global NPM packages (package.json), Ruby gems (Gemfile), and so on.
But in Python, everything you install with pip is global. Anything you install with pip will make your system Python installation messy at best and at worst will not be portable between developer machines. We get around this problem with the concept of a virtual environment which more or less is a copy of whatever Python version you're using self-contained to a project.
pipenv works pretty similarly to npm.
You initialise with pipenv --three and use pipenv install Flask (for example) to install packages and keep track of them in Pipfile.toml and a lock file. You can then clone it on another computer and pipfile install to install all dependencies.
If this tool does not work for you, you may also want to try pyenv and virtualenv and use a requirements.txt file as suggested by Rahul.
Hope that helps!

Related

Will libraries in requirements.txt install its own dependencies?

I'm trying to create a requirements.txt for my project.
I got the file with pip freeze > requirements.txt but I have some questions about it.
Basically I have a ton of libraries on my PC, so I only want the ones for my project in requirements.txt.
If for example, I remove all libraries from the file and only leave awscli, will awscli dependencies install when the user runs pip install -r requirements.txt? Or will I need to keep every dependency of every library inside the file?
Yes, they will. However, based on your question, I assume you need to familiarize yourself with a concept called virtual environments.

Dockerfile Build Fails - Pipenv and Pyenv Multiple Versions of Python Found

Question: How can I fix the Dockerfile to properly freeze requirements.txt and successfully build?
I am working through deploying a Dockerfile of a Python script utilizing Pyenv and Pipenv for local development.
On the build step where the Piplock file is frozen to requirements.txt, I receive the following error:
Error: Invalid value for "--python": Expected Python at path
/Users/jz/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python3 does
not exist
My Dockerfile is:
FROM python:3.7
RUN pip install pipenv
COPY Pipfile* /tmp/
RUN cd /tmp && pipenv --python /Users/x/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python3 lock --requirements > requirements.txt
ENV RDKAFKA_INSTALL=system
RUN pip install -r /tmp/requirements.txt
COPY . /tmp/app/
RUN pip install /tmp/app/
CMD ["python", "./tmp/app/main.py"]
Creation of the local Pipenv environment provided this information about the interpreter, (which was used in the Dockerfile):
Using /usr/local/opt/pyenv/versions/3.8.0/bin/python3 (3.8.0) to
create virtualenv… ⠙ Creating virtual environment...Using base prefix
'/usr/local/opt/pyenv/versions/3.8.0' New python executable in
/Users/x/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python3 Also
creating executable in
/Users/x/.local/share/virtualenvs/quanter-TP0oWHoL/bin/python
Pyenv is using 3.8.0 locally:
pyenv versions
system
3.7.5
* 3.8.0 (set by /Users/x/Projects/quanter/.python-version)
Any help getting this working would be greatly appreciated! Thank you.
The original error came from the idea that pipenv can not find python executable on the given path. It is hardcoded, but on every fully new build, pipenv will create env with a different name. Take a look at this interesting discussion: How does pipenv know the virtualenv for the current project?, it seems like it could be predictable, but there should be a better solution.
Firstly, you do not need to specify a path to python executable directly because python image comes only with one system python installation, e.g. it will be available by default via python (of course you can create different environments with different versions of python by hand).
Secondly, you can handle pipenv in docker env in a more nicer and preferable way, instead of converting pipenv into pip like flow. Take a look at the example below:
FROM python:3.7
COPY Pipfile /
COPY Pipfile.lock /
RUN pip3 install pipenv \
&& pipenv install --system --deploy --ignore-pipfile
Several notes worthing to consider:
Commit Pipfile.lock into VCS. In this case, you can be sure that in every environment you have exactly the same versions of packages and its dependencies.
You may be interested in --system option. That says to install dependencies into a system instead of virtualenv (this is what you expect). But according to this answer: https://stackoverflow.com/a/55610857/9926721 It is not officially recommended. But still, a quite popular project wemake-django-template uses it as well.
Consider splitting up environments. There is useful flag --dev for it. No more several requirement files. Handle it more efficient way.

Correctly patching Python open source package without package clashing

I debated which Stackoverflow site this best fit but couldn't decide.
I'd like to contribute to an open-source project on Github, but I can't figure out how to prevent the stable version already installed on my machine and the development version I'd like to make a patch for from clashing on import.
The repository only suggests pip installing with the editable.
What I've done so far is: clone the repository locally, and then trying to import it in a Jupyter Notebook from the directory above. However, the Jupyter Notebook is referencing the stable version installed earlier with pip. I tried to append to sys.path the child directory holding the package but still the same issue. I can't seem to get relative imports working either. Do I need to uninstall the stable version?
Any tips are appreciated!
You'd use virtualenv for this. It will allow you to create an environment that is isolated from your system python and you can install the dev version of the library on it.
Basic usage (for Unix-like systems) is:
$ pip install virtualenv
$ virtualenv MY_ENV
$ cd MY_ENV
$ source bin/activate # activates the local python for this shell only
(MY_ENV)$ pip install <some-module> # installs to a local and isolated python
(MY_ENV)$ python ... # runs python in the local environment
(MY_ENV)$ deactivate # disable the isolated python
$

Save all currently installed packages in anaconda to a file

I want to make a .txt file with all the python packages my environment is using, and include it in the git repo, so that anyone who needs to run the code, can just make an environment, install all the packages from the list and use it.
I have two questions, first, how can I create that .txt files with all the installed packages? Second, how can someone with the .txt file install everything from it (using pip3?) in their fresh anaconda environment?
After activating your environment, you can do this:
pip freeze > requirements.txt
And to install all these packages in a fresh environment:
pip install -r requirements.txt
Hope this helps!

Add package to Conda Environment from site-packages

this might be a fundamental misunderstanding of Python packages, but I could use some help or directions to the right resources.
I have a egg file in my Python 3.6 site-packages directory, i'll call it package.egg. When I run python from the command line I can use modules from that package. However, when I created a new Pycharm Project and a corresponding Conda environment, I can no longer use that package (for obvious reasons). However, it doesn't seem like just copying package.egg file into the project environments site files.
Is there another process, like unzipping that I have to perform before I can call those modules?
I tried running both pip install ./package.egg and conda install ./package.egg
Thank you.
I was able to resolve this error by reinstalling the package in the context of the new environment. In this case, through the PyCharm terminal I went back to the packages setup.py file and ran that. It installed the package into the correct location in my conda environment.

Resources