I'm trying to install python package in editable mode with:
pip3 install -e ./
setup.py file contains:
data_files=[
(os.path.expanduser("~") + "/.xxx", ["xxx/yyy.data"])
],
After installation the yyy.data file is not copied to .xxx folder.
Is there an option to create data files outside of the package folder when working in editable mode?
The truth is data_files has caveats. See No single, complete solution for packaging data issue on the list of Problems in Python Packaging, note in data_files section of Packaging and Distributing Project tutorial from Python Packaging User Guide, pip's bug All packages that contain non-package data are now likely installed in a broken way since 7.0.0 and wheel's bug bdist_wheel makes absolute data_files relative to site-packages.
According to information gathered from above sources your data was installed into site-packages directory instead of your home directory as you were expecting.
Related
I have a device with Python 3.7 pre-installed, without any pip package.
I created the program on my local machine with some packages in my venv (I have a requirements.txt file) and it works perfectly.
My problem is that now I want to create a directory with my programs and upload it to my device. This doesn't work because I don't have additional packages installed.
My question: Is there a way to export the installed package to a directory in my program files and import it locally and not from venv?
Copy the all the venv modules to some directory and modify PYTHONPATH variable when running your program, append your modules directory path to it.
man python3
PYTHONPATH
Augments the default search path for module files. The format is the same as the shell's $PATH: one or more directory
pathnames separated by colons. Non-existent directories are silently ignored. The default search path is installation
dependent, but generally begins with ${prefix}/lib/python<version> (see PYTHONHOME above). The default search path is al‐
ways appended to $PYTHONPATH. If a script argument is given, the directory containing the script is inserted in the path
in front of $PYTHONPATH. The search path can be manipulated from within a Python program as the variable sys.path.
In general, you have the following options to run a python script on another device than the one you developed the script on:
Generate an executable (for example with the package pyinstaller). With that solution, it is not required to have python installed on your device, as everything is embedded in the executable
If you have python installed on the device (like your case), you can just run it on it. However, if you have dependency (from PyPi or Conda), you must also install them on your device
If you have access to internet and have your requirements.txt file, you can just run pip install -r requirements.txt
If you don't have access to internet, you can either:
download the wheel for each package and then ship it to your device
just ship to your device the contents of the folders lib and lib64 of your virtual environnement folder .venv of your local machine (I hope you are using one python -m venv .venv) into the virtual environment of your device
I have created a python package and would like to distribute it on pypi ( https://pypi.org/project/catapi.py/ ). My initial v0.1.1 upload worked without issue. I decided to add in a sub directory to store abstract classes because there was a lot of code reuse. Upon uploading this to pypi and installing, I get the message that the abc module does not exist.
I did some research and found that I must include the subdirectory in the MANIFEST.in file, so I did. Upon uploading and attempting an install again, I get the same error. I downloaded the package directly and extracted the files to find the abc directory does indeed exist. Next I checked the site-packages version of catapi only to find it does not have the abc module.
Has anyone encountered this and know how to fix this? Here's a script to show the issue
# make a temp dir to hold this in
mkdir catapi
cd catapi
# Prepare python venv
python -m venv env-catapi
source env-catapi/bin/activate
pip install catapi.py==0.3.4
# Download file for comparison
wget https://files.pythonhosted.org/packages/ac/ee/044c1cc53e7c994fe4a7d57362651da8adff54eb34680c66f62a1d4fb57d/catapi.py-0.3.4.tar.gz
tar -xvf catapi.py-0.3.4.tar.gz
diff catapi.py-0.3.4/catapi env-catapi/lib/python3.8/site-packages/catapi
deactivate
cd ../
# Prints out
# Only in catapi: abc
# Only in env-catapi/lib/python3.8/site-packages/catapi: __pycache__
It's necessary to add in the sub-directories into the
packages=['package1', 'package2', 'etc']
part of setup.py. In my case, I had to add in the abc directory to have it placed in the catapi install
packages=['catapi', 'catapi.abc'],
condensed version of what I want to achieve:
Create .rpm and .deb packages from my source.py source code and make sure all dependencies get resolved when installing them on an deb/rpm based linux distribution.
More details:
Let's assume I have created a piece of software which is located in a folder structure like this:
---MyProgram Folder
---MyProgram Folder
---img Folder
---logo.ico File
---media Folder
---head.txt File
---__init__.py File
---source.py File
---a.py File
---LICENSE File
---README.md File
---setup.py File
The file setup.py contains the following:
import setuptools
with open("README.md", "r") as fh:
long_description = fh.read()
setuptools.setup(
name="MyProgram",
version="0.0.1",
author="First Last",
author_email="email#memore.com",
description="A tool to create nice things",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://google.com",
packages=setuptools.find_packages(),
classifiers=[
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
],
python_requires='>=3.7',
data_files=[
('.../MyProgram/img/logo.ico'),
('.../MyProgram/media/head.txt'),
],
)
I now run
python setup.py sdist bdist_rpm
from a cmd line under '.../MyProgram'. Two folders 'dist' and 'build' are created as well as 'MyProgram.tar.gz' and two rpm's 'MyProgram-noarch.rpm' and 'MyProgram-src.rpm'.
When i try to install 'noarch.rpm' under fedora 31 the process end successfully but no "shortcut" is created, and when i type MyProgram in a cmd line it is not found.
rpm -ql MyFilter
does find it and outputs a bunch of paths:
/usr/lib/python3.7/site-packages/MyProgram/...
/usr/lib/python3.7/site-packages/MyProgram/source.py
/usr/lib/python3.7/site-packages/MyProgram/a.py
....
Which tells me that my installation at least has copied the basic filesystem. But i also see that all the original .py files are still .py files.
My questions:
How can i 'make' the rpm so that all dependencies are contained inside the rpm, or at least get resolved by dnf/apt/yum when installing the rpm? In other wording: Is it possible to bundle all dependencies into a rpm/deb like in an .exe for example?
How can i specify a path like '/usr/bin' or 'usr/share' as installation target
dir?
How can i add a launcher app bundled into the rpm/deb?
Is the above a good way of doing this at all?
If the solution to this is trivial and i just overlooked it i am really sorry to bother you but atm i just can't see it. Sites that have relevant information and that i already reviewed:
https://docs.python.org/2.0/dist/creating-rpms.html
https://github.com/AppImage/AppImageKit/wiki/Bundling-Python-apps
Python 3.5 create .rpm with pyinstaller generated executable
https://github.com/junaruga/rpm-py-installer
https://www.pyinstaller.org/
https://packaging.python.org/overview/#python-source-distributions
https://packaging.python.org/overview/
https://pyinstaller.readthedocs.io/en/stable/usage.html
https://pyinstaller.readthedocs.io/en/stable/installation.html
https://python-packaging-tutorial.readthedocs.io/en/latest/setup_py.html
Just my two cents, rather than a complete answer. Will mostly touch on RPM packaging.
The bdist_rpm option seems easy, but you have little control of the logic of the .spec file it generates/uses and cannot do fancy stuff like scriplets, etc.
That is, unless you take the approach of having it generate the .spec file and quit (instead of building final RPM). From the docs:
If you wish, you can separate these three steps. You can use the --spec-only option to make bdist_rpm just create the .spec file and exit; in this case, the .spec file will be written to the “distribution directory”—normally dist/, but customizable with the --dist-dir option. (Normally, the .spec file winds up deep in the “build tree,” in a temporary directory created by bdist_rpm.)
But as a matter of preference and consistency, I would advise on following distro-specific guidelines for packaging Python apps.
In that way, you will be more in line with the distro's you are building for.
It is not the easiest way though. You will have to shift through some docs. Basically, if you're building for anything CentOS/RHEL, Fedora guidelines for packaging should be observed.
You can find the extra reference here, with the example .spec file for building both Python 2 and 3 versions of the same app.
For this whole 'build like a distro' thing, you would definitely want to look into using mock for the job, to build your package in a chroot.
As for the "shortcut" issue, you have to have your setup.py declare some console scripts for it to create one when you install your package. E.g. from lastversion's setup.py:
entry_points={"console_scripts": ["lastversion = lastversion:main"]},
This entry will result in a "binary" lastversion created/installed (which runs the defined function) when you install your Python package.
Subsequently, in the spec files, the macro %py2_install will make use of setup.py to create the same launcher program.
And you will then be able to ensure that launcher is packaged by placing it in the files section of the spec file:
%files -n python3-myapp
%license COPYING
%doc README.rst
%{python3_sitelib}/%{srcname}/
%{python3_sitelib}/%{srcname}-*.egg-info/
%{_bindir}/myapp
I’m trying build a package to be managed by an offline conda environment
in Linux. I’m doing a dry run with py4j.
On my online build server:
I download the py4j recipe
And download the source distribution (py4j-0.8.2.1.tar.gz)
Copy the recipe and the source distribution to the offline puppet
server
On my offline puppet server:
tweak the recipe to point to my the copy of the source distribution.
condabuildpy4j− conda install –use-local py4j
$ conda index linux-64
conda index linux-64 writes the py4j configuration to repodata.json. I
can see py4j is in repodata.json. And there’s also a
py4j-0.8.2.1-py27_0.json created under /opt/anaconda/conda-meta/
We have a custom channel mapped to /srv/www/yum/anaconda_pkgs/
$ cat .condarc
channels:
- http://10.1.20.10/yum/anaconda_pkgs/
I can see that py4j configuration is added to the following files:
./envs/_test/conda-meta/py4j-0.8.2.1-py27_0.json
./pkgs/cache/ef2e2e6cbda49e8aeeea0ae0164dfc71.json
./pkgs/py4j-0.8.2.1-py27_0/info/recipe.json
./pkgs/py4j-0.8.2.1-py27_0/info/index.json
./conda-bld/linux-64/repodata.json ./conda-bld/linux-64/.index.json
./conda-meta/py4j-0.8.2.1-py27_0.json
Can someone explain what each of these json files is supposed to do?
I can also see that there is a repodata.json and .index.json in
/srv/www/yum/anaconda_pkgs/linux-64 that were updated but don’t have a
configuration for py4j.
I manually copied my py4j-0.8.2.1.tar.gz into my custom repo
(channel) in /srv/www/yum/anaconda_pkgs/linux-64?
I still can’t do conda install –use-local py4j from host machines or
puppet agent -t. I get the following:
err: /Stage[main]/Anaconda::Packages/Anaconda::Install_pkg[py4j]/Package[py4j]/ensure: change from absent to present failed: Execution of ‘/opt/anaconda/bin/conda install –yes –quiet py4j’ returned 1: Fetching package metadata: ..
Error: No packages found in current linux-64 channels matching: py4j
You can search for this package on Binstar with
binstar search -t conda py4j
--use-local only searches the conda-bld/linux-64 channel. If you move the package to another local channel, you will need to add it to your ~/.condarc channels as a file:// url.
Whenever you add a package to a local repo, you need to run conda index on that directory. This will regenerate the repodata.json file.
I'll answer you question about the various json files, but note that you really don't need to care about any of these.
./envs/_test/conda-meta/py4j-0.8.2.1-py27_0.json
This is a remnant from the build process. Once the package is built, it is installed into a _test environment so that the actions in the test section of your meta.yaml can be run. Each environment has a conda-meta directory that contains the metadata for each package installed in that environment.
./pkgs/cache/ef2e2e6cbda49e8aeeea0ae0164dfc71.json
Everything in the pkgs directory is a cache. This is a local cache of the channel repodata, so that conda doesn't have to redownload it when it is "fetching package metadata" if it hasn't changed.
./pkgs/py4j-0.8.2.1-py27_0/info/recipe.json
Again, this is a cache. When the p4js package is installed anywhere, it is extracted in the pkgs directory. Inside the package, in the info directory, is all the metadata for the package. This file is the metadata from the recipe that was used to create the package. Conda doesn't use this metadata anywhere, it is just included for convenience.
./pkgs/py4j-0.8.2.1-py27_0/info/index.json
This is the metadata of the package included in the package itself. It's what conda index will use to create the repodata.json.
./conda-bld/linux-64/repodata.json
This is the repo metadata for the special channel of packages you have built (the channel used with --use-local, and used by conda build automatically.
./conda-bld/linux-64/.index.json
This is a special cache file used internally by conda index.
./conda-meta/py4j-0.8.2.1-py27_0.json
This is similar to the first one. It's the environment metadata for the package that you installed into your root environment.
I am trying to create a python module but a .pyc file is not created. I am working in windows. I execute 2 commands in command prompt(which I run as administrator) as following:
c:\Python33\python.exe setup.py sdist
after this statement is executed in the same window I execute
c:\Python33\python.exe setup.py install
I don't know what I am lacking???
Python3.x doesn't create .pyc files in the same directory. This is part of python3's __pycacahe__ standard. Instead, your .pyc files are stored in __pycache__ for the version that you build for.
This was mentioned in Brett Cannon's talk at Pycon-2013