Process for building a package to be managed by an offline conda/puppet environment - puppet

I’m trying build a package to be managed by an offline conda environment
in Linux. I’m doing a dry run with py4j.
On my online build server:
I download the py4j recipe
And download the source distribution (py4j-0.8.2.1.tar.gz)
Copy the recipe and the source distribution to the offline puppet
server
On my offline puppet server:
tweak the recipe to point to my the copy of the source distribution.
condabuildpy4j− conda install –use-local py4j
$ conda index linux-64
conda index linux-64 writes the py4j configuration to repodata.json. I
can see py4j is in repodata.json. And there’s also a
py4j-0.8.2.1-py27_0.json created under /opt/anaconda/conda-meta/
We have a custom channel mapped to /srv/www/yum/anaconda_pkgs/
$ cat .condarc
channels:
- http://10.1.20.10/yum/anaconda_pkgs/
I can see that py4j configuration is added to the following files:
./envs/_test/conda-meta/py4j-0.8.2.1-py27_0.json
./pkgs/cache/ef2e2e6cbda49e8aeeea0ae0164dfc71.json
./pkgs/py4j-0.8.2.1-py27_0/info/recipe.json
./pkgs/py4j-0.8.2.1-py27_0/info/index.json
./conda-bld/linux-64/repodata.json ./conda-bld/linux-64/.index.json
./conda-meta/py4j-0.8.2.1-py27_0.json
Can someone explain what each of these json files is supposed to do?
I can also see that there is a repodata.json and .index.json in
/srv/www/yum/anaconda_pkgs/linux-64 that were updated but don’t have a
configuration for py4j.
I manually copied my py4j-0.8.2.1.tar.gz into my custom repo
(channel) in /srv/www/yum/anaconda_pkgs/linux-64?
I still can’t do conda install –use-local py4j from host machines or
puppet agent -t. I get the following:
err: /Stage[main]/Anaconda::Packages/Anaconda::Install_pkg[py4j]/Package[py4j]/ensure: change from absent to present failed: Execution of ‘/opt/anaconda/bin/conda install –yes –quiet py4j’ returned 1: Fetching package metadata: ..
Error: No packages found in current linux-64 channels matching: py4j
You can search for this package on Binstar with
binstar search -t conda py4j

--use-local only searches the conda-bld/linux-64 channel. If you move the package to another local channel, you will need to add it to your ~/.condarc channels as a file:// url.
Whenever you add a package to a local repo, you need to run conda index on that directory. This will regenerate the repodata.json file.

I'll answer you question about the various json files, but note that you really don't need to care about any of these.
./envs/_test/conda-meta/py4j-0.8.2.1-py27_0.json
This is a remnant from the build process. Once the package is built, it is installed into a _test environment so that the actions in the test section of your meta.yaml can be run. Each environment has a conda-meta directory that contains the metadata for each package installed in that environment.
./pkgs/cache/ef2e2e6cbda49e8aeeea0ae0164dfc71.json
Everything in the pkgs directory is a cache. This is a local cache of the channel repodata, so that conda doesn't have to redownload it when it is "fetching package metadata" if it hasn't changed.
./pkgs/py4j-0.8.2.1-py27_0/info/recipe.json
Again, this is a cache. When the p4js package is installed anywhere, it is extracted in the pkgs directory. Inside the package, in the info directory, is all the metadata for the package. This file is the metadata from the recipe that was used to create the package. Conda doesn't use this metadata anywhere, it is just included for convenience.
./pkgs/py4j-0.8.2.1-py27_0/info/index.json
This is the metadata of the package included in the package itself. It's what conda index will use to create the repodata.json.
./conda-bld/linux-64/repodata.json
This is the repo metadata for the special channel of packages you have built (the channel used with --use-local, and used by conda build automatically.
./conda-bld/linux-64/.index.json
This is a special cache file used internally by conda index.
./conda-meta/py4j-0.8.2.1-py27_0.json
This is similar to the first one. It's the environment metadata for the package that you installed into your root environment.

Related

How to uninstall RStudio (server) under Ubuntu from source build

I was trying to install RStudio-v2022.07.1-554 (server) under Ubuntu 22.04 LTS (arm64). Because there doesn't exist a binary for arm64, I have to build the RStudio from the source. After download the source (tag 2022.07.1-554), and installed all dependencies. I was able to install the binary.
However, after make install, I found out that the default CMAKE_INSTALL_PREFIX was set to /usr/local, not as the INSTALL file claimed: /usr/local/lib/rstudio-server! Now, all the 1560 library and binary files of RStudio-server are spread over under the folder /usr/local(Thank you RStudio team!):
/usr/local/./README.md
/usr/local/./INSTALL
/usr/local/./COPYING
/usr/local/./NOTICE
/usr/local/./SOURCE
/usr/local/./VERSION
...
When I try to make uninstall, I found out the makefile doesn't define any uninstall action. Fortunately, there is an install_manifest.txt file which listed all the files installed under usr/local. What I could think of to "uninstall" RStudio is to use shell script to loop through the files list and remove them one by one.
Is there any better way to uninstall the RStudio-server compiled from source other than manually delete the files one by one.
Thank you for your attention and reply in advance.

Build a debian package with user settings

I'm packaging a PyQt application for Linux as a .deb package, following the Debian maintenance guide.
The manual does a good job describing how to build the python binaries with debuild -b, and install global data files in /usr/share/<package>/ through the debian/install file. However, I don't see any mention of installing user settings files - cache files or files for configuration changes that the current user running the program might want to save.
As far as I understand, other programs usually save these in a hidden directory on the users home path - eg atom's user data in /home/<username>/.atom/.
The manual does mention conffiles. However these seem to be globally installed. I'm also not sure if they're suitable for config files that change frequently as a result of user actions, since package updates will attempt to solve conflicts between new and existing conffiles.
Some other documentation mentions postinstall scripts, but this seems potentially too complicated for something that should be common to many debian packages?

how to activate linux virtualenv in windows 10 [duplicate]

By error, I forgot to specify the WORKON_HOME variable before creating my virtual environments, and they were created in /root/.virtualenvs directory. They worked fine, and I did some testing by activating certain environment and then doing (env)$ pip freeze to see what specific modules are installed there.
So, whe I discovered the workon home path error, I needed to change the host directory to /usr/local/pythonenv. I created it and moved all the contents of /root/.virtualenvs directory to /usr/local/pythonenv, and changed the value of WORKON_HOME variable. Now, activating an environment using workon command seems to work fine (ie, the promt changes to (env)$), however if I do (env)$ pip freeze, I get way longer list of modules than before and those do not include the ones installed in that particular env before the move.
I guess that just moving the files and specifying another dir for WORKON_HOME variable was not enough. Is there some config where I should specify the new location of the host directory, or some config files for the particular environment?
Virtualenvs are not by default relocatable. You can use virtualenv --relocatable <virtualenv> to turn an existing virtualenv into a relocatable one, and see if that works. But that option is experimental and not really recommended for use.
The most reliable way is to create new virtualenvs. Use pip freeze -l > requirements.txt in the old ones to get a list of installed packages, create the new virtualenv, and use pip install -r requirements.txt to install the packages in the new one.
I used the virtualenv --relocatable feature. It seemed to work but then I found a different python version installed:
$ . VirtualEnvs/moslog/bin/activate
(moslog)$ ~/VirtualEnvs/moslog/bin/mosloganalisys.py
python: error while loading shared libraries: libpython2.7.so.1.0: cannot open shared object file: No such file or directory
Remember to recreate the same virtualenv tree on the destination host.

How to properly build package from sources

I'm using ubuntu 18.04.
I want to modify and build a project and install it as a package. For example gstreamer1.5.
So I clone repo, modify code and use ./autogen.sh and make install in project folder. Why don't I see it in apt list then? Also there is no files in /usr/lib/x86_64-linux-gnu/gstreamer-1.5/.
The reason why I want it to behave as the original package is becase I want to build another project that uses it (kurento media server). So I just want to remove some plugins I don't need that use another packages as deps I cannot use.
apt list is from the Linux distribution. You custom made things won't appear there magically.
If you make install from your custom tree your libraries and plugins will land in /usr/local/lib/.. (note the local path). You may have some control over it by setting the prefix path. Just be careful you don't break you system by overwriting with broken libraries.

Create data files in pip3 editable install mode

I'm trying to install python package in editable mode with:
pip3 install -e ./
setup.py file contains:
data_files=[
(os.path.expanduser("~") + "/.xxx", ["xxx/yyy.data"])
],
After installation the yyy.data file is not copied to .xxx folder.
Is there an option to create data files outside of the package folder when working in editable mode?
The truth is data_files has caveats. See No single, complete solution for packaging data issue on the list of Problems in Python Packaging, note in data_files section of Packaging and Distributing Project tutorial from Python Packaging User Guide, pip's bug All packages that contain non-package data are now likely installed in a broken way since 7.0.0 and wheel's bug bdist_wheel makes absolute data_files relative to site-packages.
According to information gathered from above sources your data was installed into site-packages directory instead of your home directory as you were expecting.

Resources