Docker File Updates for a Python Project - python-3.x

I currently have a dockerfile which installs python libraries in the container which i eventually use to execute code. Now for every release , I need to add or update the dependencies , which results in rebuilding the image.
The issue is while rebuilding many internal transitive libraries create version issues which effects my functionality , for eg some library can bring in a new numpy library version which can cause issues in the code.
How should I handle this problem ? Should I create a new base image for every release and update it in dockerfile ?
Edit : Caching does not help me , because the moment my requirement.txt file change , rebuild will happen.
Also , I cant specify versions for all libraries. Transitive libraries are a challenge here.

This is not related to Docker. You can either pin the package version in requirement.txt or use Poetry to manage the dependencies. Poetry uses a lock file which makes sure the proper version is installed for all the dependencies.

Related

How to import and use a modified npm library packages dynamically

I am using a sigmajs library for creating node based graph visualisations. But the library package had a few bugs, so I modified a few files in the source code of the library and fixed them.
I have hosted my graphs on django server, and whenever I host it, the sigma package in the package.json gets loaded dynamically each time. The static library files on my machine which I had modified and fixed bugs don't get loaded. So,I get the same old package and not the modified one.
How do I access the modified library package dynamically when I host the server.
My advice is that of copying your fixed version of the library on server and install it from local path instead of remote npm repository like this:
npm install --save /path/to/fixed/lib/dir/in/server.
See this answer: npm local install
Pay attention that your fixed lib won't be sync with official one.
I don't know how you modify the library but i suggest to fork the official repository and syncronize your local one with remote one as for example explaind here sync forked repo github.
In this way you can sync to official repo while you mantain your fix and you will install your modified local one. Eventually consider to open issues and PR on sigmajs official repo to apply your fix directly to official library. If they will be accepted you can then install directly official version.

How do I properly handle private python package dependencies?

FYI, I'm quite new to Python and it's packaging and dependency tools seem confusing.
I am going to be writing series of Python packages that support Dags running in Apache Airflow. As these packages share some common functionality I want to extract the commonalities out into separate supporting modules. In turn, these supporting modules will rely on at least two other supporting modules. All of the modules/packages in question will be published as source distributions on an internal repository.
Is there a way for me to install the main packages such that all of the direct and indirect dependencies are installed from the private repo?
I have made use of install_requires in setup.py to install modules available via Pypi and it seems like I could do something like this to achieve my goals, however this seems like it could get messy when I need to update say the version of the indirect dependencies. Is there a better way that I can handle this? Would adding the dependencies to requirements.txt with an --extra-index argument be a valid approach?
The hierarchy of dependencies can be represented loosely as:
MainPackage
-> SupportingPackage
-> CommonUtilites
It is possible to use a git repository as a Python package source.
Just add git+{REPO_LINK}#{TAG_OR_SHA1} in requirements.txt and just pip install -r requirements.txt.
See How to add git source in requirements.txt.

How to best automate deployment of NPM-dependent project?

I'm used to deploy code depending on Composer (PHP's NPM cousing), that sports .json and .lock files. The first one describes the package and your version constraints, and the second one lists exactly what was installed. Always there's a lock file and you run composer install you're sure to receive the same set of packages; running composer update will re-read the json file, install new versions, and update the lock file.
That's awesome for production deployment, since you don't need to checkout your dependencies to your versioning system and you're sure to have the exact same set of dependencies in production as you have in development.
My question is: how to best automate deployment of NPM-dependent code? Is it possible to achieve a method similar to Composer? I've noticed that npm install only installs what's first available in the package.json file. After the first run, i.e. if you change a version constraint you must manually npm update that package - and that would render automate deployment useless, as there's no way to check in to versioning "update this package here to a new version"...
npm shrinkwrap is a analog of composer.lock file. It will generate a npm-shrinkwrap.json, that have all deps with version in it, so you can use it to deploy to production env. Also you can try a various libs from npm to lock versions or search for updates of it without changing packages.json.

Adding dependencies from a single file, without composer.json

I am struggling around a wrong usage of composer, for sure.
I set up this repository: https://github.com/alle/assets-merger
I forked the project and was just trying to make it a kohana-module, including all the dependencies.
As for it would need the YUI comporess JAR, I was trying to make just that JARfile as a dependency, and I ended to declare it in the composer.json file (please, look at this).
Once I need to add my new package to a project I add it in the require section as follows:
...
"alle/assets-merger": "dev-master",
...
But the (latest) composer update command says:
Loading composer repositories with package information
Updating dependencies (including require-dev)
Your requirements could not be resolved to an installable set of packages.
Problem 1
- Installation request for alle/assets-merger dev-develop -> satisfiable by alle/assets-merger[dev-develop].
- alle/assets-merger dev-develop requires yui/yuicompressor 2.4.8 -> no matching package found.
Potential causes:
- A typo in the package name
- The package is not available in a stable-enough version according to your minimum-stability setting see <https://groups.google.com/d/topic/composer-dev/_g3ASeIFlrc/discussion> for more details.
And my story ends here.
How should I configure my composer.json in the https://github.com/alle/assets-merger repository, in order to include it as a fully satisfied kohana-module in other projects?
Some things I notice in your composer.json.
There is a version of that CSS minify available on Packagist which says it is just a copy of the original Goole-Code hosted files, but with Composer: natxet/cssmin. It is version 3.0.2, but I think that shouldn't make a difference.
mrclay/minify is included twice in the packages with the same version. It also is available on Packagist. You will probably already use that (version 2.2.0 is registered, and because you didn't turn of Packagist access, it will be generally available for install unless a version requirement or conflict prevents it).
You are trying to download a JAR file (which is a java executable without and PHP), but try to get PHP classmaps out of it. That will fail for sure.
You did miss the big note in the Composer documentation saying that Composer cannot resolve repositories mentioned in sub packages, only in the root package. That means that whatever you mention in your alle/asset-merger package will not be used if you use that package anywhere else. You'd have to duplicate these repositories in every package in addition to adding the package name itself as "required".
What this means is that you probably avoided missing mrclay/minify because it is available on Packagist, you might as well have added the cssmin by accident, but you definitly did not add YUICompressor.
But you shouldn't add this in the first place, because it is no PHP software. You can however add post-install commands to your projects. All your Composer integration does is download the JAR file. You can do that with a post-install or post-update command. See the documentation here.

Distribution of node js module

I used the steps explained in this page: http://nodejs.org/api/addons.html and successfully created a addon.node file using the node-gyp tool. it works fine and it's a wrapper of a c++ static library.
Now I want to distribute this, I created the package.json file using:
npm init
and test it using "npm install . -g" but it tries to recompile the module which will be difficult to achieve because it will require the libraries that I'm embedding into the .node file, is it possible to distribute the .node file that I already compiled in my system?
How can I include the compiled .node file into the npm package and upload it to the npm registry. I'm sure I'm just one step to made it, but I dont know where to start.
I read about the dependencies, but seems that it's suited when your module depends on other modules, and not with your own .node file.
Thanks for your help.
ok, I finally did what I wanted to achieve, here're the options in case someone else needs this:
To avoid the compilation you could create a new folder and copy the package.json in there, along with the .node file, I didn't find this, just tried and it worked.
Provide the required libraries to allow the user his own compilation.
Although the first one worked well, I will need to create a package for windows, linux, mac, etc. Which looks very odd to say: "if you are in linux use: npm install xxx-linux", so I decided to adjust my library to allow the user the module compilation.
To do this I created a "client-dev" installer that has the required libraries precompiled, as long as the include headers required, then created the node module to be compiled using the preinstalled libraries and headers. I will need to add a help in my website to explain that, in order to install the module, the user will need to install the dependencies first using apt-get, windows installer, or mac pkg.
Although this works for me, I don't know if that will be maintainable in the long run, but I didn't find a better way to do this. (the only link that finally enlightened my goal was one saying: "if you're going to use node modules with precompiled libraries you will have nightmares", anyway... I prefer that instead of doing a full implementation in node js from scratch and maintain version for java, c#, nodejs, php, etc.

Resources