Should I use Docker to deploy a library of functions? - azure

I see that Docker is intended to deploy applications, but what about libraries? For instance I have a library called RAILWAY that is a set of headers, binary code libraries, and command line tools.
I was thinking the output of the railway CI/CD pipeline can be a docker image that is pushed to a registry. Any application that wants to use railway, must be built using docker. And it will just put FROM railway:latest and COPY --from=railway ... in its Dockerfile. The application can copy whatever it need from the library image into its own image.
Is this a normal use-case?
I could use a Debian package for railway, but Azure Artifacts do not support Debian packages (only nuget and npm). And docker is just so damn easy!

Most languages have their own systems for distributing and managing dependencies (like NuGet which you mentioned) which you should use instead.
The problem with your suggestion is that it's not as simple as "applications use libraries", it's rather "applications use libraries which use libraries which use libraries which use...".
E.g. if your app wants to use libraries A and B, but library A also uses library B itself, how do you handle that in your setup? Is there a binary for B in As docker image that gets copied over? Does it overwrite the binary for B that you copied earlier? What if they're different versions with different methods in them?

Related

Using JTR in Cloud Functions?

I am trying JTR to brute force a pdf file.
The password of pdf is like First 4 Letters Last 4 Number ex: ABCD1234 or ZDSC1977
I've downloaded the jumbo source code from github and using pdf2john.pl i've extracted the hash.
But now by reading the documentation it says i need to configure and install john which is not going to work in my case.
Cloud Functions or firebase functions does not allow sudo apt get installs. and that's the reasone we can't use tools like popple utils which includes amazing pdftotext.
How can i use JTR in cloud functions properly without need of installation ?
is there any portable or prebuilt for ubuntu 18.04 version of JTR ?
It is important to keep in mind that you can't arrange for packages to be installed on Cloud Functions instances. This due to your code doesn't run with root privileges.
If you need binaries to be available to your code deployed to Cloud Functions, you will have to build it yourself for Debian, and include the binaries in your functions directory so it gets deployed along with the rest of your code.
Even if you're able to do that, there's no guarantee it will work, because the Cloud Fucntions images may not include all the shared libraries required for the executables to work.
You can request that new packages be added to the runtime using the Public Issue Tracker.
Otherway, you can use Cloud Run or Compute Engine.

Packaging Software Ideas

We have a migration tool to migrate the customers data between different applications. I am looking for ideas to make it very easy for the customer to use this tool. Right now they invoke shell scripts with some options and get the data dump, but I want to make this even more easier for the end customer. Tool is written in nodejs
pkg could be what you're looking for.
From the package description:
This command line interface enables you to package your Node.js
project into an executable that can be run even on devices without
Node.js installed.
Use Cases
Make a commercial version of your application without
sources
Make a demo/evaluation/trial version of your app without
sources
Instantly make executables for other platforms
(cross-compilation) Make some kind of self-extracting archive or
installer No need to install Node.js and npm to run the packaged
application No need to download hundreds of files via npm install to
deploy your application. Deploy it as a single file Put your assets
inside the executable to make it even more portable Test your app
against new Node.js version without installing it

How to remove development dependencies in production docker images

When shipping a dockerized node.js to production,
is it correct to ship an image that contains development dependencies?
I am not talking about the development dependencies
Not the devDependencies listed in packages.json, I mean gcc, python, node-gyp, some other *-dev packages, containing a bunch of headers, static libraries.
All of them are needed to compile some node dependencies (like node-sass)
An idea could be a two stage build, one image with all *-dev dependencies, build stuff in there, and export the results to another new image with just the binaries.
Pros: The final "production" image is small
Cons: Not standard way to build images
In general, any compiled sofware I want to distribute in docker images, should not contain the compilers, headers, and tools used to build the binaries.
If you want something to not be included in your final image, you have to do all the related commands in only one layer (one RUN statement).
Something like the following (pseudo code):
RUN install dev-dependencies && build your-project && uninstall dev-dependencies
Only one layer is created for the RUN statement and it won't contain dev dependencies.
The image would not be smaller if you remove the dependencies because the older layers contain them.
Try the new (experimental) --squash option with docker-build, using Docker 1.13.
The answer to OP question depends of how much images OP/his company maintain for production needs.
There few strategies possible:
If the quantity of maintained images is medium to low and system architecture is not very complex and do not uses dozens of images at once there the simplest and easiest solution to maintain is the best one. You can approach it with 1 single build. Or 2 step build if you want to use compiled source as base for containers that could bear different content (In this case 2nd stage could be even done during docker-compose up (system start-up)).
You can remove dev-only dependencies (as other answers suggested) if its necessary to keep the image slim/ there a lot of running containers that use same image / the size of compiled files is huge. This will increase time span of the build process but will result in smaller image.
3rd approach is totally different - if there is compiling process, use CI pipeline that independently compiles the assets within separate container (CI runner process) and provides versioned artifact - which you can use in your production builds (even store it somewhere, on S3/CDN/private, accesible to deployment storage), and then just fetch it from there, or just use files hosted there (in case of CDN).

Visual Studio NodeJS Tools - How to Package Project? Create NPM?

I've got a VS2013 solution with a mix of NodeJS (using TypeScript) and C# class library projects (they're bound together by EdgeJS). Where the NodeJS projects are concerned, one can be considered a library (for a RabbitMQ bus implementation), two are applications which are meant to be hosted as part of a fourth project with both using the bus.
So, one project (host) which will depend on three projects (bus, app1 and app2) (it starts the bus, and passes it to app1 and app2).
Of course, I could just lump all these projects together and be done with it - but that's a horrible idea.
How do I package these projects up for proper reuse and referencing (like assemblies in traditional .NET)?
Is that best done with NPM? If so, does VS provide anything in this area? If not, then what?
Note that, aside from the Bus project, I'm not looking to release these publicly - I'm not sure if that changes anything.
In general, if something can be bundled together as an independent library, then it's best to consider this a Node package and thus, refactor that logic out to it's own project. It sounds like you've already done this to some extent, separating out your bus, app1, and app2 projects. I would recommend they each have their own Git repositories if they are separate packages.
Here's some documentation to get you started with Node packages:
https://www.npmjs.org/doc/misc/npm-developers.html
The host project, if it's not something you would package but instead deploy, probably does not need to be bundled as a Node package. I would instead just consider this something that would be pulled down from Git and started on some server machine.
With all that said, your last line is important:
I'm not looking to release these publicly
GitHub does have private repositories, but as of now npmjs.org does not have private repositories. There are options to create your own private repository (Sinopia and Kappa offer different ways of accomplishing this), but if you don't want this code available for everyone do not deploy it do npmjs.org. You can still package it up in the way I've outlined it above, just not deploy it as of yet.

If I want use imagemagick in node.js I have to install imagemagick CLI tools?

I want use a imagemagick node-imagemagick to do some job about image, but beside install this module in my app, I have install the imagemagick into my computer?
the problem is my application is deploy on Paas platform, I don't think it provide the imagemagick CLI, so is there some other method I could choose?
node-imagemagick actually calls the CLI tools (using child_process), hence the dependency.
As for an alternative, it really depends on what type of image manipulation you need (and also their dependencies as well, given that it has to run in a Paas platform). There's a nice list of graphics modules on the Node wiki.

Resources