Building image from multiple docker hub images or private repo docker - linux

i am able to create the dockerFile where i could do the stuffs. Its like i might have 10-15 apps running for now and more to go.
my dockerFile
FROM ubuntu:16.04
RUN installing necessary softwares
The thing i am trying is installing the softwares via images too. Like for
php7.0
FROM ubuntu:16.04
FROM php:7.0-cli
RUN installing necessary softwares
So currently i am making docker file for each project and do like FROM source RUN install this and that and same thing i have to do for the rest. Lets suppose i want to change the version of php for all 10 servers. i have to open file and edit. Any good suggestion to overcome this problem?

Maybe you can use ENV variables? Like
...
ENV PHP_VERSION=7.0
RUN apt-get install php=$PHP_VERSION
...
Or maybe use templating language which is offered by tool Rocker

Related

Install old packages in docker file

We are using docker containers as our build workspace . for compilation of our legacy code we need to use very old version of some of the Linux rpms. I tried giving the full name of the old rpm with their version , but it end up with the error package not available.
Request to please help how can i develop my docker file to install these old packages.
I have tried using alpine as well as centos image .
Please help me what is the correct way to do this as a good docker file practice.
One possible way is to download the package (with all the prerequisites), then in Dockerfile initiate the copy of package and additional file (if applicable) to the docker, then install them with rpm command.

Cache npm packages in Docker GitHub actions

I actually have GitHub actions that tests a nodeJS project in a Docker image (node:16-alpine). My problem is that at each run, yarn install re-installed completely all the packages. My question is: how can I cache these packages between runs ?
I've trouble doing it since the execution run in the Docker image and I could not find a solution to cache the packages. Thank you for your help!
You can use github actions cache to cache things inside your job.
If you're using a docker image separately from your job, probably you can't cache that. My suggestion, improve your workflow if you create a job for a test and need the same environment put it all in just one job with different steps.

Laradock - add custom npm package

It's a kind of not normal thing, but this is something, that temporarily is a solution.
I have laradock installed in a system and laravel app.
All that I'm using from laradock provides me command below
docker-compose up -d nginx mysql php-worker workspace redis
I need to add node package (https://www.npmjs.com/package/tiktok-scraper) installed globally in my docker, so I can get results by executing php code like below
exec('tiktok-scraper user username-n 3 -t json');
This needs to be available for php-fpm and php-worker level, as I need this in jobs and for endpoints, that should invoke scrape.
I know, that I'm doing wrong, but I have tried to install it within workspace like using
docker-compose exec workspace bash
npm i -g tiktok-scraper
and after this it's available in my workspace (I can run for instance tiktok-scraper --help) and it will show me the different options.
But this doesn't solve the issue, as I'm getting nothing by exec('tiktok-scraper user username-n 3 -t json'); in my laravel app.
I'm not so familiar with docker and not sure, in which dockerfile should I put something like
RUN npm i -g tiktok-scraper
Any help will be appreciated
Thanks
To execute the npm package from inside your php-worker you would need to install it in the php-worker container. But for the php exec() to have an effect on your workspace this workspace would need to be in the same container as your php-worker.

How do I create a distributable Docker image?

I'd like to build a NodeJS server packaged as an executable, which can then be installed and run on any Linux machine without any pre-requisite dependencies. I was considering packaging it as a Docker image, but that would mean that the user would need Docker to be installed on their system. Is there a way to package a Docker image itself as an executable, so that all the user needs to do is to run an executable file?
With docker NO
The answer for the executable from docker is no.
You can create docker/docker-compose project which you can simply run
if you have docker installed.
Without docker YES
But you can still package it without using docker (with the whole nodejs included in the executable).
Look at this link https://www.npmjs.com/package/pkg

Nodejs private module and Docker containers

I've got a nodejs project that references a module I wrote and hosted by private github repo. Dependencies in package.json look something like this:
"dependencies": {
... other stuff ...
"my_module": "git+https://github.com/me/mymodule.git",
}
That's fine, but I'd like to create a Docker container for the application, but I don't want git inside the container. I know I can host via private npm repos, but I'd love to find a way to have the build process pull the source (including that module) and then copy it to the container.
I'm fine with doing an npm install in the container, but it will not like the git dependency. Alternatively, I don't want to do an npm install on the build machine because I want the freedom to choose any container I want... I don't want the build machine to snag windows binaries to a mongo module, for example, and copy those to my debian container.
One option I considered was putting the dependency to "my_module" in devDependencies, then within the Docker container do "npm install --production", then copy the one module over. That's just inconsistent with the intent of devDependencies.
Any better/recommended solutions? I'm open to not hosting the module in github if there's a better way (but I use it on a few projects that only make sense for this client).
Theres a pretty easy solution to this. Build the node application
npm install etc
Then in your dockerfile include the COPY command, telling it where the node projects install directory is, and where you want it to copy to.
Edit:
To address the issue brought up by #angelok you should use npm rebuild once it's copied into the docker image so that it builds with the correct dependencies relative to the OS of the Docker image instead of the OS in which the node packages were initially installed. See docs for rebuild here.

Resources