How to install dependencies in base AWS Lambda Node.js Dockerfile image - node.js

I am writing an AWS Lambda function using Node.js which is deployed via a container image.
I have used the base Node.js Dockerfile image for Lambda provided at the link below to configure my image. This works well. My image is deployed and my Lambda function is running.
https://docs.aws.amazon.com/lambda/latest/dg/images-create.html#images-create-from-base
Here is the Dockerfile:
FROM public.ecr.aws/lambda/nodejs:14
COPY index.js package.json cad/ ${LAMBDA_TASK_ROOT}
# Here I would like to install libgl1-mesa-dev, libx11-dev and libglu1-mesa-de
RUN npm install
CMD ["index.handler"]
However, I now need to install additional dependencies on the image. Specifically I need OpenGL to use PDFTron to convert CAD files to PDF, according to the PDFTron documentation here. So I require libgl1-mesa-dev, libx11-dev and libglu1-mesa-de.
The information on the AWS documentation above states:
Install any dependencies under the ${LAMBDA_TASK_ROOT} directory alongside the function handler to ensure that the Lambda runtime can locate them when the function is invoked.
If this was an ubuntu or alpine image I could install using apt-get or apk add. But neither is available on this base AWS Lambda Node image since this isn't an ubuntu or alpine image.
So my question is, how do I install libgl1-mesa-dev, libx11-dev and libglu1-mesa-de on this image so that the Lambda runtime can locate them when the function is invoked?

I think the equivalent for ubuntu, on Amazon Linux 2 (lambda is using it) would be:
FROM public.ecr.aws/lambda/nodejs:14
COPY index.js package.json cad/ ${LAMBDA_TASK_ROOT}
RUN yum install -y libgl1-mesa-devel libx11-devel mesa-libGL-devel
RUN npm install
CMD ["index.handler"]

Related

How to install the node in Dockerfile?

I want to install the node v18 in on AWS Linux
Prerequisite
I have Django and frontend React system. So, I want to use node when installing frontend.
If I use make Dockerfile such as From node:18 it works, but I want to use FROM python:3.9 to django work.
Is it not a good idea to put Djang and React in the same container?
Now my docker file is like this.
FROM python:3.9
ENV PYTHONUNBUFFERED 1
RUN apt-get update
WORKDIR /usr/src/app
RUN apt-get install -y npm
RUN pip install pipenv
WORKDIR /usr/src/app/frontend_react
RUN npm install --force
RUN node -v //version 12 is installed
RUN dnf module install nodejs:18/common
RUN node -v
RUN npm run build
Howeber there is no dnf.
How can I do this?
If you can use prebuilt Docker Hub images then this is much easier. I would generally avoid trying to put components with different use cases, build systems, and runtimes into the same image if possible.
In the specific case of a Django application with a React frontend, you might be compiling the frontend to static files that you then serve directly via Django. In this setup, you don't need Node to run the application, just so long as the static files exist then Django can serve them up. Docker's multi-stage build feature will let you build the front-end using a node image, then COPY it into your application. A typical example might look like:
FROM node:18 AS react
WORKDIR /app
COPY frontend_react/package*.json ./
RUN npm ci
COPY fronend_react/ ./
RUN npm build
FROM python:3.9
ENV PYTHONUNBUFFERED 1
WORKDIR /usr/src/app
COPY requirements.txt ./
RUN pip install -r requirements.txt
COPY ./ ./
COPY --from=react /app/dist/ frontend_react/dist/
EXPOSE 8000
CMD ["./manage.py", "runserver", "0.0.0.0:8000"]
The first half should look like a normal React image build, except that it doesn't have a CMD. The second half should look like a normal Django image build, plus the COPY --from=react line to get the built application from the first build stage. We don't need node or npm in the final image, only the static files, and so we don't invoke a package manager to try to install them.

npm install failing in alpine based docker image

I'm trying to run a node server in an Alpine based Docker image. However, it's failing on npm install. I would appreciate some help in figuring out what the issue is. Here is the Dockerfile
Here is the error when 'npm install' tries to run
One of your project depedencies requires an X window development package libXext which is not being install in your apk add... command.
Add the libxext-dev Alpine package, for instance.

Unable to run (Linux container) or create image (Windows container) a Gatsby React site (win binaries error, matching manifest error) through Docker

I have my website wrapped up and wanted to containerize it for experience as I've never used Docker before. It's built on Gatsby. I did a fresh install of Docker and am running into two issues:
If I try to create an image in a Linux container, it seems to work, but I can't actually run it. I get the following error: "Error in "/app/node_modules/gatsby-transformer-sharp/gatsby-node.js": 'win32-x64' binaries cannot be used on the 'linuxmusl-x64' platform. Please remove the 'node_modules/sharp' directory and run 'npm install' on the 'linuxmusl-x64' platform."
I tried the above, uninstalling and reinstalling sharp in my project to no avail.I'm not even using sharp nor do I know what it is, though.
If I switch to Windows containers, I can't even create an image as I get the following:
"no matching manifest for windows/amd64 10.0.18363 in the manifest list entries"
My Dockerfile is as follows:
FROM node:13.12.0-alpine
# set working directory
WORKDIR /app
# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH
# install app dependencies
COPY package.json ./
COPY package-lock.json ./
RUN npm install --silent
RUN npm install react-scripts#3.4.1 -g --silent
# add app
COPY . ./
# start app
CMD ["npm", "start"]
and my .dockerignore contains
node_modules
build
Dockerfile
Dockerfile.prod
.git
Things I've tried:
This tutorial > https://mherman.org/blog/dockerizing-a-react-app/ (Where I got the Dockerfile text)
This tutorial >https://www.robinwieruch.de/docker-create-react-app-development (And its Dockerfile at one point)
Changing the FROM for node: to 14.4.0, 14, with or without -alpine.
Uninstalling and re-installing sharp
Uninstalling sharp entirely and trying to run it that way (I still get the sharp error for some reason)
Reading the documentation. Which for whatever reason only tells you how to launch a default application (such as create-react-app) or one pulled from somewhere, but not how to do so for our own website.
Thanks

Add Carbone to Docker image

I have installed Carbone on my local Linux machine using the following command and it is working properly.
npm install carbone
Now, I need to add carbone in my docker image, but I don't know how to add it to the image. Should I add the npm install command to DockerFile or add it to package.json?
I got the following error if I don't add carbone to docker image:
Code : const carbone = require('carbone');
Error: Cannot find module 'carbone'
Carbone have to be used on node projects. You can install through NPM:
npm install carbone --save
Then, you must follow the documentation of the basics:
https://github.com/Ideolys/carbone/#getting-started
If you want to dockerize your application, you can start your container from the image ideolys/carbone-env-docker. It's a ready to go node:8 image with Libreoffice installed. Example of Dockerfile:
FROM ideolys/carbone-env-docker
ENV DIR /app
WORKDIR ${DIR}
COPY . ${DIR}
RUN npm install
# index.js should call carbone functions to generate your report.
CMD [ "node", "index.js" ]
Finally you can build and run the container !
If you need more help or encounter an issue, post an issue on the Carbone Github.

Installing a software and setting up environment variable in Dockerfile

I have a jar file, which I need to create a docker image. My jar file is dependent on an application called ImageMagick. Basically, ImageMagick will be installed and the path to image magick will be added as an environmental variable. I am new to Docker, and based on my understanding, I believe, a container can access only resource within the container.
So I created a docker file, as such
FROM openjdk:8
ADD target/eureka-server-1.0.0-RELEASE.jar eureka-server-
1.0.0-RELEASE.jar
EXPOSE 9991
RUN ["yum","install","ImageMagick"]
RUN ["export","imagemagix_home", "whereis ImageMagick"](Here is what am
struggling that, i need to set env variable by taking the installation
directory of imagemagick. Currently iam getting null)
ENTRYPOINT ["java","-jar","eureka-server-1.0.0-RELEASE.jar"]
Please let me know, whether the solution am trying is proper, or is there any better solution for my problem.
Update,
As am installing an application and setting env variable at the build time, passing an argument in -e runtime is no use.I have updated my docker file as below,
FROM openjdk:8
ADD target/eureka-server-1.0.0-RELEASE.jar eureka-server-
1.0.0-RELEASE.jar
EXPOSE 9991
RUN ["yum","install","ImageMagick"]
ENV imagemagix_home = $(whereis ImageMagick)
RUN ["wget","https://johnvansickle.com/ffmpeg/builds/ffmpeg-git-64bit-
static.tar.xz"]
RUN ["tar","xvf","ffmpeg-git-*.tar.xz"]
RUN ["cd","./ffmpeg-git-*"]
RUN ["cp","ff*","qt-faststart","/usr/local/bin/"]
ENV ffmpeg_home = $(whereis ffmpeg)
ENTRYPOINT ["java","-jar","eureka-server-1.0.0-RELEASE.jar"]
And while am building, iam getting an error that,
OCI runtime create failed: conatiner_linux.go: starting container process caused "exec": "\yum": executable file not found in $PATH: unknow.
Update
yum is not available in my base image package, so I changed yum as apt-get as below,
RUN apt-get install build-essential checkinstall && apt-get build-dep
imagemagick -y
Now am getting package not found build-essential, check install. returned
a non-zero code 100
Kindly let me know whats going wrong
It seems build-essential or checkinstall is not available. Try installing them in separate commands. Or searching for them.
Maybe you need to do apt-et update to refresh the repository cache before installing them.

Resources