Consider following file structure of yarn workspaces:
.
├── docker-compose.yaml
├── package.json
├── packages
│ └── pkg-1
│ ├── dist
│ ├── package.json
│ ├── src
│ └── tsconfig.json
├── services
│ ├── api-1
│ │ ├── dist
│ │ ├── Dockerfile
│ │ ├── package.json
│ │ ├── src
│ │ ├── tsconfig.json
│ │ └── yarn.lock
│ └── client-1
│ ├── package.json
│ ├── src
│ └── yarn.lock
├── tsconfig.json
└── yarn.lock
I have written Dockerfile to create image for api-1:
ARG APP_DIR=/usr/app
# Build stage
FROM node:16.2-alpine AS build
ARG APP_DIR
WORKDIR ${APP_DIR}
COPY package.json ./
COPY yarn.lock ./
COPY tsconfig.json ./
WORKDIR ${APP_DIR}/packages/pkg-1
COPY packages/pkg-1/package.json ./
RUN yarn --pure-lockfile --non-interactive
COPY packages/pkg-1/tsconfig.json ./
COPY packages/pkg-1/src/ ./src
RUN yarn build
WORKDIR ${APP_DIR}/services/api-1
COPY services/api-1/package.json ./
COPY services/api-1/yarn.lock ./
RUN yarn --pure-lockfile --non-interactive
COPY services/api-1/tsconfig.json ./
COPY services/api-1/src/ ./src
RUN yarn build
# Production stage
FROM node:16.2-alpine AS prod
ARG APP_DIR
WORKDIR ${APP_DIR}
COPY --from=build ${APP_DIR}/package.json ./
COPY --from=build ${APP_DIR}/yarn.lock ./
WORKDIR ${APP_DIR}/packages/pkg-1
COPY --from=build ${APP_DIR}/packages/pkg-1/package.json ./
RUN yarn --pure-lockfile --non-interactive --production
COPY --from=build ${APP_DIR}/packages/pkg-1/dist ./dist
WORKDIR ${APP_DIR}/services/api-1
COPY --from=build ${APP_DIR}/services/api-1/package.json ./
COPY --from=build ${APP_DIR}/services/api-1/yarn.lock ./
RUN yarn --pure-lockfile --non-interactive --production
COPY --from=build ${APP_DIR}/services/api-1/dist ./dist
CMD ["node", "dist"]
Build is running from root docker-compose.yaml to have proper context:
services:
api-1:
image: project/api-1
container_name: api-1
build:
context: ./
dockerfile: ./services/api-1/Dockerfile
target: prod
ports:
- 3000:3000
It is working but this way there will be a lot of repetition while application grow. Problem is the way how packages are building.
Package can be for example normalized components collection used among client services or collection of normalized errors used among api services.
Whenever I will build some service I need to first build its depending packages which is unnecessarily repetitive task. Not mention that building steps of respective package are defined over and over again in Dockerfile of every single service that uses the package.
So my question is. Is there a way how to create for example image of package that will be used for building a service to avoid defining build steps of package in service Dockerfile?
A while ago I have posted an answer detailing how I structured a monorepo with multiple services and packages.
The "trick" is to copy all the packages that your service depends on, as well as the project root package.json. Then running yarn --pure-lockfile --non-interactive --production once will install the dependencies for the all the sub-packages since they are part of the workspace.
The example linked isn't using typescript, but I believe this could be easily achieved with a postinstall script in every package.json that would run yarn build.
Seems like you are looking for something that gives you the option to have a "parent" package.json, so you only have to invoke "build" on one and with that build the whole dependency tree.
e.g:
- package.json // root package
| - a
| - package.json // module a package
| - b
| - package.json // module b package
You might want to look into the following:
npm workspaces
lerna
Both support structures like the one mentioned, lerna has just a lot more features. To get a quick grasp on the differences, look here: Is Lerna needed anymore with NPM 7.0.0's workspaces?
Related
I have current structure for the project below:
project/
├── node_modules
├── services
└── service_1
├── src
└── lambdas
├── tsconfig.json
└── serverless.yml
└── service_2
├── src
└── lambdas
├── tsconfig.json
└── serverless.yml
├── serverless.yml
├── tsconfig.json
└── package.json
Earlier inside service_1 and service_2 there were independently package.json, but now my project has become bigger.
What I'm trying to do is to make only one global node_modules for whole project.
I've pipeline build for my project in few stages.
e.g. for main build is code below:
phases:
install:
runtime-versions:
nodejs: 14
commands:
- npm install -g serverless#2.66.2
- npm i
build:
commands:
- sls deploy --force --stage $ENV_NAME
for services the same code but only need to be added:
build:
commands:
- cd services/service_1
- sls deploy --force --stage $ENV_NAME
So for each build we have an issue to run "npm i" and that's a problem. I want to avoid duplicated "npm i" per build and have installed all packages in first build and somehow make my "node_modules" shared for all next function that'll be installed in next builds.
I've read that in serverless there are "layers" to create some shared folder for node_modules and just make a reference for each lambda to use it.
How is it possible to run "npm i" only in first build and create a reference for each functions that is inside service_1 and service_2?
I understand that inside each service/serverless.yml for each function I need to add layers. But I don't understand how to set global layer for all functions in first build.
The code below is in each service/serverless.yml
layers:
NodeLayer:
path: ../../layers
name: node-layer-${self:provider.stage}
function:
myLambdaFunc:
handler: src/lambdas.myLambdaFunc
layers:
- { Ref: NodeLayerLambdaLayer }
I have a prediction application with the below folder structure:
Docker
├── dataset
│ └── fastText
│ └── crawl-300d-2M.vec
├── Dockerfile
├── encoder
│ └── sentencoder2.pkl
├── pyt_models
│ └── actit1.pt
├── requirements.txt
└── src
├── action_items_api.py
├── infer_predict.py
├── model.py
├── models.py
└── sent_enc.py
Dockerfile:
FROM python:3.6
EXPOSE 80
# copy and install packages for flask
COPY /requirements.txt /tmp/
RUN cd /tmp && \
pip3 install --no-cache-dir -r ./requirements.txt
WORKDIR /Docker
COPY src src
CMD gunicorn -b 0.0.0.0:80 --chdir src action_items_api:app
In the Docker file I try only to copy the src folder where all the Python files are placed. I want to keep the fastTest, ecnode, pyt_models to be accessed outside the container.
When I tried:
docker run -p8080:80 -v /encoder/:/encoder/;/pyt_models/:/pyt_models/;/dataset/:/dataset/ -it actit_mount:latest
But by doing this my code gives me FileNotFoundError No such file or directory: 'encoder/sentencoder2.pkl'
But by keeping the same folder structure if I run from the docker folder:
gunicorn --chdir src --bind 0.0.0.0:80 action_items_api:app It works.
What is wrong with the Dockerfile or the docker run?
Because you set WORKDIR /Docker, the gunicorn process will have its working directory set to /Docker. Which implies that relative file paths in your python app will be resolved from /Docker.
Give a try to
docker run -p8080:80 \
-v $(pwd)/encoder/:/Docker/encoder/ \
-v $(pwd)/pyt_models/:/Docker/pyt_models/ \
-v $(pwd)/dataset/:/Docker/dataset/ \
-it actit_mount:latest
docker: Error response from daemon: create ./folder: "./folder" includes invalid characters for a local volume name, only "[a-zA-Z0-9][a-zA-Z0-9_.-]" are allowed. If you intended to pass a host directory, use absolute path.
Here is an example:
I have a project that basically has a structure like this: (*)
my_project/
├── server/
│ ├── node_modules/
│ └── server.js
├── src/
├── node_modules/
├── Dockerfile
└── {multiple important config files for webpack and typescript etc}.json
I build the project with npm run build. This creates a dist/ folder from the src/ folder.
This is my package.json:
"scripts": {
"prebuild": "npm run install:client && npm run install:server",
"build": "webpack",
"install:client": "npm install",
"install:server": "cd server/ && npm install"
}
The final project only needs this: (**)
my_project/
├── server/
│ ├── node_modules/
│ └── server.js
└── dist/
├── webapp/
└── assets/
Now I want to create a docker image out of this.
I have a Dockerfile that's working now. It looks like this:
FROM node:boron
WORKDIR /usr/src/app
COPY package.json .
COPY . .
RUN npm run build
EXPOSE 9090
CMD [ "node", "server/server.js" ]
But from my understanding, it copies everything I have in my directory and and then creates dist/ folder and final docker image contains all of this: (***)
my_project/
├── server/
│ ├── node_modules/
│ └── server.js
├── src/
├── node_modules/
├── Dockerfile
├── {multiple important config files for webpack and typescript etc}.json
└── dist/
├── webapp/
└── assets/
How can I configure the Docker image to contain only the things in (**)
Running npm run build will create the dist folder, which is what you want. After that you can remove the stuff that you don't need from the image by adding the following to the docker file
FROM node:boron
WORKDIR /usr/src/app
COPY package.json .
COPY . .
RUN npm run build && /bin/bash -c "find . -not -name 'server' -not -name 'dist' -delete"
EXPOSE 9090
CMD [ "node", "server/server.js" ]
The command /bin/bash -c "find . -not -name 'server' -not -name 'dist' -delete" will just keep the server and dist folders
We want to start containerizing our applications, but we have stumbled upon some issues with local dependencies.
We have a single git repository, in which we have numerous node packages, under "shared" folder, and applications that require these packages.
So let's say our folder structure is as follows:
src/
├── apps
│ └── my_app
└── shared
└── shared_module
in my_app package.json we have the following dependency:
{
"dependencies": {
"shared-module": "file:../../shared/shared_module"
}
}
The issue here is that because we want to move "my_app" to run in a container, we need to npm install our local dependency.
Can this be done?
Yes, it's possible but a little bit ugly. The problem for you is that Docker is very restrictive when it comes to its build context. I'm not sure how familiar you are already with that concept, so here is the introduction from the documentation:
The docker build command builds an image from a Dockerfile and a context.
For example, docker build . uses . as its build context, and since it's not specified otherwise, ./Dockerfile as the Dockerfile. Files or paths outside the build context cannot be referenced in the Dockerfile (so no COPY ..).
The issue for you is that during a Docker build, the build context cannot be left. If you have multiple applications that you want to build, you would normally add a Dockerfile for each app.
src/
├── apps
│ ├── my_app
│ │ └── Dockerfile
│ └── my_other_app
│ └── Dockerfile
└── shared
└── shared_module
Naturally, you would cd into my_app and use docker build . to build the application's Docker image. The issue with this is that you can't access ../../shared from the build, since it's outside of the context.
So you need to make sure both apps and shared is in the build context. One way would be to place all Dockerfile in src like so:
src/
├── Dockerfile.my_app
├── Dockerfile.my_other
├── apps
│ ├── my_app
│ └── my_other_app
└── shared
└── shared_module
You can then build the applications by explicitly specifying the context and the Dockerfile:
src$ docker build -f Dockerfile.my_app .
Alternatively, you can keep the Dockerfiles inside my_app and my_other_app, and point to them:
src$ docker build -f apps/my_app/Dockerfile .
That should also work. In both cases, the build is executed from within src, which means you need to pay a little attention to the paths in the Dockerfile. The working directory is still src:
COPY ./apps/my_app /src/apps/my_app
By mirroring the folder structure you have locally, you should be able to make your dependencies work without any changes:
RUN mkdir -p /src
COPY ./shared /src/shared
COPY ./apps/my_app /src/apps/my_app
RUN cd /src/apps/my_app && npm install
Hope that helps you get started.
TL;DR How do I configure different "sub-modules" in a modular node.js project to refer to one another as simply as possible?
I'm trying to wrap my head around local packages for NPM, specifically as they relate to a modular project.
I'm building a web app with a front end and a back end API. These need to share a package which exports simple models. My project directory structure looks like this:
package
├── api
│ ├── dist
│ │ └── <compiled files>
│ ├── node_modules
│ │ └── ...
│ ├── package.json
│ └── src
│ └── <source files>
├── application
│ ├── dist
│ │ └── <compiled files>
│ ├── node_modules
│ │ └── ...
│ ├── package.json
│ └── src
│ └── <source files>
└── models
├── dist
│ └── <compiled files>
├── node_modules
│ └── ...
├── package.json
└── src
└── <source files>
Both the API and application projects are going to use models, so I abstracted that code to a separate sub-module within my project.
I've read the documentation for npm link and that seems to be the right approach because, as I understand it, it symlinks the package in the node_modules dir. This gives access to the code as it exists right now, instead of installing a copy in node_modules. Sounds like what I need, but there is a wrinkle: I'm working on this project from a couple of different places: my laptop, my office at work, and occasionally from home. In addition, others will be contributing to this project in the future.
I would like to make it as simple as possible for a new contributor to get up and running with development.
Currently, a new contributor goes through these steps:
clone the repository
cd into the models dir
run npm install
npm link
cd into the api dir
run npm install
npm link models
cd into the application dir
run npm install
npm link models
start working
What I would like to do (and I think npm should be capable of doing) is:
clone the repository
cd into the models dir
run npm install
cd into the api dir
run npm install
cd into the application dir
run npm install
start working
I could write a script to do this, but it seems like an obvious use case for npm and I suspect that it's probably capable of doing something like this. I think I may be overlooking something because I'm not finding it in the documentation.
There may be better ways to solve this, but..
One possibility is to create a package.json in the root of your project, which handles all of the initialization of your project.
package
├── package.json
And the contents of the package.json has no dependencies, but scripts to link and install dependencies for all of the submodules using just npm install from the package directory.
If you put the following in a package.json in your package folder
{
"name": "package-name",
"version": "1.0.0",
"description": "Some description",
"scripts": {
"init-models": "cd ./models && npm install && npm link",
"init-api": "cd ./api && npm install && npm link",
"init-app": "cd ./application && npm install && npm link",
"postinstall": "npm run init-models && npm run init-api && npm run init-app"
}
}
You will just need to npm install to initialize the whole project.