Serverless shared layers(node_modules) for all lambdas in multiple services - node.js

I have current structure for the project below:
project/
├── node_modules
├── services
└── service_1
├── src
└── lambdas
├── tsconfig.json
└── serverless.yml
└── service_2
├── src
└── lambdas
├── tsconfig.json
└── serverless.yml
├── serverless.yml
├── tsconfig.json
└── package.json
Earlier inside service_1 and service_2 there were independently package.json, but now my project has become bigger.
What I'm trying to do is to make only one global node_modules for whole project.
I've pipeline build for my project in few stages.
e.g. for main build is code below:
phases:
install:
runtime-versions:
nodejs: 14
commands:
- npm install -g serverless#2.66.2
- npm i
build:
commands:
- sls deploy --force --stage $ENV_NAME
for services the same code but only need to be added:
build:
commands:
- cd services/service_1
- sls deploy --force --stage $ENV_NAME
So for each build we have an issue to run "npm i" and that's a problem. I want to avoid duplicated "npm i" per build and have installed all packages in first build and somehow make my "node_modules" shared for all next function that'll be installed in next builds.
I've read that in serverless there are "layers" to create some shared folder for node_modules and just make a reference for each lambda to use it.
How is it possible to run "npm i" only in first build and create a reference for each functions that is inside service_1 and service_2?
I understand that inside each service/serverless.yml for each function I need to add layers. But I don't understand how to set global layer for all functions in first build.
The code below is in each service/serverless.yml
layers:
NodeLayer:
path: ../../layers
name: node-layer-${self:provider.stage}
function:
myLambdaFunc:
handler: src/lambdas.myLambdaFunc
layers:
- { Ref: NodeLayerLambdaLayer }

Related

Google Cloud Functions, NodeJS monorepo and local dependencies

So I have this NodeJS monorepo which is structured as follows:
monorepo/
├─ src/
│ ├─ libs/
│ │ ├─ types/
│ │ │ ├─ src/
│ │ │ ├─ package.json
│ ├─ some_app/
│ ├─ gcp_function/
│ │ ├─ src/
│ │ ├─ package.json
├─ package.json
Multiple projects use the same types library, so anytime a type changes, we can update all references at once. It has worked really great so far and I'm really happy with the structure.
Except now I needed to create a function on the Google Cloud Platform. In this function I also reference the types project.
The functions package.json is as follows:
{
"devDependencies": {
"#project_name/types": "*",
"npm-watch": "^0.11.0",
"typescript": "^4.9.4"
}
}
The #project_name/types refers to the src/libs/types project. This works with every project, because we have a centralised build tool.
This means the function works locally on my machine and I have no problem developing it, but as soon as I push it to Google Cloud (using the command listed below) I get the following error:
npm ERR! 404 '#project_name/types#*' is not in this registry.
I use this command to deploy from ./monorepo:
gcloud functions deploy gcp-function --gen2 --runtime nodejs16 --trigger-topic some_topic --source .\src\gcp_function
I think it's pretty clear why this happens:
Google Cloud only pushes the function, which then builds on Google Cloud Build. But because the rest of the monorepo doesn't get pushed, it doesn't work as it has no reference to #project_name/types.
I've been struggling with this issue for quite a while now.
Has anybody ever run into this issue and if so, how did you fix it?
Is there any other way to use local dependecies for Google Cloud functions? Maybe there is some way to package the entire project and send it to Google Cloud?
I have resolved this issue by using a smart trick in the CI pipeline.
In the CI I take the following steps:
Build the types and use npm pack to package it.
Copy the compressed pack to the function folder. There install it using npm i #project_name/types#file:./types.tgz. That way it gets overridden in the package.json.
Zip the entire GCP Function and push it to Google Cloud Storage.
Tell Google Cloud Functions to build and run that zip file.
In the end my CI looks a bit like this:
steps:
- name: Build types library
working-directory: ./src/libs/types
run: npm i --ignore-scripts && npm run build && npm pack
- name: Copy pack to gcp_function and install
run: cp ../libs/types/*.tgz ./types.tgz && npm i #project_name/types#file:./types.tgz
- name: Zip the current folder
run: rm -rf node_modules && zip -r function.zip ./
- name: Upload the function.zip to Google Cloud Storage
run: gsutil cp function.zip gs://some-bucket/gcp_function/function.zip
- name: Deploy to Google Cloud
run: |
gcloud functions deploy gcp_function \
--source gs://some-bucket/gcp_function/function.zip
This has resolved the issue for me. Now I don't have to publish the package to NPM (I didn't want to do that, because it doesn't fit in the monorepo narrative). And I can still develop the types which get updated live in every other project while developing.

Best way to run multiple package.json build processes with a single Heroku dyno?

I have a Laravel application that is hosted on Heroku and serves multiple Javascript games from a single domain. Everything is contained in a single Git repo. The back-end application provides OAuth and other features that are shared by all of the games. Each game is a standalone React app with its own separate node_modules folder. The folder structure looks like this:
├── package.json
├── node_modules
└── games/
└── checkers/
├── package.json
└── node_modules
└── chess/
├── package.json
└── node_modules
└── mahjongg/
├── package.json
└── node_modules
└── public/
├── app-compiled.json
└── games/
└── checkers/
└── app-compiled.js
└── chess/
└── app-compiled.js
└── mahjongg/
└── app-compiled.js
Each time I push a new version of this up to Heroku, I need to run a separate build command to compile the main web application and each of the separate React games. Basically, this is what needs to happen:
npm install
npm run production
cd games/checkers
npm install
npm run production
cd ../chess
npm install
npm run production
cd ../mahjongg
npm install
npm run production
Those commands will compile the React games and place each one under public/games/{slug}/app-compiled.js where they can be served by the common web application.
I already tried adjusting the heroku-postbuild script in my main package.json file to look like this:
"heroku-postbuild": "npm run production && cd games/checkers && npm install && npm run production && cd ../chess && npm install && npm run production && cd ../mahjongg && npm install && npm run production"
That actually works, but I'm worried that it might break at some point in the future as the build process gets bigger and more complex. Is there a better, more supported way to accomplish what I'm trying to do here?
Note that I am not open to the idea of running a separate Heroku dyno for each React app. I am eventually going to have about 30 games, and it would be cost-prohibitive to run each on a separate dyno.

Yarn workspaces build with docker

Consider following file structure of yarn workspaces:
.
├── docker-compose.yaml
├── package.json
├── packages
│   └── pkg-1
│   ├── dist
│   ├── package.json
│   ├── src
│   └── tsconfig.json
├── services
│   ├── api-1
│   │   ├── dist
│   │   ├── Dockerfile
│   │   ├── package.json
│   │   ├── src
│   │   ├── tsconfig.json
│   │   └── yarn.lock
│   └── client-1
│   ├── package.json
│   ├── src
│   └── yarn.lock
├── tsconfig.json
└── yarn.lock
I have written Dockerfile to create image for api-1:
ARG APP_DIR=/usr/app
# Build stage
FROM node:16.2-alpine AS build
ARG APP_DIR
WORKDIR ${APP_DIR}
COPY package.json ./
COPY yarn.lock ./
COPY tsconfig.json ./
WORKDIR ${APP_DIR}/packages/pkg-1
COPY packages/pkg-1/package.json ./
RUN yarn --pure-lockfile --non-interactive
COPY packages/pkg-1/tsconfig.json ./
COPY packages/pkg-1/src/ ./src
RUN yarn build
WORKDIR ${APP_DIR}/services/api-1
COPY services/api-1/package.json ./
COPY services/api-1/yarn.lock ./
RUN yarn --pure-lockfile --non-interactive
COPY services/api-1/tsconfig.json ./
COPY services/api-1/src/ ./src
RUN yarn build
# Production stage
FROM node:16.2-alpine AS prod
ARG APP_DIR
WORKDIR ${APP_DIR}
COPY --from=build ${APP_DIR}/package.json ./
COPY --from=build ${APP_DIR}/yarn.lock ./
WORKDIR ${APP_DIR}/packages/pkg-1
COPY --from=build ${APP_DIR}/packages/pkg-1/package.json ./
RUN yarn --pure-lockfile --non-interactive --production
COPY --from=build ${APP_DIR}/packages/pkg-1/dist ./dist
WORKDIR ${APP_DIR}/services/api-1
COPY --from=build ${APP_DIR}/services/api-1/package.json ./
COPY --from=build ${APP_DIR}/services/api-1/yarn.lock ./
RUN yarn --pure-lockfile --non-interactive --production
COPY --from=build ${APP_DIR}/services/api-1/dist ./dist
CMD ["node", "dist"]
Build is running from root docker-compose.yaml to have proper context:
services:
api-1:
image: project/api-1
container_name: api-1
build:
context: ./
dockerfile: ./services/api-1/Dockerfile
target: prod
ports:
- 3000:3000
It is working but this way there will be a lot of repetition while application grow. Problem is the way how packages are building.
Package can be for example normalized components collection used among client services or collection of normalized errors used among api services.
Whenever I will build some service I need to first build its depending packages which is unnecessarily repetitive task. Not mention that building steps of respective package are defined over and over again in Dockerfile of every single service that uses the package.
So my question is. Is there a way how to create for example image of package that will be used for building a service to avoid defining build steps of package in service Dockerfile?
A while ago I have posted an answer detailing how I structured a monorepo with multiple services and packages.
The "trick" is to copy all the packages that your service depends on, as well as the project root package.json. Then running yarn --pure-lockfile --non-interactive --production once will install the dependencies for the all the sub-packages since they are part of the workspace.
The example linked isn't using typescript, but I believe this could be easily achieved with a postinstall script in every package.json that would run yarn build.
Seems like you are looking for something that gives you the option to have a "parent" package.json, so you only have to invoke "build" on one and with that build the whole dependency tree.
e.g:
- package.json // root package
| - a
| - package.json // module a package
| - b
| - package.json // module b package
You might want to look into the following:
npm workspaces
lerna
Both support structures like the one mentioned, lerna has just a lot more features. To get a quick grasp on the differences, look here: Is Lerna needed anymore with NPM 7.0.0's workspaces?

Setting up docker nodejs application with local npm dependencies

We want to start containerizing our applications, but we have stumbled upon some issues with local dependencies.
We have a single git repository, in which we have numerous node packages, under "shared" folder, and applications that require these packages.
So let's say our folder structure is as follows:
src/
├── apps
│   └── my_app
└── shared
└── shared_module
in my_app package.json we have the following dependency:
{
"dependencies": {
"shared-module": "file:../../shared/shared_module"
}
}
The issue here is that because we want to move "my_app" to run in a container, we need to npm install our local dependency.
Can this be done?
Yes, it's possible but a little bit ugly. The problem for you is that Docker is very restrictive when it comes to its build context. I'm not sure how familiar you are already with that concept, so here is the introduction from the documentation:
The docker build command builds an image from a Dockerfile and a context.
For example, docker build . uses . as its build context, and since it's not specified otherwise, ./Dockerfile as the Dockerfile. Files or paths outside the build context cannot be referenced in the Dockerfile (so no COPY ..).
The issue for you is that during a Docker build, the build context cannot be left. If you have multiple applications that you want to build, you would normally add a Dockerfile for each app.
src/
├── apps
│ ├── my_app
│ │ └── Dockerfile
│ └── my_other_app
│ └── Dockerfile
└── shared
└── shared_module
Naturally, you would cd into my_app and use docker build . to build the application's Docker image. The issue with this is that you can't access ../../shared from the build, since it's outside of the context.
So you need to make sure both apps and shared is in the build context. One way would be to place all Dockerfile in src like so:
src/
├── Dockerfile.my_app
├── Dockerfile.my_other
├── apps
│ ├── my_app
│ └── my_other_app
└── shared
└── shared_module
You can then build the applications by explicitly specifying the context and the Dockerfile:
src$ docker build -f Dockerfile.my_app .
Alternatively, you can keep the Dockerfiles inside my_app and my_other_app, and point to them:
src$ docker build -f apps/my_app/Dockerfile .
That should also work. In both cases, the build is executed from within src, which means you need to pay a little attention to the paths in the Dockerfile. The working directory is still src:
COPY ./apps/my_app /src/apps/my_app
By mirroring the folder structure you have locally, you should be able to make your dependencies work without any changes:
RUN mkdir -p /src
COPY ./shared /src/shared
COPY ./apps/my_app /src/apps/my_app
RUN cd /src/apps/my_app && npm install
Hope that helps you get started.

Node/NPM managing local packages

TL;DR How do I configure different "sub-modules" in a modular node.js project to refer to one another as simply as possible?
I'm trying to wrap my head around local packages for NPM, specifically as they relate to a modular project.
I'm building a web app with a front end and a back end API. These need to share a package which exports simple models. My project directory structure looks like this:
package
├── api
│   ├── dist
│   │   └── <compiled files>
│   ├── node_modules
│   │   └── ...
│   ├── package.json
│   └── src
│   └── <source files>
├── application
│   ├── dist
│   │   └── <compiled files>
│   ├── node_modules
│   │   └── ...
│   ├── package.json
│   └── src
│   └── <source files>
└── models
   ├── dist
    │   └── <compiled files>
   ├── node_modules
   │   └── ...
   ├── package.json
   └── src
   └── <source files>
Both the API and application projects are going to use models, so I abstracted that code to a separate sub-module within my project.
I've read the documentation for npm link and that seems to be the right approach because, as I understand it, it symlinks the package in the node_modules dir. This gives access to the code as it exists right now, instead of installing a copy in node_modules. Sounds like what I need, but there is a wrinkle: I'm working on this project from a couple of different places: my laptop, my office at work, and occasionally from home. In addition, others will be contributing to this project in the future.
I would like to make it as simple as possible for a new contributor to get up and running with development.
Currently, a new contributor goes through these steps:
clone the repository
cd into the models dir
run npm install
npm link
cd into the api dir
run npm install
npm link models
cd into the application dir
run npm install
npm link models
start working
What I would like to do (and I think npm should be capable of doing) is:
clone the repository
cd into the models dir
run npm install
cd into the api dir
run npm install
cd into the application dir
run npm install
start working
I could write a script to do this, but it seems like an obvious use case for npm and I suspect that it's probably capable of doing something like this. I think I may be overlooking something because I'm not finding it in the documentation.
There may be better ways to solve this, but..
One possibility is to create a package.json in the root of your project, which handles all of the initialization of your project.
package
├── package.json
And the contents of the package.json has no dependencies, but scripts to link and install dependencies for all of the submodules using just npm install from the package directory.
If you put the following in a package.json in your package folder
{
"name": "package-name",
"version": "1.0.0",
"description": "Some description",
"scripts": {
"init-models": "cd ./models && npm install && npm link",
"init-api": "cd ./api && npm install && npm link",
"init-app": "cd ./application && npm install && npm link",
"postinstall": "npm run init-models && npm run init-api && npm run init-app"
}
}
You will just need to npm install to initialize the whole project.

Resources