I'm currently moving my backend from polyrepo or multirepo to a monorepo.
I've successfully moved the shared packages into the Lerna monorepo, now it's time for microservices.
My project structure is simple as that:
root
services
service_written_in_node
Dockerfile
package.json
service_Written_in_something_else
Dockerfile
packages
shared_package_1
package.json
shared_package_2
package.json
So basically everything in the directory packages should be published to the npm, and everything in services dir should be published to the Docker registry.
You can use functionality of package.json and add a postinstall script there.
ex.:
"scripts": {
"postinstall": "npm run docker-build-publish",
"docker-build-publish: "docker build .... {ADD HERE SCRIPTS TO PUSHLISH}
}
Have this in each of your services that has a Dockerfile.
At root of repo you can use lerna bootstrap as your postinstall to trigger install on each service.
You can always check other pre and post scripts that can fit better you usecase: https://docs.npmjs.com/cli/v8/using-npm/scripts
Related
We're using Storybook platform to build UI components and pages in isolation. According to Storybook documentation, an npm package can be published with npm publish command after it's been built. But first they say that it needs to be built with this command:
{
"scripts": {
"build": "cross-env BABEL_ENV=production babel src -d dist"
}
}
This creates dist folder which looks like this:
My issue is that it doesn't include any .css or .json files that are part of the project. So I have to move them there manually.
Any idea how can I configure the project to include them automatically so I don't have to do it each time I run the build command?
I've created a monorepo with nodejs + typescript to publish a rest endpoint using yarn workspaces. This server will run on Google Cloud Run and use Google Container Registry.
The structure is the following
my-project /
|
-> common/
package.json
|
-> server/
-> src
Dockerfile
package.json
|
-> client/
(react client that consume what server produce)
Common is a "shared" folder across projects in order to have a single package for types. I used the naming convention #my-project/common as the name of the common package to avoid collisions.
Then, I reference inside my server's code the common package:
in server's package json
"dependencies": {
...
"#my-project/common": "1.0.0",
...
}
then in the actual code
import { MyType } from '#my-project/common/type'
If it is any worth, here's also the Dockerfile:
FROM node:15-alpine
WORKDIR /usr/src/app
COPY package*.json ./
RUN yarn install
COPY ./dist ./
CMD [ "node", "server.js" ]
The build chain is
cd to server's folder
call build script with yarn that created the dist/ folder with the compiled code
exec gcloud builds submit --tag gcr.io/my-gcp-project-12345/my-gcp-projectto push to GCR
Here's the problem: Cloud Build can't find the common package:
An unexpected error occurred: "https://registry.yarnpkg.com/#my-project%2fcommon: Not found".
Is it possible to find a solution to this, without pushing the common package to a private registry?
We have a similar case but with cloud functions maybe it can help you.
Before deploying the functions we have to run a yarn pack on our #company/common lib and then yarn add file:packName.tgz to the functions.
In our case since we are still using package-lock.json on the deploy side we then have to run npm i --package-lock-only just to update the integrity field on the package-lock.json
After deployment we have to revert the package.json back to our version of the lib.
Quite a "hack" but works.
I'm using nodejs 10 and npm 6.9.
I wanted to create two projects. One dependant from the other.
So I created these folders
-myapps
---parentproj
---childproj
then I did these steps:
go into parentproj folder
execute "npm create"
execute "npm install fs-extra" (for adding a third party reference)
go into childproj folder
execute "npm create"
execute "npm install ..\parentproj"
now the childproj folder contains both the package.json and package-lock.json file.
If I run "npm ci" I get this error
"npm ERR! fs-extra not accessible from parentproj"
Moreover, if I run "npm ls" from childproj folder I get this message
`-- UNMET DEPENDENCY fs-extra#^7.0.1
Am I doing something wrong?
Which is the correct way for working with local packages without publishing them?
regards.
I have a node.js+express application. To deploy it to my server the partner is asking me to "build" the app into a folder called "dist" where all the files that need to be deployed to the server will exist. How can I implement such kind of a build.
Any hint or guidance would be appreciated.
You could create a script which does this in your package.json. You simply need to create the directory and copy everything required for running your application in production to it and no more.
//package.json
{
//...
"scripts": {
"dist": "mkdir -p dist && cp -R node_modules src server.js ... dist"
}
//...
}
Not the above is not cross-platform compatible. This is always the complex part of such build scripts. If this is an issue for you, I'd recommend looking at using available tooling such as gulp.
You can also use a NPM lifecycle hook to do this automatically as part of your install. Ensure you also run npm install --production rather than npm install to omit your dev dependencies.
I'am searching for a good way for building a multi project application.
At the moment I have this structure:
Every app is a nodejs application
- parent folder (git root)
|- app1
|-- app1-backend
|-- app1-frontend
|- app2
|- app3
at the moment i need to install every app by hand with the following steps:
install npm modules with npm install
install typings with typings install
compile app with tsc
every app folder contains the following: (app1-backend, app1-frontend, app2, app3)
tsconfig.json, package.json, typings.json
should i automate that with grunt?
Should I use a own grunt file for each project?
greets
Since it's already 3 self contained commands, you can probably get by with just adding a script in the package.json of each project that handles all it's building commands, ie:
{
"name": "project-name",
...
"scripts": {
"build": "npm install && typings install && tsc"
}
}
Which will allow you to just run npm run build to run all 3 commands for any given project.
Then you can just run
(cd /path/to/project1 && npm run build) & (cd /path/to/project2 && npm run build) & (cd /path/to/project3 && npm run build)
Which will build all 3 simultaneously.
Note: I'm assuming npm will handle multiple processes, but you may have to run sequentially
It is possible to use grunt to run whatever shell commands such as using grunt-shell; however for me personally, it doesn't make sense to have a build process in one project that will cause another project to build.