Netlify dont deploy specific folder or files - netlify

I have a git a repository set up with CD on Netlify. The site itself is 4 files, but I have some other files I'd like to add to the repository that I don't want deployed. Is there a way to deploy only certain files with a deployment? Or only a specific folder?
My site only requires an http server, there's not an npm, jekyll, or hugo install. It's just the deployment of 4 files.

If you put the files in a specific folder you can set the Base Directory in your Build & Deploy settings to that directory and it will ignore the other files/folders not in that directory.

According to the docs, this is a bit more involved than just setting the base repo:
For this next example, consider a monorepo project set up like this, where blog-1 is the base directory.
repository-root/
├─ package.json
└─ workspaces/
├─ blog-1/
│ ├─ package.json
│ └─ netlify.toml
├─ common/
│ └─ package.json
└─ blog-2/
├─ package.json
└─ netlify.toml
The following ignore command example adapts the default behavior so that the build proceeds only if there are changes within the blog-1 or common directories.
[build]
ignore = "git diff --quiet $CACHED_COMMIT_REF $COMMIT_REF . ../common/"

Related

Google Cloud Functions, NodeJS monorepo and local dependencies

So I have this NodeJS monorepo which is structured as follows:
monorepo/
├─ src/
│ ├─ libs/
│ │ ├─ types/
│ │ │ ├─ src/
│ │ │ ├─ package.json
│ ├─ some_app/
│ ├─ gcp_function/
│ │ ├─ src/
│ │ ├─ package.json
├─ package.json
Multiple projects use the same types library, so anytime a type changes, we can update all references at once. It has worked really great so far and I'm really happy with the structure.
Except now I needed to create a function on the Google Cloud Platform. In this function I also reference the types project.
The functions package.json is as follows:
{
"devDependencies": {
"#project_name/types": "*",
"npm-watch": "^0.11.0",
"typescript": "^4.9.4"
}
}
The #project_name/types refers to the src/libs/types project. This works with every project, because we have a centralised build tool.
This means the function works locally on my machine and I have no problem developing it, but as soon as I push it to Google Cloud (using the command listed below) I get the following error:
npm ERR! 404 '#project_name/types#*' is not in this registry.
I use this command to deploy from ./monorepo:
gcloud functions deploy gcp-function --gen2 --runtime nodejs16 --trigger-topic some_topic --source .\src\gcp_function
I think it's pretty clear why this happens:
Google Cloud only pushes the function, which then builds on Google Cloud Build. But because the rest of the monorepo doesn't get pushed, it doesn't work as it has no reference to #project_name/types.
I've been struggling with this issue for quite a while now.
Has anybody ever run into this issue and if so, how did you fix it?
Is there any other way to use local dependecies for Google Cloud functions? Maybe there is some way to package the entire project and send it to Google Cloud?
I have resolved this issue by using a smart trick in the CI pipeline.
In the CI I take the following steps:
Build the types and use npm pack to package it.
Copy the compressed pack to the function folder. There install it using npm i #project_name/types#file:./types.tgz. That way it gets overridden in the package.json.
Zip the entire GCP Function and push it to Google Cloud Storage.
Tell Google Cloud Functions to build and run that zip file.
In the end my CI looks a bit like this:
steps:
- name: Build types library
working-directory: ./src/libs/types
run: npm i --ignore-scripts && npm run build && npm pack
- name: Copy pack to gcp_function and install
run: cp ../libs/types/*.tgz ./types.tgz && npm i #project_name/types#file:./types.tgz
- name: Zip the current folder
run: rm -rf node_modules && zip -r function.zip ./
- name: Upload the function.zip to Google Cloud Storage
run: gsutil cp function.zip gs://some-bucket/gcp_function/function.zip
- name: Deploy to Google Cloud
run: |
gcloud functions deploy gcp_function \
--source gs://some-bucket/gcp_function/function.zip
This has resolved the issue for me. Now I don't have to publish the package to NPM (I didn't want to do that, because it doesn't fit in the monorepo narrative). And I can still develop the types which get updated live in every other project while developing.

Debug npx environment selection for nested directories?

I have nested directories with different nodejs environments (package.json and node_modules/), and need to run npx under different environments. Something like this:
a/
├─ node_modules/
├─ b/
│ ├─ c/
│ │ ├─ node_modules/
│ │ ├─ package.json
├─ package.json
Recently, I created directory c/, and found that when running a command with npx under c/, it shows some warning related to a/node_modules/. Both environments have this command (package), to avoid confusion.
Thus, I would like to check why npx is unexpectedly looking at the upper levels of the directory structure.
Is there a way to output the relevant information?
For example, how do I check which (environment of) directory that npx is running commands in?

Does ESLint look up the directory tree in order to find an extended config?

Say you have the eslint-config-next package installed at the root of a monorepo, and you have a base ESLint config at the root which is extended by a child ESLint config in a sub-package:
# monorepo
├── .eslintrc.base.yml
├── apps
│ └── client
│ └── .eslintrc.yml
└── packages
└── components
└── .eslintrc.yml
Here's how the child .eslintrc.yml in packages/components config looks like:
extends:
- "../../.eslintrc.base.yml"
- "next"
But there is no node_modules directory in packages/components. Does ESLint look up the directory tree (in this case, go up two directories at the root of the monorepo) to locate the next config in the node_modules directory?

Run "npm install" as if package was not in workspace

I am working on an NPM workspace node project. To deploy one of the workspace's packages, I would like to run npm install and obtain a node_modules directory as a subdirectory of that package such that the package becomes self-contained.
Consider the directory structure below:
node_modules
packages
├ cloud-app
│ ├ src
│ └ package.json
├ helpers
│ ├ src
│ └ package.json
├ business-logic
│ ├ src
└ └ package.json
package.json
Just one deduplicated node_modules is excellent for development in a monorepo. But to deploy the cloud-app package, I need the structure to look like this:
packages
├ cloud-app
│ ├ node_modules
│ ├ src
│ └ package.json
├ helpers
│ ├ src
│ └ package.json
├ business-logic
│ ├ src
└ └ package.json
package.json
Then, I could upload the cloud-app directory as usual without exposing my NPM workspace to the vendor's (incompatible) CD pipeline.
Is this possible at all? What would be the correct command or procedure here?
This seems to work. Try the workspaces argument:
cd <my_project_root>
npm i --workspaces=false
https://docs.npmjs.com/cli/v9/commands/npm-install#workspaces
While I did not find a standard way to achieve this, there is a slightly hacky way that worked for me:
Copying the node_modules directory allows the package to act as a stand-alone module. However, there is one caveat: The node_modules directory contains a symlink for each package in the workspace. Thus, a loop begins when it is copied into a package and when symlinks are followed. To prevent this, we first have to delete our package. Therefore, a deploy script could look something like this:
rm ./node_modules/cloud-app
cp -rL ./node_modules ./cloud-app/node_modules
# deploy cloud-app here
I thought of this while formulating the above question but would still be delighted to know whether there is any canonical, supported way to do this.
Something you can try is what #youzen brought on this post.
Basically, in the root folder of your repository, you can run:
npm install --prefix <path/to/prefix_folder> -g
And so, the node_modules folder will be created in the specified folder.
You could access the link mentioned above to get others solutions, hope this helps!

Setting up docker nodejs application with local npm dependencies

We want to start containerizing our applications, but we have stumbled upon some issues with local dependencies.
We have a single git repository, in which we have numerous node packages, under "shared" folder, and applications that require these packages.
So let's say our folder structure is as follows:
src/
├── apps
│   └── my_app
└── shared
└── shared_module
in my_app package.json we have the following dependency:
{
"dependencies": {
"shared-module": "file:../../shared/shared_module"
}
}
The issue here is that because we want to move "my_app" to run in a container, we need to npm install our local dependency.
Can this be done?
Yes, it's possible but a little bit ugly. The problem for you is that Docker is very restrictive when it comes to its build context. I'm not sure how familiar you are already with that concept, so here is the introduction from the documentation:
The docker build command builds an image from a Dockerfile and a context.
For example, docker build . uses . as its build context, and since it's not specified otherwise, ./Dockerfile as the Dockerfile. Files or paths outside the build context cannot be referenced in the Dockerfile (so no COPY ..).
The issue for you is that during a Docker build, the build context cannot be left. If you have multiple applications that you want to build, you would normally add a Dockerfile for each app.
src/
├── apps
│ ├── my_app
│ │ └── Dockerfile
│ └── my_other_app
│ └── Dockerfile
└── shared
└── shared_module
Naturally, you would cd into my_app and use docker build . to build the application's Docker image. The issue with this is that you can't access ../../shared from the build, since it's outside of the context.
So you need to make sure both apps and shared is in the build context. One way would be to place all Dockerfile in src like so:
src/
├── Dockerfile.my_app
├── Dockerfile.my_other
├── apps
│ ├── my_app
│ └── my_other_app
└── shared
└── shared_module
You can then build the applications by explicitly specifying the context and the Dockerfile:
src$ docker build -f Dockerfile.my_app .
Alternatively, you can keep the Dockerfiles inside my_app and my_other_app, and point to them:
src$ docker build -f apps/my_app/Dockerfile .
That should also work. In both cases, the build is executed from within src, which means you need to pay a little attention to the paths in the Dockerfile. The working directory is still src:
COPY ./apps/my_app /src/apps/my_app
By mirroring the folder structure you have locally, you should be able to make your dependencies work without any changes:
RUN mkdir -p /src
COPY ./shared /src/shared
COPY ./apps/my_app /src/apps/my_app
RUN cd /src/apps/my_app && npm install
Hope that helps you get started.

Resources