Can we run multiple yml files using serverless offline plugins - node.js

I have multiple yml files in different folder then how I run locally all those files using serverless offline plugins?

If I'm understanding your question correctly, you have a structure something like this:
./
serverless.yml
/more-yml
/functions
lambda-x.yml
lambda-y.yml
lambda-z.yml
/resources
resource-a.yml
resource-b.yml
You can write a script which parses all these files, runs any validations you may want on the items within, and returns a file for serverless.yml to use, so that your serverless.yml might look like this:
service: your-service
provider:
...
resources: ${file(./scripts/serverless/join-resources.js)}
functions: ${file(./scripts/serverless/join-lambda-functions.js)}
All this scripts (or scripts) need to do is loop over a given directory, load the yml, concat each file's yml to a temp file, then resolve with that temp file!

Related

reading .env file from node - env file is not published

I am trying to read .env file using "dotenv" package but it returns undefined from process.env.DB_HOST after published to gcloud run. I see all files except for the .env file in root directory when I output all files to log. I do have .env file in my project on a root directory. Not sure why it's not getting pushed to gcloud or is it?. I do get a value when I tested locally for process.env.DB_HOST.
I used this command to publish to google run.
gcloud builds submit --tag gcr.io/my-project/test-api:1.0.0 .
If you haven't a .gcloudignore file in your project, gcloud CLI use the .gitignore by default
Create a .gcloudignore and put the file that you don't want to upload when you use gcloud CLI command. So, don't put the .env in it!
EDIT 1
When you add a .gcloudignore, the gcloud CLI no longer read the .gitignore file and use it instead.
Therefore, you can define 2 different logics
.gitignore list the file that you don't want to push to the repository. Put the .env file in it to NOT commit it
.gcloudignore list the file that you don't want to send with the gcloud CLI. DON'T put the .env file in it to include it when you send your code with the gcloud CLI

AWS Elastic beanstalk - My deployed app can't seem to write pdf's into this directory i've set up in my project folder

I am currently using nodejs that is deployed in ebs on aws. I have a function that will write a pdf and then email it off but it says the file path can't be found. I've verified the project file seems to be /var/app/current/, but changing the reference of the file path doesn't seem to remove the error. Any idea how to go about fixing this?
The /var/app/current/ does not exist initially. Its only created at the very last stage of your deployment.
The deployment happens in /var/app/staging/ folder, and at the very last, once everything finishes, /var/app/staging/ is moved into /var/app/current/.
Thus, I would not recommend using absolute paths in your project or config files. Its better to use relative path or container_commands for config scripts:
The specified commands run as the root user, and are processed in alphabetical order by name. Container commands are run from the staging directory, where your source code is extracted prior to being deployed to the application server.

What is the best approch to handle env file?

I have .env file at my project root directory.
How should I handle .env file for dev, qa, stage and prod?
Should include them in git repo? if not where I put them? different folder on external drive for example?
What is the correct extensions? .env.qa or .qa.env?
If I want to build my bundle using webpack to the dist folder (server side), should I include the env file or manually copy it to the dist folder?
You should not check-in your env files into any source control. Any of those secrets will be forever available to anyone having access to the repo until the history is rewritten to remove them.
If you use AWS services, for example, I would suggest using the Secrets Manager.
Any environment variables introduced to Webpack should not be secrets but be configuration values. Anyone who can view source can read those values. If you need to have environment-specific configurations, the Webpack DefinePlugin will replace vars like MY_API_HOST with their values with the following config:
const plugins = [
new webpack.DefinePlugin({
MY_API_HOST: JSON.stringify('https://my-domain.com/api/'),
MY_API_VERSION: JSON.stringify('v2')
})
]
Config module is a easy way to address the different env specific values. Read about config module at - https://www.npmjs.com/package/config. You will have a config folder in the repository with env specific files and I like this approach as the files are in the repository but very well separated.This provides a really easy way to set default values, override the environment specific values etc. It is also very convenient to use the different environment specific files by setting the appropriate node environment variable(export NODE_ENV=development or acceptance or production).

Azure functions github deployment from subfolder

I'm using Azure functions with GitHub deployment. I would like to place the function in src/server/functionName/ within the repo, but the deployment works only if the function is placed directly in functionName/
How do I deploy functions that are placed in subdirectories?
The documentation states
your host.json file and function folders should be directly at the root of what you deploy.
but "should" doesn't mean "must", right?
What I tried:
Various combinations of locations of host.json and function.json
In host.json I set routePrefix but it seems to affect only the function's URL: "http": { "routePrefix": "src/server" },
There are a few ways you can customize the deployment process. One way is by adding a custom deployment script to your repository root. When a .deployment script exists Azure will run that script as part of the deployment process as detailed here. E.g. you can write a simple script that copies the files and directories from your sub directory \src\server to the root, e.g.:
#echo off
echo Deploying Functions ...
xcopy "%DEPLOYMENT_SOURCE%\src\server" %DEPLOYMENT_TARGET% /Y /S
If you don't want to commit a .deployment file to your repo and your requirements are relatively simple, you can also do this via app settings by adding a PROJECT app setting to your function app with the value being your sub directory, e.g. src\server.
Try setting the AZURE-FUNCTIONAPP_PACKAGE_PATH variable in .github/workflows/<something>.yml:
env:
AZURE_FUNCTIONAPP_PACKAGE_PATH: 'azure/api' # set this to the path of your web app project, defaults to the repository root
DOTNET_VERSION: '3.1.301' # set this to the dotnet version to use

Deployment specific files in NodeJS

I am running my NodeJS project on DotCloud. Sadly, DotClouds deployment is "project-intrusive" that is it requires a supervisord.conf file to reside in the app-root. My deployment setup looks like this (using git repos).
project-deploy.git/prod/dotcloud.yml
project-deploy.git/prod/project -> project.git
(/prod/project use project.git as a submodule to access the code)
Now, my though of this is that I eventually would end up having different environments like this, e.g. dev, test and stage. The dev environment wouldn't even have a dotcloud.yml file since it is expected to run everything locally.
Well this works pretty well. But the problem is the supervisord.conf file which is just for deployment to dotcloud, now it resides in the project.git repo, but it doesn't belong there since it is just for deployment.
Are there any modules or NodeJS scripts that let you put deployment configuration files elsewhere, and maybe even specify what the target environment is, e.g. node deploy.js --production, or something like that?
There is a way to get rid of supervisord.conf. Assuming that you want to run e.g. node app.js, you can put the following in dotcloud.yml:
www:
type: nodejs
process: node app.js
Now, of course, it doesn't solve the problem of the dotcloud.yml file itself; but at least it reduces clutter a little bit -- removing it from the approot.

Resources