How to ignore npm scripts when deploying to PAAS/Modulus - node.js

When deploying to Modulus.io (this probably applies to other PAAS as well), they will install the required packages from the packages.json file. As part of the install process, some npm scripts might be called as well. For example postinstall. However, these scripts might not be able to run (or should not run) on production. Be it because of scripts that are only available locally or do not make any sense on production.
How can I detect the environment and execute or not execute certain npm scripts? Can I access the process.env object and handle the scripts appropriatly or is there a better way?

Unfortunately, you can't in your package.json define script only for specific environment.
Let's say you have a postinstall script declared like this in package.json:
{
"scripts": {
"postinstall": "node postInstall.js"
},
}
The "easy" way would be to add your logic regarding the environment in this postInstall.js script:
if (process.env.NODE_ENV === 'production') {
// Do not run in production
process.exit(1);
}
If you're running in the production environment, you just instructs Node.js to terminate the process as quickly as possible with the specified exit code for example.
You could also if you're running multiple scripts in the postinstall hook, move all your scripts execution in a wrapper having the same mechanism to exit on certain environment, if not, executes all the other scripts.
Another approach if you're always running on Unix systems is to check directly the Node.js environment using a Bash condition:
{
"scripts": {
"postinstall": "[ \"$NODE_ENV\" != production ] && node postInstall.js"
},
}
In this case, if the node environment is not production, then you're running your postInstall.js script. You can adjust it to other conditions like only in development, etc.

Related

Set environment variable through Typescript script

I want to create a hash on build and set is as environment variable. It should be accessible by node.
Firstly I wrote a bash script, exported the environment variable in the script and sourced it in the package.json.
Problem is node doesn't know the source command.
Now I rewrote the script in Typescript (due to the whole project using TS not JS).
In the script I set the variable as follows:
process.env.VARIABLE = hashFunction(path);
The function is called through a script in package.json
"hash": "ts-node path/to/script.ts"
The function works as it should, but the environment variable is not set. Can someone help me to resolve this? Is it possible to return the string outside of the script and set it from there?
If possible i'd like to not use an external package.
Thank you :)
Update:
I used a bash script, but with a typescript script it'd work the same way. For bash the console.log is replaced with echo.
script.ts
console.log("2301293232") // The hash created by the script
package.json
"scripts": {
"build": "yarn run hash react-scripts build", // omit &&
"hash": "ENV_VAR=$(ts-node script.ts)"
}
So I did the following:
The script returns the checksum to the console/standard output. But I'll capture it before and set the printed value as environment variable in the package.json file. This will work as long as its the same process which starts the build.
That is why neither
"scripts": {
"build": "yarn run hash && react-scripts build"
}
nor
"scripts": {
"build": "react-scripts build",
"prebuild": "ENV_VAR=$(ts-node script.ts)"
}
will work. In both examples a new process will be started and the environment variable will be lost.
Can't (easily) change environment variables for parent process
You can change/set the environment for the currently running process. That means that when ts-node runs your program, you are changing the environment variables for your script and for ts-node.
After your script is finished running, ts-node stops, and the environment changes are lost. They don't get passed back to the shell.
Changing another process's environment
Changing the environment variables for the parent process (the shell) is a much more complicated process and depends on your OS and upon having the correct permissions. For linux, one such technique is listed here. In Windows, you can find some hints by looking at this question.
Other options
Your other option might be to just return a string that your shell understands, and run that.

How to run a 'watch' script along with a 'start' script in my node.js project

I'm writing an app that is composed of microservices (I use micro).
I really like es6, so I use Babel to make the development process easier. The problem that I have is that I need a script that would compile my es6 code and restarted the 'server'; I don't know how to achieve this.
Right now I have the following script in my package.json:
"scripts": {
"start": "yarn run build && micro",
"build": "./node_modules/.bin/babel src --out-dir lib"
},
When I run yarn start my es6 code compiles successfully and micro starts the server. However, if I make changes to my code, I'll have to manually stop the server and run yarn start again.
I've tried to change my build script
"build": "./node_modules/.bin/babel src --watch --out-dir lib"
But in this case the micro command does not get executed as the build script just watches for changes and blocks anything else from execution. My goal is to have a script that would watch for changes and restart the server if a change occurred (compiling the code beforehand) like in Meteor.
One option is using ParallelShell module to run shell commands in parallel. You can find an example of how to use it here
The simplest solution would be to yarn run build & micro (note the single & and not &&).
As mentioned by others, parallelshell is another good hack (probably more robust than &).

What is the test command while creating package.json?

While creating package.json from command line using npm init for creating a module in Node.js, there is a test command field that I don't know about. There's no mention of it in the docs too on executing npm help json also in the CLI.
Please explain what it is about.
The test command is the command that is run whenever you call npm test.
This is important when integrating with continuous integration/continuous deployment tools (such as jenkins, codeship, teamcity).
Example:
- say you deploy a project to AWS or some other cloud hosting provider,
- you can set up your infrastructure to automatically run npm test.
- If there are problems within those tests, your ci/cd will automatically rollback before deploying.
To execute tests
You can use karma, jest, or selenium/nightmare/phantomjs or about any other test scripting library/framework that allows you to write and execute tests and then set the required command in scripts.test and finally run it from npm test.
Assuming you mean scripts.test:
"scripts" : {
"test" : "echo \"Error: no test specified\" && exit 1"
}
This field contains the program(/command line) that should run when you call npm test. Typically, that program is a test-runner like mocha, ava, jest, ...
The default value is a placeholder that prints an error message (try running npm test in the same directory as your package.json).

Node Environmental variable on Windows

I noticed this strange behavior which is not a big deal, but bugging the heck out of me.
In my package.json file, under the "scripts" section, I have a "start" entry. It looks like this:
"scripts": {
"start": "APPLICATION_ENV=development nodemon app.js"
}
typing npm start on a Mac terminal works fine, and nodemon runs the app with the correct APPLICATION_ENV variable as expected. When I try the same on a Windows environment, I get the following error:
"'APPLICATION_ENV' is not recognized as an internal or external command, operable program or batch file."
I have tried the git-bash shell and the normal Win CMD prompt, same deal.
I find this odd, because typing the command directly into the terminal (not going through the package.json script via npm start) works fine.
Has anyone else seen this and found a solution? Thanks!!
For cross-platform usage of environment variables in your scripts install and utilize cross-env.
"scripts": {
"start": "cross-env APPLICATION_ENV=development nodemon app.js"
}
The issue is explained well at the link provided to cross-env. It reads:
Most Windows command prompts will choke when you set environment variables with NODE_ENV=production like that. (The exception is Bash on Windows, which uses native Bash.) Similarly, there's a difference in how windows and POSIX commands utilize environment variables. With POSIX, you use: $ENV_VAR and on windows you use %ENV_VAR%.
I ended up using the dotenv package based on the 2nd answer here:
Node.js: Setting Environment Variables
I like this because it allows me to setup environmental variables without having to inject extra text into my npm script lines. Instead, they are using a .env file (which should be placed on each environment and ommitted from version control).
You should use "set" command to set environment variables in Windows.
"scripts": {
"start": "set APPLICATION_ENV=development && nodemon app.js"
}
Something like this.

Is it possible to consume environment variables inside of npm / package.json?

I'm attempting to build a package.json so that when running a NodeJS app on Heroku it will run the scripts.postinstall step using an environment variable. For example:
...
"scripts": {
"postinstall": "command $ENV_VAR"}
},
...
I've looked at the docs and wasn't able to find something saying I can.
Is this even possible? Is this even desirable and "I'm Doing It Wrong"™?
Ignore the nay-sayers. You can do this in a cross-platform manner using cross-var:
"scripts": {
"postinstall": "cross-var command $ENV_VAR"
}
Updated answer due to new packages having been written
You can use the cross-var package to do this in a clean way:
...
"scripts": {
...
"postinstall": "cross-var command $ENV_VAR",
...
},
"dependencies": {
...
"cross-var": "^1.1.0",
...
}
...
Original answer
To answer the last questions, because they're the most important one: yes, no, and absolutely, because you've just broken cross-platform compatibility. There is no guarantee your environment syntax works for all shells on all operating systems, so don't do this.
We have a guaranteed cross-platform technology available to us already: Node. So, create a file called something like bootstrap.js, and then make npm run node bootstrap as your postinstall script. Since the code inside bootstrap.js will run like any other node script, it'll have access to process.env in a fully cross-platform compatible way, and everyone will be happy.
And many, many, many things that use common utils have node equivalents, so you can npm install them, locally rather than globally, and then call them in an npm script. For instance mkdir -p is not cross-platform, but installing the mkdirp module is, and then an npm script like "ensuredirs": "mkdirp dist/assets" works fine everywhere when run as npm run ensuredirs
And for convenience, the most common unix utilities have their own runner package, shx, which is fully cross-platform and makes the lives of devs even easier, with the "if you're writing code" equivalent being fs-extra.

Resources