npm - run scripts in directory other than cwd - node.js

I have a script which scaffolds an app. This involves making a new directory in the current directory inside which the app's scaffold files will be unpacked.
At the end of my script I run a series of npm/node commands as follows (appDir is a reference to the child dir created to host the app; it exists by the time these commands run):
const postInstCmds = [
'cd '+appDir,
'npm init -y',
'npm install add-npm-scripts --save-dev',
'npm install http-server --save-dev',
'add-npm-scripts server "http-server"',
'npm run server'
];
const execSync = require('child_process').execSync;
postInstCmds.forEach(cmd => execSync(cmd));
However it seems the first cd command is being ignored; the subsequent commands are being executed in the parent, not child, directory.
I've read this, which explicitly says that npm supports cd * operations, but perhaps I'm misunderstanding something (I'm no npm wizard). I've also tried using npm's --prefix argument to specify path, but no joy.
[UPDATE]
The linked dup suggests npm start <dir> but this seems to work only if <dir> already contains a package.json. As per my sequence of commands, I want to move to the directory and then create the package.json (via npm init). Another answer there suggests npm run --prefix, but run again requires an extant pacakge.json, since it reads from the scripts object.
I have also tried npm init --prefix <dir> but the arg doesn't seem to be supported for npm init.

Your code is running different child processes for each command. So the cd command runs in a subprocess and exits. Then the next command runs in a new subprocess that is in the original directory you started in and not in the directory that the cd command navigated to.
One fix would be to get rid fo the cd command and instead pass the directory as a cwd option for each execSync() call.
postInstCmds.forEach(cmd => execSync(cmd, {cwd: appDir}));

Related

Run Laravel Mix without a global nodejs and npm installation

I have a laravel project and I need to build styles and scripts with laravel-mix, but a testing server (Ubuntu 20.04.4) hasn't a globally installed node. Node and npm are in different folders in the system so I run commands like this:
/path/to/node /path/to/npm install
/path/to/node /path/to/npm run dev
But when I run npm run dev (this command runs laravel-mix build), I see the error:
> mazer#2.0.0 dev
> mix
/usr/bin/env: ‘node’: No such file or directory
In the package.json it looks like this:
"scripts": {
"dev": "mix"
...
}
I checked the laravel-mix package (in node_modules) and found this: #!/usr/bin/env node. The package checks the node var in this file, but there is no node var.
I don't need to change the env file, so how can I change this path or set a temporary system var? Is there any way to simulate that the variable is there?
I have one solution for this problem.
The issue regarding naming misspelling or path symlinks.
so that you need to link symlinks for nodejs with this command
ln -s /usr/bin/nodejs /usr/bin/node
or
sudo ln -s /usr/bin/nodejs /usr/bin/node
I resolved my issue with Docker, so now I run this command on git push:
docker run --rm -v /path/to/project:/var/www/html node:16.16.0-alpine npm run dev --prefix /var/www/html
Perhaps it will be useful to someone.
UPD
I found another way to resolve it, I use PATH incorrectly and for this reason it didn't work:
Wrong
I set paths to node and npm and then add it to PATH like this:
NODE_PATH="/path/to/node_folder/node"
NPM_PATH="/path/to/node_folder/npm"
PATH="${NODE_PATH}:${NPM_PATH}:$PATH"
And the system can't find npm and node anyway.
The right way
Add /path/to/node_folders (node and npm are in it) to PATH:
NODE_DIR="/path/to/node_folder"
PATH="${NODE_DIR}:$PATH"
And then, I can run just npm install and npm run dev without full paths to them.

bat file - change directory and npm install in that directory, then switch again

So I have a large number of services I'm running locally for my application, and I need a nice convenient way to first install all of their dependencies without going into the individual folders in one terminal, and second keep them up to date easier. I'm using node/npm and its not working. Here is an example of how it looks
start cd ./Service1 && npm install
start cd ./Service2 && npm install
start cd ./Service3 && npm install
and it keeps going and going. When I run the bat file, it will open a cmd prompt for each one as it should, and it changes directories just fine, but it switches back to the directory all of the services are housed in and then runs npm install. At least from what I can tell, that is what's happening. How can I change to Service1 and run npm install in it's own cmd prompt, then open another cmd prompt and do the same thing and so on?
In your code the START command starts a separate process and does a change directory. That process is its own separate environment and then closes that environment.
I think what you are attempting to do is see if the folder exists and if it does then run npm install.
So a better option would be.
IF EXIST "Service1" START "" /D Service1 call npm install

How to make bin script available for npm package installed locally

I read on the npm documentation that you can't use bin scripts of locally installed packages.
So, how gulp can be launched as bin command when installed locally?
What's making it available when locally installed, I reviewed the gulp package.json and the bin scripts, I don't found any answer.
From NPMJS documentation:
To use this, supply a bin field in your package.json which is a map of command name to local file name. On install, npm will symlink that file into prefix/bin for global installs, or ./node_modules/.bin/ for local installs.
So, your locally installed packages binaries will be executable like this
./bin/node_modules/.bin/the_binary
This is if you want to launch the binary directly. Or, as specified in the scripts part of the documentation:
In addition to the shell's pre-existing PATH, npm run adds node_modules/.bin to the PATH provided to scripts.
Thus, you can simply write a wrapper script like
scripts: {
"build": "the_binary"
}
and call your script like this
npm run build
Bonus
As of npm#2.0.0, you can use custom arguments when executing scripts. The special option -- is used by getopt to delimit the end of the options. npm will pass all the arguments after the -- directly to your script:
npm run test -- --grep="pattern"
You can use lpx https://www.npmjs.com/package/lpx to
run a binary found in the local node_modules/.bin folder
run a binary found in the node_modules/.bin of a workspace root from anywhere in the workspace
lpx does not download any package if the binary is not found locally (ie not like npx)
Example : lpx tsc -b -w will run tsc -b -w with the local typescript package

How to automatically copy files from package to local directory via postinstall npm script?

I want to automatically copy certain files from an npm package to user's local directory after running
npm install my-package
I can get them installed by declaring "files" inside package.json. The problem is --- the files are not put in the local directory. So I need to run postinstall script.
But now I don't know where the package is installed (maybe higher up the directory tree), so how can I reliably access the files and copy them to the local directory via the script?
(By local directory I mean --- from where I run npm install my-package as user consuming the package.)
UPDATE. It seems the postinstall script runs as npm owned process with home directory being node_modules/my-package, so I still don't know how to access user's home directory other than with naive ../../.
Since npm 3.4 you can use the $INIT_CWD envar:
https://blog.npmjs.org/post/164504728630/v540-2017-08-22
When running lifecycle scripts, INIT_CWD will now contain the original working directory that npm was executed from.
To fix you issue add to your postinstall script in package.json the following:
"scripts": {
"postinstall": "cp fileYouWantToCopy $INIT_CWD",
},
After a lot of searching I found this works cross platform
"scripts":
"postinstall": "node ./post-install.js",
// post-install.js
/**
* Script to run after npm install
*
* Copy selected files to user's directory
*/
'use strict'
var gentlyCopy = require('gently-copy')
var filesToCopy = ['.my-env-file', 'demo']
// User's local directory
var userPath = process.env.INIT_CWD
// Moving files to user's local directory
gentlyCopy(filesToCopy, userPath)
var cwd = require('path').resolve();
Note: If the arguments to resolve have zero-length strings then the current working directory will be used instead of them.
from https://nodejs.org/api/path.html
I would use shellscript/bash
-package.json
"scripts":
"postinstall": "./postinstall.sh",
-postinstall.sh
#!/bin/bash
# go to YOUR_NEEDED_DIRECTORY .e.g "dist" or $INIT_CWD/dist
cd YOUR_NEEDED_DIRECTORY
# copy each file/dir to user dir(~/)
for node in `ls`
do
cp -R $node ~/$node
done
Don't forget to!
chmod +x postinstall.sh

Running gulp tasks in sibling folder

Let's say I need to run a gulp command to build some assets.
Assume I am at root folder, I have a dir called ./node_modules/semantic-ui/, that requires to run 'gulp build' to get necessary assets.
I also setup an npm command, called
build:semantic': 'gulp ./node_modules/semantic-ui'
but it is not possible to do so and I am not willing to use cd command inside my npm run command.
What can I do to do so? Thanks.
Gulp allows you to pass a --cwd and if you do that it will run from within that directory. So in your case, you do this:
gulp --cwd './node_modules/semantic-ui'
Hope that helps. Let me know if it doesn't.

Resources