I've found myself in a situation where I'm having to run a single command e.g. node compile.js
that .js file needs to run the following
browserify -t jadeify client/app.js -o bundle.js
All the dependencies are installed, and by running this command in the CLI works fine, just need to figure out how to execute it from within a node script.
We've also got inside our package.json the following which contains something similar to
"script" : [ "compile": "browserify -t jadeify client/app.js -o bundle.js" ]
this works perfectly when you execute cd /project && npm run compile via ssh however not via exec
Thanks
You should be able use the api-example and extend it with the transform as suggested by the jadeify setup paragraph.
var browserify = require('browserify');
var fs = require('fs');
var b = browserify();
b.add('./client/app.js');
// from jadeify docs
b.transform(require("jadeify"));
// simple demo outputs to stdout, this writes to a file just like your command line example.
b.bundle().pipe(fs.createWriteStream(__dirname + '/bundle.js'));
You can access script arguments via process.argv.
An array containing the command line arguments. The first element will be 'node', the second element will be the name of the JavaScript file. The next elements will be any additional command line arguments.
You can then use the browserify api together with jadeify to get what you need.
var browserify = require('browserify')();
var fs = require('fs');
var lang = process.argv[2];
console.log('Doing something with the lang value: ', lang);
browserify.add('./client/app.js');
browserify.transform(require("jadeify"));
browserify.bundle().pipe(fs.createWriteStream(__dirname + '/bundle.js'));
Run it with $ node compile.js enGB
Related
I am trying to run ng lint --format json command using node's child_process spawn method. The code looks like below:
const spawn = require('child_process').spawn;
const fs = require('fs-extra');
const jsonLog = fs.createWriteStream('test.json', { flags: 'a' });
const linter = spawn('ng', ['lint', '--format', 'json']);
linter.stdout.pipe(jsonLog);
As you can guess, I am trying to output JSON to a file. The problem I am facing is that I am not getting the complete JSON output. It gets truncated at some point making it an invalid JSON. I think it has to do something with buffer size.
Now most of you will suggest me this:
ng lint --format json > jsonFile.json
I am already aware of it. I want to do it programmatically because I am trying to build a utility for it. Let me know if you have any ideas or solution.
I am building a CLI tool with node, and want to use the fs.promise API. However, when the app is launched, there's always an ExperimentalWarning, which is super annoying and messes up with the interaction prompts. How can I disable this warning/all warnings?
I'm testing this with the latest node v10 lts release on Windows 10.
To use the CLI tool globally, I have added this to my package.json file:
{
//...
"preferGlobal": true,
"bin": { "myapp" : "./index.js" }
//...
}
And have run npm link to link the ./index.js script. Then I am able to run the app globally simply with myapp.
After some research I noticed that there are generally 2 ways to disable the warnings:
set environmental variable NODE_NO_WARNINGS=1
call the script with node --no-warnings ./index.js
Although I was able to disable the warnings with the 2 methods above, there seems to be no way to do that while directly running myapp command.
The shebang I placed in the entrance script ./index.js is:
#!/usr/bin/env node
// my code...
I have also read other discussions on modifying the shebang, but haven't found a universal/cross-platform way to do this - to either pass argument to node itself, or set the env variable.
If I publish this npm package, it would be great if there's a way to make sure the warnings of this single package are disabled in advance, instead of having each individual user tweak their environment themselves. Is there any hidden npm package.json configs that allow this?
Any help would be greatly appreciated!
I am now using a launcher script to spawn a child_process to work around this limitation. Ugly, but it works with npm link, global installs and whatnot.
#!/usr/bin/env node
const { spawnSync } = require("child_process");
const { resolve } = require("path");
// Say our original entrance script is `app.js`
const cmd = "node --no-warnings " + resolve(__dirname, "app.js");
spawnSync(cmd, { stdio: "inherit", shell: true });
As it's kind of like a hack, I won't be using this method next time, and will instead be wrapping the original APIs in a promise manually, sticking to util.promisify, or using the blocking/sync version of the APIs.
I configured my test script like this:
"scripts": {
"test": "tsc && cross-env NODE_OPTIONS=--experimental-vm-modules NODE_NO_WARNINGS=1 jest"
},
Notice the NODE_NO_WARNINGS=1 part. It disables the warnings I was getting from setting NODE_OPTIONS=--experimental-vm-modules
Here's what I'm using to run node with a command line flag:
#!/bin/sh
_=0// "exec" "/usr/bin/env" "node" "--experimental-repl-await" "$0" "$#"
// Your normal Javascript here
The first line tells the shell to use /bin/sh to run the script. The second line is a bit magical. To the shell it's a variable assignment _=0// followed by "exec" ....
Node sees it as a variable assignment followed by a comment - so it's almost a nop apart from the side effect of assigning 0 to _.
The result is that when the shell reaches line 2 it will exec node (via env) with any command line options you need.
New answer: You can also catch emitted warnings in your script and choose which ones to prevent from being logged
const originalEmit = process.emit;
process.emit = function (name, data, ...args) {
if (
name === `warning` &&
typeof data === `object` &&
data.name === `ExperimentalWarning`
//if you want to only stop certain messages, test for the message here:
//&& data.message.includes(`Fetch API`)
) {
return false;
}
return originalEmit.apply(process, arguments);
};
Inspired by this patch to yarn
I'm writing a bookmarklet. I need to prepend "javascript:" to the compiled, minified JavaScript. I'm looking for a way to accomplish this using an NPM package.json script.
{
"scripts": {
"oar:transpile-typescript": "tsc --target es6 --lib dom,es6 ./OarBookmarklet/Oar.ts",
"oar:minify-javascript": "jsmin -o ./OarBookmarklet/oar.min.js ./OarBookmarklet/oar.js",
"oar:prepend-javascript": "[??? prepend `javascript:` to minified JavaScript ???]",
"oar": "run-s oar:transpile-typescript oar:minify-javascript oar:prepend-javascript",
"build": "run-s oar"
}
}
For a cross-platform solution utilize node.js and it's builtin fs.readFileSync(...) and fs.writeFileSync(...). This way it doesn't matter which shell your npm script runs in (sh, cmd.exe, bash, bash.exe, pwsh, ... )
To achieve this consider either of the following two solutions - they're essentially the same just different methods of application.
Solution A. Using a separate node.js script
Create the following script, lets save it as prepend.js in the root of the project directory, i.e. at the same level as where package.json resides.
prepend.js
const fs = require('fs');
const filepath = './OarBookmarklet/oar.min.js';
const data = fs.readFileSync(filepath);
fs.writeFileSync(filepath, 'javascript:' + data);
package.json
Define the oar:prepend-javascript npm script in package.json as follows::
"scripts": {
...
"oar:prepend-javascript": "node prepend",
...
},
Note: Above node.js invokes the script and performs the required task. If you choose to save prepend.js in a different directory than the aforementioned then ensure you define the correct path to it, i.e. "oar:prepend-javascript": "node ./some/other/path/to/prepend.js"
Solution B. Inline the node.js script in package.json
Alternatively, you can inline the content of prepend.js in your npm script, therefore negating the use of a separate .js file.
package.json
Define the oar:prepend-javascript script in package.json as follows:
"scripts": {
...
"oar:prepend-javascript": "node -e \"const fs = require('fs'); const fp = './OarBookmarklet/oar.min.js'; const d = fs.readFileSync(fp); fs.writeFileSync(fp, 'javascript:' + d);\""
...
},
Note: Here the nodejs command line option -e is utilized to evaluate the inline JavaScript.
If this is running on something Unix-like then:
(printf 'javascript:' ; cat ./OarBookmarklet/oar.min.js) > ./OarBookmarklet/oar.bm.min.js
should do the job.
Edit in response to OP's comment:
My execution environment is Windows, ...
In that case you should be able to use:
(set /p junk="javascript:" <nul & type ./OarBookmarklet/oar.min.js) > ./OarBookmarklet/oar.bm.min.js
The set /p ... <nul weirdness is a way to get some text sent to stdout without a newline being appended to it.
I am creating a script in npm package.json.
The script will run yeoman to scaffold my template and then I want to run a gulp task to do some more stuff to a specific file (inject using gulp-inject)
The npm task looks like this:
"scaffolt": "scaffolt -g scaffolt/generators template && gulp inject"
Now, i need to be able to call the command from the command line giving a name to my template.
The command I need to run is the following:
npm run scaffolt {templateName}
but if I do this, then I try to run a gulp task called the same as the typed {templateName}.
A quick example: If I run npm run scaffolt myTemplate then the second part of this will try to run a task called gulp myTemplate, failing.
Is there any way to pass the {myTemplate} name as an argument to the second part of the script so that it can be used in the gulptask?
The gulp task currently only console.log the process.argv.
You can pass arguments to the npm run-script. Here is the documentation.
Make gulp tasks for these operations.
//gulpfile.js
const gulp = require('gulp');
const commandLineArgs = require('command-line-args');
const spawn = require('child_process').spawn;
gulp.task('inject', ['scaffolt'], () => {
console.log('scaffolt complete!');
});
gulp.task('scaffolt', (cb) => {
const options = commandLineArgs([{ name: 'templateName' }]);
//use scaffolt.cmd on Windows!
spawn('scaffolt', ['-g', 'scaffolt/generators', options.templateName])
.on('close', cb);
});
And in your package
//package.json
"scripts": {
"scaffolt": "gulp inject "
}
And to run it npm run scaffolt -- --templateName=something
Tip: npm run-script appends node_modules/.bin directory in the PATH so we can spawn executables just like they are on the same folder!
You can use magical $npm_config_<exampleVarName> in script definition and then pass the value of it either from env variable named match exampleVarName or pass it in command line you add --exampleVarName=ValueHere
in your case
//package.json
"scripts": {
"scaffolt": "scaffolt -g scaffolt/generators $npm_config_templateName && gulp inject"
}
then run it as
npm run scaffolt --templateName=whatever
I just started using electron. I have a doubt about how to pass command line arguments in electron when I'm using npm start to run electron.
In Node.js I am using: node server.js one two=three four
command prompt for :
var arguments = process.argv.slice(2);
arguments.forEach(function(val,index, array) {
console.log(index + ': ' + val);
});
In Node.js is working. I need to know how can I make this work in electron.
Can someone please give a solution for this?
The way of passing arguments will be same, the only thing you have to take care is path of electron. In package.json its written npm start will perform electron main.js. So you will have to execute this command explicitly and pass arguments with "proper path of electron" i.e ./node_modules/.bin/electron. Then the command will be
./node_modules/.bin/electron main.js argv1 argv2
and these arguments you can access by process.argv in main.js
and If wish you to access these parameters in your app then there are following things to do :
1.In your main.js define a variable like
global.sharedObject = {prop1: process.argv};
2.In your app just include remote and use this sharedObject
const remote = require('electron').remote;
const arguments = remote.getGlobal('sharedObject').prop1;
console.log(arguments);
3.Output will be ["argv1", "argv2"]