NodeJS - How to handle "listen EADDRINUSE" when accessing external process - node.js

I'm using phantomJS for printing PDF, with phantomjs-node module. It works well but when I try to create several files at once, it throws an Unhandled error "Listen EADDRINUSE.
I assume this is because the module uses phantomJS which is an external process and it can't bind it to the same port several times ?
Anyway, I can't catch this error, and I'd like to resolve this problem at least by avoiding a server crash when this happens.
I thought of using a "global" variable, like a locker, in order to block concurrent calls until the current one is finished.
Any idea of how to implement that, or any other solution ?

The code from #AndyD is not correct imho. See lines 45 - 54 in
https://github.com/sgentle/phantomjs-node/blob/master/phantom.coffee
So the example should be
var portscanner = require('portscanner');
var phantom = require('phantom');
portscanner.findAPortNotInUse(40000, 60000, 'localhost', function(err, freeport) {
phantom.create({'port': freeport}, function(ph){
...
}
});

You should be able to pass in a port number every time you call create:
var phantom = require('phantom');
phantom.create(null, null, function(ph){
}, null, 11111);
You can then use a counter to ensure it's different every time you start phantomjs-node.
If you are starting a new process every time and you can't share a counter then you can use portscanner to find a free port:
var portscanner = require('portscanner');
var phantom = require('phantom');
portscanner.findAPortNotInUse(40000, 60000, 'localhost', function(err, freeport) {
phantom.create(null, null, function(ph){
...
}
}, null, freeport);

Related

Modules not loading all the time

I'm trying to split out a "prebid" file , so as I can have seperate files from the "bidders", analytics client, bidder settings and some other bits. I've basically made my original file the main.js and have split out some of the code into different files such as
var pbjs = pbjs || {};
pbjs.que = pbjs.que || [];
pbjs.que.push(function() {
pbjs.addAdUnits(adUnits);
requirejs(['bidder_settings']);
requirejs(['pbjs_config']);
pbjs.requestBids({
bidsBackHandler: initAdserver,
timeout: PREBID_TIMEOUT
});
});
i'm trying to call in the files within the orginal file, so as it pulls those bits in, it sometimes works but other times it doesnt seem to load certain bits, any clue what i'm doing wrong/ is there a way to make sure the file loads the "modules/ seperate file" in sequencence down the page?
This sounds like an asynchronous race condition on your page where requirejs is not loading your modules by the time Prebid needs them to finish the auction. According to the [requirejs docs] (https://requirejs.org/docs/api.html#jsfiles) you should be using a callback to then run code that will require your loaded modules.
example:
var pbjs = pbjs || {};
pbjs.que = pbjs.que || [];
requirejs(['bidder_settings', 'pbjs_config', ],
function (bidder_settings, pbjs_config) {
pbjs.que.push(function () {
pbjs.addAdUnits(adUnits);
requirejs(['bidder_settings']);
requirejs(['pbjs_config']);
// do what you need with modules here
pbjs.requestBids({
bidsBackHandler: initAdserver, // make sure you utilize disableInitialLoad
timeout: PREBID_TIMEOUT
});
});
}
);

Query a remote server's operating system

I'm writing a microservice in Node.js, that runs a particular command line operation to get a specific piece of information. The service runs on multiple server, some of them on Linux, some on Windows. I'm using ssh2-exec to connect to the servers and execute a command, however, I need a way of determining the server's OS to run the correct command.
let ssh2Connect = require('ssh2-connect');
let ssh2Exec = require('ssh2-exec');
ssh2Connect(config, function(error, connection) {
let process = ssh2Exec({
cmd: '<CHANGE THE COMMAND BASED ON OS>',
ssh: connection
});
//using the results of process...
});
I have an idea for the solution: following this question, run some other command beforehand, and determine the OS from the output of said command; however, I want to learn if there's a more "formal" way of achieving this, specifically using SSH2 library.
Below would be how i would think it would be done...
//Import os module this will allow you to read the os type the app is running on
const os = require('os');
//define windows os in string there is only one but for consistency sake we will leave it in an array *if it changes in the future makes it a bit easier to add to an array the remainder of the code doesn't need to change
const winRMOS = ['win32']
//define OS' that need to use ssh protocol *see note above
const sshOS = ['darwin', 'linux', 'freebsd']
// ssh function
const ssh2Connect = (config, function(error, connection) => {
let process = ssh2Exec({
if (os.platform === 'darwin') {
cmd: 'Some macOS command'
},
if (os.platform === 'linux') {
cmd: 'Some linux command'
},
ssh: connection
});
//using the results of process...
});
// winrm function there may but some other way to do this but winrm is the way i know how
const winRM2Connect = (config, function(error, connection) => {
let process = ssh2Exec({
cmd: 'Some Windows command'
winRM: connection
});
//using the results of process...
});
// if statements to determine which one to use based on the os.platform that is returned.
if (os.platform().includes(sshOS)){
ssh2Connect(config)
} elseif( os.platform().includes(winrmOS)){
winrm2Connect(config)
}

"Write after end": how to imitate gulp-watch with watchify?

The following Gulp task does almost what I want.
const gulp = require('gulp');
const browserify = require('browserify');
const vinylStream = require('vinyl-source-stream');
const vinylBuffer = require('vinyl-buffer');
const watchify = require('watchify');
const glob = require('glob');
const jasmineBrowser = require('gulp-jasmine-browser');
gulp.task('test', function() {
let testBundler = browserify({
entries: glob.sync('src/**/*-test.js'),
cache: {},
packageCache: {},
}).plugin(watchify);
function updateSpecs() {
return testBundler.bundle()
.pipe(vinylStream(jsBundleName))
.pipe(vinylBuffer())
.pipe(jasmineBrowser.specRunner({console: true}))
.pipe(jasmineBrowser.headless({driver: 'phantomjs'}));
}
testBundler.on('update', updateSpecs);
updateSpecs();
});
It bundles all my Jasmine specs using Browserify and has them tested through gulp-jasmine-browser. It also watches all specs and all modules that they depend on and re-runs the tests if any of these modules changes.
The only ugly bit, which I'd really like to see solved, is that a new PhantomJS instance and a new Jasmine server are created every time updateSpecs is run. I was hoping to avoid that with code like the following:
gulp.task('test', function() {
let testBundler = browserify({
entries: glob.sync('src/**/*-test.js'),
cache: {},
packageCache: {},
}).plugin(watchify);
// persist the Jasmine server and PhantomJS browser
let testServer = jasmineBrowser.headless({driver: 'phantomjs'});
function updateSpecs() {
return testBundler.bundle()
.pipe(vinylStream(jsBundleName))
.pipe(vinylBuffer())
.pipe(jasmineBrowser.specRunner({console: true}))
.pipe(testServer);
}
testBundler.on('update', updateSpecs);
updateSpecs();
});
Alas, this doesn't work. Right after starting the task, all tests run fine, but the next time updateSpecs is called, I get a write after end error and the task exits with status 1. This error originates from the readable-stream Node module.
As I understand it, the end event during the first run of updateSpecs leaves testServer in a state in which it doesn't accept any new inputs. Unfortunately, the Node.js streams documentation isn't very clear on how to remedy this.
I have tried breaking the pipe chain at a different place, but I got the same result, which seems to indicate this is universal behaviour for streams. I also tried stopping the end event from propagating by inserting a through-stream that didn't re-emit that event, but this prevented the tests from being run at all. Finally, I tried returning the testServer stream from the task; this stopped the error, but although the updateSpecs function gets called every time the sources change, the tests are only being run the first time the task starts. This time, the testServer simply seems to ignore the new input.
The gulp-jasmine-browser documentation suggests that the following code would work:
var watch = require('gulp-watch');
gulp.task('test', function() {
var filesForTest = ['src/**/*.js', 'spec/**/*-test.js'];
return gulp.src(filesForTest)
.pipe(watch(filesForTest))
.pipe(jasmineBrowser.specRunner())
.pipe(jasmineBrowser.server());
});
And it goes on to suggest that you can also make this work with Browserify, but this isn't illustrated. Apparently, gulp-watch does something which causes the follow-up pipes to accept updated inputs later. How can I imitate this behaviour with watchify?
GitHub issue
As it turns out, it is a hard rule in Node.js that you cannot write after the end event. In addition, jasmineBrowser.specRunner(), .server() and .headless() must receive the end signal in order to actually test anything. This restriction is inherited from the official Jasmine test runner.
The example with gulp-watch from the README doesn't actually work, either, for the same reason. In order to make it work, one would have to do something similar to the working version of my watchify code in the question:
gulp.task('test', function() {
var filesForTest = ['src/**/*.js', 'spec/**/*-test.js'];
function runTests() {
return gulp.src(filesForTest)
.pipe(jasmineBrowser.specRunner())
.pipe(jasmineBrowser.server());
}
watch(filesForTest).on('add change unlink', runTests);
});
(I didn't test it, but something very close to this should work.)
So whatever watching mechanism you're using, you'll always need to call .specRunner() and .server() again for every cycle. The good news is that apparently, the Jasmine server will be reused if you explicitly pass a port number:
.pipe(jasmineBrowser.server({port: 8080}));
this also applies to .headless().

nodejs open chrome in window with arguments

Trying to write my own plugins for gulp. So I've written a gulp task that attempts opens the chrome browser in windows (i'll work on getting working for mac/linux later).
It seems to work except it's not passing in my arguments:
/*
* Open
*/
gulp.task('open', function (done) {
var uri = 'http://localhost:' + CONFIG.PORT,
CONFIG.PORT = 8080,
args = [
uri,
'--no-first-run',
'--no-default-browser-check',
'--disable-translate',
'--disable-default-apps',
'--disable-popup-blocking',
'--disable-zero-browsers-open-for-tests',
'--disable-web-security',
'--new-window',
'--user-data-dir="C:/temp-chrome-eng"'
]
cp.spawn('C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe', args);
})
How do i get it to accept my arguments it passed in? Am in providing it the wrong arguments?
I would recommend using a quite popular npm module, opener, instead which will solve both your issue with arguments and cross platform support.
Instead of finding the browser executable like you are doing, you can simply write:
var opener = require('opener')
opener('http://google.com')
If you however want to go with your current method, try capturing the output by naming your process and then listening on stderr and stdout:
var chrome = cp.spawn ...
chrome.stdout.on('data', function (data) {
console.log(data.toString())
})
chrome.stderr.on('data', function (data) {
console.error(data.toString())
})
It does work for me on linux if I replace your chrome path with chromium.

NodeJS not spawning child process except in tests

I have the following NodeJS code:
var spawn = require('child_process').spawn;
var Unzipper = {
unzip: function(src, dest, callback) {
var self = this;
if (!fs.existsSync(dest)) {
fs.mkdir(dest);
}
var unzip = spawn('unzip', [ src, '-d', dest ]);
unzip.stdout.on('data', function (data) {
self.stdout(data);
});
unzip.stderr.on('data', function (data) {
self.stderr(data);
callback({message: "There was an error executing an unzip process"});
});
unzip.on('close', function() {
callback();
});
}
};
I have a NodeUnit test that executes successfully. Using phpStorm to debug the test the var unzip is assigned correctly
However if I run the same code as part of a web service, the spawn call doesn't return properly and the server crashes on trying to attach an on handler to the nonexistent stdout property of the unzip var.
I've tried running the program outside of phpStorm, however it crashes on the command line as well for the same reason. I'm suspecting it's a permissions issue that the tests don't have to deal with. A web server spawning processes could cause chaos in a production environment, therefore some extra permissions might be needed, but I haven't been able to find (or I've missed) documentation to support my hypothesis.
I'm running v0.10.3 on OSX Snow Leopard (via MacPorts).
Why can't I spawn the child process correctly?
UPDATES
For #jonathan-wiepert
I'm using Prototypical inheritance so when I create an "instance" of Unzipper I set stdout and stderr ie:
var unzipper = Unzipper.spawn({
stdout: function(data) { util.puts(data); },
stderr: function(data) { util.puts(data); }
});
This is similar to the concept of "constructor injection". As for your other points, thanks for the tips.
The error I'm getting is:
project/src/Unzipper.js:15
unzip.stdout.on('data', function (data) {
^
TypeError: Cannot call method 'on' of undefined
As per my debugging screenshots, the object that is returned from the spawn call is different under different circumstances. My test passes (it checks that a ZIP can be unzipped correctly) so the problem occurs when running this code as a web service.
The problem was that the spawn method created on the Object prototype (see this article on Protypical inheritance) was causing the child_process.spawn function to be replaced, so the wrong function was being called.
I saved child_process.spawn into a property on the Unzipper "class" before it gets clobbered and use that property instead.

Resources