How can you set up a nodejs web server to run asynchronously? - node.js

Okay, so I'm in the process of creating a Minecraft panel. I have a windows service, and that will call my express server. The problem is, that when the computer tries to start the process, it goes on infinitely because the server will run for eternity. I need this to be changed so that when it runs it will start up the web server, make sure it's running, and then finish the task and forget the webserver exists, but leave it running. How would this be possible? Many thanks :)
Update: After a little bit of though I'm going to refine my question to this: How can I start a process and not wait for it to finish in C#/nodejs(either work)
Edit: Lol i am refreshing this page like every two microseconds.... Its 4 AM for me so yeah :D brain - broke

Answer: So I figured out how to fix it. I used pm2 which can start a process and not wait for it to finish. We have a start.js and a stop.js which can start it. If you want to build something like it here is the code I used.
//Start.js
const pm2 = require('pm2');
pm2.connect((err) => {
if (err) {
console.log(err);
}
});
pm2.start({
name: 'webserver',
script: "webserver.js",
}, (err) => {
if (err) {
console.log(err)
} else {
console.log('Webserver started');
pm2.disconnect();
process.exit(0)
}
})
//Stop.js
const pm2 = require('pm2');
//Stop the webserver
pm2.stop('webserver', (err) => {
if (err) {
console.log(err);
} else {
console.log('Webserver stopped');
pm2.disconnect();
process.exit(0)
}
});

Related

Forked process in expressjs causes server to restart

I have an expressjs server and it uses a fork inside a route to make sure that the main loop isn't blocked (it's somewhat computing intensive). Every time this route is called, the server restarts. I've debugged this problem and found that the forking is what causes this behaviour but I don't understand why. The route is defined as follows:
module.exports = async function someComputingIntensiveFunction(req, res) {
try {
// some stuff
const childProcess = fork('../path/to/file.js');
childProcess.on('message', (data) => {
res.status(201).json(data).end();
});
catch (error) {
res.status(500).end()
}
}
Inside this file is
process.on('message', (data) => {
// do some stuff with data
// based on whatever the result is
process.send(result);
process.exit(result.status);
});
Am I forgetting a crucial part of forking which causes the expressjs server to restart? Thanks in advance for any help.

Is it possible to get the process metadata from PM2?

I was wondering if it was possible to get process metadata using pm2 inside my node application.
Yes, you can get any information from pm2 inside your app. below will return all running process list. For further details, you can check pm2-api
var pm2 = require('pm2');
app.use('/all_process_list', function(req,res){
pm2.connect(function(err) {
if (err) {
console.error(err);
process.exit(2);
}
pm2.list(function(err, processDescriptionList) {
console.log(processDescriptionList)
res.json ({process_list:processDescriptionList})
pm2.disconnect(); // Disconnects from PM2
});
});
});

Need to run a NodeJs application from another NodeJs application

I have a NodeJs application running in the following directory
First Application's Path '/users/user1/projects/sampleProject' which is running at 3000 port.
Second Application's Path '/users/user1/demoProjects/demo1' which is going to run at 5000 port on triggering the router function from first application.
The second NodeJs application is not yet started(It will run at port 5000). It need to run independently on hitting a router function in the first NodeJs Application which is running on port 3000 ie(http://localhost:3000/server/startServer). I'm new to NodeJs child processes, Kindly correct me if i'm wrong. And suggest me a right way to do it. Thanks
Start another node application using node.js?
I have tried it like below
// First NodeJs application
import { exec } from "child_process";
router.get('/startServer', async (req, res, next) => {
console.log("Initiated request")
let startServerInstance = 'cd "/users/user1/demoProjects/demo1" && npm run dev'; // path for the second NodeJs application
console.log("Server instance path => " + startServerInstance)
try {
// exec from child process, Spawns a shell then executes the command within that shell
let child = exec(startServerInstance, function (err, stdout, stderr) {
if (err) throw err;
else {
console.log("result ")
res.json({
status: 'success'
});
}
});
} catch (error) {
res.json({
status: 'error',
message: error
});
}
});
The above code executes the command and triggered the second application to run in background but it doesn't return anything. Either error or success result.
You need to use stout and stderror to check other server logs. Also your code is not correct. If you use if without {} it will not go to else statement. That is why you don't see 'result' text in console.
import {
exec
} from "child_process";
router.get('/startServer', async (req, res, next) => {
console.log("Initiated request")
let startServerInstance = 'cd "/users/user1/demoProjects/demo1" && npm run dev'; // path for the second NodeJs application
console.log("Server instance path => " + startServerInstance)
try {
// exec from child process, Spawns a shell then executes the command within that shell
let child = exec(startServerInstance, function(err) {
if (err) throw err;
console.log("Server started");
});
child.stdout.on('data', (data) => {
// this is new server output
console.log(data.toString());
});
child.stderr.on('data', (data) => {
// this is new server error output
console.log(data.toString());
});
res.json({
status: 'success'
});
} catch (error) {
res.json({
status: 'error',
message: error
});
}
});
Child process callback is only called once the process terminates. If the process keeps running, callback is not triggered.
Explained here - https://nodejs.org/docs/latest-v10.x/api/child_process.html#child_process_child_process_exec_command_options_callback

Avoid the task completion callback called too many times?

Given the following gulp tasks I'm able to successfully start the gulp, webpack and nodemon process, but the webpack tasks are open ended, so they will continue to fire the completion handler when their watch / compile cycle is complete.
The server task depends on the client task output, so I need these operations to be synchronous, hence the done
function onBuild(done) {
return function(err, stats) {
if(err) {
gutil.log('Error', err);
if(done) {
done();
}
} else {
Object.keys(stats.compilation.assets).forEach(function(key){
gutil.log('Webpack: output ', gutil.colors.green(key));
});
gutil.log('Webpack: ', gutil.colors.blue('finished ', stats.compilation.name));
if(done) {
done();
}
}
}
}
//dev watch
gulp.task('webpack-client-watch', function(done) {
webpack(devConfig[0]).watch(100, function(err, stats) {
onBuild(done)(err, stats);
});
});
gulp.task('webpack-server-watch', function(done) {
webpack(devConfig[1]).watch(100, function(err, stats) {
onBuild(done)(err, stats);
nodemon.restart();
});
});
gulp.task('webpack-watch',function(callback) {
runSequence(
'webpack-client-watch',
'webpack-server-watch',
callback
);
});
gulp.task('nodemon', ['webpack-watch'], function() {
nodemon({
script: path.join('server/dist/index.js'),
//ignore everything
ignore: ['*'],
watch: ['foo/'],
ext: 'noop'
}).on('restart', function() {
gutil.log(gutil.colors.cyan('Restarted'));
});
});
When I change a file, the watcher does its thing and gulp complains about the callback being called yet again.
[15:00:25] Error: task completion callback called too many times
I've looked at this, but not sure if its applicable.
Why might I be getting "task completion callback called too many times" in gulp?
Basically, I just want this to work synchronously and continuously without error.
gulp nodemon
This solved it for me: Just don't call the callback parameter in your webpack-watch task. Leave it out completely.
After that, the watcher works fine and fast without complaining.
If public folder exists in your application. Please remove and re-run, after you can see this issue resolved.

Why isn't the MongoClient in my Node.js script finishing?

I have a one-shot Node script that makes some changes to a MongoDB database on MongoLab. However, once it finishes, it never exits the event loop (I always have to ctrl+C it), no matter how much db.close() and db.logout() calling I do.
What's strange is, if I start a local running instance of mongod and connect to that, the script finishes fine, but the remote connection just never ends.
Here is a short version of my script that still has the issue (taking the URL to the server on the command line). What's going on?
var mongodb = require("mongodb");
function onSuccess(cb){
return function(err) {
if (err) {
console.error(err)
} else {
cb.apply(this,Array.prototype.slice.call(arguments,1))
}
}
}
console.log("Connecting to "+process.argv[2]+' ...');
mongodb.MongoClient.connect(process.argv[2],onSuccess(function(db){
console.log("Connected.");
db.logout(onSuccess(function(logoutResult){
db.close(onSuccess(function(closeResult){
console.log("All finished. Can has prompt return nao?")
}));
}));
}));
Just tried the code with driver version 1.2.7/1.2.8 and the newest 1.2.9 against mongolab and it works correctly. So more likely its a weird combination of driver/os/node version that's causing this. I suggest upgrade your node and driver to the latest version and try again.
I suspect it has to do with the way you have defined your closures but I cannot quite put my finger on it.
For what is worth, below is the approach that I use and this does close the connection as expected:
MongoClient.connect(dbUrl, function(err, db) {
if(err) return callback(err);
var collection = db.collection(dbCollection);
collection.find().toArray(function(err, items){
db.close()
if(err) return callback(err);
callback(null, items);
});
});
You can find a full example here: https://github.com/hectorcorrea/mongoDbSample

Resources