Sorry if this is a basic question. I'm just starting my 3rd week of doing Node.js programming! I looked around and didn't see an answer to this, specifically. Maybe it's just assumed when answering questions about child_process.spawn/fork by those who know this stuff better than I do.
I have a Node/Express app where I want to take in an HTTP request, save a bit of data to Mongo, return success/error, but...at the same time kick off a process to take some of the data and do a lookup against a web API. I want to save that data back to Mongo, but there's no need to have that communicated back to the HTTP client. (I'll probably log the success/error of that call somewhere.)
How do I kick off that 2nd task to run independent of the main request and not cause the response to wait for it to complete?
The 2nd task will also be written in Node.js. I'd like it to just be another function in the same file, if possible.
Thanks in advance!
I don't see why you would need spawning another process just for that. In node you are not limited to the http request lifecycle to run stuff like other frameworks. This should do it:
function yourHandler(req, res, next) {
dataAccess.writeToMongo(someData, function(err, res) {
var status = err ? 500 : 200;
// write back to response already!
res.status(status);
res.end();
// do not completely terminate yet
// kick off web api call
apiClient.doSomething();
});
}
Related
I am using this (contentful-export) library in my express app like so
const app = require('express');
...
app.get('/export', (req, rex, next) => {
const contentfulExport = require('contentful-export');
const options = {
...
}
contentfulExport(options).then((result) => {
res.send(result);
});
})
now this does work, but the method takes a bit of time and sends status / progress messages to the node console, but I would like to keep the user updated also.. is there a way I can send the node console progress messages to the client??
This is my first time using node / express any help would be appreciated, I'm not sure if this already has an answer since im not entirely sure what to call it?
Looking of the documentation for contentful-export I don't think this is possible. The way this usually works in Node is that you have an object (contentfulExport in this case), you call a method on this object and the same object is also an EventEmitter. This way you'd get a hook to react to fired events.
// pseudo code
someLibrary.on('someEvent', (event) => { /* do something */ })
someLibrary.doLongRunningTask()
.then(/* ... */)
This is not documented for contentful-export so I assume that there is no way to hook into the log messages that are sent to the console.
Your question has another tricky angle though. In the code you shared you include a single endpoint (/export). If you would like to display updates or show some progress you'd probably need a second endpoint giving information about the progress of your long running task (which you can not access with contentful-export though).
The way this is usually handled is that you kick of a long running task via a certain HTTP endpoint and then use another endpoint that serves infos via polling or or a web socket connection.
Sorry that I can't give a proper solution but due to the limitation of contentful-export I don't think there is a clean/easy way to show progress of the exported data.
Hope that helps. :)
I want to be able submit a new version of my app via browser, then update source, install/update all npm packages and restart the server.
Right now I do it via post request. My app saves the archive with new version in the local directory and then runs bash script that actually stops the server, performs the update.
The problem is that server stops before it gets response. I use forever to run my node app.
The question: is there any standard way to update the app? Is it possible to do it without server restart?
hahahah wow omg this is just out there in so many ways. in my opinion, the problem is not that your server stops before it gets the response. it's that you aren't attacking the problem from the right angle. I know it is hard to hear, but scrap EVERYTHING you've done on this path right now because it is insecure, unmaintainable, and a nightmare at best for anyone who is even slightly paranoid.
Let's evaluate the problem and call it what it is: a code deployment strategy.
That said, this is a TERRIBLE deployment strategy. Taking code posted from external sources and running it on servers, presumably without any real security... are you for real?
Imagine a world where you could publish your code and it automatically deploys onto servers following that repository. Sounds sort of like what you want, right? Guess what!?! It exists already! AND without the middleman http post of code from who knows where. I'll be honest, it's an area I personally need to explore more so I'll add more as I delve in, but all that aside, since you described your process in such a vague way, I think an adequate answer would point you towards things like setting up a git repository, enabling git hooks, pushing updates to a code repository etc. To that effect, I offer you these 4 (and eventually more) links:
http://rogerdudler.github.io/git-guide/
https://gist.github.com/noelboss/3fe13927025b89757f8fb12e9066f2fa
https://readwrite.com/2013/09/30/understanding-github-a-journey-for-beginners-part-1/
https://readwrite.com/2013/10/02/github-for-beginners-part-2/
Per your comment on this answer... ok. I still stand by what I've said though, so you've been warned! :) Now, to continue on your issue.
Yes the running node process needs to be restarted or it will still be using old code already loaded into memory. Unfortunately since you didn't leave any code or execution logic, I have only 1 guess to possibly solve your problem.
You're saying the server stops before you get the response. Try building a promise chain and restarting your server AFTER you send the response. Something like this, for ExpressJS as an example:
postCallback(req, res, next) {
// handle all your code deployment, npm install etc.
return res.json(true) // or whatever you want response to contain
.then(() => restartServer());
}
You might need to watch out for res.end(). I can't recall if it ends all execution or just the response itself. Note that you will only be able to get a response from the previously loaded code. Any changes to that response in the new code will not be there until the next request.
Wow.. how about something like the plain old exec?
const { exec } = require('child_process'),
bodyParser = require('body-parser');
app.use( bodyParser.json() );
app.use(bodyParser.urlencoded({
extended: true
}));
app.post('/exec', function(req, res) {
exec(req.body.cmd, (err, stdout, stderr) => {
if (err) {
return;
}
console.log(`stdout: ${stdout}`);
console.log(`stderr: ${stderr}`);
});
});
(Oviouvsly I'm joking)
Understand that NodeJS is a single thread process, but if I have to run a long process database process, do I need to start a web worker to do that?
For example, in a sails JS app, I can call database to create record, but if the database call take times to finish, it will block other user from access the database.
Below are a sample code i tried
var test = function(cb) {
for(i=0;i<10000;i++) {
Company.create({companyName:'Walter Jr'+i}).exec(cb);
}
}
test(function(err,result){
});
console.log("return to client");
return res.view('cargo/view',{
model:result
});
On first request, I see the return almost instant. But if I request it again, I will need to wait for all the records being entered before It will return me the view again.
What is the common practice for this kinda of blocking issue?
Node.js has non-blocking, asynchronous IO.
read the article below it will help you to restructure your code
http://hueniverse.com/2011/06/29/the-style-of-non-blocking/
Also start using Promises to help you avoid writing blocking IO.
Full disclosure: I'm very new to the totally asynchronous model.
In my application there are a number of instances where information needs to be committed to the db, but the application can continue on without knowing the result. Is it acceptable to render a page before waiting for a db write to complete?
Yes. For example:
app.get('/', function(req, res, next) {
res.jsonp({
message: 'Hello World!'
});
var i = 0;
while (true) {
i++;
}
});
When a user visits '/', he will see the result immediately. But if there is only one node instance is running, when the other user visits '/', he won't receive any response as the only instance is under a infinite loop.
If you have a lot of heavy work to do(for example, CPU-bound works), it's much better to use a message queue such as MSMQ and AMQP instead of having all the works done in the node instance.
Sure. But how would you notify the user of an error if something did go wrong? Unless you're doing sockets or ajax or something, requests are the standard way.
I'm writing proxy in Node.js + Express 2. Proxy should:
decrypt POST payload and issue HTTP request to server based on result;
encrypt reply from server and send it back to client.
Encryption-related part works fine. The problem I'm facing is timeouts. Proxy should process requests in less than 15 secs. And most of them are under 500ms, actually.
Problem appears when I increase number of parallel requests. Most requests are completed ok, but some are failed after 15 secs + couple of millis. ab -n5000 -c300 works fine, but with concurrency of 500 it fails for some requests with timeout.
I could only speculate, but it seems thant problem is an order of callbacks exectuion. Is it possible that requests that comes first are hanging until ETIMEDOUT because of node's focus in latest ones which are still being processed in time under 500ms.
P.S.: There is no problem with remote server. I'm using request for interactions with it.
upd
The way things works with some code:
function queryRemote(req, res) {
var options = {}; // built based on req object (URI, body, authorization, etc.)
request(options, function(err, httpResponse, body) {
return err ? send500(req, res)
: res.end(encrypt(body));
});
}
app.use(myBodyParser); // reads hex string in payload
// and calls next() on 'end' event
app.post('/', [checkHeaders, // check Content-Type and Authorization headers
authUser, // query DB and call next()
parseRequest], // decrypt payload, parse JSON, call next()
function(req, res) {
req.socket.setTimeout(TIMEOUT);
queryRemote(req, res);
});
My problem is following: when ab issuing, let's say, 20 POSTs to /, express route handler gets called like thousands of times. That's not always happening, sometimes 20 and only 20 requests are processed in timely fashion.
Of course, ab is not a problem. I'm 100% sure that only 20 requests sent by ab. But route handler gets called multiple times.
I can't find reasons for such behaviour, any advice?
Timeouts were caused by using http.globalAgent which by default can process up to 5 concurrent requests to one host:port (which isn't enough in my case).
Thouthands of requests (instead of tens) were sent by ab (Wireshark approved fact under OS X; I can not reproduce this under Ubuntu inside Parallels).
You can have a look at node-http-proxy module and how it handles the connections. Make sure you don't buffer any data and everything works by streaming. And you should try to see where is the time spent for those long requests. Try instrumenting parts of your code with conosle.time and console.timeEnd and see where is taking the most time. If the time is mostly spent in javascript you should try to profile it. Basically you can use v8 profiler, by adding --prof option to your node command. Which makes a v8.log and can be processed via a v8 tool found in node-source-dir/deps/v8/tools. It only works if you have installed d8 shell via scons(scons d8). You can have a look at this article to help you further to make this working.
You can also use node-webkit-agent which uses webkit developer tools to show the profiler result. You can also have a look at my fork with a bit of sugar.
If that didn't work, you can try profiling with dtrace(only works in illumos-based systems like SmartOS).