I have a script file in Node js with a function that recursively calls itself after sometime outs.
This function sends out a "message" to which parent is listening.
This parent is nothing but a REST api with basic CRUD operations.
In this file, this is what forking part looks like:
var myBgTask = require('child_process').fork('./server/api/thing/bgTask.js', [], { execArgv: ['--debug=5859']});
myBgTask.on('message', function(data){
//DO SOMETHING
})
Now, when from my angular code, I make a request to update the database, somehow the child process gets interrupted and throws Channel closed error at this line:
process.send({
name: randomThing,
readByUser: false
}, function(err){
console.log("error", err)
if(!err)
setTimeout(autoCreate, randomNumb * 1000);
});
and thus my server stops and I never am able to make http post/put calls.
Strange part is that it throws error only when I am making post or put calls and never with get calls.
I have been trying to debug this but have not been able to find out what the problem is. Can I get some help here on this?
Related
Consider a Node.js application with few processes:
single main process sitting in the memory and working like a web server;
system user's commands that can be run through CLI and exit when they are done.
I want to implement something like IPC between main and CLI processes, and it seems that ZeroMQ bindings for Node.js is a quite good candidate for doing that. I've chosen 6.0.0-beta.4 version:
Version 6.0.0 (in beta) features a brand new API that solves many fundamental issues and is recommended for new projects.
Using Request/Reply I was able to achieve what I wanted: CLI process notifies the main process about some occurred event (and optionally receives some data as a response) and continues its execution. A problem I have right now is that my CLI process hangs if the main process is off (is not available). The command still has to be executed and exit without notifying the main process if it's unable to establish a connection to a socket.
Here is a simplified code snippet of my CLI running in asynchronous method:
const { Request } = require('zeromq');
async function notify() {
let parsedResponse;
try {
const message = { event: 'hello world' };
const socket = new Request({ connectTimeout: 500 });
socket.connect('tcp://127.0.0.1:33332');
await socket.send(JSON.stringify(message));
const response = await socket.receive();
parsedResponse = JSON.parse(response.toString());
}
catch (e) {
console.error(e);
}
return parsedResponse;
}
(async() => {
const response = await notify();
if (response) {
console.log(response);
}
else {
console.log('Nothing is received.');
}
})();
I set connectTimeout option but wonder how to use it. The docs state:
Sets how long to wait before timing-out a connect() system call. The connect() system call normally takes a long time before it returns a time out error. Setting this option allows the library to time out the call at an earlier interval.
Looking at connect one see that it's not asynchronous:
Connects to the socket at the given remote address and returns immediately. The connection will be made asynchronously in the background.
Ok, probably send method of the socket will wait for connection establishment and reject a promise on connection timeout...but nothing happens there. send method is executed and the code is stuck at resolving receive. It's waiting for reply from the main process that will never come. So the main question is: "How to use connectTimeout option to handle socket's connection timeout?" I found an answer to similar question related to C++ but it actually doesn't answer the question (or I can't understand it). Can't believe that this option is useless and that it was added to the API in order to nobody can't use it.
I also would be happy with some kind of a workaround, and found receiveTimeout option. Changing socket creation to
const socket = new Request({ receiveTimeout: 500 });
leads to the the rejection in receive method and the following output:
{ [Error: Socket temporarily unavailable] errno: 11, code: 'EAGAIN' }
Nothing is received.
Source code executed but the process doesn't exit in this case. Seems that some resources are busy and are not freed. When main process is on the line everything works fine, process exits and I have the following reply in output:
{ status: 'success' }
So another question is: "How to exit the process gracefully on rejecting receive method with receiveTimeout?". Calling process.exit() is not an option here!
P.S. My environment is:
Kubuntu 18.04.1;
Node 10.15.0;
ZeroMQ bindings are installed this way:
$ yarn add zeromq#6.0.0-beta.4 --zmq-shared
ZeroMQ decouples the socket connection mechanics from message delivery. As the documentation states connectTimeout only influences the timeout of the connect() system call and does not affect the timeouts of sending/receiving messages.
For example:
const zmq = require("zeromq")
async function run() {
const socket = new zmq.Dealer({connectTimeout: 2000})
socket.events.on("connect:retry", event => {
console.log(new Date(), event.type)
})
socket.connect("tcp://example.com:12345")
}
run()
The connect:retry event occurs every ~2 seconds:
> node test.js
2019-11-25T13:35:53.375Z connect:retry
2019-11-25T13:35:55.536Z connect:retry
2019-11-25T13:35:57.719Z connect:retry
If we change connectTimeout to 200 then you can see the event will occur much more frequently. The timeout is not the only thing influencing the delay between the events, but it should be clear that it happens much quicker.
> node test.js
2019-11-25T13:36:05.271Z connect:retry
2019-11-25T13:36:05.531Z connect:retry
2019-11-25T13:36:05.810Z connect:retry
Hope this clarifies the effect of connectTimeout.
I have my mongodb service stopped, so I know that my front end is not connected to my DB. I am using react and express.
Upon my app starting, I want to indicate that to the user somehow the server is offline so I figured if my original get call for users fails, then the server is offline.
I'm doing a simple call:
componentDidMount () {
axios.get ('/api/users')
.then ((res) => this.setState(
{ users : res.data }
))
.catch ((error) => {
//console.error(error);
console.log('error found : offline');
});
}
But nothing happens in situation. I never get the catch call for the console. Am I going about this wrong? I'm new to backend so this is all a learning experience for me.
I was going to set a failed flag and then render a display error for the user and then retry the connection every 1500ms or something (is that bad programming?).
i will share the code directly
app.get('/ListBooks', function (req, res) {
console.log("Function called");
//internally calls another URL and sends its response to browser
request({
url: 'someURLinRESTServer',
method: 'POST',
json: MyJsonData
}, function (error, response, body) {
if (error) {
console.log("/Call Failed ->" + error);
res.status(200).send('Failed');
} else {
console.log("/Call got Response");
console.log(response.statusCode, body);
res.send(body); res.end();
}
})
now when the browser generates a request on http://localhost/ListBooks
my node console shows the first message "Function called" and waits for internal REST URL Response
the real problem occurs only when the REST SERVER is down
then if i try to call http://localhost/ListBooks from another browser tab the Node server console doesnt show any changes and only after the repsonse of previous function REST CALL call it displays console message of second function call on app.get('/ListBooks'
i thought node js makes async functions bt here i dnt want functions to wait likes this for multiple instance calls
or is it just a delay in printing message and each function call executes separately .Plz clarify ...
If this is only occurring when the REST server is down (as your comment indicates), then that's just a function of how long your calls to request() take to fail. And, each separate call to request() goes through its own cycle of trying to connect and then eventually timing out. If both are timing out, then you will issue request1, then request2, then some timeout amount of time will pass and request1 will fail and then request2 will fail shortly after it. This has nothing to do with how express handles multiple requests and everything to do with how the calls to your REST server behave.
You can set the timeout option for request() if you want to shorten how long it will wait for a response, but you do need to make sure you don't shorten it so much that a busy REST server that just takes a little while to actually respond gets timed out.
or is it just a delay in printing message and each function call
executes separately
Each call is acting completely separately. There is no serialization of these responses by node.js or by Express. The appearance of serialization is just because they both take the same amount of time to fail with a timeout so they will fail one after the other.
I have some proxy code like this below. Problem is that whenever the target server is down, this code fails to capture the error, resulting in the entire application crashing with Error: connect ECONNREFUSED.
For a proxy server, this is terrible, it needs to just return an error to the caller, not crash altogether upon the first time that the target server is unreachable.
What is the right way around it these days?
Node version 6.
let targetUrl = "http://foo.com/bar"
app.options('/cors-proxy/bar', cors())
app.post('/cors-proxy/bar', function(req, res) {
console.log(`received message with method ${req.method} and some body ${req.body}`)
console.log(`relaying message to ${targetUrl}`)
try {
req.pipe(
request({
url: targetUrl,
method: req.method,
json: req.body
})
).pipe(res);
} catch (err) {
res.status(502)
res.render('error', {
message: err.message,
error: err
});
}
});
Thanks!
In general, you can't use try/catch to catch exceptions that may occur in asynchronous callbacks or asynchronous operations. That will only catch synchronous errors.
Instead, you have to read how each particular asynchronous operation reports errors and make sure you are plugged into that particular mechanism.
For example, streams report errors with a message to the stream that you intercept with stream.on('error', ...). For example, a request() can report errors several different ways depending upon which request() library you are actually using and how you are using it.
Some references:
Error handling with node.js streams
Stream Readable Error
How Error Events Affect Piped Streams in Node.js
I have a very limited knowledge about node and nob-blocking IO so forgive me if my question is too naive.
In order to return needed information in response body, I need to
Make a call to 3rd party API
Wait for response
Add some modifications and return JSON response with the information I got from API.
My question is.. how can I wait for response? Or is it possible to send the information to the client only when I received response from API (as far as I know, connection should be bidirectional in this case which means I won't be able to do so using HTTP).
And yet another question. If one request waits for response from API, does this mean than other users will be forced to wait too (since node is single-threaded) until I increase numbers of threads/processes from 1 to N?
You pass a callback to the function which calls the service. If the service is a database, for example:
db.connect(host, callback);
And somewhere else in the code:
var callback = function(err, dbObject) {
// The connection was made, it's safe to handle the code here
console.log(dbObject.status);
res.json(jsonObject, 200)
};
Or you can use anonymous functions, so:
db.connect(host, function(err, dbObject) {
// The connection was made, it's safe to handle the code here
console.log(dbObject.status);
res.json(jsonObject, 200)
});
Between the call and the callback, node handles other clients / connections freely, "non-blocking".
This type of situation is exactly what node was designed to solve. Once you receive the request from your client, you can make a http request, which should take a callback parameter. This will call your callback function when the request is done, but node can do other work (including serving other clients) while you are waiting for the response. Once the request is done, you can have your code return the response to the client that is still waiting.
The amount of memory and CPU used by the node process will increase as additional clients connect to it, but only one process is needed to handle many simultaneous clients.
Node focuses on doing slow I/O asynchronously, so that the application code can start a task, and then have code start executing again after the I/O has completed.
An typical example might make it clear. We make a call to the FB API. When we get a response, we modify it and then send JSON to the user.
var express = require('express');
var fb = require('facebook-js');
app.get('/user', function(req, res){
fb.apiCall('GET', '/me/', {access_token: access_token}, function(error, response, body){ // access FB API
// when FB responds this part of the code will execute
if (error){
throw new Error('Error getting user information');
}
body.platform = 'Facebook' // modify the Facebook response, available as JSON in body
res.json(body); // send the response to client
});
});