How to poll another server periodically from a node.js server? - node.js

I have a node.js server A with mongodb for database.
There is another remote server B (doesn't need to be node based) which exposes a HTTP/GET API '/status' and returns either 'FREE' or 'BUSY' as the response.
When a user hits a particular API endpoint in server A(say POST /test), I wish to start polling server B's status API every minute, until server B returns 'FREE' as the response. The user doesn't need to wait till the server B returns a 'FREE' response (polling B is a background job in server A). Once the server A gets a 'FREE' response from B, it shall send out an email to the user.
How can this be achieved in server A, keeping in mind that the number of concurrent users can go large ?

I suggest you use Agenda. https://www.npmjs.com/package/agenda
With agenda you can create recurring schedules under which you can schedule anything pretty flexible.
I suggest you use request module to make HTTP get/post requests.
https://www.npmjs.com/package/request

Going from the example in node.js docs I'd go with something like the code here. I tested and it works. BTW, I'm assuming here that the api response is something like {"status":"BUSY"} & {"status":"FREE"}
const http = require('http');
const poll = {
pollB: function() {
http.get('http://serverB/status', (res) => {
const { statusCode } = res;
let error;
if (statusCode !== 200) {
error = new Error(`Request Failed.\n` +
`Status Code: ${statusCode}`);
}
if (error) {
console.error(error.message);
res.resume();
} else {
res.setEncoding('utf8');
let rawData = '';
res.on('data', (chunk) => { rawData += chunk; });
res.on('end', () => {
try {
const parsedData = JSON.parse(rawData);
// The important logic comes here
if (parsedData.status === 'BUSY') {
setTimeout(poll.pollB, 10000); // request again in 10 secs
} else {
// Call the background process you need to
}
} catch (e) {
console.error(e.message);
}
});
}
}).on('error', (e) => {
console.error(`Got error: ${e.message}`);
});
}
}
poll.pollB();
You probably want to play with this script and get rid of unnecessary code for you, but that's homework ;)
Update:
For coping with a lot of concurrency in node.js I'd recommend to implement a cluster or use a framework. Here are some links to start researching about the subject:
How to fully utilise server capacity for Node.js Web Apps
How to Create a Node.js Cluster for Speeding Up Your Apps
Node.js v7.10.0 Documentation :: cluster
ActionHero.js :: Fantastic node.js framework for implementing an API, background tasks, cluster using http, sockets, websockets

Use a library like request, superagent, or restify-clients to call server B. I would recommend you avoid polling and instead use a webhook when calling B (assuming you are also authoring B). If you can't change B, then setTimeout can be used to schedule subsequent calls on a 1 second interval.

Related

Progress bar for express / react communicating with backend

I want to make a progress bar kind of telling where the user where in process of fetching the API my backend is. But it seems like every time I send a response it stops the request, how can I avoid this and what should I google to learn more since I didn't find anything online.
React:
const {data, error, isError, isLoading } = useQuery('posts', fetchPosts)
if(isLoading){<p>Loadinng..</p>}
return({data&&<p>{data}</p>})
Express:
app.get("api/v1/testData", async (req, res) => {
try {
const info = req.query.info
const sortByThis = req.query.sortBy;
if (info) {
let yourMessage = "Getting Data";
res.status(200).send(yourMessage);
const valueArray = await fetchData(info);
yourMessage = "Data retrived, now sorting";
res.status(200).send(yourMessage);
const sortedArray = valueArray.filter((item) => item.value === sortByThis);
yourMessage = "Sorting Done now creating geojson";
res.status(200).send(yourMessage);
createGeoJson(sortedArray)
res.status(200).send(geojson);
}
else { res.status(400) }
} catch (err) { console.log(err) res.status(500).send }
}
You can only send one response to a request in HTTP.
In case you want to have status updates using HTTP, the client needs to poll the server i.e. request status updates from the server. Keep in mind though that every request needs to be processed on the server side and will take resources away which are then not available for other (more important) requests from other clients. So don't poll too frequently.
If you want to support long running operations using HTTP have a look at the following API design pattern.
Alternatively you could also use a WebSockets connection to push updates from the server to the client. I assume your computation on the backend will not be minutes long and you want to update the client in real-time, so probably WebSockets will be the best option for you. A WebSocket connection has, once established, considerably less overhead than sending huge HTTP requests/ responses between client and server.
Have a look at this thread which dicusses abovementioned and other possibilites.

Axios always time out on AWS Lambda for a particular API

Describe the issue
I'm not really sure if this is an Axios issue or not. The following code runs successfully on my local development machine but always time out whenever I run it from the cloud (e.g. AWS Lambda). Same thing happens when I run on repl.it.
I can confirm that AWS Lambda has internet access and it works for any other API but this:
https://www.target.com.au/ws-api/v1/target/products/search?category=W95362
Example Code
https://repl.it/repls/AdeptFluidSpreadsheet
const axios = require('axios');
const handler = async () => {
const url = 'https://www.target.com.au/ws-api/v1/target/products/search?category=W95362';
const response = await axios.get(url, { timeout: 10000 });
console.log(response.data.data.productDataList);
}
handler();
Environment
Axios Version: 0.19.2
Runtime: nodejs12x
Update 1
I tried the native require('https') and it times out on both localhost and cloud server. Please find sample code here: https://repl.it/repls/TerribleViolentVolume
const https = require('https');
const url = 'https://www.target.com.au/ws-api/v1/target/products/search?category=W95362';
https.get(url, res => {
var body = '';
res.on('data', chunk => {
body += chunk;
});
res.on('end', () => {
var response = JSON.parse(body);
console.log("Got a response: ", response);
});
}).on('error', e => {
console.log("Got an error: ", e);
});
Again, I can confirm that same code works on any other API.
Update 2
I suspect that this is something server side as it also behaves very weirdly with curl.
curl from local -> 403 access denied
curl from local with User-Agent header -> success
curl from cloud server -> 403 access denied
It must be server side validation, something related to AkamaiGHost.
You have probably placed your Lambda function in a VPC without Internet access to the outside world. Try check the VPC section in your lambda configuration, and setup an internet gateway accordingly
You should try by wrapping axios call into try/catch maybe that will catch the issue.
const axios = require('axios');
const handler = async () => {
try {
const url = 'https://www.target.com.au/ws-api/v1/target/products/search?category=W95362';
const response = await axios.get(url, { timeout: 10000 });
console.log(typeof (response));
console.log(response);
} catch (e) {
console.log(e, "error api call");
}
}
handler();
As suggested by Akshay you can use try and catch block to get the error. Maybe it helps you out.
Have you configured Error Handling for Asynchronous Invocation?
To configure error handling follow the below steps:
Open the Lambda console Functions page.
Choose a function.
Under Asynchronous invocation, choose Edit.
Configure the following settings.
Maximum age of event – The maximum amount of time Lambda retains an event in the asynchronous event queue, up to 6 hours.
Retry attempts – The number of times Lambda retries when the function returns an error, between 0 and 2.
Choose Save.
axios is only Promise based HTTP client for the browser and node.js and as you set timeout: 10000 so I believe timeout issue is not from its end.
Although your API
https://www.target.com.au/ws-api/v1/target/products/search?category=W95362
is working fine on the browser and rendering JSON data.
and Function timeout of lambda is by default 15 minutes, which I believe is enough for the response. There may be another issue.
Make sure you have set other configurations like permissions etc. as suggested in the documentation.
Here you can check the default limits for AWS lambda.

Terminate EventSource event listener?

I'm trying to work around a problem to do with rest streaming between the Nest API and a service (ST) that does not support streaming.
To get around this, I have built a service on Sails which takes a post request from ST containing the Nest Token, and then triggers an EventSource event listener that sends the data back to ST.
It is heavily based off the Nest rest-streaming example here:
https://github.com/nestlabs/rest-streaming and my code is as follows:
startStream: function(req, res) {
var nestToken = req.body.nestToken,
stToken = req.body.stToken,
endpointURL = req.body.endpointURL,
source = new EventSource(sails.config.nest.nest_api_url + '?auth=' + nestToken);
source.addEventListener('put', function(e) {
var d = JSON.parse(e.data);
var data = { devices: d.data.devices, structures: d.data.structures},
config = { headers : {'Authorization': 'Bearer ' + stToken}};
sendData(endpointURL, data, config);
});
source.addEventListener('open', function(e) {
console.log("Connection opened");
});
source.addEventListener('auth_revoked', function(e){
console.log("Auth token revoed");
});
source.addEventListener('error', function(e) {
if (e.readyState == EventSource.CLOSED) {
console.error('Connection was closed! ', e);
} else {
console.error('An unknown error occurred: ', e);
}
}, false);
}
};
The problem I foresee though is that once a request is received by the node server, it start the event listener, however I cannot for the life of me figure out how I can kill the event listener.
If I cannot figure out a way to stop this, then every EventListener will run indefinitely which is obviously not suitable.
Has anyone got any suggestions on how to overcome the issue?
Each SSH client connection is a dedicated socket.
If a particular client doesn't want event streaming, don't make the connection. If they start event streaming, but want to turn it off, call source.close();source=NULL;
If from server-side you want to stop sending the messages, close the socket.
You didn't show the server-side code, but if it is running a dedicated process per SSE client then you just exit the process. If you are maintaining a list of sockets, one per connected client, close the socket. On node.js you might be running a function on setInterval. To close the connection you do and clearInterval() and response.end();.

Node.js and understanding how response works

I'm really new to node.js so please bear with me if I'm making a obvious mistake.
To understand node.js, i'm trying to create a webserver that basically:
1) update the page with appending "hello world" everytime the root url (localhost:8000/) is hit.
2) user can go to another url (localhost:8000/getChatData) and it will display all the data built up from the url (localhost:8000/) being triggered
Problem I'm experiencing:
1) I'm having issue with displaying that data on the rendered page. I have a timer that should call get_data() ever second and update the screen with the data variable that stores the appended output. Specifically this line below response.simpleText(200, data); isn't working correctly.
The file
// Load the node-router library by creationix
var server = require('C:\\Personal\\ChatPrototype\\node\\node-router').getServer();
var data = null;
// Configure our HTTP server to respond with Hello World the root request
server.get("/", function (request, response) {
if(data != null)
{
data = data + "hello world\n";
}
else
{
data = "hellow world\n";
}
response.writeHead(200, {'Content-Type': 'text/plain'});
console.log(data);
response.simpleText(200, data);
response.end();
});
// Configure our HTTP server to respond with Hello World the root request
server.get("/getChatData", function (request, response) {
setInterval( function() { get_data(response); }, 1000 );
});
function get_data(response)
{
if(data != null)
{
response.writeHead(200, {'Content-Type': 'text/plain'});
response.simpleText(200, data);
console.log("data:" + data);
response.end();
}
else
{
console.log("no data");
}
}
// Listen on port 8080 on localhost
server.listen(8000, "localhost");
If there is a better way to do this, please let me know. The goal is to basically have a way for a server to call a url to update a variable and have another html page to report/display the updated data dynamically every second.
Thanks,
D
The client server model works by a client sending a request to the server and the server in return sends a response. The server can not send a response to the client that the client hasn't asked for. The client initiates the request. Therefore you cannot have the server changing the response object on an interval.
The client will not get these changes to the requests. How something like this is usually handled as through AJAX the initial response from the server sends Javascript code to the client that initiates requests to the server on an interval.
setTimeout accepts function without parameter which is obvious as it will be executed later in time. All values you need in that function should be available at the point of time. In you case, the response object that you are trying to pass, is a local instance which has scope only inside the server.get's callback (where you set the setTimeout).
There are several ways you can resolve this issue. you can keep a copy of the response instance in the outer scope where get_data belongs or you can move the get_data entirely inside and remove setTimeout. The first solution is not recommended as if getChatData is called several times in 1sec the last copy will be prevailing.
But my suggestion would be to keep the data in database and show it once getChatData is called.

fetch data from multiple table by sending only one request

I am using node.js for server side development and backbone.js for client side development. i want to fetch data from multiple table(more than 3) by sending only one request to node.js. but i cant merge all that result with each other beacuse of asynchronous execution of node.js. i have done this but it sending a lots of get request to node js for getting data from all the tables and because of these performance of my site is become slower. please help me if anyone having any idea.
I would create a method which aggregates the results from each of the requests and sends the response back. Basically each of your three async db calls would pass their data to the same method. That method would check to see if it had all of the data it needed to complete the request, and if it did, send the response.
Here is a pseudo code example:
function handleRequest(req, res) {
var results = {};
db.getUsers(function(data) {
aggregate('users', data);
});
db.getPosts(function(data) {
aggregate('posts', data);
});
db.getComments(function(data) {
aggregate('comments', data);
});
function aggregate(name, data) {
results[name] = data;
if(results.users && results.posts && results.comments) {
res.send(results);
}
}
}
This is simplified greatly, you should also of course check for errors and timeouts to the db calls, but this will allow you to wait for all the async commands to complete before sending the data.

Resources