NodeJS ending https response early - node.js

I have a nodejs application that sends http requests to an external API and retrieves json data (in the form of a hex buffer) in response, using http.
However, when I try to pull large data sets from the API, the data is incomplete. The json will cut off about a third of the way through and trying to parse it produces an error. I don't think it's a problem with my parsing (toString) because res.complete is not being triggered, so it's clearly not finishing.
Is there a way to force my request to wait for res.complete to finish?
My request looks like this:
const req = https.get(options=pull_options,res=> {
res.on('data', d=> {
if(res.complete) {
resolve(d.toString());
} else {
console.error("Connection terminated while message was still being sent.")
}
}
}
I really don't think it's a problem with the API cutting me off because I'm able to pull the same data set with nearly identical code in python with no issues.

output = '';
const req = https.get(options=pull_options,res=> {
res.on('data', d=> {
output += d;
}
res.on('end', d=> {
resolve(output)
}
}
Changing the resolve to on end to allow all the data to come in worked for me.

Related

How to construct and extract value from simple HTTPS request in Node.js?

I have a simple HTTPS request -
https://api.pro.coinbase.com/products/btc-eur/ticker
In the browser this returns one object. What's the simplest code that will allow me to retrieve and display this object (as is) in the terminal of Node?
const https = require('https')
const url = https.get('https://api.pro.coinbase.com/products/btc-eur/ticker')
const myObject = JSON.parse(url)
console.log(myObject)
A simple copy / paste of the above code in VSC returns the error SyntaxError: Unexpected token o in JSON at position 1.
#mamba76, welcome to the SO community. Please use Node.js node-fetch package. It is much simpler to use. You can install it using npm install.
Following code might help:
"use strict";
const fetch = require('node-fetch')
async function getValue() {
// Invoke the API.
// Wait until data is fetched.
let response = await fetch('https://api.pro.coinbase.com/products/btc-eur/ticker');
let value = await response.json();
return value;
}
getValue().then(result => {console.log(result.price);});
As a good practice, always assume that API calls over the HTTP (whether in your own network or outside) might take time to return data and hence you should use async-await pattern to make these requests.
Extending #Akshay.N's answer and without using external dependencies,
const https = require('https')
https.get("https://api.pro.coinbase.com/products/btc-eur/ticker",res=>{
let body = '';
res.on('data', (chunk) => { body += chunk; });
res.on('end', () => {
const myObject = JSON.parse(body);
console.log(myObject);
})
})
Now, what we're doing here is waiting on the data event as long as the data is coming in, and appending it to the variable body. Once the end event is encountered, we take that as a signal that all data has been received and we can proceed to parse the body into an object using JSON.parse (assuming the data was serialized in JSON; if it wasn't JSON.parse will throw an error).
This tutorial is helpful: https://davidwalsh.name/nodejs-http-request
try something like this:-
https.get("https://api.pro.coinbase.com/products/btc-eur/ticker",res=>{
res.on('data', (chunk) => { console.log(JSON.parse(chunk))})
})
With node (you need request module) :
// display object
(require("request")).get({
url: "myurl",
json: true
}, function(e,r,b){
console.log(b);
});
// display as string
(require("request")).get({
url: "myurl",
json: false
}, function(e,r,b){
console.log(b);
});
With just curl in your terminal (without node)
curl myurl

How to poll another server periodically from a node.js server?

I have a node.js server A with mongodb for database.
There is another remote server B (doesn't need to be node based) which exposes a HTTP/GET API '/status' and returns either 'FREE' or 'BUSY' as the response.
When a user hits a particular API endpoint in server A(say POST /test), I wish to start polling server B's status API every minute, until server B returns 'FREE' as the response. The user doesn't need to wait till the server B returns a 'FREE' response (polling B is a background job in server A). Once the server A gets a 'FREE' response from B, it shall send out an email to the user.
How can this be achieved in server A, keeping in mind that the number of concurrent users can go large ?
I suggest you use Agenda. https://www.npmjs.com/package/agenda
With agenda you can create recurring schedules under which you can schedule anything pretty flexible.
I suggest you use request module to make HTTP get/post requests.
https://www.npmjs.com/package/request
Going from the example in node.js docs I'd go with something like the code here. I tested and it works. BTW, I'm assuming here that the api response is something like {"status":"BUSY"} & {"status":"FREE"}
const http = require('http');
const poll = {
pollB: function() {
http.get('http://serverB/status', (res) => {
const { statusCode } = res;
let error;
if (statusCode !== 200) {
error = new Error(`Request Failed.\n` +
`Status Code: ${statusCode}`);
}
if (error) {
console.error(error.message);
res.resume();
} else {
res.setEncoding('utf8');
let rawData = '';
res.on('data', (chunk) => { rawData += chunk; });
res.on('end', () => {
try {
const parsedData = JSON.parse(rawData);
// The important logic comes here
if (parsedData.status === 'BUSY') {
setTimeout(poll.pollB, 10000); // request again in 10 secs
} else {
// Call the background process you need to
}
} catch (e) {
console.error(e.message);
}
});
}
}).on('error', (e) => {
console.error(`Got error: ${e.message}`);
});
}
}
poll.pollB();
You probably want to play with this script and get rid of unnecessary code for you, but that's homework ;)
Update:
For coping with a lot of concurrency in node.js I'd recommend to implement a cluster or use a framework. Here are some links to start researching about the subject:
How to fully utilise server capacity for Node.js Web Apps
How to Create a Node.js Cluster for Speeding Up Your Apps
Node.js v7.10.0 Documentation :: cluster
ActionHero.js :: Fantastic node.js framework for implementing an API, background tasks, cluster using http, sockets, websockets
Use a library like request, superagent, or restify-clients to call server B. I would recommend you avoid polling and instead use a webhook when calling B (assuming you are also authoring B). If you can't change B, then setTimeout can be used to schedule subsequent calls on a 1 second interval.

Multiple clients posting data in node js

I've read that in Node js one should treat POST requests carefully because the post data may arrive in chunks, so it has to be handled like this, concatenating:
function handleRequest(request, response) {
if (request.method == 'POST') {
var body = '';
request.on('data', function (data) {
body += data;
});
request.on('end', function () {
//data is complete here
});
}
}
What I don't understand is how this code snippet will handle several clients at the same time. Let's say two separate clients start uploading large POST data. They will be added to the same body, mixing up the data...
Or is it the framework which will handle this? Triggering different instances of handleRequest function so that they do not get mixed up in the body variable?
Thanks.
Given the request, response signature of your method, it looks like that's a listener for the request event.
Assuming that's correct, then this event is emitted for every new request, so as long as you are only concatenating new data to a body object that is unique to that handler (as in your current example), you're good to go.

fetch data from multiple table by sending only one request

I am using node.js for server side development and backbone.js for client side development. i want to fetch data from multiple table(more than 3) by sending only one request to node.js. but i cant merge all that result with each other beacuse of asynchronous execution of node.js. i have done this but it sending a lots of get request to node js for getting data from all the tables and because of these performance of my site is become slower. please help me if anyone having any idea.
I would create a method which aggregates the results from each of the requests and sends the response back. Basically each of your three async db calls would pass their data to the same method. That method would check to see if it had all of the data it needed to complete the request, and if it did, send the response.
Here is a pseudo code example:
function handleRequest(req, res) {
var results = {};
db.getUsers(function(data) {
aggregate('users', data);
});
db.getPosts(function(data) {
aggregate('posts', data);
});
db.getComments(function(data) {
aggregate('comments', data);
});
function aggregate(name, data) {
results[name] = data;
if(results.users && results.posts && results.comments) {
res.send(results);
}
}
}
This is simplified greatly, you should also of course check for errors and timeouts to the db calls, but this will allow you to wait for all the async commands to complete before sending the data.

response.write failure in Nodejs

I am trying to make a proxy server that gets a page from www.xxx.com for example, cache the response, and then send it to the requesting browser.
To do so, on the server I create an HTTP client that requests the page from xxx.com. The response is returned in the form of chunks (Buffers). Then, since the number of chunks is different according to the webpage, I put the chunks in an array of buffers. Then I send the elements of the array.
My problem is not all the chunks are sent successfully. Is there any other way I can cache the data before sending it? (I know that I can send the data directly, but I need to send the cache instead since I want to send it to more than one browser)
To save the chunks I use:
function getURL(u) {
u = url.parse(u);
var client = http.createClient(u.port || 80, u.hostname);
var request = client.request('GET', '/', {
'Host': u.hostname,
});
var cache ={ };
cache.data = [];
request.end();
request.on('response', function(response) {
cache.statusCode = response.statusCode;
cache.headers = response.headers;
response.on('data', function(chunk) {
cache.data.push(chunk);
}
}
to send the cache, i use:
function sendCache(response, cache) {
var writeSuccess = [];
response.writeHead(cache.statusCose, cache.headers);
for (var i in cache.data) {
// don't encode the data, leave it as it is
writeSuccess[i] = response.write(cache.data[i], "binary");
console.log("chunk " + i + " is of length "
+ cache.data[i].length + ". Response success: " + writeSuccess[i]);
}
}
Here I log the returned value of the response.write to check if it is successful or not. In the node.js API, it is not explained if this function returns something or not, but I just tried it out.
What I noticed, the response.write was sometimes true and then false for other chunks of the cache whereas if I directly send the response without caching, response.write of all chunks is true.
If anyone notices something wrong I am doing or does know a better way to cache the data (preferably in binary so that all non-ASCII characters will be cached to).
If you are trying to proxy requests in node.js you should try using https://github.com/nodejitsu/node-http-proxy, it will save you a lot of time and headaches.
In the latest release of Node.js (v 0.4.0), the issue of writing a response from buffer was solved. So by just updating to this version my problem was solved.
However, one has to know that response.write may still give false, but this doesn't mean that it is not sent, but not sent directly (leaky bucket concept). This is what I was able to conclude from the comment inside the node.js library (I hope I am correct).

Resources