I'm trying to get data from keepa api for my app ,
the status code of my request is 200 but I'm getting the Syntax error unexpected token JSON at position 0 for every request.
response.on("data", function(data){
const asinData = JSON.parse(data);
console.log(asinData);
res.send();
});
Can you print this "data"? Guess there is an error in it. I think "data" is a "serverResponse" object and "serverResponse.data" is what you want to see there, try to console.log it.
Is the response object coming from the http core module get() method? If so this may be helpful: https://nodejs.org/api/http.html#http_http_get_options_callback.
Basically, the response object you are getting is an http.IncomingMessage instance, which is a readable stream. The data event is triggered on this object not when the response body has been fully received, but every time a small part - a chunk - of it has. You would need to aggregate all of these chunks into a single piece of data before attempting to parse it into a javascript object.
Also, be aware the chunks are emitted as buffers by default - not as strings. You can set the stream to emit strings instead by setting the stream encoding before starting to read the chunks.
Related
I am using a GET request to fetch what I expect to be an XML document from an endpoint. The response has the following structure:
' <itunes:explicit>clean</itunes:explicit>\n' +
' <itunes:episode>11</itunes:episode>\n' +
' <itunes:episodeType>full</itunes:episodeType>\n' +
(This is from a console log in a Node.js function).
I haven't encountered a response like this before and am having trouble doing anything useful with it. I've tried:
Changing the response type and encoding of my GET function
Parsing the response with an XML parser - this throws an error
Removing the newline and + characters manually with regex (I'd like to avoid this, but it doesn't seem to work anyway)
It's worth saying that the response looks as you'd expect in a browser window:
Am I missing something fundamental about how this data is encoded / structured and what is the best way to turn it into something I can work with?
Rookie error. In case anyone else stumbles across this; I expected the response from my Axios GET request to be the xml. The response was actually in a data property in the response:
const response = await axios.get(url);
const myXML = response.data;
badStream.pipe(res)
When badStream throws an error, the response is not terminating and the request in the browser is stuck in a pending state.
badStream.on(error, function() {this.end()}).pipe(res)
I've tried the above to no avail. What's the proper way to handle the error in this case? Thanks for any help.
In nodejs, an error on the readstream that is piped to the http response stream just unpipes it from the response stream, but does not otherwise do anything to the response stream it was piped to. That leaves it hanging as an open socket with the browser still waiting for it to finish (as you observed). As such, you have to manually handle the error and do something to the target stream.
badStream.pipe(res);
badStream.on('error', err => {
// log the error and prematurely end the response stream
console.log(err);
res.end();
});
Because this is an http response and you are already in the middle of sending the http response body and thus the http status and headers have already been sent, there aren't a lot of things you can do in the middle of sending the response body.
Ultimately, you're going to have to call res.end() to terminate the response so the browser knows the request is done. If there's a content-length header on this response (the length was known ahead of time), then just terminating the response stream before it's done will cause the browser to see that it didn't get the whole response and thus know that something went wrong.
If there's no content-length on the response, then it really depends upon what type of data you're sending. If you're just sending text, then the browser probably won't know there's an error because the text response will just end. If it's human readable text, you could send "ERROR, ERROR, ERROR - response ended prematurely" or some visible text marker so perhaps a human might recognize that the response is incomplete.
If it's some particular format data such as JSON or XML or any multi-part response, then hanging up the socket prematurely will probably lead to a parsing error that the client will notice. Unfortunately, http just doesn't really make provisions for mid-response errors so it's left to the individual applications to detect and handle.
FYI, here's a pretty interesting article that covers a lot about error handling with streams. And, note that using stream.pipeline() instead of .pipe() also does a lot more complete error handling, including giving you one single callback that will get called for an error in either stream and it will automatically call .destroy() on all streams. In many ways, stream.pipeline(src, dest) is meant to replace src.pipe(dest).
I have a server that is streaming json objects to an endpoint. Here is a simplified example:
app.get('/getJsonObjects', function (req, res) {
res.write(JSON.stringify(json1));
res.write(JSON.stringify(json2));
res.write(JSON.stringify(json3));
res.write(JSON.stringify(json4));
res.write(JSON.stringify(json5));
res.end();
});
Then client side using browser-request, I'm trying to do:
var r = request(url);
r.on('data', function(data) {
console.log(JSON.parse(data));
});
The problem is despite streaming to the endpoint chunks of valid stringified JSON, the chunks I'm getting back from the request are just text chunks that don't necessarily align with the start/end of the JSON chunks that were sent from the server. This means that JSON.parse(data) will sometimes fail.
What is the best way to stream these chunks of json in the same way that they were written to the endpoint?
This is an async problem. The server code you have provided will not be guaranteed to send out data in that order.
You will either have to accumulate the chunks on the client side and determine the order of the chunks on the client end for display or you will have to do some sort of accumulator method on the server end and then output the JSON in order as they get processed.
Edit:
It appears that res.write can take in an encoding type "chunked". So try setting the header field to chunked and then specify "chunked" in the encoding parameter of res.write().
https://nodejs.org/api/http.html#http_response_write_chunk_encoding_callback
If this fails, you can just make a huge callback / promise chain using the callback parameter of res.write to guarantee the order of the res.write().
I (as the client) am trying to post an image with restify, and the server just needs to save it.
req.pipe(fs.createWriteStream('test.jpg'));
is not working. An empty file is created but nothing more. It works when I copy req.body into a buffer and then fs.writeFile(...). I have also tried req.body.pipe, but this throws an error.
You're probably using a body parser middleware that is already reading all of the data from the request so there is nothing left to read. Try adjusting the placement of your route handler and/or body parsing middleware if you want to read directly from the request object.
However, that will only work if the request contains only the image data. Typically a request is formatted as multipart/form-data if it contains at least one file, so you cannot just pipe the request and expect image data only.
So something else in your middleware chain, probably restify.bodyParser(), is already streaming the request body into a buffer or string as req.body and you can't stream something twice. Find the middleware and disable it for this route if you want to handle the streaming straight to the filesystem yourself.
I need to serialize the stream object that is the callback from a net.createServer()
var server = net.createServer(function (stream) {
var json = JSON.stringify(stream);
However, When I do this I get a Type Error because the stream object contains circular attributes.
Is there a workaround for this?
#Jason is correct here. You want to take the data from the stream, not the stream itself, and put it into Redis. In order to do this, you must add event listeners to the stream for the data and end events. In the event handlers you will get a chunk of data with each callback that you can either write to redis in pieces or assemble them in memory and then write the whole thing when the end callback occurs. Here's an example you can follow.