Nodejs not sending http requests inside File ReadStream - node.js

I'm having a very strange error where my http requests aren't sending at all. Once I send the request it just stops executing. I'm ingesting a CSV file through a ReadStream and sending requests inside the emitter listening blocks. If I try to send the request outside the emitters it can send fine, but I need to POST data from the file so it has to be sent within them (unless there's a way to export the data I don't know about.
Code:
const buffer = multipart.parse(event, true).fileName.content;
const file = stream.Readable.from(buffer);
let events;
file.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
events = parseEvents(event, results);
console.log("sending request");
request({method: 'POST', url: 'https://httpbin.org/anything', json: {hello: 'world'}}, function (err, response, body) {
console.log(err);
console.log(response);
console.log(body);
});
console.log("finished request");
})
.on('error', error => {
console.log(error);
});
Before you say, I've also tried all kinds of requests. Using got, Axios, and request I've done awaits and tried to process it that way. I actually can get the promise but if I await it nothing happens. It's also not stuck in an infinite loop or anything because when put in a for loop it tries it every time and just always returns nothing.
For more info: The console gets the "sending request" log and then "finished request" and that's it. If I go the promise route, it doesn't even log the "finished request".

Per the documentation, url is not an option for http.request(), nor is json:
https://nodejs.org/dist/latest-v16.x/docs/api/http.html#httprequestoptions-callback

This seems like an async issue. After you fire your request, your code executes the next line rather than waits for the response and thats why you see the logs for sending and finished. Not sure how you implemented the promise solution before, and without seeing the rest of your code it would be hard to debug but I assume something like this should work.
return new Promise((resolve, reject) => {
file.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
events = parseEvents(event, results);
console.log("sending request");
request({method: 'POST', url: 'https://httpbin.org/anything', json: {hello: 'world'}}, function (err, response, body) {
console.log(err);
console.log(response);
console.log(body);
resolve(body);
});
console.log("finished request");
})
.on('error', error => {
reject(error);
console.log(error);
})
});
Then wherever you are returning the promise to you can call .then(() => { //do stuff })

Related

didn't get response when using res.write() inside of the fetch function

I am using this <res.write()> ==> https://nodejs.org/api/http.html#responsewritechunk-encoding-callback (in nodejs)
and using this fetch ==> https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch
My situation is that I didn't see any response when using the res.write() function inside of the fetch function. for example, in the below backend code, I tried to put res.write("456") inside of the first then function below fetch, but I only see 123 returned in the frontend, no 456.
res.write("123");
fetch('http://example.com/movies.json')
.then((response) => {response.json(), res.write("456")})
.then((data) => console.log(data));
I have searched Google for a while, but didn't see anything related. My guess is that this could be because of async usage.
appreciate if someone can give suggestions.
===== update =====
res is express's res obj
async sendText(res: Response){
res.write("123")
fetch('http://example.com/movies.json')
.then((response) => {return response.json()})
.then((data) => {
console.log(data);
res.write('456');
res.end();
});
}
seeing behavior: only see 123 in the frontend.
VS
async sendText(res: Response){
res.write("123")
await fetch('http://example.com/movies.json')
.then((response) => {return response.json()})
.then((data) => {
console.log(data);
res.write('456');
res.end();
});
}
seeing behavior: can see both 123 and 456 in the frontend.
I never use await and .then together before, not fully understand the difference. searching the web rn.
You aren't checking for any errors or an unsuccessful response so response.json() may be failing, preventing anything after it from executing.
Something like this should work better for you
async sendText (res: Response) {
res.write("123");
const response = await fetch("http://example.com/movies.json");
// check for an unsuccessful response
if (!response.ok) {
const error = new Error(`${response.status} ${response.statusText}`);
error.text = await response.text();
throw error;
}
const data = await response.json();
res.write("456");
res.end(); // finalise the HTTP response
console.log(data); // log data for some reason ¯\_(ツ)_/¯
};
This will return a promise that resolves with undefined or rejects with either a networking error, HTTP error or JSON parsing error.
When calling it, you should handle any rejections accordingly
app.get("/some/route", async (res, res, next) => {
try {
await sendText(res);
} catch (err) {
console.error(err, err.text);
next(err); // or perhaps `res.status(500).send(err)`
}
});
I never use await and .then together before
Nor should you. It only leads to confusing code.

How can I force this NodeJS function to wait until fs.writeFile has completed?

I am implementing https.request per these instructions (https://nodejs.org/api/https.html#httpsrequesturl-options-callback) except instead of doing a stdout, I am writing to file. I want the end to wait until the file writing process is complete. How can I do this?
process.stdout.write(d);
is changed to
fs.writeFile(path, d, err => {
if (err) {
console.error(err);
} else {
console.log("data => " path)
}
})
This is the entire code
const https = require('node:https');
const options = {
hostname: 'encrypted.google.com',
port: 443,
path: '/',
method: 'GET'
};
const req = https.request(options, (res) => {
console.log('statusCode:', res.statusCode);
console.log('headers:', res.headers);
res.on('data', (d) => {
process.stdout.write(d);
});
});
req.on('error', (e) => {
console.error(e);
});
req.end();
UPDATE
MTN posted a solution that works, but I realized that my code is slightly more complex. I read the file in chunks and save at the end. MTN's solution finishes early. Here is the code. Can anyone help me fix it?
const request = https.request(url, (response, error) => {
if (error) {console.log(error)}
let data = '';
response.on('data', (chunk) => {
data = data + chunk.toString();
});
response.on('end', () => {
fs.writeFile(path, data, err => {
if (err) {
console.error(err);
} else {
console.log("data => " path)
}
})
})
})
request.on('error', (error) => {
console.error(error);
})
request.end()
},
The immediate answer would be that whatever should happen after the file was written would have to go into the callback, after your console.log. (There is nothing in your code that looks like it's supposed to run afterwards though.)
But:
Your code would be a lot simpler if you'd...
Use a library for sending HTTP requests instead of the raw https module. (For example: got, axios, node-fetch, ...) - Not only do these take care of things like reading the body for you, they also have a promise interface which allows you to do point 2.
Rewrite the code to uses async/await.
Here is an example with got:
import got from 'got'
import { writeFile } from 'fs/promises'
const response = await got('https://encrypted.google.com').text()
await writeFile('test.html', response)
// Whatever should happen after the file was written would simply go here!
Note: This has to be an ES6 module because I used top-level await and import, and got doesn't even support CommonJS anymore. So either your package.json would have to have "type": "module" or the file ending would have to be mjs.
You can use fs.writeFileSync() instead. Its sync so it waits for the writing to be finished
res.on(“data”, (d) => { fs.writeFile(/* Enter params here */ })
Inside the fs.writeFile, add whatever you want to do in the last callback function.

NodeJS - Recovering from a try-catch of a thrown error, in-mid Promises-Chain (asynchrous - "second path")

Recently I have been struggling with a problem that rose up because of NodeJS's asynchronous nature.
I have a situation where I try to issue a request to one server, and if it does not work (because of a timeout for example), I "replace" it - by issuing a request to another server to supply me the data, and continue the code execution.
Right now, what actually happens is that once the catch method is being called, I am not sure how to "Go back" to the same place it stopped at, and continue the .then (Promise) chain.
Of course I can write code after the .catch and watch it being executed, but 2 things would probably happen:
1. This code will run asynchronously "without waiting".
2. I'll have to replicate large chunks of code, over and over, while nesting them inside each other, using promises and catch blocks, which will elevate the "Promises-chaining-hell", and which is obviously, or probably, not the correct way to achieve.
A short description of what I am trying to achieve:
const options1 = {
method: 'GET',
timeout: 1500,
uri: 'https://www.example.com/'
}
const options2 = {
method: 'GET',
timeout: 1500,
uri: 'https://www.example.com/'
}
const options3 = {
method: 'GET',
timeout: 1500,
uri: 'https://www.example.com/'
}
//Code before
request(options1)
.then(function (response) {
//Server 1 is working - execute what's inside .then
request(options3)
.then(function (response) {
//Got the data from server 1 or 2, doesn't matter, now get the required data from server 3
})
.catch(function (err) {
//Timeout has been thrown, show an error and continue
console.log('Server 3 error occured, continuing.');
});
})
.catch(function (err) {
//Timeout has been thrown, show an error and continue
request(options2)
.then(function (response) {
})
.catch(function (err) {
//Server 2 doesn't work either, abord and notify the user
console.log('Server 2 error occured, continuing.');
});
console.log('Server 1 error occured, continuing.');
});
Should I use an outside function, in-order to define those "recovery routes"?
Thank you.
You just return a promise to then(){} scope, and rest is taken care by the promises. So, as you yourself said, you can attach a "then" block after the catch, and write post execution code their, but you need to return a promise to the parent, which takes care of the chaining and consecutive calling of attached callbacks.
Something like this:
//Code before
request(options1)
.then(function (response) {
console.log("Success 1")
//Server 1 is working - execute what's inside .then
return request(options3)
.then(function (response) {
//Got the data from server 1 or 2, doesn't matter, now get the required data from server 3
console.log("Success 3")
})
.catch(function (err) {
//Timeout has been thrown, show an error and continue
console.log('Server 3 error occured, continuing.');
});
})
.catch(function (err) {
console.log("Failed 1")
//Timeout has been thrown, show an error and continue
return request(options2)
.then(function (response) {
console.log("Success 2")
})
.catch(function (err) {
console.log("Failed 1")
//Server 2 doesn't work either, abord and notify the user
console.log('Server 2 error occured, continuing.');
});
})
.then( ()=>{
console.log("Post execution")
} )
Hope it helps.
You can validate by throwing an error inside then block along with a timeout, and force the catch block to execute.
You should create a common function for what happens when the 1st / 2nd request succeeds. This function can be called in both the cases without code duplication.
request(options1)
.then(function (response) {
//Server 1 is working - execute what's inside .then
return request3(response);
})
.catch(function (err) {
//Timeout has been thrown, show an error and continue
return request(options2)
.then(function (response) {
return request3(response);
});
});
function request3() {
// return a promise here
}

Catch superagent request error before piping

I'm trying to pipe a file from service A trough service B into my Postman cliente. Service A builds an delivers a CSV file, and service B (nodejs) has to pipe into my client.
After researching a lot I have managed to successfully pipe the files into service B and then into Postman. Now I want to handle the ugly cases: what if the request token is invalid? What if I can't find the file?
As of this moment, I have found zero documentation or examples on how successfully handle errors while piping a request using superagent.
This is what I have so far
router.post("/csv", (req, res) => {
download_csv(req.get("Authorization"), req.body.ids)
.then((response) => {
res.sendFile(path.resolve(response));
})
.catch((err) => {
res.status(error.status).json(error.response.body);
})
});
function download_csv(token, ids) {
const stream = fs.createWriteStream("filters.csv")
let request = agent
.post(`${profiles}/api/documents/csv`)
.set("authorization", token)
.send({
ids: ids,
action: DOWNLOAD_CSV_PROFILES
})
request.on("response", res => {
// Maybe I can use abort to handle this thing, but can't figure out how!
// if (res.status !== 200) request.abort()
console.log(res.status)
})
request.on("abort", () => {
console.log("aborted")
return new Promise((resolve, reject) => {
resolve("request aborted")
})
})
request.pipe(stream)
return streamToPromise(stream);
}
function streamToPromise(stream) {
return new Promise((resolve, reject) => {
stream.on("error", (err) => {
console.log("error in error")
})
stream.on("finish", () => {
console.log("File saved")
resolve(stream.path);
});
});
}
This code handles the creation of the files correctly. When I fake the token or misspell the Authorization header, I get a correct 401 response, but a file gets written anyway with its contents being the authentication error.
Can anyway give me a hint on how to:
actually catch and manage the request when fails
in such case, how to escape the piping by going back to the express context and just returning a failed express request?
Many thanks!
If I understand you correctly, simply create the fs write stream in on('response') and make a small fix on the resultion.
function download_csv(token, ids) {
return new Promise((resolve, reject) => {
let request = agent
.post(`${profiles}/api/documents/csv`)
.set("authorization", token)
.send({
ids: ids,
action: DOWNLOAD_CSV_PROFILES
})
request.on("response", res => {
// Maybe I can use abort to handle this thing, but can't figure out how!
if (res.status === 200) {
res
.on("end", resolve)
.pipe(fs.createWriteStream("filters.csv"));
} else {
reject();
}
})
request.on("abort", reject);
});
}
I'm not sure what is the "request" you're using - but assuming it's actually the request npm module that will help.
Ideally, upload the file to a temporary directory and move it when the promise is resolved, delete on rejected. This way you'll solve the issue of partial downloads.
If you want to make any on-the-fly transforms, check out my "scramjet". It'll make everything easier with promises.

Handling status >= 400 in streamed node.js request http client

I'm making an HTTP request using https://github.com/request/request and I want to receive JSON. The response will be seriously large, so I want to use a stream approach to process the response. However, the API I'm calling returns 'text/plain' for status >= 400, which means that my JSONStream will bork. Code:
req = request.get({url: data_url});
req.pipe(require('JSONStream').parse([true]))
.pipe(require('stream-batch')({maxItems: 1000}))
.on('data', callback)
.on('end', done);
Error:
Invalid JSON (Unexpected "I" at position 0 in state STOP)
("A" as in "Internal server error".) It seems request does not emit 'error' events for requests that completes so adding req.on('error', (err) => ...) does not trigger. I could add
req.on('response', function(res) {
if(res.statusCode >= 400) { ... }
})
But then I seem not to be able to get at the error message in the body.
How can I get at the error message, log meaningfully and break processing?
Since the argument passed to the response event is a readable stream as well, you can create the pipeline inside its handler:
req.on('response', function(res) {
if (res.statusCode >= 400) {
let chunks = [];
res.on('data', c => chunks.push(c))
.on('end', () => {
console.log('error', Buffer.concat(chunks).toString());
});
} else {
res.pipe(require('JSONStream').parse([true]))
.pipe(require('stream-batch')({maxItems: 1000}))
.on('data', callback)
.on('end', done);
}
})

Resources