I'm new to node.js and Stack Overflow and I'm having a little trouble: I want to read two files and do something with them in a specific order. Problem being that I don't know which one will finish being read first so I don't know how to make sure they will trigger in the right order.
To give an example let's say that I want to read two files and write them in the response:
fs.readFile('./file1',(err,data) => {
res.write(data);
})
fs.readFile('./file2',(err,data) => {
res.write(data);
})
How could I make sure the first file will be written before the second even if the second file is smaller than the first one?
I could do that:
fs.readFile('./file1',(err,data) => {
res.write(data);
fs.readFile('./file2',(err,data) => {
res.write(data);
})
})
But it would act like a blocking structure: the second one couldn't start being read before the end of the first one and that's not the point of Node.js... Am I right?
P.S. Sorry if the question is dumb or for my poor English
One of the reason (out of several) Promise and async/await exists is to solve exactly this kind of issue.
You can create a wrapper to return Promise:
function readSomething(filePath) {
return new Promise((resolve, reject) => {
fs.readFile(filePath, (err, data) => {
if (err) reject(err);
else resolve(data);
});
}
Then you can call like,
// called in parallel
const result1 = readSomething('./file1');
const result2 = readSomething('./file2');
result1.then((data) => {
res.write(data) // or anything
// to order, chaining promise
return result2;
})
.then(data => res.write(data))
With async/await you can do (await is valid only inside async function):
async function handleResults() {
res.write(await result1)
res.write(await result2)
}
Note: Watch out for error (reject cases). They become a little tricky in this situation. (I'm you can figure it out)
In nodejs you can utilise the power of EventEmitter as well. You can also check out reactive programming implemented by rxjs library for Observer pattern.
let file1 = fs.readFile('./file1')
let file2 = fs.readFile('./file2');
file1 = await file1;
file2 = await file2;
res.write(file1)
res.write(file2)
By this reading takes place in paralle, but if you know before hand that file1 is going to be comparatively big you can swap the await statements. And then you can write in whichever order you want to.
Related
Hello i really need help with this issue, my last console.log is execute BEFORE the for loop and i dont know how to fix it. I really need to have access at my array nbfilm after the for loop
Can someone help me?
What the console print : lien
client.db.query("SELECT name,id FROM film", function (err, result) {
if (err) throw err;
const catalog = new MessageEmbed()
.setTitle("Catalogue")
.setColor("#fcfe80")
.setFooter({text:"🍿 ・ PopFlix"})
let testresult =[]
let nbfilm =[]
for (let compteur of result){
testresult.push(compteur.id)
testresult.push(compteur.name)
}
console.log(testresult)
for (let compteur2 = 0; compteur2 < testresult.length; compteur2+=2){
client.db.query(`SELECT link FROM lien WHERE fid=${testresult[compteur2]}`, function (err,result) {
nbfilm.push(testresult[compteur2+1])
nbfilm.push(result.length)
console.log("nbfilm in for loop",nbfilm)
});
}
console.log("nbfilmAFTER",nbfilm)
});
The body of the loop delays execution. Due to the fact that javascript is an asynchronous i/o type operation language, it is common to return a promise. In other words, the code is executed as expected, but the result of the actions will be visible only after all pending tasks have been completed. In your case, adding async/await in code might help. Use it node docs
It looks like client.db.query is asynchroneous. JS here works as expected because it doesnt wait for the query to be finished before moving to the next line.
Its now a better practice to use async/await instead of callbacks. If you provide the package that you are using we can provide a code example.
client.db.query() is an asynchronous function and won't execute until after the console.log(nbfilm) is executed regardless of how fast that query actually runs.
I'd recommend using Promise.all(). You will also have to "promisify" the query() function. Pass everything you want to resolve(), and then concatenate them in the final line as below.
let promises = [];
for(....) {
promises.push(new Promise((resolve, reject) => {
client.db.query("select ... ", () => {
// all the same stuffs
resolve([testresult[compteru2+1], result.length]);
});
}))
}
Promise.all(promises)
.then((results) => results.reduce((p, c) => p.concat(c), []))
.then(data => console.log(data)); // do whatever you want with "data"
Here's a simplified demo:
https://codesandbox.io/s/promise-all-example-78r347
So my code is supposed to read some lines from a CSV file, convert them to an array of JSON objects, and return that array.
To read the file as a stream, I am using got, and then using it in fast-csv.
In order to return the resulting array, I put the entire thing into a Promise like this:
async GetPage() : Promise<{OutputArray:any[], StartingIndex:number}>{
return new Promise(async (resolve, reject) => {
const output:any[] = [];
const startingIndex = this.currentLocation;
try{
parseStream(this.source, {headers:true, maxRows:this.maxArrayLength, skipRows:this.currentLocation, ignoreEmpty:true, delimiter:this.delimiter})
.on('error', error => console.log(`parseStream: ${error}`))
.on('data', row => {
const obj = this.unflatten(row); // data is flattened JSON, need to unflatten it
output.push(obj); // append to output array
this.currentLocation++;
})
.on('end', (rowCount: number) => {
console.log(`Parsed ${this.currentLocation} rows`);
resolve({OutputArray:output, StartingIndex:startingIndex});
});
}
catch(ex){
console.log(`parseStream: ${ex}`);
throw new Error(ex);
}
})
}
Now when I call this once (await GetPage()) it works perfectly fine.
The problem is when I'm calling it a second time in a row. I'm getting the following:
UnhandledPromiseRejectionWarning: Error: Failed to pipe. The response has been emitted already.
I've seen a similar case over here: https://github.com/sindresorhus/file-type/issues/342 but from what I gather this is a different case, or rather if it's the same I don't know how to apply the solution here.
The GetPage is a method inside a class CSVStreamParser which is given a Readable in the constructor, and I create that Readable like this: readable:Readable = got.stream(url)
What confuses me is that my first version of GetPage did not include a Promise, but rather accepted a callback (I just sent console.log to test it) and when I called it several times in a row there was no error, but it could not return a value so I converted it to a Promise.
Thank you! :)
EDIT: I have managed to make it work by re-opening the stream at the start of GetPage(), but I am wondering if there is a way to achieve the same result without having to do so? Is there a way to keep the stream open?
First, remove both of the async, since you are already returning a Promise.
Then remove the try/catch block and throw since you shouldn't throw in a promise. Instead use the reject function.
GetPage() : Promise<{OutputArray:any[], StartingIndex:number}>{
return new Promise((resolve, reject) => {
const output:any[] = [];
const startingIndex = this.currentLocation;
parseStream(this.source, {headers:true, maxRows:this.maxArrayLength, skipRows:this.currentLocation, ignoreEmpty:true, delimiter:this.delimiter})
.on('error', error => reject(error))
.on('data', row => {
const obj = this.unflatten(row); // data is flattened JSON, need to unflatten it
output.push(obj); // append to output array
this.currentLocation++;
})
.on('end', (rowCount: number) => {
console.log(`Parsed ${this.currentLocation} rows`);
resolve({OutputArray:output, StartingIndex:startingIndex});
});
});
}
Here's some resources to help you learn about async functions and promises.
fs.rename("${nombreHtml}.html",(err)=>{
console.log(err)
})
fs.appendFileSync("${nombreHtml}.html", htmlSeparado, () => { })
I try to run these two operations but it doesn't want to work
fs.rename is an asyncronous task.
By the time fs.rename finished its execution, fs.appendFileSync has already tried appending data to an html file which has not existed by the time.
fs.rename ... awaiting callback
fs.append ... failing
fs.rename finished, file now has a new name.
You probably want to either place fs.appendFileSync inside the fs.rename callback, or switch to promises. (example at the bottom)
example that should work:
fs.rename("${nombreHtml}.html",(err)=>{
if (err) console.log(err)
else {
fs.appendFileSync("${nombreHtml}.html", htmlSeparado, () => { })
}
})
By the way, because syncronous functions block the event loop and hence freeze your server for the time handling that function, making it unavailable for any other request - using filesystem's syncronous functions is rather less recommended for the general usecase, as the read/write/append operations are rather long. it is recommended to use the async versions of them, which return a callback or a promise, as you have done using fs.rename.
fs has a built-in sub-module with the same functions as promises which can be accessed by require('fs').promises.
this way you could just
const { rename, appendFile } = require('fs').promises;
try {
await rename("${nombreHtml}.html");
await appendFile("${nombreHtml}.html", htmlSeparado);
} catch (error) {
console.log(error);
}
I assume you want a template string so that the variables insert themselves into the string:
fs.rename(`${nombreHtml}.html`,(err)=>{
console.log(err)
})
fs.appendFileSync(`${nombreHtml}.html`, htmlSeparado, () => { })
Below is code:
var fs = require('fs')
for(let i=0;i<6551200;i++){
fs.appendFile('file',i,function(err){
})
}
When I run this code, after a few seconds, it show:
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
and yet nothing in file!
my qusetion is :
why is no byte in file?
where cause out of memory?
how to async write file in for loop no mater how large the write times?
thanks advance.
Bottom line here is that fs.appendFile() is an asynchronous call and you simply are not "awaiting" that call to complete on each loop iteration. This has a number of consequences, including but not limited to:
The callbacks keep getting allocated before they are resolved, which results in the "heap out of memory" eventually being reached.
You are contesting with a file handle, since you the function you are employing is actually opening/writing/closing the file given, and if you don't wait for each turn to do so, then you're simply going to clash.
So the simple solution here is to "wait", and some modern syntax sugar makes that easy:
const fs = require('mz/fs');
const x = 6551200;
(async function() {
try {
const fd = await fs.open('file','w');
for (let i = 0; i < x; i++) {
await fs.write(fd, `${i}\n`);
}
await fs.close(fd);
} catch(e) {
console.error(e)
} finally {
process.exit();
}
})()
That will of course take a while, but it's not going to "blow up" your system whilst it does it's work.
The very first simplified thing is to just get hold of the mz library, which already wraps common nodejs libraries with modernized versions of each function supporting promises. This will help clean up the syntax a lot as opposed to using callbacks.
The next thing to realize is what was mentioned about that fs.appendFile() in how it is "opening/writing/closing" all in one call. That's not great, so what you would typically do is simply open and then write the bytes in a loop, and when that is complete you can actually close the file handle.
That "sugar" comes in modern versions, and though "possible" with plain promise chaining, it's still not really that manageable. So if you don't actually have a nodejs environment that supports that async/await sugar or the tools to "transpile" such code, then you might alternately consider using the asyncjs libary with plain callbacks:
const Async = require('async');
const fs = require('fs');
const x = 6551200;
let i = 0;
fs.open('file','w',(err,fd) => {
if (err) throw err;
Async.whilst(
() => i < x,
callback => fs.write(fd,`${i}\n`,err => {
i++;
callback(err)
}),
err => {
if (err) throw err;
fs.closeSync(fd);
process.exit();
}
);
});
The same base principle applies as we are "waiting" for each callback to complete before continuing. the whilst() helper here allows iteration until the test condition is met, and of course does not do the next iteration until data is passed to the callback of the iterator itself.
There are other ways to approach this, but those are probably the two most sane for a "large loop" of iterations. Common approaches such as "chaining" via .reduce() are really more suited to a "reasonable" sized array of data you already have, and building arrays of such sizes here has inherent problems of it's own.
For instance, the following "works" ( on my machine at least ) but it really consumes a lot of resources to do it:
const fs = require('mz/fs');
const x = 6551200;
fs.open('file','w')
.then( fd =>
[ ...Array(x)].reduce(
(p,e,i) => p.then( () => fs.write(fd,`${i}\n`) )
, Promise.resolve()
)
.then(() => fs.close(fd))
)
.catch(e => console.error(e) )
.then(() => process.exit());
So that's really not that practical to essentially build such a large chain in memory and then allow it to resolve. You could put some "governance" on this, but the main two approaches as shown are a lot more straightforward.
For that case then you either have the async/await sugar available as it is within current LTS versions of Node ( LTS 8.x ), or I would stick with the other tried and true "async helpers" for callbacks where you were restricted to a version without that support
You can of course "promisify" any function with the last few releases of nodejs right "out of the box" as it where, as Promise has been a global thing for some time:
const fs = require('fs');
await new Promise((resolve, reject) => fs.open('file','w',(err,fd) => {
if (err) reject(err);
resolve(fd);
});
So there really is no need to import libraries just to do that, but the mz library given as example here does all of that for you. So it's really up to personal preferences on bringing in additional dependencies.
Javascript is a single threaded language, which means your code can execute one function at the time. So when you execute an async function, it will be "queued" in the stack to be executed next.
so in your code, you are sending 6551200 calls to the stack, which would of course crash your app before starting working "appendFile" on any of them.
you can achieve what you want by splitting your loop into smaller loops, use async and await functions, or iterators.
if what you are trying to achieve is as simple as your code, you can use the following:
const fs = require("fs");
function SomeTask(i=0){
fs.appendFile('file',i,function(err){
//err in the write function
if(err) console.log("Error", err);
//check if you want to continue (loop)
if(i<6551200) return SomeTask(i);
//on finish
console.log("done");
});
}
SomeTask();
In the above code, you write a single line, and when that is done, you call the next one.
This function is just for basic usage, it needs a refactor and use of Javascript Iterators for advanced usage check out Iterators and generators on MDN web docs
1 - The file is empty because none of the fs.append calls have ever finished, the Node.JS process broken before.
2 - The Node.JS heap memory is limited and stores the callback until it returns, not only the "i" variable.
3 - You could try to use promises to do that.
"use strict";
const Bluebird = require('bluebird');
const fs = Bluebird.promisifyAll(require('fs'));
let promisses = [];
for (let i = 0; i < 6551200; i++){
promisses.push(fs.appendFileAsync('file', i + '\n'));
}
Bluebird.all(promisses)
.then(data => {
console.log(data, 'End.');
})
.catch(e => console.error(e));
But no logic can avoid heap memory error for a loop this big. You could increase Node.JS Heep Memory or, the reasonable way, take chunks of data for interval:
'use strict';
const fs = require('fs');
let total = 6551200;
let interval = setInterval(() => {
fs.appendFile('file', total + '\n', () => {});
total--;
if (total < 1) {
clearInterval(interval);
}
}, 1);
I understand using the Q library it's easy to wait on a number of promises to complete, and then work with the list of values corresponding to those promise results:
Q.all([
promise1,
promise2,
.
.
.
promiseN,
]).then(results) {
// results is a list of all the values from 1 to n
});
What happens, however, if I am only interested in the single, fastest-to-complete result? To give a use case: Say I am interested in examining a big list of files, and I'm content as soon as I find ANY file which contains the word "zimbabwe".
I can do it like this:
Q.all(fileNames.map(function(fileName) {
return readFilePromise(fileName).then(function(fileContents) {
return fileContents.contains('zimbabwe') ? fileContents : null;
}));
})).then(function(results) {
var zimbabweFile = results.filter(function(r) { return r !== null; })[0];
});
But I need to finish processing every file even if I've already found "zimbabwe". If I have a 2kb file containing "zimbabwe", and a 30tb file not containing "zimbabwe" (and suppose I'm reading files asynchronously) - that's dumb!
What I want to be able to do is get a value the moment any promise is satisfied:
Q.any(fileNames.map(function(fileName) {
return readFilePromise(fileName).then(function(fileContents) {
if (fileContents.contains('zimbabwe')) return fileContents;
/*
Indicate failure
-Return "null" or "undefined"?
-Throw error?
*/
}));
})).then(function(result) {
// Only one result!
var zimbabweFile = result;
}).fail(function() { /* no "zimbabwe" found */ });
With this approach I won't be waiting on my 30tb file if "zimbabwe" is discovered in my 2kb file early on.
But there is no such thing as Q.any!
My question: How do I get this behaviour?
Important note: This should return without errors even if an error occurs in one of the inner promises.
Note: I know I could hack Q.all by throwing an error when I find the 1st valid value, but I'd prefer to avoid this.
Note: I know that Q.any-like behavior could be incorrect, or inappropriate in many cases. Please trust that I have a valid use-case!
You are mixing two separate issues: racing, and cancelling.
Racing is easy, either using Promise.race, or the equivalent in your favorite promise library. If you prefer, you could write it yourself in about two lines:
function race(promises) {
return new Promise((resolve, reject) =>
promises.forEach(promise => promise.then(resolve, reject)));
}
That will reject if any promise rejects. If instead you want to skip rejects, and only reject if all promises reject, then
function race(promises) {
let rejected = 0;
return new Promise((resolve, reject) =>
promises.forEach(promise => promise.then(resolve,
() => { if (++rejected === promises.length) reject(); }
);
}
Or, you could use the promise inversion trick with Promise.all, which I won't go into here.
Your real problem is different--you apparently want to "cancel" the other promises when some other one resolves. For that, you will need additional, specialized machinery. The object that represents each segment of processing will need some way to ask it to terminate. Here's some pseudo-code:
class Processor {
promise() { ... }
terminate() { ... }
}
Now you can write your version of race as
function race(processors) {
let rejected = 0;
return new Promise((resolve, reject) =>
processors.forEach(processor => processor.promise().then(
() => {
resolve();
processors.forEach(processor => processor.terminate());
},
() => { if (++rejected === processors.length) reject(); }
);
);
}
There are various proposals to handle promise cancellation which might make this easier when they are implemented in a few years.