How can I make in NodeJS Async Buffer - node.js

I search, search and dont find async Buffer.from().
I have for loop and in it Buffer.from(), all its ok, but this Buffer sleep my while, for 100-300 ms, unfortunately this cannot be the case.
Do you have solutions?
// Edit
await group.getIcon() - Return Promise Buffer and I need this to base64.
Code like this:
try {
groupIcon = Buffer.from(await group.getIcon()).toString("base64");
} catch (error) {
console.log("Not found icon");
}
Does anyone know the library on npmjs.com, for such things that it happens asynchronously?
Its the function i need to convert to base64. https://multivit4min.github.io/TS3-NodeJS-Library/classes/teamspeakservergroup.html#geticon

Buffer is sync. You can modify that buffer a few slice but i wouldn't recommend it if there is no really big buffer. It could be hard to handle and i think that is not a good practice for your problem. I have a better idea and suggestion for speed up your function.
If you haven't got a good reason for wait in loop dont do that.
I dont know what your code doing exactly but as i guess you can modify your code something like below for better performance.
let yourFunc = async () => {
/** Some codes about groups array */
const groups = await getGroups();
const iconPromises = [];
for (let group of groups) {
try {
iconPromises.push(group.getIcon());
} catch (error) {
console.log('Not found icon');
}
}
//Or you can use Promise.allSettled for better error handling
const icons = await Promise.all(iconPromises);
const groupIcons = icons.map(icon => Buffer.from(icon).toString('base64'));
/** then do whatever you want */
}

Related

Typescript Handling promise errors

In a controller I have multiple calls to some methods that return a promise.
I'm going to use the await/async statement and I have something like this:
try {
let foo = await myFirstMethod();
let bar = await mySecondMethod();
}catch(e => {
// which method fails between the both?
});
Yes, I know that I could split the call in two separate try/catch statement, but I have to handle also this scenario and in addiction I'd like to understand which is the better way to have a specific type of error response for each methods.
Any helps or suggestions are appreciated and welcome.
Thanks
What you're looking for is Promise.allSettled
let promises = [myFirstMethod() ,mySecondMethod()];
let allResults = Promise.allSettled(promises).then((results) => {
// results is an array of objects representing the promises' final state
// result.status is either "fulfilled" or "rejected"
results.forEach(result => {
console.log(result.status);
});
});

Nodejs loop through array of urls in a synchronous way

i've worked with node now for 2 years but cannot solve the following requirements:
I have an array of ~ 50.000 Parameters
I need to loop through the array and make a get request to always the same url with the parameter added
I need to write the result of the url-call back to the array
It's needed to do this one by one, as i can not call the api with several threads.
I'm sure there is a simple solution for that but everything i tried didn't make the code wait for the get request to return. I know that doing things synchronous in node is not the way we should to things, but in this special situation it is by design that the process shall not go on till the result comes back.
Any hint appreciated
Regards
Use a for loop, use a means of doing the GET request that returns a promise (such as the got() library) and then use await to pause the for loop until your response comes back.
const got = require('got');
const yourArray = [...];
async function run() {
for (let [index, item] of yourArray.entries()) {
try {
let result = await got(item.url);
// do something with the result
} catch(e) {
// either handle the error here or throw to stop further processing
}
}
}
run().then(() => {
console.log("all done");
}).catch(err => {
console.log(err);
});

converting promiseAll to gradual promises resolve(every 3promises for example) does not work

I have a list of promises and currently I am using promiseAll to resolve them
Here is my code for now:
const pageFutures = myQuery.pages.map(async (pageNumber: number) => {
const urlObject: any = await this._service.getResultURL(searchRecord.details.id, authorization, pageNumber);
if (!urlObject.url) {
// throw error
}
const data = await rp.get({
gzip: true,
headers: {
"Accept-Encoding": "gzip,deflate",
},
json: true,
uri: `${urlObject.url}`,
})
const objects = data.objects.filter((object: any) => object.type === "observed-data" && object.created);
return new Promise((resolve, reject) => {
this._resultsDatastore.bulkInsert(
databaseName,
objects
).then(succ => {
resolve(succ)
}, err => {
reject(err)
})
})
})
const all: any = await Promise.all(pageFutures).catch(e => {
console.log(e)
})
So as you see here I use promise all and it works:
const all: any = await Promise.all(pageFutures).catch(e => {
console.log(e)
})
However I noticed it affects the database performance wise so I decided to resolve every 3 of them at a time.
for that I was thinking of different ways like cwait, async pool or wrting my own iterator
but I get confused on how to do that?
For example when I use cwait:
let promiseQueue = new TaskQueue(Promise,3);
const all=new Promise.map(pageFutures, promiseQueue.wrap(()=>{}));
I do not know what to pass inside the wrap so I pass ()=>{} for now plus I get
Property 'map' does not exist on type 'PromiseConstructor'.
So whatever way I can get it working(my own iterator or any library) I am ok with as far as I have a good understanding of it.
I appreciate if anyone can shed light on that and help me to get out of this confusion?
First some remarks:
Indeed, in your current setup, the database may have to process several bulk inserts concurrently. But that concurrency is not caused by using Promise.all. Even if you had left out Promise.all from your code, it would still have that behaviour. That is because the promises were already created, and so the database requests will be executed any way.
Not related to your issue, but don't use the promise constructor antipattern: there is no need to create a promise with new Promise when you already have a promise in your hands: bulkInsert() returns a promise, so return that one.
As your concern is about the database load, I would limit the work initiated by the pageFutures promises to the non-database aspects: they don't have to wait for eachother's resolution, so that code can stay like it was.
Let those promises resolve with what you currently store in objects: the data you want to have inserted. Then concatenate all those arrays together to one big array, and feed that to one database bulkInsert() call.
Here is how that could look:
const pageFutures = myQuery.pages.map(async (pageNumber: number) => {
const urlObject: any = await this._service.getResultURL(searchRecord.details.id,
authorization, pageNumber);
if (!urlObject.url) { // throw error }
const data = await rp.get({
gzip: true,
headers: { "Accept-Encoding": "gzip,deflate" },
json: true,
uri: `${urlObject.url}`,
});
// Return here, don't access the database yet...
return data.objects.filter((object: any) => object.type === "observed-data"
&& object.created);
});
const all: any = await Promise.all(pageFutures).catch(e => {
console.log(e);
return []; // in case of error, still return an array
}).flat(); // flatten it, so all data chunks are concatenated in one long array
// Don't create a new Promise with `new`, only to wrap an other promise.
// It is an antipattern. Use the promise returned by `bulkInsert`
return this._resultsDatastore.bulkInsert(databaseName, objects);
This uses .flat() which is rather new. In case you have no support for it, look at the alternatives provided on mdn.
First, you asked a question about a failing solution attempt. That is called X/Y problem.
So in fact, as I understand your question, you want to delay some DB request.
You don't want to delay the resolving of a Promise created by a DB request... Like No! Don't try that! The promise wil resolve when the DB will return a result. It's a bad idea to interfere with that process.
I banged my head a while with the library you tried... But I could not do anything to solve your issue with it. So I came with the idea of just looping the data and setting some timeouts.
I made a runnable demo here: Delaying DB request in small batch
Here is the code. Notice that I simulated some data and a DB request. You will have to adapt it. You also will have to adjust the timeout delay. A full second certainly is too long.
// That part is to simulate some data you would like to save.
// Let's make it a random amount for fun.
let howMuch = Math.ceil(Math.random()*20)
// A fake data array...
let someData = []
for(let i=0; i<howMuch; i++){
someData.push("Data #"+i)
}
console.log("Some feak data")
console.log(someData)
console.log("")
// So we have some data that look real. (lol)
// We want to save it by small group
// And that is to simulate your DB request.
let saveToDB = (data, dataIterator) => {
console.log("Requesting DB...")
return new Promise(function(resolve, reject) {
resolve("Request #"+dataIterator+" complete.");
})
}
// Ok, we have everything. Let's proceed!
let batchSize = 3 // The amount of request to do at once.
let delay = 1000 // The delay between each batch.
// Loop through all the data you have.
for(let i=0;i<someData.length;i++){
if(i%batchSize == 0){
console.log("Splitting in batch...")
// Process a batch on one timeout.
let timeout = setTimeout(() => {
// An empty line to clarify the console.
console.log("")
// Grouping the request by the "batchSize" or less if we're almost done.
for(let j=0;j<batchSize;j++){
// If there still is data to process.
if(i+j < someData.length){
// Your real database request goes here.
saveToDB(someData[i+j], i+j).then(result=>{
console.log(result)
// Do something with the result.
// ...
})
} // END if there is still data.
} // END sending requests for that batch.
},delay*i) // Timeout delay.
} // END splitting in batch.
} // END for each data.

when should callback functions in JavaScript be used

can someone explain me how should we know that when we should use a callback?
like in the code given here as a link
snip of code is given here
we see that in readFile method inside fetchAll(cb), we used callback denoted by (cb) to read the content, parse it and stringify it and whatever, but in readFile method of save(), there was no need to use (cb). So how can we know when to use the callback?
Its simple. just be aware what is the nature of the methods you are using.
readFile is async so it will need a callback. What it is basically saying is "hey, I am going to read the file you asked, but while I read it you can do other stuff instead of waiting for me and when I am finished, I will come back to you". In your readFile method in save() you still pass a callback:
(err, fileContent) => {
// do stuff
}
Callback are used to handle async code so we can continue work, while something else is happening and we do not want to stop and wait for it.
const fs=require('fs')
const path=require('path')
module.exports=class Prroduct{
constructor(title,imgurl,description,price){
this.title=title
this.imgurl=imgurl
this.description=description
this.price=price
}
save(){
const p=path.join(__dirname,'../','data','products.json')
fs.readFile(p,(err,fileContent)=>{
let products=[]
if(!err){
products=JSON.parse(fileContent)
}
products.push(this)
fs.writeFile(p,JSON.stringify(products),(err)=>{
console.log(err)
})
})
}
static fetchAll(cb){
const p=path.join(__dirname,'../','data','products.json')
fs.readFile(p,(err,fileContent)=>{
if(err){
cb([])
}
cb(JSON.parse(fileContent))
})
}
}

Should loops be avoided in Node.JS or is there a special way to handle them?

Loops are blocking. They seem indifferent to the idea of Node.JS. How to handle the flow where a for loop or a while loop seems to be the best option.
For example, if I want to print a table of a random number upto number * 1000, I would want to use the for loop. Is there a special way to handle this in Node.JS?
Loops are not per se bad, but it depends on the situation. In most cases you will need to do some async stuff inside loops though.
So my personal preference is to not use loops at all but instead go with the functional counterparts (forEach/map/reduce/filter). This way my code base stays consistent (and a sync loop is easily changed to an async one if needed).
const myArr = [1, 2, 3];
// sync loops
myArr.forEach(syncLogFunction);
console.log('after sync loop');
function syncLogFunction(entry) {
console.log('sync loop', entry);
}
// now we want to change that into an async operation:
Promise.all(myArr.map(asyncLogFunction))
.then(() => console.log('after async loop'));
function asyncLogFunction(entry) {
console.log('async loop', entry);
return new Promise(resolve => setTimeout(resolve, 100));
}
Notice how easily you can change between sync and async versions, the structure stays almost the same.
Hope this helps a bit.
If you are doing loops on data in memory (for example, you want to go through an array and add a prop to all objects), loops will work normally, but if you need to do something inside the loop like save values to a DB, you will run into some issues.
I realize this isn't exactly the answer, but it's a suggestion that may help someone. I found one of the easiest ways to deal with this issue is using rate limiter with a forEach (I don't like really promises). This also gives the added benefit of having the option to process things in parallel, but only move on when everything is done:
https://github.com/jhurliman/node-rate-limiter
var RateLimiter = require('limiter').RateLimiter;
var limiter = new RateLimiter(1, 5);
exports.saveFile = function (myArray, next) {
var completed = 0;
var totalFiles = myArray.length;
myArray.forEach(function (item) {
limiter.removeTokens(1, function () {
//call some async function
saveAndLog(item, function (err, result) {
//check for errors
completed++;
if (completed == totalFiles) {
//call next function
exports.process();
}
});
});
});
};

Resources