I've got the following function which works as expected on Parse Server cloud code, however it's painfully slow.
The nested for loops which are internally calling queries and save functions are undoubtedly the root cause.
How can I refactor this code so that there is some async processing or even better what methods are there to remove / edit the relations on an object, the documentation around this is very poor.
ClientLabels.applyClientLabels = async (req, res) => {
const { clients, labels } = req.params;
const user = req.user;
const objectIds = clients.map((client) => client.objectId);
const clientSaveList = [];
const clientClass = Parse.Object.extend('Clients');
const query = new Parse.Query(clientClass);
query.containedIn("objectId", objectIds);
const queryResult = await query.find({ sessionToken: user.getSessionToken() })
try {
for (const client of queryResult) {
const labelRelation = client.relation('labels');
const relatedLabels = await labelRelation.query().find({ sessionToken: user.getSessionToken() });
labelRelation.remove(relatedLabels);
for (const label of labels) {
label.className = "ClientLabels";
const labelRelationObj = Parse.Object.fromJSON(label)
labelRelation.add(labelRelationObj);
};
clientSaveList.push(client);
};
const saved = await Parse.Object.saveAll(clientSaveList, { sessionToken: user.getSessionToken() })
res.success(saved);
} catch (e) {
res.error(e);
};
}
Explanation of some weirdness:
I am having to call Parse.Object.fromJSON in order to make the client side label object a ParseObjectSubClass and allow operations on it such as adding relations.
You cannot use include on a relation query as you would with a Pointer, so there needs to be a query for relations all on it's own. An array of pointers was ruled out as there is going to be an unknown amount of labels applied.
There are a few things that can be done: (1) The creation of labels in the inner loop is invariant relative to the outer loop, so that can be done one time, at the start. (2) There's no need to query the relation if you're just going to remove the related objects. Use unset() and add to replace the relations. (3) This won't save much computation, but clientSaveList is superfluous, we can just save the query result...
ClientLabels.applyClientLabels = async (req, res) => {
const { clients, labels } = req.params;
const objectIds = clients.map((client) => client.objectId);
let labelObjects = labels.map(label => {
label.className = "ClientLabels";
return Parse.Object.fromJSON(label)
});
const query = new Parse.Query('Clients');
query.containedIn("objectId", objectIds);
const sessionToken = req.user.getSessionToken;
const queryResult = await query.find({ sessionToken: sessionToken })
try {
for (const client of queryResult) {
client.unset('labels');
client.relation('labels').add(labelObjects);
};
const saved = await Parse.Object.saveAll(queryResult, { sessionToken: sessionToken })
res.success(saved);
} catch (e) {
res.error(e);
};
}
Related
I read several documentation but I don't understand why I should use an extra layer(foreach) in my code when I read all of the data inside a collection using Firebase (Cloud Firestore).
Here is the original documentation:
https://firebase.google.com/docs/firestore/query-data/get-data#get_all_documents_in_a_collection
Here is my code:
async loadUsers(): Promise<User[]> {
const users = new Array<User>();
const snapshot = await this.firestore.collection('users').get();
snapshot.forEach((collection) => {
collection.docs.forEach(doc => {
users.push(doc.data() as User);
});
});
return users;
}
As I understand it should work like this:
async loadUsers(): Promise<User[]> {
const users = new Array<User>();
const snapshot = await this.firestore.collection('users').get();
snapshot.forEach(doc => {
users.push(doc.data() as User);
});
return users;
}
Error message:
"Property 'data' does not exist on type 'QuerySnapshot'."
.collection().get() does NOT return an array; it returns a QuerySnapshot, which has a property .docs, which is an array of QueryDocumentSnapshot, each of which has a property .data, which is the data read from the document.
Documentation
https://firebase.google.com/docs/reference/js/firebase.firestore.CollectionReference
In new modular firebase firestore(version 9.+) it should be like this:
import { getFirestore, collection, query, getDocs } from 'firebase/firestore/lite'
async readAll() {
const firestore = getFirestore()
const collectionRef = collection(firestore, '/users')
let q = query(collectionRef, orderBy('createTimestamp', 'desc'))
const querySnapshot = await getDocs(q)
const items = []
querySnapshot.forEach(document => {
items.push(document.data())
})
return items
}
I could not find any parameter on querySnapshot directly that is something like .docs was and included whole array before. So it is kinda like onSnapshot is and was.
Based on #LeadDreamer answer, I could manage to simplify the code
async loadUsers(): Promise<User[]> {
const users = new Array<User>();
await this.firestore.collection('users').get().subscribe(querySnapshot => {
querySnapshot.docs.forEach(doc => {
users.push(doc.data() as User);
});
});
return users;
}
There seems to be no other way but to iterate.
const q = query(collection(db, "item"));
getDocs(q).then( response => {
const result = response.docs.map(doc=>({
id: doc.id,
...doc.data(),
}))
console.log(result);
}).catch(err=>console.log(err))
I'm using Back4app.
My Profile class schema has 4 File columns containing pictures.
So when I retrieve an object , I have to make an HTTP request for each file URL and get the byte data like this.
const data = await Parse.Cloud.httpRequest({url:profilePhoto.url()});
return data.buffer.toString('base64');
But for all four files I have to do 4 HTTP requests to the server.
Is there anyway to do a batch HTTP request so that with just 1 request I can get data for all 4 files ?
My main aim is to do the least amount of requests to the server as possible.
There is no out-of-the-box way to retrieve multiple files with one request in Parse Server.
You could implement your own Parse Cloud Code function to retrieve multiple files, but you would have to manually combine them server side and separate them client side.
As a starting point you could look at packages like multistream that allow you to combine multiple file streams into one to get some inspiration.
You might be able to do something similar to what I've done in cloud code.
I had to load up a bunch of information at the start of my application, requiring many round trips to the server.
So I wrote a function called getUserData().
This does many unrelated queries, and jams all of the results into one big object. I then return the object from the function.
Here is the entire function:
console.log("startig getUserData");
var callCount = 0;
var lastLoadTime=0;
// Given a user, load all friends. Save the objects to ret.objects,
// and save the objectIds to ret.friends
//
// Note: we always load the exhaustive friend list, because
// otherwise, we would have no way of recognizing
// removed friendships.
//
async function loadFriends(user, ret) {
const friendQuery = user.relation("friends").query();
const friends = await findFully(friendQuery);
for(var i=0;i<friends.length;i++){
ret.friends[friends[i].id]=1;
ret.objects[friends[i].id]=friends[i];
};
}
// Given a user, load all owned cells. Save the objects to ret.owned,
// and save their objectIds to ret.ownedCells.
//
// Also, save the ids of members, which we will use to flesh out ret.objects with
// the objects who are not friends, but share a cell with the current user.
async function loadPublicCells(user, ret, memberIds) {
const ownedCellQ = new Parse.Query('PublicCell');
ownedCellQ.equalTo('owner',user);
const joinedCellQ = new Parse.Query('PublicCell');
joinedCellQ.equalTo('members',user);
const publicCellQ = Parse.Query.or(ownedCellQ,joinedCellQ);
publicCellQ.greaterThan("updatedAt",new Date(lastLoadTime));
const publicCells=await findFully(publicCellQ);
for(var i=0;i<publicCells.length;i++) {
const cell = publicCells[i];
ret.ownedCells[cell.id]=cell;
const owner = cell.get("owner");
if(owner==null)
continue;
ret.objects[cell.id]=cell;
if(owner.id === user.id) {
ret.ownedCells[cell.id]=1;
} else {
ret.joinedCells[cell.id]=1;
};
const memberQ = cell.relation("members").query();
const members = await findFully(memberQ);
if(ret.memberMap[cell.id]==null)
ret.memberMap[cell.id]={};
const map = ret.memberMap[cell.id];
for(var j=0;j<members.length;j++){
const member=members[j];
map[member.id]=1;
ret.objects[member.id]=member;
};
};
};
// given a list of all members of all cells, load those objects and store
// them in ret.objects. We do not have to record which cells they belong
// to, because that information is in ret.memberMap
async function loadMembers(memberIds, ret) {
const memberQ = new Parse.Query(Parse.User);
var partIds;
while(memberIds.length){
partIds = memberIds.splice(0,100);
memberQ.containedIn('objectId',partIds);
const part = await findFully(memberQ);
for(var i=0;i<part.length;i++) {
ret.objects[part[i].id]=part[i];
}
};
};
// given a user, save all of the objectIds of people who have annoyed him with
// spam. We save only the ids, they don't go on ret.objects, because we only
// need to filter them out of things. The objectIds are sufficient.
//
// We always send all spam objects, otherwise we would not recognize deletions
async function loadUserSpams(user, ret) {
const userSpamsQ = new Parse.Query("_User");
userSpamsQ.equalTo("spamUsers",user);
userSpamsQ.greaterThan("updatedAt", new Date(lastLoadTime));
const userSpams = await findFully(userSpamsQ);
for(var i=0;i<userSpams.length;i++){
ret.userSpams[userSpams[i].id]=1;
};
};
// given a user, save all of the objectIds of people who have been annoyed *BY*
// him with spam. We save only the ids, they don't go on ret.objects, because we
// only need to filter them out of things. The objectIds are sufficient.
//
// We always send all spam objects, otherwise we would not recognize deletions
async function loadSpamUsers(user, ret) {
const spamUserR = user.relation('spamUsers');
const spamUserQ = spamUserR.query();
spamUserQ.greaterThan("updatedAt", new Date(lastLoadTime));
const spamUsers = await findFully(spamUserQ);
for(var i=0;i<spamUsers.length;i++){
ret.spamUsers[spamUsers[i].id]=1;
};
};
// given a user, save all of the objectIds of people to whom he has sent a
// friend request which is still pending. We save only the ids, they don't go
// on ret.objects, because we only need to filter them out of things. The
// objectIds are sufficient.
async function loadPendingFriends(user, ret) {
const request1Q = new Parse.Query('Request');
request1Q.equalTo("owner",user);
const request2Q = new Parse.Query('Request');
request2Q.equalTo("sentTo",user);
const requestQ = Parse.Query.or(request1Q,request2Q);
requestQ.equalTo("status",'PENDING');
const requests = await findFully(requestQ);
for(var i=0;i<requests.length;i++){
const request = requests[i];
const sentBy = request.get("owner");
if(sentBy==null){
console.warn("sentBy==null");
continue;
};
const sentTo = request.get("sentTo");
if(sentTo==null){
console.warn("sentTo==null");
continue;
};
console.dump({sentTo,sentBy});
if(sentBy.id==user.id){
ret["pendingFriends"][sentTo.id]=sentTo;
} else if ( sentTo.id==user.id ) {
ret["friendingPends"][sentBy.id]=sentBy;
};
};
};
// given a user, load all of his private cells. We do not store
// the user objects, because only friends will be in your private cells.
async function loadPrivateCells(user, ret) {
const privateCellQ = new Parse.Query('PrivateCell');
privateCellQ.equalTo("owner", user);
privateCellQ.greaterThan("updatedAt", new Date(lastLoadTime));
const privateCells = await findFully(privateCellQ);
for(var i=0;i<privateCells.length;i++) {
const cell = privateCells[i];
ret.objects[cell.id]=cell;
ret.privateCells[cell.id]=cell;
if(ret.memberMap[cell.id]==null)
ret.memberMap[cell.id]={};
const map = ret.memberMap[cell.id];
const memberQ = cell.relation("members").query();
const members = await findFully(memberQ);
for(var j=0;j<members.length;j++){
const member=members[j];
map[member.id]=1;
ret.objects[member.id]=member;
};
};
//});
}
// we use objects as maps to weed out duplicate objects and cells.
// when we are done, we use this function to replace the object
// with an array of objects. we don't need to send the keys, since
// they already exist within the objects.
function objToValueList(k,ret){
const objs = [];
for( var id in ret[k] )
objs.push(ret[k][id]);
ret[k]=objs;
ret.counts[k]=objs.length;
};
// convert the objects which have been used to accumulate key lists
// to arrays of objectIds. k is the name of the list we are working
// on. ret[k] is the list itself.
function objToKeyList(k,ret) {
const objs = [];
for( var id in ret[k] ) {
objs.push(id);
};
ret[k]=objs;
ret.counts[k]=objs.length;
};
async function checkUserConsent(user){
const query = new Parse.Query("PrivacyPolicy");
query.descending("createdAt");
query.limit(1);
const res = await query.find();
if(res.length==0) {
return true;
};
const policy=res[0];
console.dump(policy);
console.log(policy);
const userConsent=user.get("lastConsent");
return userConsent!=null && userConsent.id == policy.id;
};
async function loadAlerts(user,ret) {
const q1 = new Parse.Query("Alert");
q1.equalTo("owner", user);
const q2 = new Parse.Query("Response");
q2.equalTo("owner", user);
const q3 = new Parse.Query("Alert");
q3.matchesKeyInQuery("objectId", "alert", q2);
const q = Parse.Query.or(q1,q3);
const list = await q.find();
var time = new Date().getTime();
time -= 1000*86400;
time=Math.max(lastLoadTime, time);
q.greaterThan("updatedAt",time);
for(var i=0;i<list.length;i++) {
const item=list[i];
ret.alerts[item.id]=1;
ret.objects[item.id]=item;
};
}
async function doGetUserData(user) {
if(!user)
return {fatal: 'not logged in!' };
const ret = {
owner: {},
privateCells: {},
friends: {},
alerts: {},
objects: {},
ownedCells: {},
joinedCells: {},
spamUsers: {},
userSpams: {},
pendingFriends: {},
friendingPends: {},
memberMap: {},
loadTime: lastLoadTime,
counts: {callCount: callCount++},
};
{
user.fetch();
ret.owner=user.id;
const memberIds={};
ret.objects[user.id]=user;
console.log("loadFriends");
await loadFriends(user,ret);
console.log("loadPrivateCells");
await loadPrivateCells(user,ret,memberIds);
console.log("loadPublicCells");
await loadPublicCells(user,ret,memberIds);
console.log("loadPendingFriends");
await loadPendingFriends(user,ret);
console.log("loadUserSpams");
await loadUserSpams(user,ret);
console.log("loadSpamUsers");
await loadSpamUsers(user,ret);
console.log("loadAlerts");
await loadAlerts(user,ret);
const memberList=[];
for( var id in memberIds ) {
console.log(ret.objects[id]);
memberList.push(id);
};
console.log("loadMembers");
await loadMembers(memberList,ret);
}
for(var cell in ret.memberMap) {
var map = ret.memberMap[cell];
var list = [];
ret.memberMap[cell]=list;
for(var member in map) {
list.push(member);
};
}
delete ret.objects[user.id];
[
'friends', "friendingPends", 'pendingFriends',
'privateCells', 'ownedCells', 'joinedCells',
'userSpams', 'spamUsers', "alerts"
].forEach((k)=>{
objToKeyList(k,ret);
});
objToValueList('objects',ret);
delete ret.counts;
return ret;
}
async function getUserData(req) {
try {
var nextLoadTime=new Date().getTime();
const user = req.user;
console.log(user);
lastLoadTime = req.params.lastLoadTime;
if(lastLoadTime==null)
lastLoadTime=0;
lastLoadTime = new Date(lastLoadTime);
const ret = await doGetUserData(user);
ret.loadTime=nextLoadTime;
return ret;
} catch ( err ) {
console.log(err);
try {
console.log(err.stack());
} catch ( xxx ) {
console.log(err);
};
throw (`error getting data: ${err}`);
};
};
Parse.Cloud.define("getUserData", getUserData);
Something like this could easily be done to get your data for you. Like this solution, it is unlikely to be entirely pretty, but it would probably work.
I'm creating a NodeJS backend where a process reads in data from a source, checks for changes compared to the current data, makes those updates to MongoDB and reports the changes made. Everything works, except I can't get the changes reported, because I can't get the Mongoose update action to await.
The returned array from this function is then displayed by a Koa server. It shows an empty array, and in the server logs, the correct values appear after the server has returned the empty response.
I've digged through Mongoose docs and Stack Overflow questions – quite a few questions about the topic – but with no success. None of the solutions provided seem to help. I've isolated the issue to this part: if I remove the Mongoose part, everything works as expected.
const parseJSON = async xmlData => {
const changes = []
const games = await Game.find({})
const gameObjects = games.map(game => {
return new GameObject(game.name, game.id, game)
})
let jsonObj = require("../sample.json")
Object.keys(jsonObj.items.item).forEach(async item => {
const game = jsonObj.items.item[item]
const gameID = game["#_objectid"]
const rating = game.stats.rating["#_value"]
if (rating === "N/A") return
const gameObject = await gameObjects.find(
game => game.bgg === parseInt(gameID)
)
if (gameObject && gameObject.rating !== parseInt(rating)) {
try {
const updated = await Game.findOneAndUpdate(
{ _id: gameObject.id },
{ rating: rating },
{ new: true }
).exec()
changes.push(
`${updated.name}: ${gameObject.rating} -> ${updated.rating}`
)
} catch (error) {
console.log(error)
}
}
})
return changes
}
Everything works – the changes are found and the database is updated, but the reported changes are returned too late, because the execution doesn't wait for Mongoose.
I've also tried this instead of findOneAndUpdate():
const updated = await Game.findOne()
.where("_id")
.in([gameObject.id])
.exec()
updated.rating = rating
await updated.save()
The same results here: everything else works, but the async doesn't.
As #Puneet Sharma mentioned, you'll have to map instead of forEach to get an array of promises, then await on the promises (using Promise.all for convenience) before returning changes that will then have been populated:
const parseJSON = async xmlData => {
const changes = []
const games = await Game.find({})
const gameObjects = games.map(game => {
return new GameObject(game.name, game.id, game)
})
const jsonObj = require("../sample.json")
const promises = Object.keys(jsonObj.items.item).map(async item => {
const game = jsonObj.items.item[item]
const gameID = game["#_objectid"]
const rating = game.stats.rating["#_value"]
if (rating === "N/A") return
const gameObject = await gameObjects.find(
game => game.bgg === parseInt(gameID)
)
if (gameObject && gameObject.rating !== parseInt(rating)) {
try {
const updated = await Game.findOneAndUpdate(
{ _id: gameObject.id },
{ rating: rating },
{ new: true }
).exec()
changes.push(
`${updated.name}: ${gameObject.rating} -> ${updated.rating}`
)
} catch (error) {
console.log(error)
}
}
})
await Promise.all(promises)
return changes
}
(The diff, for convenience:
9,10c9,10
< let jsonObj = require("../sample.json")
< Object.keys(jsonObj.items.item).forEach(async item => {
---
> const jsonObj = require("../sample.json")
> const promises = Object.keys(jsonObj.items.item).map(async item => {
33a34
> await Promise.all(promises)
)
EDIT: a further refactoring would be to use that array of promises for the change descriptions themselves. Basically changePromises is an array of Promises that resolve to a string or null (if there was no change), so a .filter with the identity function will filter out the falsy values.
This method also has the advantage that changes will be in the same order as the keys were iterated over; with the original code, there's no guarantee of order. That may or may not matter for your use case.
I also flipped the if/elses within the map function to reduce nesting; it's a matter of taste really.
Ps. That await Game.find({}) will be a problem when you have a large collection of games.
const parseJSON = async xmlData => {
const games = await Game.find({});
const gameObjects = games.map(game => new GameObject(game.name, game.id, game));
const jsonGames = require("../sample.json").items.item;
const changePromises = Object.keys(jsonGames).map(async item => {
const game = jsonGames[item];
const gameID = game["#_objectid"];
const rating = game.stats.rating["#_value"];
if (rating === "N/A") {
// Rating from data is N/A, we don't need to update anything.
return null;
}
const gameObject = await gameObjects.find(game => game.bgg === parseInt(gameID));
if (!(gameObject && gameObject.rating !== parseInt(rating))) {
// Game not found or its rating is already correct; no change.
return null;
}
try {
const updated = await Game.findOneAndUpdate(
{ _id: gameObject.id },
{ rating: rating },
{ new: true },
).exec();
return `${updated.name}: ${gameObject.rating} -> ${updated.rating}`;
} catch (error) {
console.log(error);
}
});
// Await for the change promises to resolve, then filter out the `null`s.
return (await Promise.all(changePromises)).filter(c => c);
};
Many blogs suggest to switch to Cloud Firestore because it's easy and well secured. Coming from Realtime Database and back when using Functions + RD it was easy to navigate through document triggers, like ref.parent
My setup is like this:
Users
{userid}
last_seen: "data"
{forms}
{formid}
However, i have added a document trigger with onCreate, and i want to get the value of last_seen:
exports.updateUser = functions.firestore.document('users/{userId}/forms/{formid}').onCreate((snap, context) => {
const newValue = snap.data();
console.log("test value : " + newValue.test); // works
console.log("form id: " + context.params.formid); // works
console.log("user last seen : " + newValue.last_seen); // doesn't work, can't access the parent collection data
});
I totally get the confusion with the switch to Firestore but it's almost the exact same way in this case.
In realtime, you have the snapshot:
exports.doStuff = functions.database.ref('/users/{userId}/forms/{formId}')
.onCreate((snapshot, context) => {
const ref = snapshot.ref;
const userRef = ref.parent.parent;
userRef.once('value').then(parentSnap => {
const user = parentSnap.val();
const lastSeen = user.last_seen;
});
});
In Firestore:
exports.doStuff = functions.firestore.document.onCreate('/users/{userId}/forms/{formId}')
.onCreate((snapshot, context) => {
const ref = snapshot.ref;
const userRef = ref.parent.parent;
userRef.get().then(parentSnap => {
const user = parentSnap.data();
const lastSeen = user.last_seen;
});
});
Another thing to consider is you are passing the userId in your params so you could just build your own DocumentReference (assuming you're also using firebaseAdmin)
functions.firestore.document.onCreate('/users/{userId}/forms/{formId}')
.onCreate((snapshot, context) => {
const userId = context.params.userId;
const userRef = firebaseAdmin.firestore().collection('users').doc(userId);
userRef.get().then(parentSnap => {
const user = parentSnap.data();
const lastSeen = user.last_seen;
});
});
It also allows you to decouple your logic for functions you may use often, consider it as a "helper" method: (NOTE, I switched to async/await on accident, it's a bit cleaner)
functions.firestore.document.onCreate('/users/{userId}/forms/{formId}')
.onCreate(async (snapshot, context) => {
const userId = context.params.userId;
const lastSeen = await getLastSeen(userId);
});
// == Helper Functions ==-------------------
export async getLastSeen(userId) {
if (!userId) return Promise.reject('no userId');
// User Ref
const userSnap = await firebaseAdmin.firestore().collection('users').doc(userId).get();
return userSnap.data().last_seen;
}
Now you can use getLastSeen() whenever you need it, and if you make a change you only have to adjust that one function. If it's not something you call often then don't worry about it, but I would consider maybe a getUser() helper...
In your code, snap is a DocumentSnapshot type object. As you can see from the linked API documentation, there is a ref property on that object that gets you a DocumentReference object pointing to the document that was added. That object has parent property that gives you a CollectionReference that points to the collection where the document exists, which also has a parent property. So, use these properties to navigate around your database as needed.
Get the reference where the change took place, move 2 levels up and capture data using ref.once() function:
exports.updateUser = functions.firestore.document('users/{userId}/forms/{formid}').onCreate( async (snap, context) => {
// Get the reference where the change took place
const changeRef = snap.after.ref;
// Move to grandad level (2 levels up)
const userIdRef = changeRef.parent.parent;
// Capture data
const snapshot = await userIdRef.once('value');
// Get variable
const lastSeen = snapshot.val().last_seen;
// Do your stuff...
return null;
});
I'm having a tremendously tough time organizing the flow here as I'm self-taught so wondering if someone might be able to assist.
var channelIds = ['XYZ','ABC','QRS']
var playlistIds = [];
var videoIds = [];
ORDER OF PROCESS
1. Get All Playlist IDs: If returning Get Request JSON contains nextPageToken run Get Request again with that page before going to (2)
2. Get All Video IDs: If returning Get Request JSON contains nextPageToken run Get Request again with that page before going to (3)
3. Aggregate into Final Array: I need put all in an array such as:
var ArrFinal = [{channelId,playlistID,videoId},{channelId,playlistID,videoId},{channelId,playlistID,videoId}];
I don't necessarily need someone to write the whole thing. I'm trying to better understand the most efficient way to know when the previous step is done, but also handle the nextPageToken iteration.
i'm not familiar with the youtube api.
But what you basically need is a get function for each endpoint. This function should also care about the "nextPageToken".
Something like that: (not tested)
'use strict';
const Promise = require('bluebird');
const request = Promise.promisifyAll(require('request'));
const playlistEndpoint = '/youtube/v3/playlists';
const baseUrl = 'https://www.googleapis.com'
const channelIds = ['xy', 'ab', 'cd'];
const getPlaylist = async (channelId, pageToken, playlists) => {
const url = `${baseUrl}${playlistEndpoint}`;
const qs = {
channelId,
maxResults: 25,
pageToken
};
try {
const playlistRequest = await request.getAsync({ url, qs });
const nextPageToken = playlistRequest.body.nextPageToken;
// if we already had items, combine with the new ones
const items = playlists ? playlists.concat(playlistRequest.body.items) : playlistRequest.body.items;
if (nextPageToken) {
// if token, do the same again and pass results to function
return getPlaylist(channelId, nextPageToken, items);
}
// if no token we are finished
return items;
}
catch (e) {
console.log(e.message);
}
};
const getVideos = async (playlistId, pageToken, videos) => {
// pretty much the same as above
}
function awesome(channelIds) {
const fancyArray = [];
await Promise.map(channelIds, async (channelId) => {
const playlists = await getPlaylist(channelId);
const videos = await Promise.map(playlists, async (playlistId) => {
const videos = await getVideos(playlistId);
videos.forEach(videoId => {
fancyArray.push({ channelId, playlistId, videoId })
})
});
});
return fancyArray;
}
awesome(channelIds)
// UPDATE
This may be a lot concurrent requests, you can limit them by using
Promise.map(items, item => { somefunction() }, { concurrency: 5 });