I have orders collection and products collection in my application. The user can have multiple products in their single order. What I want to do is calculating the amount of each product reading through products collection and then perform the further action. Below is what I got as of now.
exports.myfunc = functions.firestore.document('collection/{collid}')
.onCreate(event => {
let data = event.data.data();
const products = data.products;
const prices = [];
_.each(products, (data1, index) => {
const weight = data1.weight;
const isLess = data1.isLess;
firebaseAdmin.firestore().collection('collection').doc(data1.productId).onSnapshot(data2 => {
let amount = weight === '1/2' ? data2.data().price1 : data2.data().price1 * weight;
amount += isLess ? 50 : 0;
prices.push(amount);
});
});
//Do some task after _.each with new total
});
But am not able to achieve synchronous task here, so that I can store actual amount for the product against its order and calculate total to store in document.
Could anyone please tell me how I achieve the above-said scenarios? How I can work along with promise and then callback?
You can map the products array to promises, like this:
var productPromises = products.map(product => {
return new Promise((resolve, reject) => {
firebaseOperation()...onSnapshot(resolve)
})
})
Promise.all(productPromises).then(results => {
// process all results at once
})
First, don't use onSnapshot() with Cloud Functions. That attaches a listener that stay listening indefinitely, until you remove it. That's not what you want at all, because functions can't execute indefinitely.
Instead, use get(), which returns a promise when the fetch is complete.
Also, you could consider accumulating all the documents you want to access into an array and use getAll() (with the spread operator on the array) to fetch them all.
Related
I have a collection of teams containing around 80 000 documents. Every Monday I would like to reset the scores of every team using firebase cloud functions. This is my function:
exports.resetOrgScore = functions.runWith(runtimeOpts).pubsub.schedule("every monday 00:00").timeZone("Europe/Oslo").onRun(async (context) => {
let batch = admin.firestore().batch();
let count = 0;
let overallCount = 0;
const orgDocs = await admin.firestore().collection("teams").get();
orgDocs.forEach(async(doc) => {
batch.update(doc.ref, {score:0.0});
if (++count >= 500 || ++overallCount >= orgDocs.docs.length) {
await batch.commit();
batch = admin.firestore().batch();
count = 0;
}
});
});
I tried running the function in a smaller collection of 10 documents and it's working fine, but when running the function in the "teams" collection it returns "Cannot modify a WriteBatch that has been committed". I tried returning the promise like this(code below) but that doesn't fix the problem. Thanks in advance :)
return await batch.commit().then(function () {
batch = admin.firestore().batch();
count = 0;
return null;
});
There are three problems in your code:
You use async/await with forEach() which is not recommended: The problem is that the callback passed to forEach() is not being awaited, see more explanations here or here.
As detailed in the error you "Cannot modify a WriteBatch that has been committed". With await batch.commit(); batch = admin.firestore().batch(); it's exactly what you are doing.
As important, you don't return the promise returned by the asynchronous methods. See here for more details.
You'll find in the doc (see Node.js tab) a code which allows to delete, by recursively using a batch, all the docs of a collection. It's easy to adapt it to update the docs, as follows. Note that we use a dateUpdated flag to select the docs for each new batch: with the original code, the docs were deleted so no need for a flag...
const runtimeOpts = {
timeoutSeconds: 540,
memory: '1GB',
};
exports.resetOrgScore = functions
.runWith(runtimeOpts)
.pubsub
.schedule("every monday 00:00")
.timeZone("Europe/Oslo")
.onRun((context) => {
return new Promise((resolve, reject) => {
deleteQueryBatch(resolve).catch(reject);
});
});
async function deleteQueryBatch(resolve) {
const db = admin.firestore();
const snapshot = await db
.collection('teams')
.where('dateUpdated', '==', "20210302")
.orderBy('__name__')
.limit(499)
.get();
const batchSize = snapshot.size;
if (batchSize === 0) {
// When there are no documents left, we are done
resolve();
return;
}
// Delete documents in a batch
const batch = db.batch();
snapshot.docs.forEach((doc) => {
batch.update(doc.ref, { score:0.0, dateUpdated: "20210303" });
});
await batch.commit();
// Recurse on the next process tick, to avoid
// exploding the stack.
process.nextTick(() => {
deleteQueryBatch(resolve);
});
}
Note that the above Cloud Function is configured with the maximum value for the time out, i.e. 9 minutes.
If it appears that all your docs cannot be updated within 9 minutes, you will need to find another approach, for example using the Admin SDK from one of your server, or cutting the work into pieces and run the CF several times.
Per Firebase cloud functions docs, "Events are delivered at least once, but a single event may result in multiple function invocations. Avoid depending on exactly-once mechanics, and write idempotent functions."
When looking at a firestore cloud function doc example below of a restaurant rating, they are using an increment counter to calculate the total number of ratings. What are some of the best ways to maintain the accuracy of this counter by using an idempotent function?
Is it reasonable to store the context.eventId in a subcollection document field and only execute the function if the new context.eventId is different?
function addRating(restaurantRef, rating) {
// Create a reference for a new rating, for use inside the transaction
var ratingRef = restaurantRef.collection('ratings').doc();
// In a transaction, add the new rating and update the aggregate totals
return db.runTransaction((transaction) => {
return transaction.get(restaurantRef).then((res) => {
if (!res.exists) {
throw "Document does not exist!";
}
// Compute new number of ratings
var newNumRatings = res.data().numRatings + 1;
// Compute new average rating
var oldRatingTotal = res.data().avgRating * res.data().numRatings;
var newAvgRating = (oldRatingTotal + rating) / newNumRatings;
// Commit to Firestore
transaction.update(restaurantRef, {
numRatings: newNumRatings,
avgRating: newAvgRating
});
transaction.set(ratingRef, { rating: rating });
});
});
}
Is it reasonable to store the context.eventId in a subcollection
document field and only execute the function if the new
context.eventId is different?
Yes, for your use case, using the Cloud Function eventId is the best solution to make you Cloud Function idempotent. I guess you have already watched this Firebase video.
In the Firebase doc from which you have taken the code in your question, you will find at the bottom, the similar code for a Cloud Function. I've adapted this code as follows, in order to check if a doc with ID = eventId exists in a dedicated ratingUpdateIds subcollection:
exports.aggregateRatings = functions.firestore
.document('restaurants/{restId}/ratings/{ratingId}')
.onWrite(async (change, context) => {
try {
// Get value of the newly added rating
const ratingVal = change.after.data().rating;
const ratingUpdateId = context.eventId;
// Get a reference to the restaurant
const restRef = db.collection('restaurants').doc(context.params.restId);
// Get a reference to the ratingUpdateId doc
const ratingUpdateIdRef = restRef.collection("ratingUpdateIds").doc(ratingUpdateId);
// Update aggregations in a transaction
await db.runTransaction(async (transaction) => {
const ratingUpdateIdDoc = await transaction.get(ratingUpdateIdRef);
if (ratingUpdateIdDoc.exists) {
// The CF is retried
throw "The CF is being retried";
}
const restDoc = await transaction.get(restRef);
// Compute new number of ratings
const newNumRatings = restDoc.data().numRatings + 1;
// Compute new average rating
const oldRatingTotal = restDoc.data().avgRating * restDoc.data().numRatings;
const newAvgRating = (oldRatingTotal + ratingVal) / newNumRatings;
// Update restaurant info and set ratingUpdateIdDoc
transaction
.update(restRef, {
avgRating: newAvgRating,
numRatings: newNumRatings
})
.set(ratingUpdateIdRef, { ratingUpdateId })
});
return null;
} catch (error) {
console.log(error);
return null;
}
});
PS: I made the assumption that the Cloud Function eventId can be used as a Firestore document ID. I didn't find any doc or info stating the opposite. In case using the eventId as an ID would be a problem, since you execute the transaction in a Cloud Function (and therefore use the Admin SDK), you could query the document based on a field value (where you store the eventId) instead of getting it through a Reference based on its ID.
I have a master collection in firestore with a couple hundred documents (which will grow to a few thousand in a couple of months).
I have a use case, where every time a new user document is created in /users/ collection, I want all the documents from the master to be copied over to /users/{userId}/.
To achieve this, I have created a firebase cloud function as below:
// setup for new user
exports.setupCollectionForUser = functions.firestore
.document('users/{userId}')
.onCreate((snap, context) => {
const userId = context.params.userId;
db.collection('master').get().then(snapshot => {
if (snapshot.empty) {
console.log('no docs found');
return;
}
snapshot.forEach(function(doc) {
return db.collection('users').doc(userId).collection('slave').doc(doc.get('uid')).set(doc.data());
});
});
});
This works, the only problem is, it takes forever (~3-5 mins) for only about 200 documents. This has been such a bummer because a lot depends on how fast these documents get copied over. I was hoping this to be not more than a few seconds at max. Also, the documents show up altogether and not as they are written, or at least they seem that way.
Am I doing anything wrong? Why should it take so long?
Is there a way I can break this operation into multiple reads and writes so that I can guarantee a minimum documents in a few seconds and not wait until all of them are copied over?
Please advise.
If I am not mistaking, by correctly managing the parallel writes with Promise.all() and returning the Promises chain it should normally improve the speed.
Try to adapt your code as follows:
exports.setupCollectionForUser = functions.firestore
.document('users/{userId}')
.onCreate((snap, context) => {
const userId = context.params.userId;
return db.collection('master').get().then(snapshot => {
if (snapshot.empty) {
console.log('no docs found');
return null;
} else {
const promises = [];
const slaveRef = db.collection('users').doc(userId).collection('slave');
snapshot.forEach(doc => {
promises.push(slaveRef.doc(doc.get('uid')).set(doc.data()))
});
return Promise.all(promises);
}
});
});
I would suggest you watch the 3 videos about "JavaScript Promises" from the Firebase video series which explain why it is key to return a Promise or a value in a background triggered Cloud Function.
Note also, that if you are sure you have less than 500 documents to save in the slave collection, you could use a batched write. (You could use it for more than 500 docs but then you would have to manage different batches of batched write...)
I am working on a Node.js application which uses the WordPress JSON API as a kind of headless CMS. When the application spins up, we query out to the WP database and pull in the information we need (using Axios), manipulate it, and store it temporarily.
Simple enough - but one of our post categories in the CMS has a rather large number of entries. For some godforsaken reason, WordPress has capped the API request limit to a maximum of 99 posts at a time, and requires that we write a loop that can send concurrent API requests until all the data has been pulled.
For instance, if we have 250 posts of some given type, I need to hit that route three separate times, specifying the specific "page" of data I want each time.
Per the docs, https://developer.wordpress.org/rest-api/using-the-rest-api/pagination/, I have access to a ?page= query string that I can use to send these requests concurrently. (i.e. ...&page=2)
I also have access to X-WP-Total in the headers object, which gives me the total number of posts within the given category.
However, these API calls are part of a nested promise chain, and the whole process needs to return a promise I can continue chaining off of.
The idea is to make it dynamic so it will always pull all of the data, and return it as one giant array of posts. Here's what I have, which is functional:
const request = require('axios');
module.exports = (request_url) => new Promise((resolve, reject) => {
// START WITH SMALL ARBITRARY REQUEST TO GET TOTAL NUMBER OF POSTS FAST
request.get(request_url + '&per_page=1').then(
(apiData) => {
// SETUP FOR PROMISE.ALL()
let promiseArray = [];
// COMPUTE HOW MANY REQUESTS WE NEED
// ALWAYS ROUND TOTAL NUMBER OF PAGES UP TO GET ALL THE DATA
const totalPages = Math.ceil(apiData.headers['x-wp-total']/99);
for (let i = 1; i <= totalPages; i++) {
promiseArray.push( request.get(`${request_url}&per_page=99&page=${i}`) )
};
resolve(
Promise.all(promiseArray)
.then((resolvedArray) => {
// PUSH IT ALL INTO A SINGLE ARRAY
let compiledPosts = [];
resolvedArray.forEach((axios_response) => {
// AXIOS MAKES US ACCESS W/RES.DATA
axios_response.data.forEach((post) => {
compiledPosts.push(post);
})
});
// RETURN AN ARRAY OF ALL POSTS REGARDLESS OF LENGTH
return compiledPosts;
}).catch((e) => { console.log('ERROR'); reject(e);})
)
}
).catch((e) => { console.log('ERROR'); reject(e);})
})
Any creative ideas to make this pattern better?
I have exactly the same question. In my case, I use Vue Resource :
this.$resource('wp/v2/media').query().then((response) => {
let pagesNumber = Math.ceil(response.headers.get('X-WP-TotalPages'));
for(let i=1; i <= pagesNumber; i++) {
this.$resource('wp/v2/media?page='+ i).query().then((response) => {
this.medias.push(response.data);
this.medias = _.flatten(this.medias);
console.log(this.medias);
});
}
I'm pretty sure there is a better workaround to achieve this.
I am using the viewer.getProperties(dbId, onSuccessCallback, onErrorCallback) method in order to get properties for objects in my viewer. I want to run the method for all selected objects, extract a subset of the properties for each object, and present the subsets in a table.
var subsets = [];
var selectFunctions = [];
handleSelection(selection, addProps, onError);
function handleSelection(selection, onSuccess, onError) {
for (var i = 0; i < selection.length; i++)
selectFunctions.push(_viewer.getProperties(selection[i], onSuccess, onError));
}
function addProps(data) {
var props = [];
for (var prop in data.properties) {
//Add property to props if some condition is true...
}
subsets.push(props);
}
Promise.all(_selectFunctions).then(function () {
console.log("Handled all selections");
//Add subsets to table...
}).catch(function (error) {
console.log("ERRROR");
});
Since getProperties is running asynchronously I am not able to wait for all objects before the table is updated. The table is updated with one object at a time, and we would rather update all at once. Blocking IO is not a problem.
As the could shows I have been looking into Promise.all() from bluebird.js in order to control execution and wait for all getProperties calls to return, but so far unsuccessfully.
Regards,
Torjus
This question is purely unrelated to the use of the viewer, you would need to look for some documentation on how to use Promises in order to wait for completion of multiple requests in parallel.
here is some pseudo code that may help you (ES6 syntax), I'm skipping error handling for sake of clarity:
// wrap get the async method in a promise so you can wait its completion
const getPropertiesAsync = (id) => {
return new Promise((resolve, reject) => {
_viewer.getProperties(id, (result) => {
resolve(result)
}, (error) => {
reject(error)
})
})
}
//create an array of asynchronous tasks for each component you want to get props on
const propTasks = componentIds.map((id) => {
return getPropertiesAsync(id)
})
//promise version
Promise.all(propTasks).then((results) => {
//populate table with results
})
//OR async ES7 syntax
const results = await Promise.all(propTasks)
//populate table with results
Here is an article I wrote about using async/await with the viewer, but since the topic is much broader you should be able to find a lot more documentation by looking over the web by yourself:
Getting rid of JavaScript callbacks using async/await
Hope that helps