I was attempting to fetch all documents from a collection in a Node.js environment. The documentation advises the following:
import * as admin from "firebase-admin";
const db = admin.firestore();
const citiesRef = db.collection('cities');
const snapshot = await citiesRef.get();
console.log(snapshot.size);
snapshot.forEach(doc => {
console.log(doc.id, '=>', doc.data());
});
I have 20 documents in the 'cities' collection. However, the logging statement for the snapshot size comes back as 0.
Why is that?
Edit: I can write to the Firestore without issue. I can also get details of a single document, for example:
const city = citiesRef.doc("city-name").get();
console.log(city.id);
will log city-name to the console.
Ensure that Firebase has been initialized and verify the collection name matches your database exactly, hidden spaces and letter case can break the link to Firestore. One way to test this is to create a new document within the collection to validate the path.
db.collection('cities').doc("TEST").set({test:"value"}).catch(err => console.log(err));
This should result in a document in the correct path, and you can also catch it to see if there are any issues with Security Rules.
Update
To list all documents in a collection, you can do this with the admin sdk through a server environment such as the Cloud Functions using the listDocuments() method but this does not reduce the number of Reads.
const documentReferences = await admin.firestore()
.collection('someCollection')
.listDocuments()
const documentIds = documentReferences.map(it => it.id)
To reduce reads, you will want to aggregate the data in the parent document or in a dedicated collection, this would double the writes for any updates but crush read count to a minimal amount.
Related
I have a cloud function that uses bigquery client to make operations. Ultimately I'm looking to check if a row with an specific id exists so I built the following function (for now I'm just looking to make the request work and log the results):
const {BigQuery} = require('#google-cloud/bigquery');
const bigquery = new BigQuery({
projectId: 'myProjectID',
keyFilename: 'mykey.json'
});
await idExist('MyID--Ak2aRpL0','myProjectID.myDatasetID.myTableName')
async function idExist (id,table){
console.log('Checking if it exist')
const query = `SELECT * FROM \`${table}\` WHERE id='${id}'`;
const options = {
query: query,
location: table
}
const [job] = await bigquery.createQueryJob(options);
console.log(`Job '${job.id}' started.`);
const [rows] = await job.getQueryResults();
console.log('Resulted Rows:');
rows.forEach(row => console.log(row));
}
When I run this I get Error: Location projectID.datasetId.tableName does not support this operation. at new ApiError (/node_modules/#google-cloud/common/build/src/util.js:75:15)
Things I tried:
Making sure that the service account that I'm using have "Big Query Data Editor" and "Big Query User" roles. Both should give me access to create jobs and manage tables.
Different combinations of locations 'datasetID.tableName' and 'tableName' only. All with the same error result.
Running the SAME query on the SQL workspace in browser (with an account with data editor + query user roles). This worked fine and was able to retrieve the row perfectly.
Re-checked :) that the SA contains 'Data Editor' and 'User' roles
Edit: As suggested I also tried running SELECT 1 AS TEST and SELECT count(1) FROM myProjectID.myDatasetID.myTableName with the same result.
Notes:
The function is a firebase function, and the service account is the firebase-adminsdk, I added the big query roles to that one.
-Inserting a row using the client works fine:
await bigquery.dataset('datasetID').table('tableName').insert(myObject);
Querying using query() function works. So maybe an issue with creating jobs? I would prefer to use jobs for this to prevent having the function running too much time.
const result = await bigquery.query(query);
I would really appreciate some help/guidance on this issue. Let me know If I'm missing any helpful information.
Thanks!!
I observed a huge amount of read on my firebase console and I was wondering if this might come from my "referral function".
This function works perfectly fine but I was wondering whether or not this function could end up with a crazy load of read in case of app scaling.
My question: does this function imply that every time a user comes in, it will account for a number of read equivalent to the number of users in my collection ?
Thus, as this function is an onUpdate, will it redo the job every time a document is updated ?
I would not mind some resources on the topic because I found it unclear on Firebase's website.
I hope my questions are clear!
Thank you very much!
export const onReferralInfoUpdate = functions.
firestore.document('users/{userUid}')
.onUpdate(async (change, context) => {
const before = change.before.data();
const after = change.after.data();
const currentUserUid = after["uid"];
if (before.godfather_code == after.godfather_code){
console.log('Text did not change')
return null
}
const godfatherUserSnapshot = await db.collection('users').where("referral_code", "==", after.godfather_code).get();
const godfather = godfatherUserSnapshot.docs[0].data();
const godfatherUid = godfather["uid"];
const userRef = db.collection('users').doc(after.uid);
const godfather_code = after.godfather_code
await userRef.update({godfather_code})
console.log(`the text before was >> ${before.godfather_code} << and after is ${after.godfather_code}` )
let batch = db.batch();
const updateGodfather = db.collection('users').doc(godfatherUid);
batch.update(updateGodfather, {
reward: admin.firestore.FieldValue.increment(100),
godChildUid: admin.firestore.FieldValue.arrayUnion(currentUserUid),
});
return batch.commit();
});
Yes, the where("referral_code", "==", after.godfather_code).get() will fetch all the documents matching the query every time onUpdate() function triggers and you'll be charged N reads (N = number of matched documents). The Admin SDK doesn't have any caching like Client SDKs.
Does this function imply that every time a user comes in, it will account for a number of read equivalent to the number of users in my collection ?
Not numbers of documents in the users collection, only the documents matching your query as mentioned.
I have written a function which gets a Querysnapshot within all changed Documents of the past 24 hours in Firestore. I loop through this Querysnapshot to get the relevant informations. The informations out of this docs I want to save into maps which are unique for every user. Every user generates in average 10 documents a day. So every map gets written 10 times in average. Now I'm wondering if the whole thing is scalable or will hit the 500 writes per transaction limit given in Firebase as more users will use the app.
The limitation im speaking about is documented in Google documentation.
Furthermore Im pretty sure that my code is really slow. So im thankful for every optimization.
exports.setAnalyseData = functions.pubsub
.schedule('every 24 hours')
.onRun(async (context) => {
const date = new Date().toISOString();
const convertedDate = date.split('T');
//Get documents (that could be way more than 500)
const querySnapshot = await admin.firestore().collectionGroup('exercises').where('lastModified', '>=', `${convertedDate}`).get();
//iterate through documents
querySnapshot.forEach(async (doc) => {
//some calculations
//get document to store the calculated data
const oldRefPath = doc.ref.path.split('/trainings/');
const newRefPath = `${oldRefPath[0]}/exercises/`;
const document = await getDocumentSnapshotToSave(newRefPath, doc.data().exercise);
document.forEach(async (doc) => {
//check if value exists
const getDocument = await admin.firestore().doc(`${doc.ref.path}`).collection('AnalyseData').doc(`${year}`).get();
if (getDocument && getDocument.exists) {
await document.update({
//map filled with data which gets added to the exisiting map
})
} else {
await document.set({
//set document if it is not existing
}, {
merge: true
});
await document.update({
//update document after set
})
}
})
})
})
The code you have in your question does not use a transaction on Firestore, so is not tied to the limit you quote/link.
I'd still recommend putting a limit on your query through, and processing the documents in reasonable batches (a couple of hundred being reasonable) so that you don't put an unpredictable memory load on your code.
When I create a new document in the note collection, I want to update the quantity in the info document. What am I doing wrong?
exports.addNote = functions.region('europe-west1').firestore
.collection('users/{userId}/notes').onCreate((snap,context) => {
const uid = admin.user.uid.toString();
var t;
db.collection('users').doc('{userId}').collection('info').doc('info').get((querySnapshot) => {
querySnapshot.forEach((doc) => {
t = doc.get("countMutable").toString();
});
});
let data = {
countMutable: t+1;
};
db.collection("users").doc(uid).collection("info").doc("info").update({countMutable: data.get("countMutable")});
});
You have... a lot going on here. A few problems:
You can't trigger firestore functions on collections, you have to supply a document.
It isn't clear you're being consistent about how to treat the user id.
You aren't using promises properly (you need to chain them, and return them out of the function if you want them to execute properly).
I'm not clear about the relationship between the userId context parameter and the uid you are getting from the auth object. As far as I can tell, admin.user isn't actually part of the Admin SDK.
You risk multiple function calls doing an increment at the same time giving inconsistent results, since you aren't using a transaction or the increment operation. (Learn More Here)
The document won't be created if it doesn't already exist. Maybe this is ok?
In short, this all means you can do this a lot more simply.
This should do you though. I'm assuming that the uid you actually want is actually the one on the document that is triggering the update. If not, adjust as necessary.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
exports.addNote = functions.firestore.document('users/{userId}/notes/{noteId}').onCreate((snap,context) => {
const uid = context.params.userId;
return db.collection("users").doc(uid).collection("info").doc("info").set({
countMutable: admin.firestore.FieldValue.increment(1)
}, { merge: true });
});
If you don't want to create the info document if it doesn't exist, and instead you want to get an error, you can use update instead of set:
return db.collection("users").doc(uid).collection("info").doc("info").update({
countMutable: admin.firestore.FieldValue.increment(1)
});
I am using Firebase cloud code and firebase realtime database.
My database structure is:
-users
-userid32
-userid4734
-flag=true
-userid722
-flag=false
-userid324
I want to query only the users who's field 'flag' is 'true' .
What I am doing currently is going over all the users and checking one by one. But this is not efficient, because we have a lot of users in the database and it takes more than 10 seconds for the function to run:
const functions = require('firebase-functions');
const admin = require("firebase-admin");
admin.initializeApp(functions.config().firebase);
exports.test1 = functions.https.onRequest((request, response) => {
// Read Users from database
//
admin.database().ref('/users').once('value').then((snapshot) => {
var values = snapshot.val(),
current,
numOfRelevantUsers,
res = {}; // Result string
numOfRelevantUsers = 0;
// Traverse through all users to check whether the user is eligible to get discount.
for (val in values)
{
current = values[val]; // Assign current user to avoid values[val] calls.
// Do something with the user
}
...
});
Is there a more efficient way to make this query and get only the relevant records? (and not getting all of them and checking one by one?)
You'd use a Firebase Database query for that:
admin.database().ref('/users')
.orderByChild('flag').equalTo(true)
.once('value').then((snapshot) => {
const numOfRelevantUsers = snapshot.numChildren();
When you need to loop over child nodes, don't treat the resulting snapshot as an ordinary JSON object please. While that may work here, it will give unexpected results when you order on a value with an actual range. Instead use the built-in Snapshot.forEach() method:
snapshot.forEach(function(userSnapshot) {
console.log(userSnapshot.key, userSnapshot.val());
}
Note that all of this is fairly standard Firebase Database usage, so I recommend spending some extra time in the documentation for both the Web SDK and the Admin SDK for that.