How to read data of multiple user id in a single request in cloud functions on firebase - node.js

I am using Firebase Database for my Android application and cloud functions. As Firebase stores data in JSON format. Each user has some unique id which assigns to user.
I want to know if there is any way to get multiple users data on a single request. If you see below code it will fetch data only of passed user id.
db
.ref("/users/" + request.query.userId)
.once("value")
.then(function(snapshot) {
I have 10 user ids of which I want to get data on a single request in nodejs. What is the way to get this ?

Store them in an array like this:
var promArr = []
var snapshotData = []
promArr.push(
db
.ref("/users/" + request.query.userId) //put here unique usernames
.once("value")
.then(function(snapshot) {
snapshotData.push(snapshot.val())
})
return Promise.all(promArr).then(snap => {
//loop through snapshotData here
})

This can be written much cleaner:
const records = [
admin.database().ref(`users/${userId1}`).once('value'),
admin.database().ref(`users/${userId2}`).once('value'),
];
return Promise.all(records).then(snapshots => {
const record1 = snapshots[0].val();
const record2 = snapshots[1].val();
});

Related

Google Cloud Functions Firestore Limitations

I have written a function which gets a Querysnapshot within all changed Documents of the past 24 hours in Firestore. I loop through this Querysnapshot to get the relevant informations. The informations out of this docs I want to save into maps which are unique for every user. Every user generates in average 10 documents a day. So every map gets written 10 times in average. Now I'm wondering if the whole thing is scalable or will hit the 500 writes per transaction limit given in Firebase as more users will use the app.
The limitation im speaking about is documented in Google documentation.
Furthermore Im pretty sure that my code is really slow. So im thankful for every optimization.
exports.setAnalyseData = functions.pubsub
.schedule('every 24 hours')
.onRun(async (context) => {
const date = new Date().toISOString();
const convertedDate = date.split('T');
//Get documents (that could be way more than 500)
const querySnapshot = await admin.firestore().collectionGroup('exercises').where('lastModified', '>=', `${convertedDate}`).get();
//iterate through documents
querySnapshot.forEach(async (doc) => {
//some calculations
//get document to store the calculated data
const oldRefPath = doc.ref.path.split('/trainings/');
const newRefPath = `${oldRefPath[0]}/exercises/`;
const document = await getDocumentSnapshotToSave(newRefPath, doc.data().exercise);
document.forEach(async (doc) => {
//check if value exists
const getDocument = await admin.firestore().doc(`${doc.ref.path}`).collection('AnalyseData').doc(`${year}`).get();
if (getDocument && getDocument.exists) {
await document.update({
//map filled with data which gets added to the exisiting map
})
} else {
await document.set({
//set document if it is not existing
}, {
merge: true
});
await document.update({
//update document after set
})
}
})
})
})
The code you have in your question does not use a transaction on Firestore, so is not tied to the limit you quote/link.
I'd still recommend putting a limit on your query through, and processing the documents in reasonable batches (a couple of hundred being reasonable) so that you don't put an unpredictable memory load on your code.

How to update a quantity in another document when creating a new document in the firebase firestore collection?

When I create a new document in the note collection, I want to update the quantity in the info document. What am I doing wrong?
exports.addNote = functions.region('europe-west1').firestore
.collection('users/{userId}/notes').onCreate((snap,context) => {
const uid = admin.user.uid.toString();
var t;
db.collection('users').doc('{userId}').collection('info').doc('info').get((querySnapshot) => {
querySnapshot.forEach((doc) => {
t = doc.get("countMutable").toString();
});
});
let data = {
countMutable: t+1;
};
db.collection("users").doc(uid).collection("info").doc("info").update({countMutable: data.get("countMutable")});
});
You have... a lot going on here. A few problems:
You can't trigger firestore functions on collections, you have to supply a document.
It isn't clear you're being consistent about how to treat the user id.
You aren't using promises properly (you need to chain them, and return them out of the function if you want them to execute properly).
I'm not clear about the relationship between the userId context parameter and the uid you are getting from the auth object. As far as I can tell, admin.user isn't actually part of the Admin SDK.
You risk multiple function calls doing an increment at the same time giving inconsistent results, since you aren't using a transaction or the increment operation. (Learn More Here)
The document won't be created if it doesn't already exist. Maybe this is ok?
In short, this all means you can do this a lot more simply.
This should do you though. I'm assuming that the uid you actually want is actually the one on the document that is triggering the update. If not, adjust as necessary.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
exports.addNote = functions.firestore.document('users/{userId}/notes/{noteId}').onCreate((snap,context) => {
const uid = context.params.userId;
return db.collection("users").doc(uid).collection("info").doc("info").set({
countMutable: admin.firestore.FieldValue.increment(1)
}, { merge: true });
});
If you don't want to create the info document if it doesn't exist, and instead you want to get an error, you can use update instead of set:
return db.collection("users").doc(uid).collection("info").doc("info").update({
countMutable: admin.firestore.FieldValue.increment(1)
});

Datastore Query filters accumulating on subsequent calls

I am making a RESTFul API. I have the following endpoint
/users/getByState/stateId
I have several users in the database. 3 users are from say, Texas, 1 from New York.
When I call the endpoint the first time, either by
/user/getByState/tx
or by
/user/getByState/ny
I get a result, but calling immediately the endpoint using the other state id, returns an empty array and the message {"moreResults": "NO_MORE_RESULTS"}
Sending the query to the console log shows me that the first time the query only has one filter.. lets say
FILTER:{stateID:'tx'}
But the second time, instead of changing the filter option to 'ny' it instead adds another filter so now in the console log I see
FILTER:{stateId:'tx'}
FILTER:{stateId:'ny'}
Which will obviously always return an empty array because it will never find stateId='tx' AND stateId='ny'
I don't understand why the cloud datastore client is adding a filter to a CONST!!
If I call the endpoint 7 times, I see 7 filters. Only after I redeploy the filters "clear"
How do I clear the filters before running the query again? Have searched about this in the cloud datastore documentation but there is no information about filters concatenating after each call
Am I missing something? This is my code:
const Datastore = require('#google-cloud/datastore');
const datastore=Datastore();
const query=datastore.createQuery("user");
exports.get_user_by_state =(req,res,next) => {
const pageCursor = req.query.cursor;
const userState=req.params.stateId;
const selectQuery = query
.filter('stateId',userState)
console.log(selectQuery);
selectQuery.run({cache:false})
.then((results) => {
res.json(results);
})
.catch(err => res.status(500).json(err));
}
You need a new query object for each and every query. So, the query object should be created inside the block.
const Datastore = require('#google-cloud/datastore');
const datastore = Datastore();
exports.get_user_by_state =(req,res,next) => {
const pageCursor = req.query.cursor;
const query = datastore.createQuery("user");
const userState = req.params.stateId;
const selectQuery = query
.filter('stateId',userState)
console.log(selectQuery);
selectQuery.run({cache:false})
.then((results) => {
res.json(results);
})
.catch(err => res.status(500).json(err));
}

How to store documents in an ArangoDb graph using ArangoJs?

I am using latest version of ArangoDb and ArangoJs from a nodejs application. I have got following two vertexes
users
tokens
tokens vertex contain the security tokens issues to one of the user in the users vertex. I have got an edge definition named token_belongs_to connecting tokens to users
How do I store a newly generated token belonging to an existing user using ArangoJs?
I am going to assume you are using ArangoDB 2.7 with the latest version of arangojs (4.1 at the time of this writing) as the API has changed a bit since the 3.x release of the driver.
As you don't mention using the Graph API the easiest way is to simply use the collections directly. Using the Graph API however adds benefits like orphaned edges automatically being deleted when any of their vertices are deleted.
First you need to get a reference to each collection you want to work with:
var users = db.collection('users');
var tokens = db.collection('tokens');
var edges = db.edgeCollection('token_belongs_to');
Or if you are using the Graph API:
var graph = db.graph('my_graph');
var users = graph.vertexCollection('users');
var tokens = graph.vertexCollection('tokens');
var edges = graph.edgeCollection('token_belongs_to');
In order to create a token for an existing user, you need to know the _id of the user. The _id of a document is made up of the collection name (users) and the _key of the document (e.g. 12345678).
If you don't have the _id or _key you can also look up the document by some other unique attribute. For example, if you have a unique attribute email that you know the value of, you could do this:
users.firstExample({email: 'admin#example.com'})
.then(function (doc) {
var userId = doc._id;
// more code goes here
});
Next you'll want to create the token:
tokens.save(tokenData)
.then(function (meta) {
var tokenId = meta._id;
// more code goes here
});
Once you have the userId and tokenId you can create the edge to define the relation between the two:
edges.save(edgeData, userId, tokenId)
.then(function (meta) {
var edgeId = meta._id;
// more code goes here
});
If you don't want to store any data on the edge you can substitute an empty object for edgeData or simply write it as:
edges.save({_from: userId, _to: tokenId})
.then(...);
So the full example would go something like this:
var graph = db.graph('my_graph');
var users = graph.vertexCollection('users');
var tokens = graph.vertexCollection('tokens');
var edges = graph.edgeCollection('token_belongs_to');
Promise.all([
users.firstExample({email: 'admin#example.com'}),
tokens.save(tokenData)
])
.then(function (args) {
var userId = args[0]._id; // result from first promise
var tokenId = args[1]._id; // result from second promise
return edges.save({_from: userId, _to: tokenId});
})
.then(function (meta) {
var edgeId = meta._id;
// Edge has been created
})
.catch(function (err) {
console.error('Something went wrong:', err.stack);
});
Attention - syntax changes:
Edge creation:
const { Database, CollectionType } = require('arangojs');
const db = new Database();
const collection = db.collection("collection_name");
if (!(await collection.exists())
await collection.create({ type: CollectionType.EDGE_COLLECTION });
await collection.save({_from: 'from_id', _to: 'to_id'});
https://arangodb.github.io/arangojs/7.1.0/interfaces/_collection_.edgecollection.html#create

Redis: How to save (and read) Key-Value pairs at once by namespace/rule?

I want to utilize Redis for saving and reading a dynamic list of users.
Essentially, Redis is Key-Value pair storage. How can I read all the saved users at once? (for example, creating a namespace "users/user_id")
And since I am a Redis beginner,
Do you think the use of Redis in the above case is proper/efficient?
Thanks.
When using key/values for storing objects you should create a domain specific key by combining the domain name plus the unique id. For example, a user object that might look like this:
// typical user data model
var User = function(params) {
if (!params) params = {};
this.id = params.id;
this.username = params.username;
this.email = params.email;
// other stuff...
};
Then domain key could be created like this:
var createUserDomainKey = function(id) {
return 'User:' + id;
};
If the id was 'e9f6671440e111e49f14-77817cb77f36' the key would be this:
User:e9f6671440e111e49f14-77817cb77f36
Since redis will store string values, you need to serialize, probably with json so to save the user object. Assuming a valid use object would would do something like this:
var client = redis.createClient(),
key = createUserDomainKey( user.id ),
json = JSON.stringify( user ) ;
client.set( key, json, function(err, result) {
if (err) throw err; // do something here...
// result is 'OK'
});
For simple fire-hose queries returning all users, you can do this:
var client = redis.createClient();
client.keys( createUserDomainKey( '*' ), function(err, keys) {
if (err) throw err; // do something here
// keys contains all the keys matching 'User:*'
});
Note that the redis folks discourage the use of 'keys' for production, so a better approach is to build your own index using sorted-set, but if your user list is limited to a few hundred, there is no problem.
And since it returns a list of keys, you need to loop through and get each user then parse the json string to recover the real object. Assuming a populated list of keys, you could do something like this:
var client = redis.getClient(),
users = [];
var loopCallback = function(err, value) {
if (!err && value) {
// parse and add the user model to the list
users.push( JSON.parse( value ) );
}
// pull the next key, if available
var key = keys.pop();
if (key) {
client.get( key, loopCallback );
} else {
// list is complete so return users, probably through a callback;
}
}
// start the loop
loopCallback();
This is a good general purpose solution, but there are others that use sorted sets that are move efficient when you want access to the entire list with each access. This solution gives you the ability to get a single user object when you know the ID.
I hope this helps.
ps: a full implementation with unit tests of this can be found in the AbstractBaseDao module of this project.

Resources