Firebase compound queries always get empty results - node.js

I'm trying to create a query in my nodesjs server (functions section) in firebase
I created a collection with 3 documents and 2 fields each - email and timestamp
when I do a query for email -> I get the right documents
await admin.firestore().collection('sessions').where('email', '==', 'email#gmail.com').get()
when I do a query for timestamp -> I get the right documents
await admin.firestore().collection('sessions').where('timestamp', '>', 1601164800).get()
but...
when I do a query for both email and timestamp -> I get no documents...
await admin.firestore().collection('sessions').where('email', '==', email).where('timestamp', '>', 1601164800).get()
the way I understand it is when I do multiple 'where' it's like a logical AND so if I get the same records in the first query and the seconds query I should get them in the third query as well
I also read that I need to create an index in case I want to do multiple where that have an equal operator and range operator so I created one and no luck :(
all the data I created manually - both the collection + documents and the index
is there something that I miss?
collection data
indexes definition
this is the code of full process of getting the docs
the result array I return to the client and I get an empty array
async function getInfo() {
let query = admin.firestore().collection('sessions').where('email', '==', 'email#gmail.com').where('timestamp', '>', 1601164800);
let dbData = await query.get();
let result = [];
dbData.forEach(doc => {
let data = doc.data();
// this log is to see how many docs I get
logger.log(data);
result.push(data);
});
return result;
}

What about iterating over dbData.docs?
logger.log(`Documents retrieved: ${dbData.size}`)
return dbData.docs.map(doc => doc.data())
https://googleapis.dev/nodejs/firestore/latest/QuerySnapshot.html

Related

How to get multiple collection group's documents at once from firestore?

So here I Have multiple sub-collections(subjects) in different doc's(grades) and I want to get all the sub-collections(subjects) documents(questions) at once I tried to get them by using Collection group queries the only problem which I am facing in my code sometime it returning all the doc's(questions) but sometimes not what is the issue
this is what i have tried
const getAllQuestions = (request,response)=>{
const subjects = ['Maths','English']
const questionsArray = []
subjects.forEach((subject,index)=>{
db.collectionGroup(subject)
.get()
.then((querySnapshot)=>{
querySnapshot.forEach((doc) => {
questionsArray.push({...doc.data(),id:doc.id})
})
if(index==subjects.length-1){
response.status(200).json({
status:200,
data:questionsArray,
length:questionsArray.length
})
}
})
})
}
If you don't want to get the subcollections from all grades, but only from one of them, you should not use a collection group query but instead specify the entire path to the collection you want to query/read:
db.collection('quizQuesDb/Grade 5/'+subject)
.get()
If you want to perform a query across all collections of a certain name under a specific path, see: CollectionGroupQuery but limit search to subcollections under a particular document

Mongodb, how to get all array's each matching first element in DB?

So I have a scenario that requires to select ALL element's first return.
What I am currently practicing is using a loop
const array_list = await Person.find({});
const results = [];
for (const element in array_list) {
const temp_result = await Transactions.find({_id: array_list[element]._id}).sort({$natural: -1 }).limit(1);
results.push(temp_results);
}
This works, but I figure out this may slow down the whole process if the database became larger in term of scale, is there anyway to faster this result?
You can simplify this to use a single db-query for finding the transaction documents by specifying the $in operator where you pass in the the _ids of the person array_list:
const elementIds = array_list.map(elem => elem._id);
await Transactions.find({_id: { $in: elementIds }}).sort({$natural: -1 }).limit(1);
Above solution still requires two db-reads - if you want to do it in a single query, you can do an aggregation-query where you $lookup the transaction documents for each person (see examples in the doc-page).

Firestore: Searching the element inside document array if it exists or not

I am having collection with some documents. Each document contains some arrays with strings. I want to know whether given string is there inside the specific document array. I have seen queries to find the documents by using array contains. But I have the document I just want to query whether the string exists inside that document array or not?
var dbRef = dbConnection.db.collection('posts').doc(req.body.post_id);
dbRef.where('likes', 'array-contains', req.body.user_id).get()
.then(data => {
console.log(data);
})
.catch(err => {
console.log(err);
})
I have a document with specific id. I know the document id. That document contains array named as likes. That array will store some strings. I want to know the whether the string exists or not inside that array or not? I am getting the following error
TypeError: dbRef.where is not a function
Then I tried without giving document id. It worked. It returned the documents. But I want to search inside the document array
Your dbRef points to a (single) document, and you can't query a document.
If you want to query the documents in the posts collection, you're looking for:
var dbRef = dbConnection.db.collection('posts');
dbRef.where('likes', 'array-contains', req.body.user_id).get()
...
You can query for both document ID and array contains with:
db.collection('books').where(firebase.firestore.FieldPath.documentId(), '==', 'fK3ddutEpD2qQqRMXNW5').get()
var dbRef = dbConnection.db.collection('posts');
dbRef
.where(firebase.firestore.FieldPath.documentId(), '==', req.body.post_id)
.where('likes', 'array-contains', req.body.user_id).get()
...
Alternatively, you can simply read the document with your original code, and then check client-side whether the array still contains the field:
var dbRef = dbConnection.db.collection('posts').doc(req.body.post_id);
dbRef.get()
.then(doc => {
if (doc.data().likes.indexOf(req.body.user_id) >= 0) {
... the post is liked by the user
}
})
.catch(err => {
console.log(err);
})

How to do a query with every result of a query?

I'm trying to build an application, using MongoDB and Node.JS. I have 3 models: User, Ride, Participating.
Participating contains a userID and a rideID. It is almost as with a SQL logic: Participating links the two others models.
I'd like to, using a userID, return every Ride thanks to Participating Model
I tried to use a forEach, as the first request returns an array.
router.get('/getAllRide/:userID',function(req,res){
let userID = req.params.userID
let return = []
Participating.find({_idUser: userID })
.then(participating => {
participating.forEach(element => {
Ride.find({_id: element._id})
.exec()
.then(ride => {
retour.push(ride)})
});
res.status(200).json(return)
});
At the end of this code, the array return is empty, while it is supposed to contain every Ride whose _id is in an entity Participating.
OK, there are a couple of issues here:
return is a keyword. You probably shouldn't be using it as a variable name.
Database calls are asynchronous. forEach loops are synchronous. This means that you're immediately going to be returning retour (which looks undefined).
Mongoose has tools to populate nested relationships -- it's best not to do it in application code. Even if you are doing this in application code, it's likely best not to iterate over your results & do new finds -- instead, it's better to construct a single find query that returns all of the new documents you need.
If you did want to do this in application code, you'd want to either use async/await or Promise.all:
const toReturn = [];
const findPromises = participating.map(element => {
return Ride.find({_id: element._id})
.exec()
.then(result => toReturn.push(result)
});
return Promise.all(findPromises).then(() => res.status(200).json(toReturn));
(note: rather than using Promise.all, if you're using Bluebird you could instead use Promise.map.

Inserting multiple records with pg-promise

I have a scenario in which I need to insert multiple records. I have a table structure like id (it's fk from other table), key(char), value(char). The input which needs to be saved would be array of above data. example:
I have some array objects like:
lst = [];
obj = {};
obj.id= 123;
obj.key = 'somekey';
obj.value = '1234';
lst.push(obj);
obj = {};
obj.id= 123;
obj.key = 'somekey1';
obj.value = '12345';
lst.push(obj);
In MS SQL, I would have created TVP and passed it. I don't know how to achieve in postgres.
So now what I want to do is save all the items from the list in single query in postgres sql, using pg-promise library. I'm not able to find any documentation / understand from documentation. Any help appreciated. Thanks.
I am the author of pg-promise.
There are two ways to insert multiple records. The first, and most typical way is via a transaction, to make sure all records are inserted correctly, or none of them.
With pg-promise it is done in the following way:
db.tx(t => {
const queries = lst.map(l => {
return t.none('INSERT INTO table(id, key, value) VALUES(${id}, ${key}, ${value})', l);
});
return t.batch(queries);
})
.then(data => {
// SUCCESS
// data = array of null-s
})
.catch(error => {
// ERROR
});
You initiate a transaction with method tx, then create all INSERT query promises, and then resolve them all as a batch.
The second approach is by concatenating all insert values into a single INSERT query, which I explain in detail in Performance Boost. See also: Multi-row insert with pg-promise.
For more examples see Tasks and Transactions.
Addition
It is worth pointing out that in most cases we do not insert a record id, rather have it generated automatically. Sometimes we want to get the new id-s back, and in other cases we don't care.
The examples above resolve with an array of null-s, because batch resolves with an array of individual results, and method none resolves with null, according to its API.
Let's assume that we want to generate the new id-s, and that we want to get them all back. To accomplish this we would change the code to the following:
db.tx(t => {
const queries = lst.map(l => {
return t.one('INSERT INTO table(key, value) VALUES(${key}, ${value}) RETURNING id',
l, a => +a.id);
});
return t.batch(queries);
})
.then(data => {
// SUCCESS
// data = array of new id-s;
})
.catch(error => {
// ERROR
});
i.e. the changes are:
we do not insert the id values
we replace method none with one, to get one row/object from each insert
we append RETURNING id to the query to get the value
we add a => +a.id to do the automatic row transformation. See also pg-promise returns integers as strings to understand what that + is for.
UPDATE-1
For a high-performance approach via a single INSERT query see Multi-row insert with pg-promise.
UPDATE-2
A must-read article: Data Imports.

Resources