how to get a list of database in quick.db? - node.js

I want to get a list of a key inside my database I use db.set(`fuel_${car}`, amount of fuel) and then I want to get a list of all cars fuel, here is my code :
/*set the fuel of lol9*/
client.on('message', async message => {
if(message.content === 'db') {
const m = await db.get('fuel');
message.channel.send(`${m}`);
}
})

The quick.db documentation states:
.all() -> array
This function returns the entire active table as an array.
await db.all()
// -> [Array]
This being said, you could do this:
let carFuelArray = [];
await db.all().then(array => {
array.forEach(element, => {
if(element.startWith("fuel_")) {
// if you want to log "fuel_{car}"
carFuelArray.push(element)
// if you want to log the "fuel_{car}" key value
carFuelArray.push(db.get(element))
}
})
})

Related

Google Firestore array_contains query doesn't work

I am trying to run a simple query to find other document that contains some ID. Here is how it looks, and here is what I am trying to get. I don't see a reason for it to work this way. I tried this code for Firestore Functions but it doesn't work.
I tried this code:
exports.updateDietDaysWhenMealChanges = functions.firestore
.document("Posilek/{posilekId}")
.onUpdate((change, context) => {
const posilekId = context.params.posilekId;
const posilekAfter = change.after.data();
return db.collection("DietDays")
.where("Meals", "array-contains", { ID: posilekId })
.get()
.then(snapshot => {
if (snapshot.empty) {
functions.logger.log("No matching DietDay found");
return null;
} else {
return Promise.all(snapshot.docs.map(dietDayDoc => {
const dietDayId = dietDayDoc.id;
const meals = dietDayDoc.data().Meals;
const mealIndex = meals.findIndex(meal => meal.ID === posilekId);
meals[mealIndex] = { ID: posilekId, Portions: posilekAfter.Portions };
functions.logger.log(`Editing meal in DietDay with ID: ${dietDayId}`);
return dietDayDoc.ref.update({ Meals: meals });
}));
}
});
});
And I tried manual query.
The array-contains operator can only check for exact matches between items in the array and the value you pass. So in code:
.where("Meals", "array-contains", { ID: posilekId, Portions: 12.5 })
There is no way to do a partial match.
The common workaround is to add an additional field (e.g. MealIDs) that contains just the value you want to filter on:
MealIDs: ["ohN....", "..."]
With that additional array, you can then filter with:
.where("MealIDs", "array-contains", posilekId)

Get all documents in collection using Cloud Firestore

I read several documentation but I don't understand why I should use an extra layer(foreach) in my code when I read all of the data inside a collection using Firebase (Cloud Firestore).
Here is the original documentation:
https://firebase.google.com/docs/firestore/query-data/get-data#get_all_documents_in_a_collection
Here is my code:
async loadUsers(): Promise<User[]> {
const users = new Array<User>();
const snapshot = await this.firestore.collection('users').get();
snapshot.forEach((collection) => {
collection.docs.forEach(doc => {
users.push(doc.data() as User);
});
});
return users;
}
As I understand it should work like this:
async loadUsers(): Promise<User[]> {
const users = new Array<User>();
const snapshot = await this.firestore.collection('users').get();
snapshot.forEach(doc => {
users.push(doc.data() as User);
});
return users;
}
Error message:
"Property 'data' does not exist on type 'QuerySnapshot'."
.collection().get() does NOT return an array; it returns a QuerySnapshot, which has a property .docs, which is an array of QueryDocumentSnapshot, each of which has a property .data, which is the data read from the document.
Documentation
https://firebase.google.com/docs/reference/js/firebase.firestore.CollectionReference
In new modular firebase firestore(version 9.+) it should be like this:
import { getFirestore, collection, query, getDocs } from 'firebase/firestore/lite'
async readAll() {
const firestore = getFirestore()
const collectionRef = collection(firestore, '/users')
let q = query(collectionRef, orderBy('createTimestamp', 'desc'))
const querySnapshot = await getDocs(q)
const items = []
querySnapshot.forEach(document => {
items.push(document.data())
})
return items
}
I could not find any parameter on querySnapshot directly that is something like .docs was and included whole array before. So it is kinda like onSnapshot is and was.
Based on #LeadDreamer answer, I could manage to simplify the code
async loadUsers(): Promise<User[]> {
const users = new Array<User>();
await this.firestore.collection('users').get().subscribe(querySnapshot => {
querySnapshot.docs.forEach(doc => {
users.push(doc.data() as User);
});
});
return users;
}
There seems to be no other way but to iterate.
const q = query(collection(db, "item"));
getDocs(q).then( response => {
const result = response.docs.map(doc=>({
id: doc.id,
...doc.data(),
}))
console.log(result);
}).catch(err=>console.log(err))

How do I run an update while streaming using pg-query-stream and pg-promise?

I am trying to load 50000 items from the database with text in them, tag them and update the tags
I am using pg-promise and pg-query-stream for this purpoes
I was able to get the streaming part working properly but updating has become problematic with so many update statements
Here is my existing code
const QueryStream = require('pg-query-stream')
const JSONStream = require('JSONStream')
function prepareText(title, content, summary) {
let description
if (content && content.length) {
description = content
} else if (summary && summary.length) {
description = summary
} else {
description = ''
}
return title.toLowerCase() + ' ' + description.toLowerCase()
}
async function tagAll({ db, logger, tagger }) {
// you can also use pgp.as.format(query, values, options)
// to format queries properly, via pg-promise;
const qs = new QueryStream(
'SELECT feed_item_id,title,summary,content FROM feed_items ORDER BY pubdate DESC, feed_item_id DESC'
)
try {
const result = await db.stream(qs, (s) => {
// initiate streaming into the console:
s.pipe(JSONStream.stringify())
s.on('data', async (item) => {
try {
s.pause()
// eslint-disable-next-line camelcase
const { feed_item_id, title, summary, content } = item
// Process text to be tagged
const text = prepareText(title, summary, content)
const tags = tagger.tag(text)
// Update tags per post
await db.query(
'UPDATE feed_items SET tags=$1 WHERE feed_item_id=$2',
// eslint-disable-next-line camelcase
[tags, feed_item_id]
)
} catch (error) {
logger.error(error)
} finally {
s.resume()
}
})
})
logger.info(
'Total rows processed:',
result.processed,
'Duration in milliseconds:',
result.duration
)
} catch (error) {
logger.error(error)
}
}
module.exports = tagAll
The db object is the one from pg-promise whereas the tagger simply extracts an array of tags from text contained in the variable tags
Too many update statements are executing from what I can see in the diagnostics, is there a way to batch them?
If you can do everything with one sql statement, you should! Here you're paying the price of a back and forth between node and your DB for each line of your table, which will take most of the time of your query.
Your request can be implemented in pure sql:
update feed_items set tags=case
when (content = '') is false then lower(title) || ' ' || lower(content)
when (summary = '') is false then lower(title) || ' ' || lower(summary)
else title end;
This request will update all your table at once. I'm sure it'd be some order of magnitude faster than your method. On my machine, with a table containing 100000 rows, the update time is about 600ms.
Some remarks:
you don't need to order to update. As ordering is quite slow, it's better not to.
I guess the limit part was because it is too slow? If it is the case, then you can drop it, 50000 rows is not a big table for postgres.
I bet this pg-stream things does not really stream stuff out of the DB, it only allows you to use a stream-like api from the results it gathered earlier... No problem about that, but I thought maybe there was a misconception here.
This is the best I could come up with to batch the queries inside the stream so that we dont need to load all data in memory or run too many queries. If anyone knows a better way to batch especially with t.sequence feel free to add another answer
const BATCH_SIZE = 5000
async function batchInsert({ db, pgp, logger, data }) {
try {
// https://vitaly-t.github.io/pg-promise/helpers.ColumnSet.html
const cs = new pgp.helpers.ColumnSet(
[
{ name: 'feed_item_id', cast: 'uuid' },
{ name: 'tags', cast: 'varchar(64)[]' },
],
{
table: 'feed_items',
}
)
const query =
pgp.helpers.update(data, cs) + ' WHERE v.feed_item_id=t.feed_item_id'
await db.none(query)
} catch (error) {
logger.error(error)
}
}
async function tagAll({ db, pgp, logger, tagger }) {
// you can also use pgp.as.format(query, values, options)
// to format queries properly, via pg-promise;
const qs = new QueryStream(
'SELECT feed_item_id,title,summary,content FROM feed_items ORDER BY pubdate DESC, feed_item_id DESC'
)
try {
const queryValues = []
const result = await db.stream(qs, (s) => {
// initiate streaming into the console:
s.pipe(JSONStream.stringify())
s.on('data', async (item) => {
try {
s.pause()
// eslint-disable-next-line camelcase
const { feed_item_id, title, summary, content } = item
// Process text to be tagged
const text = prepareText(title, summary, content)
const tags = tagger.tag(text)
queryValues.push({ feed_item_id, tags })
if (queryValues.length >= BATCH_SIZE) {
const data = queryValues.splice(0, queryValues.length)
await batchInsert({ db, pgp, logger, data })
}
} catch (error) {
logger.error(error)
} finally {
s.resume()
}
})
})
await batchInsert({ db, pgp, logger, data: queryValues })
return result
} catch (error) {
logger.error(error)
}
}

Using node.js and graphql - I'm trying to display the total number of students for each school, but by Promise.all(), it returns empty

async studentCount(#Parent() school: School): Promise<number> {
let studentIds = []
let cohortStudentIds = []
// find all the programs for each school
const programs = await this.programService.find({ schoolId: school.id })
// find all the cohorts for each program
programs.map(async program => {
//collect the student ids per cohort within array
const cohortsWithStudents = await this.cohortService.getStudentsForCohortsByProgramId(program.id)
// find all the students by looping through each cohort and save in cohortStudentIds[]
cohortsWithStudents.map(async (cohort) => {
await cohort.students.map(async (student) => { await cohortStudentIds.push(student.id) })
});
//collect all the student id arrays into 1 big array
studentIds = await [
...studentIds,
...cohortStudentIds
]
})
await Promise.all([programs, cohortStudentIds, studentIds])
.then((values) => {
console.log( values )
return values
});
console.log('xxx')
// return the number of students per school
return uniq(studentIds).length
}
You're passing an async function as the callback to map(), which results in an array of promises, but you're never waiting for those.
In contrast, you're awaiting a lot of things you don't need to wait for as they are not promises, like the return value of push, the return value of map, or the array literal.
You should write something like
async studentCount(#Parent() school: School): Promise<number> {
let studentIds = new Set
const programs = await this.programService.find({ schoolId: school.id })
await Promise.all(programs.map(async program => {
// ^^^^^^^^^^^^^^^^^
const cohortsWithStudents = await this.cohortService.getStudentsForCohortsByProgramId(program.id)
for (const cohort of cohortsWithStudents) {
for (const student of cohort.students) {
studentIds.add(student.id)
}
}
});
return studentIds.size
}

Sequelize how to transform objects list

I have this code
employees = []
async function fillEmployeesList() {
await Employe.findAll().then(allEmployes => {
employees = allEmployes;
console.log('All employees: ' + allEmployes[0].nom);
});
}
And I want to transform the raw object to correct employee object, how can I do this? Because allEmployes is currently a list of sequelize object, I want it to be a list of Employee object, how? (Sorry english is not my first language)
you can try this
employees = []
async function fillEmployeesList() {
const allEmployees = await Employe.findAll();
employees = allEmployees.map(employee => {
return employee.toJSON();
});
}
im adding just this return employee.toJSON(); line of code.
If you are looking to assign employees data directly to a variable without using callback and manipulate those data as per your requirements you can do like this.
employees = []
async function fillEmployeesList() {
const allEmployees = await Employe.findAll();
employees = allEmployees.map(employee => {
return {"nom", employee.nom, "other_key": employee.name};
})
console.log(employees);//will print employees modified data
}
You can try this
employees = []
async function fillEmployeesList() {
await Employe.findAll({raw: true}).then(allEmployes => {
employees = allEmployes;
console.log('All employees: ' + allEmployes[0].nom);
});
}
If you only need the raw data and don't want to update anything, you can do like this to get the raw data.
employees = []
async function fillEmployeesList() {
await Employe.findAll({raw: true}).then(allEmployes => {
employees = allEmployes;
console.log('All employees: ' + allEmployes[0].nom);
});
}
You can read about it here. http://docs.sequelizejs.com/manual/models-usage.html#raw-queries

Resources