How to get itemcount where child value is false with firebase functions? - node.js

Edit:I wrote a query that should return all items where ticked is true. I try to set the count of tickedItemsQuery but it tells me count is not an function. numChildren is also not a function. How can I set the count of the query?
exports.tickedBoxesCount = functions.database.ref('/listItems/{list_id}').onWrite(event => {
const ref = event.data.ref.parent;
const list_id = event.params.list_id;
const tickediItemsQuery = ref.orderByChild("ticked").equalTo(true)
admin.database().ref(`/lists/${list_id}/tickedCount`).set(tickediItemsQuery.count());
});

This line of code creates a query:
const tickediItemsQuery = ref.orderByChild("ticked").equalTo(true)
Note that it does not execute the query, it merely defines it. So there's no way that you can know the number of matching children from merely this line.
To determine the count, you'll need to execute the query, and determine the number of children in the resulting snapshot:
return tickediItemsQuery.once("value").then(function(snapshot) {
return admin.database().ref(`/lists/${list_id}/tickedCount`).set(snapshot.numChildren());
});
Note that I also added necessary return statements, to ensure the Cloud Functions knows when your (asynchronous) read and write operations are done.

Related

Filling a database with after promises parsing a text file

I'm trying to build a db with bluebird and sqlite3 to manage a lot of "ingredients".
So far I've managed to parse a file and extrapolate some data from it using regex.
Every time a line matches the regex I want to search in the database if an element with the same name has already been inserted, if so the element is skipped, otherwise it must be inserted.
The problem is that some elements get inserted more than one time.
The code partially works, and I'm saying it partially works because if I remove the line of code where I check the existence of an element with the same name, the duplicate rows are much more.
Here's the piece of code:
lineReader.eachLine(FILE_NAME, (line, last, cb) => {
let match = regex.exec(line);
if (match != null) {
let matchedName = match[1].trim();
//This function return a Promise for all the rows with corresponding name
ingredientsRepo.getByName(matchedName)
.then((entries) => {
if (entries.length > 0) {
console.log("ALREADY INSERTED INGREDIENT: " + matchedName)
} else {
console.log("ADDING " + matchedName)
ingredientsRepo.create(matchedName)
}
})
}
});
I know I'm missing something about Promises but I can't understand what I'm doing wrong.
Here's the code of Both getByName(name) and create(name):
create(name) {
return this.dao.run(
`INSERT INTO ingredients (name) VALUES (?)`,
[name]
)
}
getByName(name) {
return this.dao.all(
'SELECT * FROM ingredients WHERE name == ?',
[name]
)
}
ingredientsRepo.getByName(matchedName) returns a promise, this means it's asynchronous. I am also guessing that ingredientsRepo.create(matchedName) is asynchronous, because you are inserting something into a DB.
Your loop doesn't wait for these async functions to complete. Hence the loop could already be on the 50th iteration, when the .then(...) is called for the first iteration.
So let's say the first 10 elements have the same name. Well, since the asynchronous functions take some time to complete, the name will not be inserted into the DB until say maybe the 20th iteration of the loop. So for the first 10 elements, the name does not exist within the DB.
It's a timing issue.

Queries being run twice on sql.js

I'm using SQLite with sql.js on my project and I have been having some trouble with my implementation. Seems like the queries are being run on the database twice because for the for the INSERT statements I get 2 records in the DB.
The way I do it, I create the SQL and then pass it on to this method (the opts variable contains all of the data being put into the database):
prepareStatementAndCompileResults(db, sql, opts){
const stmt = db.prepare(sql);
const result = stmt.getAsObject(opts);
var rows = [];
if(!this.isEmpty(result)){ // isEmpty is a simple method that checks for empty objects
rows.push(result);
}
while(stmt.step()) {
var row = stmt.getAsObject();
rows.push(row);
}
this.saveToFile(db);
stmt.free();
return rows;
},
Here is a sample SQL INSERT that is being run twice
INSERT OR IGNORE INTO tag_event (tag_id, event_id, unique_string)
VALUES (:tag_id,:event_id, :unique);
Here is what the opts variable would look like for this query:
var opts = {
[':tag_id']: 1,
[':event_id']:1,
[':unique']: '1-1'
}
Because you're pushing it into row 2 time.
// if not empty will add to row
if(!this.isEmpty(result)){ // isEmpty is a simple method that checks for empty objects
rows.push(result);
}
// not sure what step() does but I'm assuming this will also run
while(stmt.step()) {
var row = stmt.getAsObject();
rows.push(row);
}
Verify by using a debugger or just console.log(rows) after the while loop before the save
So, what it turns out I needed to do was bind the variables to the prepared statement before getting the rather than binding them through getAsObject. This is much more efficient. My API response time on a local test went from 785ms to 14.5ms
prepareStatementAndCompileResults(db, sql, opts){
const rows = [];
const stmt = db.prepare(sql);
stmt.bind(opts);
while(stmt.step()) {
var row = stmt.getAsObject();
rows.push(row);
}
this.saveToFile(db);
stmt.free();
return rows;
},

how to check if any of given queries return any result in knex

I have two queries:
a) select id from ingredietns where name = my_param;
b) select word_id from synonyms where name = my_param;
Both return 0 or 1 row. I can also add limit 1 if needed (or in knex first()).
I can translate each into knex like this:
knex("ingredients").select('id').where('name', my_param) //do we need first()?
knex("synonyms").select('word_id').where('name', my_param) //do we need first()?
I need function called "ingredientGetOrCreate(my_param)". This function would
a) check if any of above queries return result
b) if any of these return, then return ingredients.id or synonyms.word_id - only one could be returned
c) if record doesn't eixst in any of tables, I need to do knex inesrt aand return newly added id from function
d) later I am not sure I also understand how to call this newly create function.
Function ingredientGetOrCreate would be used later as seperate function or in the following scenario (like "loop") that doesn't work for me either:
knex("products") // for each product
.select("id", "name")
.map(function (row) {
var descriptionSplitByCommas = row.desc.split(",");
Promise.all(descriptionSplitByCommas
.map(function (my_param) {
// here it comes - call method for each param and do insert
ingredientGetOrCreate(my_param)
.then(function (id_of_ingredient) {
knex('ingredients_products').insert({ id_of_ingredient });
});
...
I am stuck with knex and Promise queries because of asynchronouse part. Any clues, please?
I though I can somehow use Promise.all or Promise.some to call both queries.
P.S. This is my first day with nodejs, Promise and knex.
As far as I decode your question, it consists of two parts:
(1) You need to implement upsert logic (get-or-create logic).
(2) Your get part requires to query not a single table, but a pair of tables in specific order. Table names imply that this is some sort of aliasing engine inside of your application.
Let's start with (2). This could definitely be solved with two queries, just like you sense it.
function pick_name (rows)
{
if (! rows.length) return null
return rows[0].name
}
// you can sequence queries
function ingredient_get (name)
{
return knex('ingredients')
.select('id').where('name', name)
.then(pick_name)
.then(name =>
{
if (name) return name
return knex('synonyms')
.select('word_id').where('name', name)
.then(pick_name)
})
}
// or run em parallel
function ingredient_get (name)
{
var q_ingredients = knex('ingredients')
.select('id').where('name', name)
.then(pick_name)
var q_synonyms = knex('synonyms')
.select('word_id').where('name', name)
.then(pick_name)
return Promise.all([ q_ingredients, q_synonyms ])
.then(([name1, name2]) =>
{
return name1 || name2
})
}
Important notions here:
Both forms works well and return first occurence or JS' null.
First form optimizes count of queries to DB.
Second form optimizes answer time.
However, you can go deeper and use more SQL. There's a special tool for such task called COALESCE. You can consult your SQL documentation, here's COLASCE of PostgreSQL 9. The main idea of COALESCE is to return first non-NULL argument or NULL otherwise. So, you can leverage this to optimize both queries and answer time.
function ingredient_get (name)
{
// preparing but not executing both queries
var q_ingredients = knex('ingredients')
.select('id').where('name', name)
var q_synonyms = knex('synonyms')
.select('word_id').where('name', name)
// put them in COALESCE
return knex.raw('SELECT COALESCE(?, ?) AS name', [ q_ingredients, q_synonyms ])
.then(pick_name)
This solution guarantees single query and furthermore DB engine can optimize execution in any way it sees appropriate.
Now let's solve (1): We now got ingredient_get(name) which returns Promise<string | null>. We can use its output to activate create logic or return our value.
function ingredient_get_or_create (name, data)
{
return ingredient_get(name)
.then(name =>
{
if (name) return name
// …do your insert logic here
return knex('ingredients').insert({ name, ...data })
// guarantee homohenic output across get/create calls:
.then(() => name)
})
}
Now ingredient_get_or_create do your desired upsert logic.
UPD1:
We already got ingredient_get_or_create which returns Promise<name> in any scenario (both get or create).
a) If you need to do any specific logic after that you can just use then:
ingredient_get_or_create(…)
.then(() => knex('another_table').insert(…))
.then(/* another logic after all */)
In promise language that means «do that action (then) if previous was OK (ingredient_get_or_create)». In most of the cases that is what you need.
b) To implement for-loop in promises you got multiple different idioms:
// use some form of parallelism
var qs = [ 'name1', 'name2', 'name3' ]
.map(name =>
{
return ingredient_get_or_create(name, data)
})
var q = Promise.all(qs)
Please, note, that this is an agressive parallelism and you'll get maximum of parallel queries as your input array provides.
If it's not desired, you need to limit parallelism or even run tasks sequentially. Bluebird's Promise.map is a way to run map which analogous to example above but with concurrency option available. Consider the docs for details.
There's also Bluebird's Promise.mapSeries which conceptually is an analogue to for-loop but with promises. It's like map which runs sequentially. Look the docs for details.
Promise.mapSeries([ 'name1', 'name2', 'name3' ],
(name) => ingredient_get_or_create(name, data))
.then(/* logic after all mapSeries are OK */)
I believe the last is what you need.

SuiteScript 2 result set

I am writing a script in 2.0, here is the snippet.
var resultSet = [];
var result = search.load({
id:'customsearch_autosend_statement_invoice'
}).run().each(function( item ) {
resultSet.push(item);
});
If I run this saved search in the normal interface I get lots of rows like 2000 plus, but if I run this code I get back 1 row, even using the each function and adding the items to another array I only get one row. I don't see anything in the documentation about this. Can anyone tell me why this is? I'm stumped. thanks in advance for any help
I found the answer but not because I should have seen it, the examples don't make any attempt to tell you that you have to return true from the each method in order to keep receiving rows. So the answer is that at the end of the "each" function you must return a true value to receive the next row. Like this, so thanks for your effort if I miss your post.
var resultSet = [];
var result = search.load({
id:'customsearch_autosend_statement_invoice'
}).run().each(function( item ) {
resultSet.push(item);
return true;
});
It's detailed in the documentation, the callback function returns a boolean which can be used to stop or continue the iteration:
Use a developer-defined function to invoke on each row in the search
results, up to 4000 results at a time. The callback function must use
the following signature: boolean callback(result.Result result); The
callback function takes a search.Result object as an input parameter
and returns a boolean which can be used to stop the iteration with a
value of false, or continue the iteration with a value of true.
...
mySearch.run().each(function(result) {
var entity = result.getValue({
name: 'entity'
});
var subsidiary = result.getValue({
name: 'subsidiary'
});
return true;
});
...
ResultSet.each(callback)

synchronous execution of database function in node.js

There are two collections products and stock in my database.Each product have multipple stock with different supplier and attributes of products.
I have selected the products from products collection and run a loop for each product. I need to append price and offer price from the stock collection to products i have selected already.
I think the for loop compleated its execution before executiong the find method on stock collection.I need to execute everything in the loop in serial manner(not asynchronous). Please check the code below and help me to solve this. I'm new in node.js and mongodb
collection = db.collection('products');
collection.find().toArray(function(err, abc) {
var finalout = [];
for( var listProducts in abc){
var subfinal = {
'_id' :abc[listProducts]['_id'],
'min_price' :abc[listProducts]['value']['price'],
'stock' :abc[listProducts]['value']['stock'],
'name' :abc[listProducts]['value']['name'],
'price' :'',
'offer_price' :'',
};
collection = db.collection('stock');
collection.find({"product":abc[listProducts]['_id'] ,"supplier":abc[listProducts]['value']['def_supplier']}).toArray(function(err, newprod) {
for( var row in newprod){
subfinal['price'] = newprod[row]['price'];
subfinal.offer_price = newprod[row]['offer_price'];
}
finalout.push(subfinal);
});
}
console.log(finalout);
});
Yes, the loop is running and starting the get requests on the database all at the same time. It is possible, like you said, to run them all sequentially, however it's probably not what you're looking for. Doing it this way will take more time, since each request to Mongo would need to wait for the previous one to finish.
Considering that each iteration of the loop doesn't depend on the previous ones, all you're really looking for is a way to know once ALL of the operations are finished. Keep in mind that these can end in any order, not necessarily the order in which they were initiated in the loop.
There's was also an issue in the creation of the subfinal variable. Since the same variable name is being used for all iterations, when they all come back they'll be using the final assignment of the variable (all results will be written on the same subfinal, which will have been pushed multiple times into the result). To fix that, I've wrapped the entire item processing iteration into another function to give each subfinal variable it's own scope.
There are plenty of modules to ease this kind of management, however here is one way to run some code only after all the Mongo calls are finished using a counter and callback that doesn't require any additional installs:
var finished = 0; // incremented every time an iteration is done
var total = Object.keys(abc).length; // number of keys in the hash table
collection = db.collection('products');
collection.find().toArray(function(err, abc) {
var finalout = [];
for( var listProducts in abc){
processItem(listProducts);
}
});
function processItem(listProducts) {
var subfinal = {
'_id' :abc[listProducts]['_id'],
'min_price' :abc[listProducts]['value']['price'],
'stock' :abc[listProducts]['value']['stock'],
'name' :abc[listProducts]['value']['name'],
'price' :'',
'offer_price' :'',
};
collection = db.collection('stock');
collection.find({"product":abc[listProducts]['_id'] ,"supplier":abc[listProducts]['value']['def_supplier']}).toArray(function(err, newprod) {
for( var row in newprod){
subfinal['price'] = newprod[row]['price'];
subfinal.offer_price = newprod[row]['offer_price'];
}
finalout.push(subfinal);
finished++;
if (finished === total) { // if this is the last one to finish.
allFinished(finalout);
}
});
}
function allFinished(finalout) {
console.log(finalout);
}
It could be written more concisely with less variables, but it should be easier to understand this way.

Resources