Is there any way of chaining queries dynamically?
For example, given the following GET request
/collection?field1=value1&field2=value2&sort=field3 asc
It is easy without the sort query
/collection?field1=value1&field2=value2
var query = {}
for (var key in query) {
query[key] = req.query[key]
}
Collection.find(query)
But how do I build the GET request if there are optional query keys such as sort, expand, and select which map to Collection.sort, Collection.populate, Collection.select respectively?
In other words, suppose you have a dynamic array of Query methods:
queries = [populate, select, sort]
Would the solution be the following:
var query = Collection.find()
for (var q in queries)
query = query.q
You just iterate through the query parameters and separate out the ones that are operations versus actual query criteria. Using your examples:
// sample data for req.query
const req = {
query: {
sort: "field9",
field1: "someValue",
field2: "otherValue",
field3: "highValue"
}
};
const queries = new Map();
const operations = new Map([
["populate", false],
["sort", false],
["select", false]
]);
for (const [key, value] of Object.entries(req.query)) {
if (operations.has(key)) {
operations.set(key, value);
} else {
queries.set(key, value);
}
}
// here:
// queries contain the non-operation pairs
// operations (if not false) contain the operation value such
// as sort => "field9"
console.log("operations:");
for (let [key, value] of operations.entries()) {
console.log(`${key} => ${value}`);
}
console.log("queries:");
for (let [key, value] of queries.entries()) {
console.log(`${key} => ${value}`);
}
To run the operation, you'd then have to check which operations are present and branch your code and query based on which operations are present.
Related
I'd like to write a wrapper function function select(db: any, ids: number[]): Cat[] that returns an array of Cat rows fetched from the DB by ID. The function should return the entire array of rows.
Below is one approach I've written. Instead of calling db.each on every ID in a for-loop as I do below, is it possible to pass my entire ids: number[] array as a parameter to a db.all / db.each query function?
// dbmethods.ts
async function select(db: any, ids: number[]): Promise<Cat[]> {
let query = "SELECT * FROM cats_table WHERE id = ?;";
let cats_back: Cat[] = [];
for (let i = 0; i < ids.length; i++) {
let cat: Promise<Cat> = new Promise(async function (resolve, reject) {
await db.get(query, ids[i], (err: Error, row: any) => {
if (err) {
reject(err);
} else {
let cat: Cat = {
index: row.id,
cat_type: row.cat_type,
health: row.health,
num_paws: row.num_paws
};
resolve(cat);
}
});
});
cats_back.push(await cat);
}
return cats_back;
}
and
// index.ts
let ids = create_many_ids(10_000); // returns an array of unique ordered ints between 0 and 10K
let res = await select(db, ids);
console.log(res); // successfully prints my cats
Benchmarks on my select function above suggest that it takes 300ms to select 10_000 rows by ID. It seems to me that's a little long; 10K rows shouldn't take that long for sqlite's select by id functionality... How can I be more efficient?
SELECT * FROM cats_table WHERE id IN (SELECT value FROM json_each(?));
The query parameter is a string representing a JSON array of ids, e.g., '[1, 2, 4]'
See this tutorial for further details
I have a field of type map that contains Maps of data in firestore.
I am trying to retrieve this data using a cloud function in node.js. I can get the document and the data from the field but i can't get it in a usable way. I have tried every solution i can find on SO and google but the below is the only code that can give me access to the data. I obviously need to be able to access each field with in the Map individually. in swift i build an array of String:Any but i can get that to work in Node.JS
const docRef = dbConst.collection('Comps').doc('XEVDk6e4AXZPkNprQRn5Imfcsah11598092006.724980');
return docRef.get().then(docSnap => {
const tagets = docSnap.get('targets')
console.log(tagets);
}).catch(result => { console.log(result) });
this is what i am getting back in the console.
In Swift i do the following and am so far not able to find an equivalent in typescript. (i don't need to build the custom object just ability to access the keys and values)
let obj1 = doc.get("targets") as! [String:Any]
for objs in obj1{
let obs = objs.value as! [String:Any]
let targObj = compUserDetails(IDString: objs.key, activTarg: obs["ActivTarget"] as! Double, stepTarg: obs["StepTarget"] as! Double, name: obs["FullName"] as! String)
UPDATE
After spending a whole day working on it thought i had a solution using the below:
const docRef = dbConst.collection('Comps').doc('XEVDk6e4AXZPkNprQRn5Imfcsah11598092006.724980');
return docRef.get().then(docSnap => {
const tagets = docSnap.get('targets') as [[string, any]];
const newDataMap = [];
for (let [key, value] of Object.entries(tagets)) {
const tempMap = new Map<String,any>();
console.log(key);
const newreWorked = value;
tempMap.set('uid',key);
for(let [key1, value1] of Object.entries(newreWorked)){
tempMap.set(key1,value1);
newDatMap.push(tempMap);
};
};
newDatMap.forEach(element => {
const name = element.get('FullName');
console.log(name);
});
However the new data map has 6 seperate mapped objects. 3 of each of the original objects from the cloud. i can now iterate through and get the data for a given key but i have 3 times as many objects.
So after two days of searching an getting very close i finnaly worked out a solution, it is very similar to the code above but this works. it may not be the "correct" way but it works. feel free to make other suggestions.
return docRef.get().then(docSnap => {
const tagets = docSnap.get('targets') as [[string, any]];
const newDatarray = [];
for (let [key, value] of Object.entries(tagets)) {
const tempMap = new Map<String,any>();
const newreWorked = value;
tempMap.set('uid',key);
for(let [key1, value1] of Object.entries(newreWorked)){
tempMap.set(key1,value1);
};
newDatarray.push(tempMap);
};
newDatarray.forEach(element => {
const name = element.get('FullName');
const steps = element.get('StepTarget');
const avtiv = element.get('ActivTarget');
const UID = element.get('uid');
console.log(name);
console.log(steps);
console.log(avtiv);
console.log(UID);
});
}).catch(result => { console.log(result) });
I made this into a little function that gets the underlying object from a map:
function getMappedValues(map) {
var tempMap = {};
for (const [key, value] of Object.entries(map)) {
tempMap[key] = value;
}
return tempMap;
}
For an object with an array of maps in firestore, you can get the value of the first of those maps like so:
let doc = { // Example firestore document data
items: {
0: {
id: "1",
sr: "A",
},
1: {
id: "2",
sr: "B",
},
2: {
id: "3",
sr: "B",
},
},
};
console.log(getMappedValues(doc.items[0]));
which would read { id: '1', sr: 'A' }
Running a Node.js serverless backend through AWS.
Main objective: to filter and list all LOCAL jobs (table items) that included the available services and zip codes provided to the filter.
Im passing in multiple zip codes, and multiple available services.
data.radius would be an array of zip codes = to something like this:[ '93901', '93902', '93905', '93906', '93907', '93912', '93933', '93942', '93944', '93950', '95377', '95378', '95385', '95387', '95391' ]
data.availableServices would also be an array = to something like this ['Snow removal', 'Ice Removal', 'Salting', 'Same Day Response']
I am trying to make an API call that returns only items that have a matching zipCode from the array of zip codes provided by data.radius, and the packageSelected has a match of the array data.availableServices provided.
API CALL
import * as dynamoDbLib from "./libs/dynamodb-lib";
import { success, failure } from "./libs/response-lib";
export async function main(event, context) {
const data = JSON.parse(event.body);
const params = {
TableName: "jobs",
FilterExpression: "zipCode = :radius, packageSelected = :availableServices",
ExpressionAttributeValues: {
":radius": data.radius,
":availableServices": data.availableServices
}
};
try {
const result = await dynamoDbLib.call("query", params);
// Return the matching list of items in response body
return success(result.Items);
} catch (e) {
return failure({ status: false });
}
Do I need to map the array of zip codes and available services first for this to work?
Should I be using comparison operators?
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LegacyConditionalParameters.QueryFilter.html
Is a sort key value or partition key required to query and filter? (the table has a sort key and partition key but i would like to avoid using them in this call)
Im not 100% sure on how to go about this so if anyone could point me in the right direction that would be wonderful and greatly appreciated!!
I'm not sure what your dynamodb-lib refers to but here's an example of how you can scan for attribute1 in a given set of values and attribute2 in a different set of values. This uses the standard AWS JavaScript SDK, and specifically the high-level document client.
Note that you cannot use an equality (==) test here, you have to use an inclusion (IN) test. And you cannot use query, but must use scan.
const AWS = require('aws-sdk');
let dc = new AWS.DynamoDB.DocumentClient({'region': 'us-east-1'});
const data = {
radius: [ '93901', '93902', '93905', '93906', '93907', '93912', '93933', '93942', '93944', '93950', '95377', '95378', '95385', '95387', '95391' ],
availableServices: ['Snow removal', 'Ice Removal', 'Salting', 'Same Day Response'],
};
// These hold ExpressionAttributeValues
const zipcodes = {};
const services = {};
data.radius.forEach((zipcode, i) => {
zipcodes[`:zipcode${i}`] = zipcode;
})
data.availableServices.forEach((service, i) => {
services[`:services${i}`] = service;
})
// These hold FilterExpression attribute aliases
const zipcodex = Object.keys(zipcodes).toString();
const servicex = Object.keys(services).toString();
const params = {
TableName: "jobs",
FilterExpression: `zipCode IN (${zipcodex}) AND packageSelected IN (${servicex})`,
ExpressionAttributeValues : {...zipcodes, ...services},
};
dc.scan(params, (err, data) => {
if (err) {
console.log('Error', err);
} else {
for (const item of data.Items) {
console.log('item:', item);
}
}
});
I have three queries on Firestore based on a time range. (24, 12 and 6 hour). I am using Promise.all and it works. As you can see from the code, I am accessing the result of each query by using an index to the returned snapshot. I have read that the Returned values will be in the order of the Promises passed, regardless of completion order.
Now, I want to be able to pass an object to the Promise.all because my number of queries will be unpredictable for what I want to do, basically, I will be looping to a number of vehicles and building the same 3 queries for each and I will pass it all on a Promise.all. And when Promise.all returns, I want to be able to know which vehicle and time range that snapshot is for.
So instead of an array, I want to pass this argument to Promise.all instead.
{"vehicle1_24":query, "vehicle1_12":query, "vehicle1_6":query,
"vehicle2_24":query, "vehicle2_12":query, "vehicle2_6":query}
code
var queries = [
vehicleRef.collection('telemetry').where('time_stamp', '<', today).where('time_stamp', '>', yesterday).get(),
vehicleRef.collection('telemetry').where('time_stamp', '<', today).where('time_stamp', '>', twelveHours).get(),
vehicleRef.collection('telemetry').where('time_stamp', '<', today).where('time_stamp', '>', sixHours).get()
]
for (var i = 0; i < queries.length; i++) {
queryResults.push(
queries[i]
)
}
Promise.all(queryResults)
.then(snapShot=> {
const yesterdayResult = result => getEnergy(result);
const twelveHourResult = result => getEnergy(result);
const sixHourResult = result => getEnergy(result);
allYesterdayResult += yesterdayResult(snapShot[0])
allTwelveHourResult += twelveHourResult(snapShot[1])
allSixHourResult +=sixHourResult(snapShot[2])
console.log("Done updating vehicle ", vehicle)
// return res.send({"Result" : "Successful!"})
}).catch(reason => {
console.log(reason)
// return res.send({"Result" : "Error!"})
This feature does not exist natively, but should be fairly easy to write, something along the lines of
async function promiseAllObject(obj) {
// Convert the object into an array of Promise<{ key: ..., value: ... }>
const keyValuePromisePairs = Object.entries(obj).map(([key, valuePromise]) =>
valuePromise.then(value => ({ key, value }))
);
// Awaits on all the promises, getting an array of { key: ..., value: ... }
const keyValuePairs = await Promise.all(keyValuePromisePairs);
// Turn it back into an object.
return keyValuePairs.reduce(
(result, { key, value }) => ({ ...result, [key]: value }),
{}
);
}
promiseAllObject({ foo: Promise.resolve(42), bar: Promise.resolve(true) })
.then(console.log); // { foo: 42, bar: true }
You can use the following code to transform your object into an array that you will pass to Promise.all()
var queriesObject = {"vehicle1_24":query, "vehicle1_12":query, "vehicle1_6":query, "vehicle2_24":query, "vehicle2_12":query, "vehicle2_6":query};
//Of course, queriesObject can be an oject with any number of elements
var queries = [];
for (var key in queriesObject) {
if (queriesObject.hasOwnProperty(key)) {
queries.push(queriesObject[key]);
}
}
Promise.all(queries);
You will receive the results of Promise.all in an array corresponding to the fulfillment values in the same order than the queries array, see: Promise.all: Order of resolved values and https://www.w3.org/2001/tag/doc/promises-guide#aggregating-promises
I'm writing a small utility to copy data from one sqlite database file to another. Both files have the same table structure - this is entirely about moving rows from one db to another.
My code right now:
let tables: Array<string> = [
"OneTable", "AnotherTable", "DataStoredHere", "Video"
]
tables.forEach((table) => {
console.log(`Copying ${table} table`);
sourceDB.each(`select * from ${table}`, (error, row) => {
console.log(row);
destDB.run(`insert into ${table} values (?)`, ...row) // this is the problem
})
})
row here is a js object, with all the keyed data from each table. I'm certain that there's a simple way to do this that doesn't involve escaping stringified data.
If your database driver has not blocked ATTACH, you can simply tell the database to copy everything:
ATTACH '/some/where/source.db' AS src;
INSERT INTO main.MyTable SELECT * FROM src.MyTable;
You could iterate over the row and setup the query with dynamically generated parameters and references.
let tables: Array<string> = [
"OneTable", "AnotherTable", "DataStoredHere", "Video"
]
tables.forEach((table) => {
console.log(`Copying ${table} table`);
sourceDB.each(`select * from ${table}`, (error, row) => {
console.log(row);
const keys = Object.keys(row); // ['column1', 'column2']
const columns = keys.toString(); // 'column1,column2'
let parameters = {};
let values = '';
// Generate values and named parameters
Object.keys(row).forEach((r) => {
var key = '$' + r;
// Generates '$column1,$column2'
values = values.concat(',', key);
// Generates { $column1: 'foo', $column2: 'bar' }
parameters[key] = row[r];
});
// SQL: insert into OneTable (column1,column2) values ($column1,$column2)
// Parameters: { $column1: 'foo', $column2: 'bar' }
destDB.run(`insert into ${table} (${columns}) values (${values})`, parameters);
})
})
Tried editing the answer by #Cl., but was rejected. So, adding on to the answer, here's the JS code to achieve the same:
let sqlite3 = require('sqlite3-promise').verbose();
let sourceDBPath = '/source/db/path/logic.db';
let tables = ["OneTable", "AnotherTable", "DataStoredHere", "Video"];
let destDB = new sqlite3.Database('/your/dest/logic.db');
await destDB.runAsync(`ATTACH '${sourceDBPath}' AS sourceDB`);
await Promise.all(tables.map(table => {
return new Promise(async (res, rej) => {
await destDB.runAsync(`
CREATE TABLE ${table} AS
SELECT * FROM sourceDB.${table}`
).catch(e=>{
console.error(e);
rej(e);
});
res('');
})
}));