Running a Node.js serverless backend through AWS.
Main objective: to filter and list all LOCAL jobs (table items) that included the available services and zip codes provided to the filter.
Im passing in multiple zip codes, and multiple available services.
data.radius would be an array of zip codes = to something like this:[ '93901', '93902', '93905', '93906', '93907', '93912', '93933', '93942', '93944', '93950', '95377', '95378', '95385', '95387', '95391' ]
data.availableServices would also be an array = to something like this ['Snow removal', 'Ice Removal', 'Salting', 'Same Day Response']
I am trying to make an API call that returns only items that have a matching zipCode from the array of zip codes provided by data.radius, and the packageSelected has a match of the array data.availableServices provided.
API CALL
import * as dynamoDbLib from "./libs/dynamodb-lib";
import { success, failure } from "./libs/response-lib";
export async function main(event, context) {
const data = JSON.parse(event.body);
const params = {
TableName: "jobs",
FilterExpression: "zipCode = :radius, packageSelected = :availableServices",
ExpressionAttributeValues: {
":radius": data.radius,
":availableServices": data.availableServices
}
};
try {
const result = await dynamoDbLib.call("query", params);
// Return the matching list of items in response body
return success(result.Items);
} catch (e) {
return failure({ status: false });
}
Do I need to map the array of zip codes and available services first for this to work?
Should I be using comparison operators?
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LegacyConditionalParameters.QueryFilter.html
Is a sort key value or partition key required to query and filter? (the table has a sort key and partition key but i would like to avoid using them in this call)
Im not 100% sure on how to go about this so if anyone could point me in the right direction that would be wonderful and greatly appreciated!!
I'm not sure what your dynamodb-lib refers to but here's an example of how you can scan for attribute1 in a given set of values and attribute2 in a different set of values. This uses the standard AWS JavaScript SDK, and specifically the high-level document client.
Note that you cannot use an equality (==) test here, you have to use an inclusion (IN) test. And you cannot use query, but must use scan.
const AWS = require('aws-sdk');
let dc = new AWS.DynamoDB.DocumentClient({'region': 'us-east-1'});
const data = {
radius: [ '93901', '93902', '93905', '93906', '93907', '93912', '93933', '93942', '93944', '93950', '95377', '95378', '95385', '95387', '95391' ],
availableServices: ['Snow removal', 'Ice Removal', 'Salting', 'Same Day Response'],
};
// These hold ExpressionAttributeValues
const zipcodes = {};
const services = {};
data.radius.forEach((zipcode, i) => {
zipcodes[`:zipcode${i}`] = zipcode;
})
data.availableServices.forEach((service, i) => {
services[`:services${i}`] = service;
})
// These hold FilterExpression attribute aliases
const zipcodex = Object.keys(zipcodes).toString();
const servicex = Object.keys(services).toString();
const params = {
TableName: "jobs",
FilterExpression: `zipCode IN (${zipcodex}) AND packageSelected IN (${servicex})`,
ExpressionAttributeValues : {...zipcodes, ...services},
};
dc.scan(params, (err, data) => {
if (err) {
console.log('Error', err);
} else {
for (const item of data.Items) {
console.log('item:', item);
}
}
});
Related
I want to get data form dynamoDB, shorted by timestamp. Anyone can help? My code is given below.
const AWS = require("aws-sdk");
const dynamoDbClient = new AWS.DynamoDB.DocumentClient();
const USERS_TABLE = process.env.USERS_TABLE;
const getNews = async (req, res) => {
try {
//dynamodb params
const params = {
TableName: USERS_TABLE,
FilterExpression: "PK = :this",
ExpressionAttributeValues: { ":this": "newsTable" },
};
//get dynamodb data
const data = await dynamoDbClient.scan(params).promise();
res.status(200).send({ data: data });
} catch (e) {
return res.status(400).send({ message: e.message });
}
};
module.exports = { getNews };
Option 1: Keep Scan; Sort client-side
Works for small tables only. Single Scan call will scan only the first 1 MB of data in the table.
If you're doing scan operation as in your code example, it's impossible to get results sorted from DynamoDB. The only way to sort them is on client-side after you download all your data from database.
Replace:
res.status(200).send({ data: data });
With:
res.status(200).send({data: data.sort((a, b) => b.date - a.date)});
However, this is not recommended, since Scan operation without pagination will scan only 1st MB of data in your table. So you could get partial results. Possible solutions are:
Option 2: (recommended) Don't use Scan; Use Query; Sort by secondary key
This will work if you have your timestamp in the secondary key of the table
Don't use Scan; Use Query -- that way you can sort your data by SK (secondary key) by passing the ScanIndexForward: false to get the most recent results first.
Assuming you have such a table schema, where a timestamp is in the secondary key:
PK
SK
email
newsTable
2022-01-01
some-1#example.com
newsTable
2022-02-01
some-2#example.com
newsTable
2022-03-01
some-3#example.com
You can change your code from:
const params = {
TableName: USERS_TABLE,
FilterExpression: 'PK = :this',
ExpressionAttributeValues: {':this': 'newsTable'},
};
//get dynamodb data
const data = await dynamoDbClient.scan(params).promise();
To:
const params = {
TableName: USERS_TABLE,
KeyConditionExpression: 'PK = :this',
ExpressionAttributeValues: {':this': 'newsTable'},
ScanIndexForward: false,
};
//get dynamodb data
const data = await dynamoDbClient.query(params).promise();
And it will return results sorted from database already.
If you don't have a timestamp in your secondary key, and you cannot add it, you can add Local Secondary Index or Global Secondary Index.
Option 3: Keep Scan, but paginate; Sort client-side
Works if you cannot change DB schema and cannot switch your code to the Query operation.
Beware, it will be much more expensive, much slower. The larger the table, the slower it gets.
If you absolutely need to use Scan, you need to paginate through all the pages of the Scan operation, and then sort results in the JS code, like I described before. I've developed a handy library that makes scanning in parallel and supports pagination.
I have a field of type map that contains Maps of data in firestore.
I am trying to retrieve this data using a cloud function in node.js. I can get the document and the data from the field but i can't get it in a usable way. I have tried every solution i can find on SO and google but the below is the only code that can give me access to the data. I obviously need to be able to access each field with in the Map individually. in swift i build an array of String:Any but i can get that to work in Node.JS
const docRef = dbConst.collection('Comps').doc('XEVDk6e4AXZPkNprQRn5Imfcsah11598092006.724980');
return docRef.get().then(docSnap => {
const tagets = docSnap.get('targets')
console.log(tagets);
}).catch(result => { console.log(result) });
this is what i am getting back in the console.
In Swift i do the following and am so far not able to find an equivalent in typescript. (i don't need to build the custom object just ability to access the keys and values)
let obj1 = doc.get("targets") as! [String:Any]
for objs in obj1{
let obs = objs.value as! [String:Any]
let targObj = compUserDetails(IDString: objs.key, activTarg: obs["ActivTarget"] as! Double, stepTarg: obs["StepTarget"] as! Double, name: obs["FullName"] as! String)
UPDATE
After spending a whole day working on it thought i had a solution using the below:
const docRef = dbConst.collection('Comps').doc('XEVDk6e4AXZPkNprQRn5Imfcsah11598092006.724980');
return docRef.get().then(docSnap => {
const tagets = docSnap.get('targets') as [[string, any]];
const newDataMap = [];
for (let [key, value] of Object.entries(tagets)) {
const tempMap = new Map<String,any>();
console.log(key);
const newreWorked = value;
tempMap.set('uid',key);
for(let [key1, value1] of Object.entries(newreWorked)){
tempMap.set(key1,value1);
newDatMap.push(tempMap);
};
};
newDatMap.forEach(element => {
const name = element.get('FullName');
console.log(name);
});
However the new data map has 6 seperate mapped objects. 3 of each of the original objects from the cloud. i can now iterate through and get the data for a given key but i have 3 times as many objects.
So after two days of searching an getting very close i finnaly worked out a solution, it is very similar to the code above but this works. it may not be the "correct" way but it works. feel free to make other suggestions.
return docRef.get().then(docSnap => {
const tagets = docSnap.get('targets') as [[string, any]];
const newDatarray = [];
for (let [key, value] of Object.entries(tagets)) {
const tempMap = new Map<String,any>();
const newreWorked = value;
tempMap.set('uid',key);
for(let [key1, value1] of Object.entries(newreWorked)){
tempMap.set(key1,value1);
};
newDatarray.push(tempMap);
};
newDatarray.forEach(element => {
const name = element.get('FullName');
const steps = element.get('StepTarget');
const avtiv = element.get('ActivTarget');
const UID = element.get('uid');
console.log(name);
console.log(steps);
console.log(avtiv);
console.log(UID);
});
}).catch(result => { console.log(result) });
I made this into a little function that gets the underlying object from a map:
function getMappedValues(map) {
var tempMap = {};
for (const [key, value] of Object.entries(map)) {
tempMap[key] = value;
}
return tempMap;
}
For an object with an array of maps in firestore, you can get the value of the first of those maps like so:
let doc = { // Example firestore document data
items: {
0: {
id: "1",
sr: "A",
},
1: {
id: "2",
sr: "B",
},
2: {
id: "3",
sr: "B",
},
},
};
console.log(getMappedValues(doc.items[0]));
which would read { id: '1', sr: 'A' }
I'm a newbie at Node JS, and I'm using NodeJS (v. 8.7.0), sqlite3 and Express.
I have two tables in a SQLite database:
releases (id, title, image)
links (id, url)
Each "release" has one or more "links" associated with it.
I can get all the releases using:
dbh.all("SELECT * FROM releases ORDER BY id DESC", (err, rows) => { ... })
And I can get all the links for a given release using:
dbh.all("SELECT * FROM links WHERE id = ?", (err, rows) => { ... })
But I can't figure out how to add a "links" property to each "release", which contains their corresponding links, so that I can feed the resulting object to Mustache, and generate a HTML page.
I know that storing hierarchical data inside of a relational database is not the best idea, and I could easily do this using PHP, but I really want to learn how to use NodeJS.
This is what I've come up so far:
var sqlite3 = require("sqlite3")
function main() {
db = new sqlite3.Database("releases.sqlite3")
all = []
db.each(
"SELECT * FROM releases ORDER BY id DESC",
(err, release) => {
release.links = []
db.all("SELECT url FROM links WHERE id = ?", [release.id], (err, links) => {
links = links.map((e) => { return e.url })
release.links = links
// line above: tried
// links.forEach((e) => { release.links.push(e.url) })
// too, but that didn't work either.
})
all.push(release)
},
(complete) => { console.log(all) }
)
}
main()
Though, when I run it, it inevitably shows:
links: []
Every time. How can I fix this?
Thank you in advance.
Edit 1:
This SQL snippet generates the database, and populates it with some data.
CREATE TABLE `links` ( `id` TEXT, `url` TEXT );
CREATE TABLE `releases` ( `id` TEXT, `title` TEXT, `image` TEXT );
INSERT INTO links VALUES
('rel-001', 'https://example.com/mirror1'),
('rel-001', 'https://example.com/mirror2');
INSERT INTO releases VALUES
('rel-001', 'Release 001', 'https://example.com/image.jpg');
The goal is to have something like this:
{
releases:[
{
id:'rel-001',
title:'Release 001',
image:'https://example.com/image.jpg',
links:[
'https://example.com/mirror1',
'https://example.com/mirror2'
]
}
]
}
try to see if both queries are being executed by adding console.log in the callbacks, moreover you should push the links only within the second callback since before the callback is fired the value is not existing, thus you are trying to push an empty value, also you don't need to initialize release.links = [], all will be only filled after all queries are executed, so therefore we need to execute console.log(all); in the last child callback:
function main() {
all = []
var parentComplete = false;
db.each("SELECT * FROM releases ORDER BY id DESC", (err, release) => {
db.all("SELECT url FROM links WHERE id = ?", [release.id], (err, links) => {
release.links = links.map(e => e.url);
all.push(release);
if (parentComplete){
console.log(all);
}
})
},
(complete) => {
parentComplete = true;
}
)
}
main();
p.s. in order to get the result you want you will need to initialize all as an object all = {releases:[]}
function main() {
all = {releases:[]};
var parentComplete = false;
db.each("SELECT * FROM releases ORDER BY id DESC", (err, release) => {
db.all("SELECT url FROM links WHERE id = ?", [release.id], (err, links) => {
release.links = links.map(e => e.url);
all.releases.push(release);
if (parentComplete){
console.log(all);
}
})
},
(complete) => {
parentComplete = true;
}
)
}
main();
I'm writing a small utility to copy data from one sqlite database file to another. Both files have the same table structure - this is entirely about moving rows from one db to another.
My code right now:
let tables: Array<string> = [
"OneTable", "AnotherTable", "DataStoredHere", "Video"
]
tables.forEach((table) => {
console.log(`Copying ${table} table`);
sourceDB.each(`select * from ${table}`, (error, row) => {
console.log(row);
destDB.run(`insert into ${table} values (?)`, ...row) // this is the problem
})
})
row here is a js object, with all the keyed data from each table. I'm certain that there's a simple way to do this that doesn't involve escaping stringified data.
If your database driver has not blocked ATTACH, you can simply tell the database to copy everything:
ATTACH '/some/where/source.db' AS src;
INSERT INTO main.MyTable SELECT * FROM src.MyTable;
You could iterate over the row and setup the query with dynamically generated parameters and references.
let tables: Array<string> = [
"OneTable", "AnotherTable", "DataStoredHere", "Video"
]
tables.forEach((table) => {
console.log(`Copying ${table} table`);
sourceDB.each(`select * from ${table}`, (error, row) => {
console.log(row);
const keys = Object.keys(row); // ['column1', 'column2']
const columns = keys.toString(); // 'column1,column2'
let parameters = {};
let values = '';
// Generate values and named parameters
Object.keys(row).forEach((r) => {
var key = '$' + r;
// Generates '$column1,$column2'
values = values.concat(',', key);
// Generates { $column1: 'foo', $column2: 'bar' }
parameters[key] = row[r];
});
// SQL: insert into OneTable (column1,column2) values ($column1,$column2)
// Parameters: { $column1: 'foo', $column2: 'bar' }
destDB.run(`insert into ${table} (${columns}) values (${values})`, parameters);
})
})
Tried editing the answer by #Cl., but was rejected. So, adding on to the answer, here's the JS code to achieve the same:
let sqlite3 = require('sqlite3-promise').verbose();
let sourceDBPath = '/source/db/path/logic.db';
let tables = ["OneTable", "AnotherTable", "DataStoredHere", "Video"];
let destDB = new sqlite3.Database('/your/dest/logic.db');
await destDB.runAsync(`ATTACH '${sourceDBPath}' AS sourceDB`);
await Promise.all(tables.map(table => {
return new Promise(async (res, rej) => {
await destDB.runAsync(`
CREATE TABLE ${table} AS
SELECT * FROM sourceDB.${table}`
).catch(e=>{
console.error(e);
rej(e);
});
res('');
})
}));
I am using NodeJS, PostgreSQL and the amazing pg-promise library. In my case, I want to execute three main queries:
Insert one tweet in the table 'tweets'.
In case there is hashtags in the tweet, insert them into another table 'hashtags'
Them link both tweet and hashtag in a third table 'hashtagmap' (many to many relational table)
Here is a sample of the request's body (JSON):
{
"id":"12344444",
"created_at":"1999-01-08 04:05:06 -8:00",
"userid":"#postman",
"tweet":"This is the first test from postman!",
"coordinates":"",
"favorite_count":"0",
"retweet_count":"2",
"hashtags":{
"0":{
"name":"test",
"relevancetraffic":"f",
"relevancedisaster":"f"
},
"1":{
"name":"postman",
"relevancetraffic":"f",
"relevancedisaster":"f"
},
"2":{
"name":"bestApp",
"relevancetraffic":"f",
"relevancedisaster":"f"
}
}
All the fields above should be included in the table "tweets" besides hashtags, that in turn should be included in the table "hashtags".
Here is the code I am using based on Nested transactions from pg-promise docs inside a NodeJS module. I guess I need nested transactions because I need to know both tweet_id and hashtag_id in order to link them in the hashtagmap table.
// Columns
var tweetCols = ['id','created_at','userid','tweet','coordinates','favorite_count','retweet_count'];
var hashtagCols = ['name','relevancetraffic','relevancedisaster'];
//pgp Column Sets
var cs_tweets = new pgp.helpers.ColumnSet(tweetCols, {table: 'tweets'});
var cs_hashtags = new pgp.helpers.ColumnSet(hashtagCols, {table:'hashtags'});
return{
// Transactions
add: body =>
rep.tx(t => {
return t.one(pgp.helpers.insert(body,cs_tweets)+" ON CONFLICT(id) DO UPDATE SET coordinates = "+body.coordinates+" RETURNING id")
.then(tweet => {
var queries = [];
for(var i = 0; i < body.hashtags.length; i++){
queries.push(
t.tx(t1 => {
return t1.one(pgp.helpers.insert(body.hashtags[i],cs_hashtags) + "ON CONFLICT(name) DO UPDATE SET fool ='f' RETURNING id")
.then(hash =>{
t1.tx(t2 =>{
return t2.none("INSERT INTO hashtagmap(tweetid,hashtagid) VALUES("+tweet.id+","+hash.id+") ON CONFLICT DO NOTHING");
});
});
}));
}
return t.batch(queries);
});
})
}
The problem is with this code I am being able to successfully insert the tweet but nothing happens then. I cannot insert the hashtags nor link the hashtag to the tweets.
Sorry but I am new to coding so I guess I didn't understood how to properly return from the transaction and how to perform this simple task. Hope you can help me.
Thank you in advance.
Jean
Improving on Jean Phelippe's own answer:
// Columns
var tweetCols = ['id', 'created_at', 'userid', 'tweet', 'coordinates', 'favorite_count', 'retweet_count'];
var hashtagCols = ['name', 'relevancetraffic', 'relevancedisaster'];
//pgp Column Sets
var cs_tweets = new pgp.helpers.ColumnSet(tweetCols, {table: 'tweets'});
var cs_hashtags = new pgp.helpers.ColumnSet(hashtagCols, {table: 'hashtags'});
return {
/* Tweets */
// Add a new tweet and update the corresponding hash tags
add: body =>
db.tx(t => {
return t.one(pgp.helpers.insert(body, cs_tweets) + ' ON CONFLICT(id) DO UPDATE SET coordinates = ' + body.coordinates + ' RETURNING id')
.then(tweet => {
var queries = Object.keys(body.hashtags).map((_, idx) => {
return t.one(pgp.helpers.insert(body.hashtags[i], cs_hashtags) + 'ON CONFLICT(name) DO UPDATE SET fool = $1 RETURNING id', 'f')
.then(hash => {
return t.none('INSERT INTO hashtagmap(tweetid, hashtagid) VALUES($1, $2) ON CONFLICT DO NOTHING', [+tweet.id, +hash.id]);
});
});
return t.batch(queries);
});
})
.then(data => {
// transaction was committed;
// data = [null, null,...] as per t.none('INSERT INTO hashtagmap...
})
.catch(error => {
// transaction rolled back
})
},
NOTES:
As per my notes earlier, you must chain all queries, or else you will end up with loose promises
Stay away from nested transactions, unless you understand exactly how they work in PostgreSQL (read this, and specifically the Limitations section).
Avoid manual query formatting, it is not safe, always rely on the library's query formatting.
Unless you are passing the result of transaction somewhere else, you should at least provide the .catch handler.
P.S. For the syntax like +tweet.id, it is the same as parseInt(tweet.id), just shorter, in case those are strings ;)
For those who will face similar problem, I will post the answer.
Firstly, my mistakes:
In the for loop : body.hashtag.length doesn't exist because I am dealing with an object (very basic mistake here). Changed to Object.keys(body.hashtags).length
Why using so many transactions? Following the answer by vitaly-t in: Interdependent Transactions with pg-promise I removed the extra transactions. It's not yet clear for me how you can open one transaction and use the result of one query into another in the same transaction.
Here is the final code:
// Columns
var tweetCols = ['id','created_at','userid','tweet','coordinates','favorite_count','retweet_count'];
var hashtagCols = ['name','relevancetraffic','relevancedisaster'];
//pgp Column Sets
var cs_tweets = new pgp.helpers.ColumnSet(tweetCols, {table: 'tweets'});
var cs_hashtags = new pgp.helpers.ColumnSet(hashtagCols, {table:'hashtags'});
return {
/* Tweets */
// Add a new tweet and update the corresponding hashtags
add: body =>
rep.tx(t => {
return t.one(pgp.helpers.insert(body,cs_tweets)+" ON CONFLICT(id) DO UPDATE SET coordinates = "+body.coordinates+" RETURNING id")
.then(tweet => {
var queries = [];
for(var i = 0; i < Object.keys(body.hashtags).length; i++){
queries.push(
t.one(pgp.helpers.insert(body.hashtags[i],cs_hashtags) + "ON CONFLICT(name) DO UPDATE SET fool ='f' RETURNING id")
.then(hash =>{
t.none("INSERT INTO hashtagmap(tweetid,hashtagid) VALUES("+tweet.id+","+hash.id+") ON CONFLICT DO NOTHING");
})
);
}
return t.batch(queries);
});
}),