denormalizing in firebase functions - copy data from other docs on create - node.js

I'm trying to figure out how to create a merged/denormalized document for an order document in firebase, as described in the "Five Uses for Cloud Functions" Firebase video (https://www.youtube.com/watch?v=77XmRDtOL7c)
User creates the basic document, functions pulls in data from several other documents to create the desired result.
Here's a basic example of what I'd like to accomplish.
exports.orderCreate = functions.firestore
.document('orders/{docId}').onCreate((snap, context) => {
const id = context.params.docId;
const orderDoc = snap.data();
const branchId = orderDoc.branchId;
const branchDoc = admin.firestore().collection('branches').doc(branchId);
const bn = branchDoc.brandName;
const ln = branchDoc.locationName;
const logo = branchDoc.imageURL;
return admin.firestore().collection('orders')
.doc(id).set({
branchBrandName: bn,
branchLocationName: ln,
branchLogo: logo
}, { merge: true });
});
Which way do I wave my hands to make this work? Thanks!

With admin.firestore().collection('branches').doc(branchId); you actually declare a DocumentReference. Then, in order to get the values of the document fields, you need to call the asynchronous get() method.
So the following should do the trick:
exports.orderCreate = functions.firestore
.document('orders/{docId}').onCreate((snap, context) => {
const id = context.params.docId;
const orderDoc = snap.data();
const branchId = orderDoc.branchId;
const branchDoc = admin.firestore().collection('branches').doc(branchId);
return branchDoc.get()
.then(branchDocSnapshot => {
const bn = branchDocSnapshot.data().brandName;
const ln = branchDocSnapshot.data().locationName;
const logo = branchDocSnapshot.data().imageURL;
return admin.firestore().collection('orders')
.doc(id).set({
branchBrandName: bn,
branchLocationName: ln,
branchLogo: logo
}, { merge: true });
});
});
You may need to deal with the case the doc does not exists, depending on your data model and app functions. See here in the doc.

Related

How can I store the value of a promise and use it once resolved?

I am currently developing an app which interacts with uniswap, and I have developed a Wrapper class to contain the info and variables I'll need about some pair (e.g DAI/WETH).
As some of this values are asynchronous, I have coded a build() async function to get those before calling the constructor, so I can store them. I want to store the result of this build function, which is an instance of the class I have defined, inside a variable to use it later, but I need to know whether the Promise that that build function returns is resolved before using it, so how can I make it?
Here is the code of the class:
'use strict'
const { ChainId, Fetcher, WETH, Route, Trade, TokenAmoun, TradeType, TokenAmount } = require('#uniswap/sdk')
const { toChecksumAddress } = require('ethereum-checksum-address')
const Web3 = require('web3')
const web3 = new Web3()
const chainId = ChainId.MAINNET;
let tok1;
let tok2;
let pair;
let route;
let trade;
class UniswapTokenPriceFetcher
{
constructor(async_params)
{
async_params.forEach((element) => {
if (element === 'undefined')
{
throw new Error('All parameters must be defined')
}
});
this.trade = async_params[0];
this.route = async_params[1];
this.pair = async_params[2];
this.tok1 = async_params[3];
this.tok2 = async_params[4];
}
static async build(token1, token2)
{
var tok1 = await Fetcher.fetchTokenData(chainId, toChecksumAddress(token1))
var tok2 = await Fetcher.fetchTokenData(chainId, toChecksumAddress(token2))
var pair = await Fetcher.fetchPairData(tok1, tok2)
var route = new Route([pair], tok2)
var trade = new Trade(route, new TokenAmount(tok2, web3.utils.toWei('1', 'Ether')), TradeType.EXACT_INPUT)
return new UniswapTokenPriceFetcher([trade, route, pair, tok1, tok2])
}
getExecutionPrice6d = () =>
{
return this.trade.executionPrice.toSignificant(6);
}
getNextMidPrice6d = () =>
{
return this.trade.nextMidPrice.toSignificant(6);
}
}
module.exports = UniswapTokenPriceFetcher
Thank you everybody!
EDIT: I know Uniswap only pairs with WETH so one of my token variables is unneccesary, but the problem remains the same! Also keep in mind that I want to store an instance of this class for latter use inside another file.
You should either call the build function with await
const priceFetcher = await UniswapTokenPriceFetcher.build(token1, token2)
or followed by then
UniswapTokenPriceFetcher.build(token1, token2).then(priceFetcher => {...})
I don't see any other way.

Azure Cosmos + Gremlin NodeJS, how to submit fluent query as script (not bytecode -- i know its not supported yet)

I'm trying to write fluent gremlin queries in nodejs for cosmos db, even though they get submitted as string. I've been through the documentation and I've seen it mentioned in a few github threads that although bytecode isn't yet supported, it is possible to submit it as scrip.
The code I have so far:
Configuring the client function:
export const CosmosConn = async (): Promise<driver.Client> => {
try {
const cosmosKey: string = await GetSecret('cosmos-key');
const cosmosEndpoint: string = await GetSecret('cosmos-endpoint');
const authenticator: driver.auth.PlainTextSaslAuthenticator = new gremlin.driver.auth.PlainTextSaslAuthenticator(
'/dbs/main/colls/main',
cosmosKey
);
const client: driver.Client = new gremlin.driver.Client(cosmosEndpoint, {
authenticator,
traversalsource: 'g',
rejectUnauthorized: true,
mimeType: 'application/vnd.gremlin-v2.0+json'
});
return client;
} catch (err) {
console.error(err);
}
};
Now these two below are temporary as i'll await that CosmosConn several times for every query, but this is for an Azure Function so i'm not optimizing yet:
export const Graph = async (query: gremlin.process.Bytecode): Promise<any> => {
const db = await CosmosConn();
const translator = new gremlin.process.Translator(
new gremlin.process.AnonymousTraversalSource()
);
return db.submit(translator.translate(query));
};
export const getGremlin = async () => {
const db = await CosmosConn();
return gremlin.process.traversal().withRemote(db);
};
Now when I try to use it:
const g = await getGremlin();
const query = g
.V()
.hasLabel('client')
.getBytecode();
const test = await Graph(query);
This of course throws out an error:
Gremlin Query Syntax Error: Script compile error: Unexpected token: 'Object'; in input: '[objectObject'. # line 1, column 9.
Have you tried to print the translator.translate(query) prior submitting?
From my experience, the translator is very limited in its support for non-trivial queries.
According to Microsoft, they plan to support fluent API on Dec 19', so probably better to wait for official support.
It was the types preventing me from initialising my translator in a way that works with CosmosDB.
const translator = new gremlin.process.Translator('g' as any);
works.
Dawn below is an example of using Translator in TypeScript to convert bytecode query to string query for CosmosDB. I don't recommend this solution, like the other response pointed out: it's limited. Use AWS Neptune instead or wait until MS implements bytecode queries in CosmosDB.
async function test(): Promise<void> {
// Connection:
const traversal = Gremlin.process.AnonymousTraversalSource.traversal;
const DriverRemoteConnection = Gremlin.driver.DriverRemoteConnection;
const g = traversal().withRemote(new DriverRemoteConnection("ws://localhost:8182/gremlin"));
// Create translator
const translator = new Gremlin.process.Translator(g);
// Convert bytecode query to string query for CosmosDB:
console.log(translator.translate(g.V().hasLabel('person').values('name').getBytecode()))
}
test();
Here is the link to tests cases to get getbytecode translation working.
https://github.com/apache/tinkerpop/blob/master/gremlin-javascript/src/main/javascript/gremlin-javascript/test/unit/translator-test.js#L31
Edit:-
Here is sample test case from above link
it('should produce valid script representation from bytecode glv steps', function () {
const g = new graph.Graph().traversal();
const script = new Translator('g').translate(g.V().out('created').getBytecode());
assert.ok(script);
assert.strictEqual(script, 'g.V().out(\'created\')');
});

Remove users storage folder with firebase cloud functions

When a user deletes their account, I want to remove their storage files along with their data.
I am able to do a multi path delete for the RTDB, how can I do this but also remove files from storage too?
I have tried chaining on a .then but it makes everything fail...
ex...
.then(() => {
const bucket = gcs.bucket(functions.config().firebase.storageBucket);
const path = `categories/${uid}`;
return bucket.file(path).delete();
})
I wish it was faster to test your functions without always deploying because it has taken soooo much time to try making this work...
Here is my working code:
exports.removeUserFromDatabase = functions.auth.user()
.onDelete(function(user, context) {
var uid = user.uid;
const deleteUserData = {};
deleteUserData[`users/${uid}`] = null;
deleteUserData[`feed/${uid}`] = null;
deleteUserData[`friends/${uid}`] = null;
deleteUserData[`profileThumbs/${uid}`] = null;
deleteUserData[`hasUnreadMsg/${uid}`] = null;
deleteUserData[`userChatRooms/${uid}`] = null;
deleteUserData[`userLikedPosts/${uid}`] = null;
deleteUserData[`userLikedStrains/${uid}`] = null;
return admin.database().ref('/friends').orderByChild(`${uid}/uid`).equalTo(uid)
.once("value").then((friendsSnapshot) => {
friendsSnapshot.forEach((friendSnapshot) => {
deleteUserData[`/friends/${friendSnapshot.key}/${uid}`] = null;
});
return admin.database().ref().update(deleteUserData)
})
.then(() => {
// const bucket = gcs.bucket(functions.config().firebase.storageBucket);
const bucket = admin.storage().bucket();
const path = `categories/${uid}`;
return bucket.file(path).delete();
})
});
I feel like it's because I am not dealing with the promise correctly, I just don't know where this is going wrong.
My code snippet currently works until i chain the .then()
Cheers.
Your current code is not returning anything from the top-level code, meaning it may get terminated at any point while it's writing to the database.
You'll want to return admin.database()... and then chain the additional then() after it.

Index messed up if I upload more than one file at once

I've got the following firebase function to run once a file is uploaded to firebase storage.
It basically gets its URL and saves a reference to it in firestore. I need to save them in a way so that I can query them randomly from my client. Indexes seem to be to best fit this requirement.
for the firestore reference I need the following things:
doc ids must go from 0 to n (n beeing the index of the last
document)
have a --stats-- doc keeping track of n (gets
incremented every time a document is uploaded)
To achieve this I've written the following node.js script:
const incrementIndex = admin.firestore.FieldValue.increment(1);
export const image_from_storage_to_firestore = functions.storage
.object()
.onFinalize(async object => {
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const splittedPath = filePath!.split("/");
// se siamo nelle immagini
// path = emotions/$emotion/photos/$photographer/file.jpeg
if (splittedPath[0] === "emotions" && splittedPath[2] === "photos") {
const emotion = splittedPath[1];
const photographer = splittedPath[3];
const file = bucket.file(filePath!);
const indexRef = admin.firestore().collection("images")
.doc("emotions").collection(emotion).doc("--stats--");
const index = await indexRef.get().then((doc) => {
if (!doc.exists) {
return 0;
} else {
return doc.data()!.index;
}
});
if (index === 0) {
await admin.firestore().collection("images")
.doc("emotions")
.collection(emotion)
.doc("--stats--")
.set({index: 0});
}
console.log("(GOT INDEX): " + index);
let imageURL;
await file
.getSignedUrl({
action: "read",
expires: "03-09-2491"
})
.then(signedUrls => {
imageURL = signedUrls[0];
});
console.log("(GOT URL): " + imageURL);
var docRef = admin.firestore()
.collection("images")
.doc("emotions")
.collection(emotion)
.doc(String(index));
console.log("uploading...");
await indexRef.update({index: incrementIndex});
await docRef.set({ imageURL: imageURL, photographer: photographer });
console.log("finished");
return true;
}
return false;
});
Getting to the problem:
It works perfectly if I upload the files one by one.
It messes up the index if I upload more than one file at once, because two concurrent uploads will read the same index value from --stats-- and one will overwrite the other.
How would you solve this problem? would you use another approach instead of the indexed one?
You should use a Transaction in which you:
read the value of the index (from "--stats--" document),
write the new index and
write the value of the imageURL in the "emotion" doc.
See also the reference docs about transactions.
This way, if the index value is changed in the "--stats--" document while the Transaction is being executed, the Cloud Function can catch the Transaction failure and generates an error which finishes it.
In parallel, you will need to enable retries for this background Cloud Function, in order it is retried if the Transaction failed in a previous run.
See this documentation item https://firebase.google.com/docs/functions/retries, including the video from Doug Stevenson which is embedded in the doc.

How to get the key of a child value with Cloud Functions for Firebase?

I have code that triggers when a user_course is added
export const writePesertaMatkul = functions.database
.ref('/user_course/{user_uid}/') // query child yg dipantau
.onCreate((snapsot, context) => { // on create trigered
const user_uid = context.params.user_uid;
const matkulData = snapsot.val(); //dataSnapshot
});
The log value is { courses_5: 'PEMROGRAMAN GUI' } its doing great, but one problem here. I just want to store courses_5. How to achieve it?
SOLUTION
export const writePesertaMatkul = functions.database
.ref('/user_course/{user_uid}/{matkul_id}') // query child yg dipantau
.onCreate((snapsot, context) => { // on create trigered
const user_uid = context.params.user_uid;
const matkulData = snapsot.val(); //dataSnapshot
const matkulID = context.params.matkul_id;
// const matkulKey = snapsot.key;
console.log("keynya :", snapsot.key);
console.log("uidnya :", user_uid);
const updates = {};
updates[snapsot.key + "/" + user_uid] = "true";
return admin.database().ref('/course_peserta/').update(updates);;
});
use this code for get the parent snapsot.key
Just use the key property of a DataSnapshot, see https://firebase.google.com/docs/reference/node/firebase.database.DataSnapshot#key
So,
const matkulKey = snapsot.key;
should do the trick.
(Note that you use the variable name snapsot in your code, and not snapshot)

Resources