Firebase Admin in Cloud Functions data manipulation problem - node.js

Firebase Realtime Database structure
freepacks contains 2 important elements:
current, that is the quizpack ID that I will download from the mobile app (retrieving it from quizpacks).
history, that is a node where I add all the quizpacks ID that I put in current over time with a scheduled function in Cloud Functions.
What I need to do EVERY TIME THE CLOUD FUNCTION EXECUTES
Step 1: Add value of current in history with the current timestamp.
Step 2: Substitute the current value with another quizpack ID that is not in history.
Done
How I tried to accomplish this target
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.scheduledFunction = functions.pubsub.schedule('* * * * *').onRun((context) => {
// Current timestamp
const dt = new Date();
const timestamp = `${
(dt.getMonth()+1).toString().padStart(2, '0')}/${
dt.getDate().toString().padStart(2, '0')}/${
dt.getFullYear().toString().padStart(4, '0')} ${
dt.getHours().toString().padStart(2, '0')}:${
dt.getMinutes().toString().padStart(2, '0')}:${
dt.getSeconds().toString().padStart(2, '0')}`
// Download the entire node 'freepacks'
return admin.database().ref('quiz/freepacks').once('value').then((currentSnap) => {
const currentPack = currentSnap.val().current;
// Add current to history
admin.database().ref('quiz/freepacks/history/' + currentPack).set(timestamp);
// Download entire history node
admin.database().ref('quiz/freepacks/history').once('value').then((historySnap) => {
const history = historySnap.val();
console.log('HISTORY: ' + history);
// Download entire quizpacks node
admin.database().ref('quiz/quizpacks').once('value').then((quizpacksSnap) => {
for(quizpack in Object.keys(quizpacksSnap.val())) {
console.log('ITERANDO: ' + quizpack);
// Add the new current if it isn't in history
if (historySnap[quizpack] == undefined) {
admin.database().ref('quiz/freepacks/current').set(quizpack);
break;
}
}
});
})
});
});
What I get from the previous code
Starting point:
First execution: history updates well but updating current didn't work
Second execution ad so on...
current doesn't update anymore (it is stuck on 0)
My experience with JavaScript and Firebase Admin is ~0... What is the problem with my code? Thanks in advance for the help!

First thing is all read/write operations return promises hence you should handle them. In this answer I used async-await syntax. .ref("quiz/freepacks") fetches the complete node i.e. current and the history so you don't have to query the history node again as in original code. Other changes are just Javascript tweaks such as using .find() instead of for-loop and so on..
Try changing your function to:
exports.scheduledFunction = functions.pubsub
.schedule("* * * * *")
.onRun(async (context) => {
// Getting Date
const dt = new Date();
const timestamp = `${(dt.getMonth() + 1).toString().padStart(2, "0")}/${dt
.getDate()
.toString()
.padStart(2, "0")}/${dt.getFullYear().toString().padStart(4, "0")} ${dt
.getHours()
.toString()
.padStart(2, "0")}:${dt.getMinutes().toString().padStart(2, "0")}:${dt
.getSeconds()
.toString()
.padStart(2, "0")}`;
// Download the entire node 'freepacks'
const currentSnap = await firebase
.database()
.ref("quiz/freepacks")
.once("value");
// Checking current free pack ID and array of previous free packs
const currentPack = currentSnap.val().current || "default";
const freeHistoryIDs = [
...Object.keys(currentSnap.val().history || {}),
currentPack,
];
// Add current free pack to free history
await firebase
.database()
.ref("quiz/freepacks/history/" + currentPack)
.set(timestamp);
// Download entire quizpacks node
const quizpacksSnap = await firebase
.database()
.ref("quiz/quizpacks")
.once("value");
const quizPackIDs = Object.keys(quizpacksSnap.val() || {});
const newPack = quizPackIDs.find((id) => !freeHistoryIDs.includes(id));
console.log(newPack);
if (!newPack) {
console.log("No new pack found")
}
return firebase.database().ref("quiz/freepacks/current").set(newPack || "none");
});

Related

Firebase Cloud Function not working as expected

exports.resetDailyFinalKills = functions.pubsub
.schedule("58 16 * * *")
.onRun(async (context) => {
const players = firestore.collection("players");
const goodtimes = await players.where("final_kills", ">", 0);
goodtimes.forEach((snapshot) => {
snapshot.ref.update({final_kills: 0});
});
return null;
});
So I have this cloud function, and when I force run it nothing happens at all like it just says the function was successful but the final_kills field never gets updated. Can anyone help?
Like I obviously have a player here which has a final_kills value that is greater than 0. So why doesn't this reset that back down to zero?
Note sure if I am missing something here but:
You actually try to iterate over the Query object firebase creates when using the where() function on your collections. You actually never fetch the data from the database.
const players = firestore.collection("players");
// fetch the objects from firestore
const goodtimes = await players.where("final_kills", ">", 0).get();
// iterate over the docs you receive
goodtimes.docs.forEach((snapshot) => {
snapshot.ref.update({ final_kills: 0 });
});
Edit (regarding your comment):
Make sure you set your timezone properly after your .schedule() function:
// timezone in my case
functions.pubsub.schedule('5 11 * * *').timeZone('Europe/Berlin')
Check this list as a reference for your correct timezone.

Firebase Functions and Express: listen to firestore data live

I have a website that runs its frontend of Firebase Hosting and its server which is written using node.js and Express on Firebase Functions
What I want to have redirect links from my website so I can map for example mywebsite.com/youtube to my youtube channel. the way I am creating these links is from my admin panel, and adding them to my Firestore database.
My data is roughly something like this:
The first way I approached this, is by querying my Firestore database on every request, but that is heavily expensive and slow.
Another way I tried to approach this is by setting some kind of background listener to the Firestore database which will always provide up to date data. but unfortunately that did not work because Firebase Functions suspends the main function when the current request execution ends.
lastly, which is the most convenience way, I configured an api route, which will be called from my Admin Panel when any change happens to the data, and I would save the new data to some json file. I tried this on my local but it did not work on production because appearently Firebase Functions is a Read-only system, so we can't edit any files after they are deployed. after some research I found out that Firebase Functions allows writing to the tmp directory, so I went forward with this, and tried deploying it. but again, Firebase Functions was resetting the tmp folder when some request execution ends.
here is my api request code which updates the utm_data.json file in the tmp directory:
// my firestore provider
const db = require('../db');
const fs = require('fs');
const os = require('os')
const mkdirp = require('mkdirp');
const updateUrlsAPI = (req, res) => {
// we wanna get the utm list from firestore, and update the file
// tmp/utm_data.json
// query data from firestore
db.collection('utmLinks').get().then(async function(querySnapshot) {
try {
// get the path to `tmp` folder depending on
// the os running this program
let tmpFolderName = os.tmpdir()
// create `tmp` directory if not exists
await mkdirp(tmpFolderName)
let docsData = querySnapshot.docs.map(doc => doc.data())
let tmpFilePath = tmpFolderName + '/utm_data.json'
let strData = JSON.stringify(docsData)
fs.writeFileSync(tmpFilePath, strData)
res.send('200')
} catch (error) {
console.log("error while updating utm_data.json: ", error)
res.send(error)
}
});
}
and this is my code for reading the utm_data.json file on an incoming request:
const readUrlsFromJson = (req, res) => {
var url = req.path.split('/');
// the url will be in the format of: 'mywebsite.com/routeName'
var routeName = url[1];
try {
// read the file ../tmp/utm_data.json
// {
// 'createdAt': Date
// 'creatorEmail': string
// 'name': string
// 'url': string
// }
// our [routeName] should match [name] of the doc
let tmpFolderName = os.tmpdir()
let tmpFilePath = tmpFolderName + '/utm_data.json'
// read links list file and assign it to the `utms` variable
let utms = require(tmpFilePath)
if (!utms || !utms.length) {
return undefined;
}
// find the link matching the routeName
let utm = utms.find(utm => utm.name == routeName)
if (!utm) {
return undefined;
}
// if we found the doc,
// then we'll redirect to the url
res.redirect(utm.url)
} catch (error) {
console.error(error)
return undefined;
}
}
Is there something I am doing wrong, and if not, what is an optimal solution for this case?
You can initialize the Firestore listener in global scope. From the documentation,
The global scope in the function file, which is expected to contain the function definition, is executed on every cold start, but not if the instance has already been initialized.
This should keep the listener active even after the function's execution has completed until that specific instance is running (which should be about ~30 minutes). Try refactoring the code as shown below:
import * as functions from "firebase-functions";
import * as admin from "firebase-admin";
admin.initializeApp();
let listener = false;
// Store all utmLinks in global scope
let utmLinks: any[] = [];
const initListeners = () => {
functions.logger.info("Initializing listeners");
admin
.firestore()
.collection("utmLinks")
.onSnapshot((snapshot) => {
snapshot.docChanges().forEach(async (change) => {
functions.logger.info(change.type, "document received");
switch (change.type) {
case "added":
utmLinks.push({ id: change.doc.id, ...change.doc.data() });
break;
case "modified":
const index = utmLinks.findIndex(
(link) => link.id === change.doc.id
);
utmLinks[index] = { id: change.doc.id, ...change.doc.data() };
break;
case "removed":
utmLinks = utmLinks.filter((link) => link.id !== change.doc.id);
default:
break;
}
});
});
return;
};
// The HTTPs function
export const helloWorld = functions.https.onRequest(
async (request, response) => {
if (!listener) {
// Cold start, no listener active
initListeners();
listener = true;
} else {
functions.logger.info("Listeners already initialized");
}
response.send(JSON.stringify(utmLinks, null, 2));
}
);
This example stores all UTM links in an array in global scope which won't be persisted in new instances but you won't have to query each link for every request. The onSnapshot() listener will keep utmLinks updated.
The output in logs should be:
If you want to persist this data permanently and prevent querying in every cold start, then you can try using Google Cloud Compute that keeps running unlike Cloud functions that timeout eventually.

Pub/Sub Cloud Function does not Update Document in Subcollection

I am trying to update a field in my document in Firestore. The general location of the document would be "/games/{userId}/userGames/{gameId}. And in this game, there is a property called "status" which changes accordingly to the games start and end time.
As you can guess, the if the start time is bigger than the "now" timestamp and the status is "TO_BE_PLAYED", the game will begin and the status will be 1, "BEING_PLAYED". Also, if the end time is bigger than the "now" timestamp and the status is "BEING_PLAYED", the game will end, therefore the status will be 2, "PLAYED". I want to create a cloud function that is capable to do so.
However, even if the function logs output 'ok', the values are never updated. Unfortunately, I do not have that much experience in Javascript too.
THE CODE
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
const STATUS_PLAYED = 2;
const STATUS_BEING_PLAYED = 1;
const STATUS_TO_BE_PLAYED = 0;
exports.handleBeingPlayedGames = functions.runWith({memory: "2GB"}).pubsub.schedule('* * * * *')
.timeZone('Europe/Istanbul') // Users can choose timezone - default is America/Los_Angeles
.onRun(async () => {
// current time & stable
// was Timestamp.now();
const now = admin.firestore.Timestamp.fromDate( new Date());
const querySnapshot = await db.collection("games").get();
const promises = [];
querySnapshot.forEach( doc => {
const docRef = doc.ref;
console.log(docRef);
promises.push(docRef.collection("userGames").where("status", "==", STATUS_BEING_PLAYED).where("endtime", "<", now).get());
});
const snapshotArrays = await Promise.all(promises);
const promises1 = [];
snapshotArrays.forEach( snapArray => {
snapArray.forEach(snap => {
promises1.push(snap.ref.update({
"status": STATUS_PLAYED,
}));
});
});
return Promise.all(promises1);
});
exports.handleToBePlayedGames = functions.runWith({memory: "2GB"}).pubsub.schedule('* * * * *')
.onRun(async () => {
// current time & stable
// was Timestamp.now();
const now = admin.firestore.Timestamp.fromDate(new Date());
const querySnapshot = await db.collection("games").get();
const promises = [];
querySnapshot.forEach( async doc => {
const docData = await doc.ref.collection("userGames").where("status", "==", STATUS_TO_BE_PLAYED).where("startTime", ">", now).get();
promises.push(docData);
});
const snapshotArrays = await Promise.all(promises);
const promises1 = [];
snapshotArrays.forEach( snapArray => {
snapArray.forEach(snap => {
promises1.push(snap.ref.update({
"status": STATUS_BEING_PLAYED,
}));
});
});
return Promise.all(promises1);
});
Okay, so this answer goes to lurkers trying to solve this problem.
First I tried to solve this problem by brute force and not including much thinking and tried to acquire the value in subcollection. However, as I searched, I've found that denormalizing (flattening) data actually solves the problem a bit.
I created a new directory under /status/{gameId} with the properties
endTime, startTime, and status field and I actually did it on a single level by using promises. Sometimes denormalizing data can be your savior.
How can startTime be greater than now? Is it set by default to a date in the future?
My current assumption is that a game cannot set it's status to STATUS_BEING_PLAYED because of the inconsistency with startTime. Moreover, a game cannot have the status STATUS_PLAYED because it depends on having STATUS_BEING_PLAYED, which cannot have.
My recommendation would be to set the field startTime and endTime to null by default. If you do so you can check if a game has to be set to STATUS_BEING_PLAYED with this:
doc.ref.collection("userGames")
.where("status", "==", STATUS_TO_BE_PLAYED)
.where("startTime", "<", now)
.where("endTime", "==", null)
.get();
You could check if a game has to be on STATUS_PLAYED with this (exactly as you did):
docRef.collection("userGames")
.where("status", "==", STATUS_BEING_PLAYED)
.where("endtime", "<", now)
.get();
Now there's something that you should wonder, is this the best approach to change a game's status? You are querying the whole game library of a user every single minute as you know read operations are charged so this approach would imply meaningful charges. Maybe you should simply use update the game's status when the game is started and closed.
Also notice that the equals operation is ==, not =.

Index messed up if I upload more than one file at once

I've got the following firebase function to run once a file is uploaded to firebase storage.
It basically gets its URL and saves a reference to it in firestore. I need to save them in a way so that I can query them randomly from my client. Indexes seem to be to best fit this requirement.
for the firestore reference I need the following things:
doc ids must go from 0 to n (n beeing the index of the last
document)
have a --stats-- doc keeping track of n (gets
incremented every time a document is uploaded)
To achieve this I've written the following node.js script:
const incrementIndex = admin.firestore.FieldValue.increment(1);
export const image_from_storage_to_firestore = functions.storage
.object()
.onFinalize(async object => {
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const splittedPath = filePath!.split("/");
// se siamo nelle immagini
// path = emotions/$emotion/photos/$photographer/file.jpeg
if (splittedPath[0] === "emotions" && splittedPath[2] === "photos") {
const emotion = splittedPath[1];
const photographer = splittedPath[3];
const file = bucket.file(filePath!);
const indexRef = admin.firestore().collection("images")
.doc("emotions").collection(emotion).doc("--stats--");
const index = await indexRef.get().then((doc) => {
if (!doc.exists) {
return 0;
} else {
return doc.data()!.index;
}
});
if (index === 0) {
await admin.firestore().collection("images")
.doc("emotions")
.collection(emotion)
.doc("--stats--")
.set({index: 0});
}
console.log("(GOT INDEX): " + index);
let imageURL;
await file
.getSignedUrl({
action: "read",
expires: "03-09-2491"
})
.then(signedUrls => {
imageURL = signedUrls[0];
});
console.log("(GOT URL): " + imageURL);
var docRef = admin.firestore()
.collection("images")
.doc("emotions")
.collection(emotion)
.doc(String(index));
console.log("uploading...");
await indexRef.update({index: incrementIndex});
await docRef.set({ imageURL: imageURL, photographer: photographer });
console.log("finished");
return true;
}
return false;
});
Getting to the problem:
It works perfectly if I upload the files one by one.
It messes up the index if I upload more than one file at once, because two concurrent uploads will read the same index value from --stats-- and one will overwrite the other.
How would you solve this problem? would you use another approach instead of the indexed one?
You should use a Transaction in which you:
read the value of the index (from "--stats--" document),
write the new index and
write the value of the imageURL in the "emotion" doc.
See also the reference docs about transactions.
This way, if the index value is changed in the "--stats--" document while the Transaction is being executed, the Cloud Function can catch the Transaction failure and generates an error which finishes it.
In parallel, you will need to enable retries for this background Cloud Function, in order it is retried if the Transaction failed in a previous run.
See this documentation item https://firebase.google.com/docs/functions/retries, including the video from Doug Stevenson which is embedded in the doc.

Firebase Cloud Functions deletes nodes directly after rather than 24 hours later

My goal is to delete all the message nodes 24 hours after they were sent using Firebase Cloud Functions and the Realtime Database. I tried copy and pasting the answer from this post however for some reason the messages delete directly after they were created rather than the 24 hours later. If someone could help me solve this problem I would really appreciate it. I have tried multiple different answers based on the same issue and they haven't worked for me.
Here is my index.js file:
'use strict';
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
// Cut off time. Child nodes older than this will be deleted.
const CUT_OFF_TIME = 24 * 60 * 60 * 1000; // 2 Hours in milliseconds.
exports.deleteOldMessages = functions.database.ref('/Message/{chatRoomId}').onWrite(async (change) => {
const ref = change.after.ref.parent; // reference to the parent
const now = Date.now();
const cutoff = now - CUT_OFF_TIME;
const oldItemsQuery = ref.orderByChild('seconds').endAt(cutoff);
const snapshot = await oldItemsQuery.once('value');
// create a map with all children that need to be removed
const updates = {};
snapshot.forEach(child => {
updates[child.key] = null;
});
// execute all updates in one go and return the result to end the function
return ref.update(updates);
});
And my database structure is:
In the comments you indicated that you're using Swift. From that and the screenshot it turns out that you're storing the timestamp in seconds since 1970, while the code in your Cloud Functions assumes it is in milliseconds.
The simplest fix is:
// Cut off time. Child nodes older than this will be deleted.
const CUT_OFF_TIME = 24 * 60 * 60 * 1000; // 2 Hours in milliseconds.
exports.deleteOldMessages = functions.database.ref('/Message/{chatRoomId}').onWrite(async (change) => {
const ref = change.after.ref.parent; // reference to the parent
const now = Date.now();
const cutoff = (now - CUT_OFF_TIME) / 1000; // convert to seconds
const oldItemsQuery = ref.orderByChild('seconds').endAt(cutoff);
const snapshot = await oldItemsQuery.once('value');
// create a map with all children that need to be removed
const updates = {};
snapshot.forEach(child => {
updates[child.key] = null;
});
// execute all updates in one go and return the result to end the function
return ref.update(updates);
});
Also see my answer here: How to remove a child node after a certain date is passed in Firebase cloud functions?

Resources