I have a firebase database of form: https://imgur.com/ar8A3DN
I would like two functions:
1. refExists, check if any child exists in database. So that
refExists('datasets') = true
refExists('foo') = false
createChild that creates a new child.
My firebase database instance is declared via:
const accountKeyPath = path.join(__dirname, 'path/to/serviceAccountKey.json')
const accountKey = require(accountKeyPath);
const firebaseAdmin = admin.initializeApp(accountKey);
const dbRef = firebaseAdmin.database().ref('datasets');
The interesting thing is that dbRef and this code which should return an error:
const badRef = firebaseAdmin.database().ref('foo')
both output the same thing. So it's unclear how to check the existence of foo when ref('datasets') and ref('foo') behave the same way.
The way to check whether an element exists is by trying to retrieve a snapshot of it - if the snapshot returns null then the element does not exist.
Adding elements is as simple as calling set on the desired element path with a data object.
function refExists(path) {
firebaseAdmin.database().child(path).once('value', (snap) => {
if (snap.val() !== null) {
console.log('ref exists');
}
});
}
function addRef(newPath, data) {
firebaseAdmin.database().child(newPath).set(data);
}
Related
In one version of my published application, there is a bug that creates a document in my collection, and in one of the fields in this document (called "Popularity"), the field equals a string "null" instead of a string of a number, say "4.5".
What it causes is that on the client-side when a user reads the data and tries to use Double.parseDouble("null") it gives an error while it should have used Double.parseDouble("4.5").
I want to add some cloud function trigger that will listen to any document that is created in that collection and if the created document has this field that equals to "null", to update it to "0.0".
My firestore is built as follows:
Users (collection) - > userId (document) -> fields (Popularity, ID, Title)
I'm new to cloud functions and I am not sure if I use correctly the .update in the end as all I could find is examples for .onUpdate and not for .onCreate examples.
I tried to use the following:
const functions = require('firebase-functions');
exports.createUser = functions.firestore
.document('Users/{userId}')
.onCreate((snap, context) => {
const newValue = snap.data();
const Popularity = newValue.Popularity;
if (Popularity != "null") {
return null;
}
if (Popularity == "null") {
return snap.update({
Popularity: "0.0"
}, {merge: true});
}
});
But I got the following error in my log:
TypeError: snap.update is not a function
and
TypeError: snap.update is not a function
I also tried to use:
exports.createUser = functions.firestore
.document('Users/{bookId}')
.onWrite((change, context) => {
const data = change.after.data();
const previousData = change.before.data();
const oldDocument = change.before.data();
if (data.Popularity != 'null' && previousData.Popularity != 'null') {
return null;
}
if (data.Popularity == 'null' || previousData.Popularity == 'null') {
return change.after.ref.set({Popularity: '0.0'}, {merge: true});
}
});
But then I got:
TypeError: Cannot read property 'Popularity' of undefined
Is there anything else I'm missing here?
Thank you
It looks like snap is a DocumentSnapshot object, which (if you check its documentation) does indeed not have an update() method. To update the document that the snapshot came from, you'll want to call snap.ref.update(...).
When a user deletes their account, I want to remove their storage files along with their data.
I am able to do a multi path delete for the RTDB, how can I do this but also remove files from storage too?
I have tried chaining on a .then but it makes everything fail...
ex...
.then(() => {
const bucket = gcs.bucket(functions.config().firebase.storageBucket);
const path = `categories/${uid}`;
return bucket.file(path).delete();
})
I wish it was faster to test your functions without always deploying because it has taken soooo much time to try making this work...
Here is my working code:
exports.removeUserFromDatabase = functions.auth.user()
.onDelete(function(user, context) {
var uid = user.uid;
const deleteUserData = {};
deleteUserData[`users/${uid}`] = null;
deleteUserData[`feed/${uid}`] = null;
deleteUserData[`friends/${uid}`] = null;
deleteUserData[`profileThumbs/${uid}`] = null;
deleteUserData[`hasUnreadMsg/${uid}`] = null;
deleteUserData[`userChatRooms/${uid}`] = null;
deleteUserData[`userLikedPosts/${uid}`] = null;
deleteUserData[`userLikedStrains/${uid}`] = null;
return admin.database().ref('/friends').orderByChild(`${uid}/uid`).equalTo(uid)
.once("value").then((friendsSnapshot) => {
friendsSnapshot.forEach((friendSnapshot) => {
deleteUserData[`/friends/${friendSnapshot.key}/${uid}`] = null;
});
return admin.database().ref().update(deleteUserData)
})
.then(() => {
// const bucket = gcs.bucket(functions.config().firebase.storageBucket);
const bucket = admin.storage().bucket();
const path = `categories/${uid}`;
return bucket.file(path).delete();
})
});
I feel like it's because I am not dealing with the promise correctly, I just don't know where this is going wrong.
My code snippet currently works until i chain the .then()
Cheers.
Your current code is not returning anything from the top-level code, meaning it may get terminated at any point while it's writing to the database.
You'll want to return admin.database()... and then chain the additional then() after it.
I've got the following firebase function to run once a file is uploaded to firebase storage.
It basically gets its URL and saves a reference to it in firestore. I need to save them in a way so that I can query them randomly from my client. Indexes seem to be to best fit this requirement.
for the firestore reference I need the following things:
doc ids must go from 0 to n (n beeing the index of the last
document)
have a --stats-- doc keeping track of n (gets
incremented every time a document is uploaded)
To achieve this I've written the following node.js script:
const incrementIndex = admin.firestore.FieldValue.increment(1);
export const image_from_storage_to_firestore = functions.storage
.object()
.onFinalize(async object => {
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const splittedPath = filePath!.split("/");
// se siamo nelle immagini
// path = emotions/$emotion/photos/$photographer/file.jpeg
if (splittedPath[0] === "emotions" && splittedPath[2] === "photos") {
const emotion = splittedPath[1];
const photographer = splittedPath[3];
const file = bucket.file(filePath!);
const indexRef = admin.firestore().collection("images")
.doc("emotions").collection(emotion).doc("--stats--");
const index = await indexRef.get().then((doc) => {
if (!doc.exists) {
return 0;
} else {
return doc.data()!.index;
}
});
if (index === 0) {
await admin.firestore().collection("images")
.doc("emotions")
.collection(emotion)
.doc("--stats--")
.set({index: 0});
}
console.log("(GOT INDEX): " + index);
let imageURL;
await file
.getSignedUrl({
action: "read",
expires: "03-09-2491"
})
.then(signedUrls => {
imageURL = signedUrls[0];
});
console.log("(GOT URL): " + imageURL);
var docRef = admin.firestore()
.collection("images")
.doc("emotions")
.collection(emotion)
.doc(String(index));
console.log("uploading...");
await indexRef.update({index: incrementIndex});
await docRef.set({ imageURL: imageURL, photographer: photographer });
console.log("finished");
return true;
}
return false;
});
Getting to the problem:
It works perfectly if I upload the files one by one.
It messes up the index if I upload more than one file at once, because two concurrent uploads will read the same index value from --stats-- and one will overwrite the other.
How would you solve this problem? would you use another approach instead of the indexed one?
You should use a Transaction in which you:
read the value of the index (from "--stats--" document),
write the new index and
write the value of the imageURL in the "emotion" doc.
See also the reference docs about transactions.
This way, if the index value is changed in the "--stats--" document while the Transaction is being executed, the Cloud Function can catch the Transaction failure and generates an error which finishes it.
In parallel, you will need to enable retries for this background Cloud Function, in order it is retried if the Transaction failed in a previous run.
See this documentation item https://firebase.google.com/docs/functions/retries, including the video from Doug Stevenson which is embedded in the doc.
I am sort of new to NodeJS and I'm learning as I code but I can't wrap my head around Promise/then.
Here is the piece of code - I'm using a library function to Read database values.
var collection = 'Students';
var query = {};
query.name = 'name';
//readFromDatabse returns -{Function} a promise to return the document found, or null if not found
var temp = readFromDatabase(collection, query).then(function(studentData) {
var result = {
resultDetails: {
username: studentData.username,
password: studentData.password
}
};
return callback(null,resultDetails);
});
but when I read see the values in temp, it contains {"isFulfilled":false,"isRejected":false}!! how can I get the result details into temp?
You have to think of Promises as containers for values. readFromDatabase returns such a container, which might eventually hold the requested value unless the query fails. Your temp variable points to the container, but not the response. The properties isFullfilled and isRejected are attributes of the Promise telling you that it has neither been resolved with a value or rejected with an error.
To get to the response you have to use the then method. The function you register there will be called for you, when the query yields a result or an error.
var collection = 'Students';
var query = {};
query.name = 'name';
var temp = null;
readFromDatabase(collection, query).then(function(studentData) {
var result = {
resultDetails: {
username: studentData.username,
password: studentData.password
}
};
temp = result;
});
// temp is still null here
I have the following queries, which starts with the GetById method firing up, once that fires up and extracts data from another document, it saves into the race document.
I want to be able to cache the data after I save it for ten minutes. I have taken a look at cacheman library and not sure if it is the right tool for the job. what would be the best way to approach this ?
getById: function(opts,callback) {
var id = opts.action;
var raceData = { };
var self = this;
this.getService().findById(id,function(err,resp) {
if(err)
callback(null);
else {
raceData = resp;
self.getService().getPositions(id, function(err,positions) {
self.savePositions(positions,raceData,callback);
});
}
});
},
savePositions: function(positions,raceData,callback) {
var race = [];
_.each(positions,function(item) {
_.each(item.position,function(el) {
race.push(el);
});
});
raceData.positions = race;
this.getService().modelClass.update({'_id' : raceData._id },{ 'positions' : raceData.positions },callback(raceData));
}
I have recently coded and published a module called Monc. You could find the source code over here. You could find several useful methods to store, delete and retrieve data stored into the memory.
You may use it to cache Mongoose queries using simple nesting as
test.find({}).lean().cache().exec(function(err, docs) {
//docs are fetched into the cache.
});
Otherwise you may need to take a look at the core of Mongoose and override the prototype in order to provide a way to use cacheman as you original suggested.
Create a node module and force it to extend Mongoose as:
monc.hellocache(mongoose, {});
Inside your module you should extend the Mongoose.Query.prototype
exports.hellocache = module.exports.hellocache = function(mongoose, options, Aggregate) {
//require cacheman
var CachemanMemory = require('cacheman-memory');
var cache = new CachemanMemory();
var m = mongoose;
m.execAlter = function(caller, args) {
//do your stuff here
}
m.Query.prototype.exec = function(arg1, arg2) {
return m.execAlter.call(this, 'exec', arguments);
};
})
Take a look at Monc's source code as it may be a good reference on how you may extend and chain Mongoose methods
I will explain with npm redis package which stores key/value pairs in the cache server. keys are queries and redis stores only strings.
we have to make sure that keys are unique and consistent. So key value should store query and also name of the model that you are applying the query.
when you query, inside the mongoose library, there is
function Query(conditions, options, model, collection) {} //constructor function
responsible for query. inside this constructor,
Query.prototype.exec = function exec(op, callback) {}
this function is responsible executing the queries. so we have to manipulate this function and have it execute those tasks:
first check if we have any cached data related to the query
if yes respond to request right away and return
if no we need to respond to request and update our cache and then respond
const redis = require("client");
const redisUrl = "redis://127.0.0.1:6379";
const client = redis.createClient(redisUrl);
const util = require("util");
//client.get does not return promise
client.get = util.promisify(client.get);
const exec = mongoose.Query.prototype.exec;
//mongoose code is written using classical prototype inheritance for setting up objects and classes inside the library.
mongoose.Query.prototype.exec = async function() {
//crate a unique and consistent key
const key = JSON.stringify(
Object.assign({}, this.getQuery(), {
collection: this.mongooseCollection.name
})
);
//see if we have value for key in redis
const cachedValue = await redis.get(key);
//if we do return that as a mongoose model.
//the exec function expects us to return mongoose documents
if (cachedValue) {
const doc = JSON.parse(cacheValue);
return Array.isArray(doc)
? doc.map(d => new this.model(d))
: new this.model(doc);
}
const result = await exec.apply(this, arguments); //now exec function's original task.
client.set(key, JSON.stringify(result),"EX",6000);//it is saved to cache server make sure capital letters EX and time as seconds
};
if we store values as array of objects we need to make sure that each object is individullay converted to mongoose document.
this.model is a method inside the Query constructor and converts object to a mongoose document.
note that if you are storing nested values instead of client.get and client.set, use client.hset and client.hget
Now we monkey patched
Query.prototype.exec
so you do not need to export this function. wherever you have a query operation inside your code, mongoose will execute above code