how can i implement a parse server cloud code function to write in firebase - parse-cloud-code

How can i implement a cloud Code function on back4app parse server to write data on firebase realtime.
follow my first approach:
thanks for any help.
const fb = require('firebase');// not working
const Produto = Parse.Object.extend("Produto");
const Fornecedor = Parse.Object.extend("Fornecedor");
const Users = Parse.Object.extend("Users");
Parse.Cloud.define("CriarUsu", async(request)=>{
const user = new Parse.User();
const fb = new fb(); // not working
return("tudo ok")
});

Related

how to write function for a specific document in firestore?

below is the function, i have used from youtube .
const functions = require("firebase-functions");
const admin = require("firebase-admin");
admin.initializeApp();
exports.androidPushNotification =
functions.firestore.document('MyMoneyNotifications/{MyPercentageMoney}').onCreate(
(snapshot, context)=>
{
admin.messaging().sendToTopic(
"new_user_forums",
{
notification:{
title:snapshot.data().title,
body: snapshot.data().body
}
});});
it is working fine.
but i want to check for below structure.
Notifications(collection)
->UserId(document)
->MyNotification(collection)
->notify1
->notify2
now, i want to check for a specific user if he had any new notifications. how to check for the collection "MyNotification" in firebase functions
If you are trying to listen to sub-collections then you can set the path to:
exports.androidPushNotification =
functions.firestore.document('Notifications/{userId}/MyNotification/{notificationId}')
You can read more about this in the documentation

Using Firebase Cloud Functions to fan out data

I'm extremely new to using Firebase cloud functions, and I am struggling to find the error in my code. It is supposed to trigger on a firestore write and then copy that document into all of the user's feeds who follow that user who posted.
My current code is below:
exports.fanOutPosts = functions.firestore
.document('posts/{postId}')
.onCreate((snap, context) => {
var db = admin.firestore();
const post = snap.data();
const userID = post['author'];
const postCollectionRef = db.collection('friends').document(userID).collection('followers');
return postCollectionRef.get()
.then(querySnapshot => {
if (querySnapshot.empty) {
return null;
} else {
const promises = []
querySnapshot.forEach(doc => {
promises.push(db.collection('feeds').document(doc.key).collection('posts').document(post.key).update(data));
});
return Promise.all(promises);
}
});
});
So this successfully deploys to Firebase, but it receives this error when a document is created:
TypeError: db.collection(...).document is not a function
at exports.fanOutPosts.functions.firestore.document.onCreate (/workspace/index.js:22:60)
Line 22 is const postCollectionRef = db.collection('friends').document(userID).collection('followers');
I am unsure why this line is causing errors with the .get, but if anyone could point me in the right direction it would be much appreciated!
Given that this is the nodejs API, you'll want to use doc() instead of document(). Other languages might use document().
I found this info via the Admin SDK on CollectionReference https://googleapis.dev/nodejs/firestore/latest/CollectionReference.html
According to the reference, the collection should be defined as the following:
const postCollectionRef = db.collection(`friends/${userId}/followers`);
Using template literals will allow you to dynamically add variables into the collection ref.
I would also take a look into the else logic to use template literals within your return statement.

Saving to two different Firestore databases with dialogflow

I'm making an actions on google project that will require adding data to two different Cloud Firestore. For some reason when I trigger the intent, it will only save to the original Cloud Firestore, but not the new one.
For simplicity, I'm going to refer to the original Cloud Firestore as "DB1" and the new one/ second one as "DB2"
Here's what I had tried:
const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
const admin = require('firebase-admin');
const {google} = require('googleapis');
const {
<Basically All of the libraries / AOG functions>
} = require('actions-on-google');
const defaultAppConfig = {"<FIREBASE CREDENTIALS FOR DB1 >"}
const OtherAppConfig = {"<FIREBASE CREDENTIALS FOR DB2>"}
const defaultApp = admin.initializeApp(defaultAppConfig); // DB1
const otherApp = admin.initializeApp(OtherappConfig, 'Other'); // DB2
const db = admin.firestore(functions.config(defaultApp).firebase); //DB1
const ab = admin.firestore(functions.config(otherApp).firebase); // DB2
const app = dialogflow({
debug: true,
clientId: '<DIALOGFLOW CREDENTIALS>'
});
app.intent('DB2 Write', (conv) =>{
conv.ask('Okay I made the write to DB2');
var data = {
name: 'This is a Test write'
};
var setDoc = ab.collection('Test').doc('Write').set(data);
});
exports.dialogflowFirebaseFulfillment = functions.https.onRequest(app);
Sorry if some parts are unnecessary, I wanted to include as much information as I could (I might be missing something that someone else sees).
To sum up what I thought would happen, I thought when I triggered the intent 'DB2 Write' that it would write 'This is a Test Write' to DB2, however it just keeps writing the message/data to DB1.
How do I get this working so it will write to my second Cloud Firestore or "DB2" when this intent is triggered?
Thanks for the help!
Note: If it makes a difference, I'm using the dialogflow inline editor for this code.
____________________________________________________________________________
Update: Here is what I have tried/ updated and it still writes to DB1
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
const otherAdmin = require('firebase-admin');
otherAdmin.initializeApp({
credential: otherAdmin.credential.cert(OtherAppConfig)
},'Other');
const ab = otherAdmin.firestore();
and as well:
admin.initializeApp(defaultAppConfig);
var otherApp = admin.initializeApp(OtherAppConfig, 'other');
console.log(admin.app().name); // '[DEFAULT]'
console.log(otherApp.name); // 'other'
// Use the shorthand notation to retrieve the default app's services
var db = admin.firestore(functions.config().firebase);
// Use the otherApp variable to retrieve the other app's services
var ab = otherApp.firestore(functions.config().firebase);
I'd like to note, the credentials I'm using for "OtherAppConfig" and "defaultAppConfig" were taken from the Firebase private key. ie: firebase console > project overview > service accounts > generate private key. Could this be the problem?
I think the problem is thus:
A Dialogflow project and a Firebase project are the same under the hood. This cool, as your Firebase Functions will know intuitively connect with Dialogflow and your database without a lot of manual configuration.
However, if you have two databases from different Cloud Projects, you will need to do some additional configurations to connect securely. I'm not sure what your AppConfigs contain, but they may not be sufficiently setup. As such, the Firebase setup may be pulling the default (current project) app and database when you're grabbing the functions config.
You may want to, for your second project, download the service key. Then you can load it as a file or directly as JSON in your startup routine.
This snippet below should work the way you want.
// Setup 1st db, this project's db
const admin = require('firebase-admin');
admin.initializeApp(); // Initialize with default params as we get it by default
const db = admin.firestore();
// Setup 2nd db
const otherAdmin = require('firebase-admin'); // We can import twice
const myOtherServiceKey = { ... }
otherAdmin.initializeApp({
credential: otherAdmin.credential.cert(myOtherServiceKey)
});
const ab = otherAdmin.firestore(); // This should be the DB of our second project

Firebase Admin Node.js Read/Write Database

It seems I can't find a proper way to use the read/write functions for admin in the Cloud Functions. I am working on a messaging function that reads new messages created in the Realtime Database with Cloud Functions Node.js and uses the snapshot to reference a path. Here is my initial exports function:
var messageRef = functions.database.ref('Messages/{chatPushKey}/Messages/{pushKey}');
var messageText;
exports.newMessageCreated = messageRef.onCreate((dataSnapshot, context) => {
console.log("Exports function executed");
messageText = dataSnapshot.val().messageContent;
var chatRef = dataSnapshot.key;
var messengerUID = dataSnapshot.val().messengerUID;
return readChatRef(messengerUID, chatRef);
});
And here is the function that reads from the value returned:
function readChatRef(someUID, chatKey){
console.log("Step 2");
admin.database.enableLogging(true);
var db;
db = admin.database();
var userInfoRef = db.ref('Users/' + someUID + '/User Info');
return userInfoRef.on('value', function(snap){
return console.log(snap.val().firstName);
});
}
In the firebase cloud functions log I can read all console.logs except for the one inside return userInfoRef.on.... Is my syntax incorrect? I have attempted several other variations for reading the snap. Perhaps I am not using callbacks efficiently? I know for a fact that my service account key and admin features are up to date.
If there is another direction I need to be focusing on please let me know.

How authenticate with gcloud credentials an Dialogflow API

I have a Node JS app that make requests to a Dialogflow agent. I actually use a temporally token based request, but how can i change this to do it through google service credentials? (https://cloud.google.com/docs/authentication/getting-started). I have a credencial created (with billing added), and the service_account json file.
I would like to use the Dialogflow package in node (https://www.npmjs.com/package/dialogflow) but i don't underestand how to use it with the json file.
const projectId = 'ENTER_PROJECT_ID_HERE';
const sessionId = 'quickstart-session-id';
const query = 'hello';
const languageCode = 'en-US';
// Instantiate a DialogFlow client.
const dialogflow = require('dialogflow');
const sessionClient = new dialogflow.SessionsClient();
// Define session path
const sessionPath = sessionClient.sessionPath(projectId, sessionId);
The example of the package use Project ID and Session ID, but not with a json file like the example of the google services (or using big query like How to authenticate with gcloud big query using a json credentials file?). Anyway, where can i get this project and session id?
Please, if someone can help me or guide how to do this in a better way?. Thanks
First you have to create a service account and download a .JSON format file of credentials on your local system.
Now, there are three ways to use that credentials for authentication/authorisation in dialogflow library.
Method 1
Create a environment variable GOOGLE_APPLICATION_CREDENTIALS and it's value should be the absolute path of that JSON credentials file.By this method, google library will implicitly loads the file and use that credentials for authentication. We don't need to do anything inside our code relating to this credentials file.
export GOOGLE_APPLICATION_CREDENTIALS="<absolute-path-of-json-file>" # for UNIX,LINUX
# then run your code, google library will pick credentials file and loads it automatically
Method 2
Assume, you know the absolute path of your JSON file and put that as value in below snippet of credentials_file_path variable.
// You can find your project ID in your Dialogflow agent settings
const projectId = '<project-id-here>';
const sessionId = '<put-chat-session-id-here>';
// const sessionid = 'fa2d5904-a751-40e0-a878-d622fa8d65d9'
const query = 'hi';
const languageCode = 'en-US';
const credentials_file_path = '<absolute-file-path-of-JSON-file>';
// Instantiate a DialogFlow client.
const dialogflow = require('dialogflow');
const sessionClient = new dialogflow.SessionsClient({
projectId,
keyFilename: credentials_file_path,
});
Method 3
You can note down the project_id, client_email and private_key from the JSON, use them in your code for authentication explicitly.
// You can find your project ID in your Dialogflow agent settings
const projectId = '<project-id-here>';
const sessionId = '<put-chat-session-id-here>';
// const sessionid = 'fa2d5904-a751-40e0-a878-d622fa8d65d9'
const query = 'hi';
const languageCode = 'en-US';
const credentials = {
client_email: '<client-email-here>',
private_key:
'<private-key-here>',
};
// Instantiate a DialogFlow client.
const dialogflow = require('dialogflow');
const sessionClient = new dialogflow.SessionsClient({
projectId,
credentials,
});
Here is how you can do it with a service account code sample is in kotlin and definitely can be translated into the node.js sdk
val credentialsProvider = FixedCredentialsProvider.create(ServiceAccountCredentials
.fromStream(Classes.getResourceAsStream([YOUR JSON CONFIG FILE GOES HERE])))
val sessionsSettings = SessionsSettings.newBuilder().setCredentialsProvider(credentialsProvider).build()
sessionsClient = SessionsClient.create(sessionsSettings)
You can get the service account from Dialogflow settings click on the service account links and then create a json config file there in ur cloud console.

Resources