Dialogflow Fulfillment - Mutiple Intents - dialogflow-es

I need some help in creating the fulfillment for the intents that i have created in Dialog flow. There are 15 intents that i have created and i have integrated them and tested and it works fine. I am stuck in fulfillment and unable to proceed since i am confused whether the fulfillment setup. Since has to be done for every intent that i have created i believe. I am unsure how to do this to complete by one click using fulfillment link on the left pane. does it work if i just directly click on fulfillment and deploy. I am really confused. Please help me out.

Setting up Fulfillment is a multi-step process.
Enabling Fulfillment
Select Fulfillment on the left navigation
If your fulfillment code will be running at a remote webhook, enable "Webhook" and enter the URL for your webhook.
If you don't have a place to run your fulfillment code, you can also use the Inline Editor to get started. Enable this, and you'll be entering your code here directly.
Save the configuration.
Enabling for each Intent
While this sets the Fulfillment that will be used for your project, you must still enable this for each Intent that should call it.
Go back to the Intent listing and select an Intent.
Scroll towards the bottom of the page in the Fulfillment section.
Turn "Enable webhook call for this intent" on.
Save the configuration.
Repeat this for every Intent that you want to process using Fulfillment.
Deploying your webhook
You will also need to write your webhook to handle the various Intents that are triggered. The code for the Inline Editor can be a good place to start.
In the intentMap, you will need to add a map from the Intent name to a function that will do the handling when that Intent triggers the webhook. You can have a different handler function for each Intent, use the same function for some, have those functions call other functions, whatever you need.
A couple of things to note, however:
If your handler needs to do an asynchronous function (access a database, make a network call, etc), then you need to make sure you return a Promise.
If you're using the Inline Editor and you're making network calls outside of Google's network, then you need to upgrade your Firebase subscription to the Blaze plan. (You will still likely be able to work with the free tier of that plan.)
'use strict';
const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
const {Card, Suggestion} = require('dialogflow-fulfillment');
process.env.DEBUG = 'dialogflow:debug'; // enables lib debugging statements
exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
const agent = new WebhookClient({ request, response });
console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
function welcome(agent) {
agent.add(`Welcome to my agent!`);
}
function fallback(agent) {
agent.add(`I didn't understand`);
agent.add(`I'm sorry, can you try again?`);
}
function handlerOne(agent) {
agent.add(`This is handler one`);
}
function handlerThree(agent) {
agent.add(`This is handler three`);
}
// Run the proper function handler based on the matched Dialogflow intent name
let intentMap = new Map();
intentMap.set('Default Welcome Intent', welcome);
intentMap.set('Default Fallback Intent', fallback);
intentMap.set('intent.one', handlerOne);
intentMap.set('intent.two', handlerOne);
intentMap.set('intent.three', handlerThree);
agent.handleRequest(intentMap);
});

You can create a map in the API, where all intent will be mapped to a correspondence handler in fulfillment.
Here is sample code,
const express = require("express");
const { WebhookClient } = require("dialogflow-fulfillment");
const { welcome, defaultFallback } = require("./intents/welcomeExit");
const app = express();
app.post("/dialogflow", express.json(), (req, res) => {
const agent = new WebhookClient({ request: req, response: res });
let intentMap = new Map();
intentMap.set("Default Welcome Intent", welcome);
intentMap.set("Default Fallback Intent", defaultFallback);
agent.handleRequest(intentMap);
});
app.listen(process.env.PORT || 8080);
This has been done in NodeJS. You can use other languages supported by Dialogflow.
How it gets configured in Dialogflow check this link.

Related

How to add Suggestion Chips through Fulfillment for the Dialogflow Messenger integration?

So I have the Dialogflow Messenger embedded in a website and want to add some Suggestion chips. It's easy through the Custom Payload Response type and they show up just fine.
But how do I add them through fulfillment?
I currently have a custom webhook setup and the idea is to have something like this:
if (x) {
agent.add('blablabla');
agent.add(new Suggestion('One');
} else {
agent.add('blablabla');
agent.add(new Suggestion('Two');
}
new Suggestion doesn't work though, so is there another way of doing this?
I was thinking about something like this:
agent.add(new Payload(
"richContent": [
[
{
"options": [
{
"text": "One"
},
{
"text": "Two"
}
],
"type": "chips"
}
]
]));
Essentially trying to insert the Custom Payload directly into the response JSON, if that makes any sense. But yeah no idea how to actually do it. Anyone know how?
It is unclear to me what you exactly mean by new Suggestion() doesn't work. You mean the suggestion chips do not show in Dialogflow Messenger? Do they show in Dialogflow itself?
Let me share a few points:
As far as I know the structure agent.add(new Suggestion(“One”)); should work. I tried a simple example and it is working fine in Dialogflow UI, with the code:
const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
const {Card, Suggestion} = require('dialogflow-fulfillment');
process.env.DEBUG = 'dialogflow:debug'; // enables lib debugging statements
exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
const agent = new WebhookClient({ request, response });
console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
let intentMap = new Map();
intentMap.set('Default Welcome Intent', welcome);
intentMap.set('Default Fallback Intent', fallback);
function welcome(agent){
agent.add("What is your favorite animal?");
agent.add(new Suggestion("Dog"));
agent.add(new Suggestion("Cat"));
}
function fallback(agent) {
agent.add(`I didn't understand`);
agent.add(`I'm sorry, can you try again?`);
}
agent.handleRequest(intentMap);
});
If suggestions chips are not rendered even in Dialogflow UI I would suggest trying the previous code to discard any potential issues with your Dialogflow setup. You may need to upgrade some dependencies e.g. "dialogflow-fulfillment": "^0.6.1".
Some integrations, like Google Assistant use the Suggestions library from actions-on-google. See for example official Google Assistant code example. You may try to follow a similar behavior if it fits your use case although I do not think it is the case. As a reference you can check this github issue.

Wait some seconds before agent's reply

I'm trying to build a very simple Dialogflow app for Actions on google.
What I had in mind was a very simple timer, but every X seconds the agent will tell the user "X seconds left".
I'm using the Fulfillment section on dialogflow. What I've tried to do was a simple "setTimeout" that include another agent.add but this seems to be ignored by Dialogflow when I deploy it:
function startTimer(agent)
{
agent.add("Timer started! 20 seconds from now.");
setTimeout(function(){
agent.add("10 seconds left!");
}, 10000);
agent.add("Time out.");
}
let intentMap = new Map();
intentMap.set('timer', startTimer);
agent.handleRequest(intentMap);
The response from assistant is a simple "Timer started" and "Time out", without the X seconds remaining. Is there any way to add a reply when an intent is started? Thanks!
EDIT | as suggested, I have tried with SSML, but the tags are displayed on the screen when they get said by the assistant.
const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
const {Card, Suggestion} = require('dialogflow-fulfillment');
exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
const agent = new WebhookClient({ request, response });
function startTimer(agent)
{
agent.add("Something to say");
agent.add(`<speak><seq><media begin="30s"><speak>30 seconds</speak></media><media begin="30s"><speak>1 minute</speak></media></seq></speak>`);
agent.add(new Suggestion(`Quit`));
}
let intentMap = new Map();
intentMap.set('timer-go', startTimer);
agent.handleRequest(intentMap);
});
It's not possible to an Action start a conversation, the fulfillment code (your function) must return within 10 seconds, or the Google Assistant will close the Action with a time-out warning.
And your setTimeout is not working because this code is running in the cloud, and to actually send it back to the Assistant, you must send the response, and you are only adding items to it, but not returning the object.
This page from DialogFlow documentation explains how the back-end fulfillment works on DialogFlow / Google Assistant.
You can use SSML in your response and set when to respond.
e.g.
<speak>
<seq>
<media begin="0s">
<speak>Timer started! 20 seconds from now</speak>
</media>
<media begin="10.0s">
<speak>10 seconds left!</speak>
</media>
</seq>
</speak>
Also, check for more information.

nodejs Dialogflow v2 close a conversation from the fulfillment

How do I end my conversation from the webhook ?
Marking it within Dialogflow does nothing , basically does not stop it as I am using the webhook for fulfillment .
And if I add it to the code as below then it does not play the media.
// Import the Dialogflow module from the Actions on Google client library.
// https://github.com/actions-on-google/actions-on-google-nodejs
const {dialogflow, Suggestions, MediaObject, Image} = require('actions-on-google');
// Import the firebase-functions package for Cloud Functions for Firebase fulfillment.
const functions = require('firebase-functions');
// Node util module used for creating dynamic strings
const util = require('util');
// Instantiate the Dialogflow client with debug logging enabled.
const app = dialogflow({
debug: true
});
// Do common tasks for each intent invocation
app.middleware((conv, framework) => {
console.log(`Intent=${conv.intent}`);
console.log(`Type=${conv.input.type}`);
//kng
console.log(`Arguments=${conv.arguments}`);
console.log(`Arguments=${typeof(conv.arguments)}`);
// Determine if the user input is by voice
conv.voice = conv.input.type === 'VOICE';
if (!(conv.intent === 'Default Fallback Intent' || conv.intent === 'No-input')) {
// Reset the fallback counter for error handling
conv.data.fallbackCount = 0;
}
});
app.intent('Play Sound', (conv, {SoundType,duration}) => {
const suggestions1 = new Suggestions('do this ', 'do that', 'do nothing');
simple_response = 'this is a response from the webhook'
conv.ask(simple_response)
conv.ask(new MediaObject({
name: SoundType,
url: some_mp3file_url,
icon: new Image({
url: some_image_url,
alt: 'Media icon'
})
}));
conv.ask( suggestions1);
//if I close from the code it doesnot play the sound
conv.close();
//if I comment out the close statement above then it does not close and toggling on the "set this intent as the end of convesation does not seem to help."
}
)
Update - This was intact a bug as pointed out by one of the comments . Reported to google and they fixed the same in April or May
I can duplicate the issue, but it appears to be a bug - playing audio as part of the response and having it close after the audio finishes used to work. It is clearly supposed to be supported - the documentation and the simulator state that Suggestions aren't required if this is a final response.
The workaround is to create an additional Intent that handles the Action actions_intent_MEDIA_STATUS. This Intent would then close the conversation.

Invoke a Dialogflow event with a specific device source

After trying and trying countless times, I ask for your help to call a Dialogflow event (GoogleHome) with a specific GoogleHome device.
Through nodeJS I managed to successfully call a Dialogflow event and I get the fullfillment response. All perfect, only I have to let my GoogleHome device speak with fullfillment, I do not need a text-only answer.
My goal is to let my GoogleHome device speak first, without the word "Ok, Google" and wait for a response from the user.
I did not find anything on the web, my attempts stop to invoke the Dialogflow event and have a console response.
This is the code i have tried for fullfillment
test: async function () {
console.log("[funcGHTalk|test] CALLED");
const projectId = "[[projectid]]";
const LANGUAGE_CODE = 'it-IT';
let eventName = "[[eventname]]";
const sessionId = uuid.v4();
const sessionClient = new dialogflow.SessionsClient();
const sessionPath = sessionClient.sessionPath(projectId, sessionId);
// The text query request.
const request = {
session: sessionPath,
queryInput: {
event: {
name: eventName,
languageCode: LANGUAGE_CODE
},
},
};
// Send request and log result
const responses = await sessionClient.detectIntent(request);
console.log('Detected intent');
const result = responses[0].queryResult;
console.log(result);
console.log(` Query: ${result.queryText}`);
console.log(` Response: ${result.fulfillmentText}`);
if (result.intent) {
console.log(` Intent: ${result.intent.displayName}`);
} else {
console.log(` No intent matched.`);
}
}
The code you have written is using the Dialogflow Detect Intent API. This is meant to run on consoles and servers to send a message to Dialogflow, which will parse it, determine which Intent it matches, call fulfillment with that information, and return all the results.
You don't need to run this on a Google Home, since the Google Assistant does all this for you.
What I think you're looking for is to develop fulfillment with Actions on Google and the Dialogflow Fulfillment API. This handles things on the other end - after Dialogflow determines what Intent matches what the user has said, and if that Intent has fulfillment enabled, it will send the information to your webhook which is running on a cloud server somewhere. You would then process it, send a reply (either using the actions-on-google library or the dialogflow-fulfillment library is easiest), and it would send it back to the Assistant.
You indicated that you want the Action to "let my GoogleHome device speak first, without the word "Ok, Google" and wait for a response from the user". This is much more complicated, and not really possible to do with the Google Home device right now. Most Actions have the user initiating the conversation with "Ok Google, talk to my test app" or whatever the name of the Action is.
You don't indicate how you expect to trigger the Home to begin talking, but you may wish to look into notifications to see if those fit your model, however notifications don't work with the Home right now, just the Assistant on mobile devices.

How to uniquely identify a user on Dialogflow fullfilment?

I need to send a unique identifier to my web service through Dialogflow Fulfillment so that I can recognize who is making the request.
For that I need to uniquely identify a user on Dialogflow Fulfillment, but I can't find how to get a token or something like that inside the Inline Editor so that I can identify the device that is making the request.
I tried to see what there is inside the agent variable.
But I found nothing that I could use to identify the user who is making the request to my web service.
I also tried to get the userStorage, like seen at How to identify unique users with Diagflow, but it returns me the error:
Cannot read property 'userStorage' of undefined
at verificarBiologia (/user_code/index.js:37:76)
at WebhookClient.handleRequest (/user_code/node_modules/dialogflow-fulfillment/src/dialogflow-fulfillment.js:273:44)
at exports.dialogflowFirebaseFulfillment.functions.https.onRequest (/user_code/index.js:52:9)
at cloudFunction (/user_code/node_modules/firebase-functions/lib/providers/https.js:26:47)
at /var/tmp/worker/worker.js:686:7
at /var/tmp/worker/worker.js:670:9
at _combinedTickCallback (internal/process/next_tick.js:73:7)
at process._tickDomainCallback (internal/process/next_tick.js:128:9)
Probably because the variable user is undefined.
This is my code:
// See https://github.com/dialogflow/dialogflow-fulfillment-nodejs
// for Dialogflow fulfillment library docs, samples, and to report issues
'use strict';
const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
const {Card, Suggestion} = require('dialogflow-fulfillment');
process.env.DEBUG = 'dialogflow:debug'; // enables lib debugging statements
exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
const agent = new WebhookClient({ request, response });
console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
function welcome(agent) {
agent.add(`Welcome to my agent!`);
}
function fallback(agent) {
agent.add(`I didn't understand`);
agent.add(`I'm sorry, can you try again?`);
}
function verificarBiologia(agent) {
agent.add('Inicio do metodo');
console.log('Build 059');
console.log(agent);
let payload = request.body.originalDetectIntentRequest.payload;
console.log(payload);
let userStorage = request.body.originalDetectIntentRequest.payload.user.userStorage || JSON.stringify({});
let userId;
console.log("userStorage", userStorage);
userStorage = JSON.parse(userStorage);
console.log("userStorage_after_parsing", userStorage);
agent.add('Final do metodo');
}
// Run the proper function handler based on the matched Dialogflow intent name
let intentMap = new Map();
intentMap.set('Default Welcome Intent', welcome);
intentMap.set('Default Fallback Intent', fallback);
intentMap.set('VerificarBiologia', verificarBiologia);
agent.handleRequest(intentMap);
});
EDIT
The request body is as follow:
{
"responseId":"f4ce5ff7-ac5f-4fec-b5bd-4e5007e4c2de",
"queryResult":{
"queryText":"Quando tenho prova de biologia?",
"parameters":{
"disciplinaBiologia":"biologia"
},
"allRequiredParamsPresent":true,
"fulfillmentText":"Voc� tem uma prova de biologia no dia 30. Tire suas d�vidas com o professor.",
"fulfillmentMessages":[
{
"text":{
"text":[
"Voc� tem uma prova de biologia no dia 30. N�o deixe de fazer os exerc�cios."
]
}
}
],
"intent":{
"name":"projects/verificadorprovas/agent/intents/020017a0-e3a9-46f0-9a2e-d93009f5ac42",
"displayName":"VerificarBiologia"
},
"intentDetectionConfidence":1,
"languageCode":"en"
},
"originalDetectIntentRequest":{
"payload":{
}
},
"session":"projects/verificadorprovas/agent/sessions/3700fddf-3572-4221-fffc-a0dc1bf28330"
}
Can someone help me to do that? What do I have to do to get something that I can use to identify the user?
Thanks in advance
Testing things in the Dialogflow simulator with "Try it now" does not simulate the Actions on Google environment. To do that, you need to use the Actions on Google Simulator, which you can get to by clicking on "See how it works in the Google Assistant" a few lines down.
If you are planning to go for Google Assistant using Actions on Google, then the best way to identify a user is by using Account Linking. Check out the following link for more around fetching user information for Google Assistant.

Resources