Dialogflow agent is matching intents accurately, but I want to use the same agent to recognize other languages and match the proper intents.
I'm using actions on google and dialogflow, I'm using webhook for most of the intents which i developed in nodejs using actions-on-google npm library.
I tried dialogflow middleware to detect the language of the user query and then convert into English language, but middleware is triggered after intent matching.
So what I want to achieve is convert the user query into English first then match one of the intent.
Related
My setup:
I'm using actions-on-google to create a custom action for google assistant, every intent is a fallback in dialogflow that is then redirected to my server where I check the intent with a custom nlp engine and then respond to it, using the actions-on-google package.
The issue:
But when an intent is suicidal, for example: "I want to kill myself", google stop the action and give a response instead of letting my custom action handle it. Is there a way to avoid this?, it also happens with certain words like "Save" in that case google stop the action and ask me if I want to save an event or whatnot.
Can I override this behavior?
You're running into a "feature" that Google calls "no-match yielding". In some cases, if you're using a Fallback Intent in Dialogflow, and the Google Assistant itself can handle the question - it will do so.
To avoid this you can use regular Intents that match the #sys.any System Entity.
However, if all you're doing for everything is sending it to your NLU, then you don't need to be using Dialogflow at all. You can just use the Actions on Google v2 to send all TEXT Intents to your NLU for processing.
Any idea or examples as to how to detect intent using the Dialogflow detect-intent api(https://cloud.google.com/dialogflow/docs/reference/rest/v2/projects.agent.sessions/detectIntent) from a JSON file?
Thank you in advance.
The function is supposed to be hosted on Firebase functions and I am using nodejs.
I will try to explain what I am trying to do.
So I have a bunch of Intents on Dialogflow already. And I know that the starting intent is going to be the Welcome intent. After the Welcome Intent I want to shape the conversation according to the intent list on the json file.
The entire flow is something like this.
Bot: Hello, What form would you like to fill up today?
User: Leave form
Dialogflow matches the leave form with the leave form entity and pulls the leave form json from the storage. The json will contain the fields that are required to be filled up for the form
Bot: Hi, so I would need some details to fill this form.
Bot: Let's start with your name ..... etc etc
So, instead of detecting the intent from the user's side, I need to detect what intent, from the json. The Webhook fulfillment is currently being hosted on Firebase functions and the json is being stored on Firebase Storage(not Database)
The Detect Intent API only helps you to detect the intent from the text sent by you to the Dialogflow agent using the API. So you cannot read the intent from the JSON file using the Detect Intent API, your intent must be imported to the Dialogflow agent.
I would like to give my dialogflow based chat-bot a name, so that when I type Mels it would recognize that I am referring to it. Is there any way to do that??
There are two ways to do it.
Explicit invocation : This way, user has to specifically ask to chat with your chatbot. For example, my chatbot name is "SukhiBot". If user types, "talk to SukhiBot" in GoogleAssistant, then the chatbot is started and it takes care of further conversation.
Implicit invocation : You define a intent specific for the trigger. e.g. if your chatbot is about providing bill/invoice information then you set specific intents with keywords around billing and invoicing. When Dialogflow determines the particular intent from user's text, your chatbot will be invoked. You can set this up in the following screen with Google Assistant.
By invoking your app via google assistant, you should use the invocation phrases. For instance:- "Ok Google, talk to nutrition calculator", in this phrase, nutrition calculator is your app name.
You can set your invocation phrases in the action on google console.
Some helpful resources:-
https://developers.google.com/actions/discovery/
I have developed a complex dialogflow bot, which uses complex dialogs to get data from user and then store it in database. And then give responses to users by querying db.
Can i use the same logic/webhooks/code to call from alexa skill? I don't want to write such a complex logic again for alexa skill.
What I want is whenever an alexa intent is called by user, i want to transfer that intent to my dialogflow webhook to handle it. Is it possible? If so then can you please provide any documentation/examples/tutorials ets.
My dialogflow model consists of 4 slot types:
Date
Number
any
some custom slots
I am certain this is not possible straight away as the REST API of Dialogflow will be different from that of Alexa. Also, Alexa is not fully supported for integration in Dialogflow like Facebook or Slack. If your code is well written and business logic is separate from the platform/request/response mapping then you will be able to use the same business logic in your Alexa webhook code. You just need to write the code for consuming the REST API of Alexa in this case.
Yes, this is possible. While Dialogflow and Alexa have different webhook JSON formats, fundamentally they both do the same thing. You will need to handle parsing the JSON to get what you need, and then formatting the response, so each uses their particular format - but the logic that you are using should still be sound and available to both.
Dialogflow lets you export the model into an Alexa compatible format that you can paste into the Alexa Skills Kit. This helps at least a bit.
I'm trying to use a Translate API webhook to enable multi-language functionality in my DialogFlow agent.
The intents are configured in English
When text is input to the bot in any other language, it goes to the Default Fallback Intent where the webhook is called for translating this text. The translated output and the user language context are returned.
If I knew exactly what intent was to be triggered next in sequence, I could just have a Follow-up Event configured for this. But here the bot's output is translated text which shouldn't print but needs to be matched against all the English intents to see which is a hit. Then the output defined in the matching intent should be translated again to the user's language and produced as output.
Not sure how to have the translated text from the fulfillment go through all the intents and match accordingly. Please help?
For your step 3, you'll want to call Dialogflow's API with the translated text. If you're using Dialgogflow v1, you'll be calling the /query endpoint. If you're using Dialogflow v2, you'll need to have a session setup and then use the detectIntent action.
You'll then take the result you get back from this call and translate it back into the user's language, and send the result to them.