I've built a webhook for DialogFlow intent fulfillment. Is there any way to include formatting in the text responses provided by the web hook? Ideally I'd like to bold, italicize, make links, and add newlines. Especially newlines.
I've tried including \n in my response, but no joy.
Actions on Google supports a limited subset of markdown in the cards, but only the new line in the Simple Response.
new line with a double space followed by \n
**bold**
*italics*
More info in documentation - responses.
Related
I have created a simple conversational flow in Dialogflow that accepts various questions and speaks pre-programmed replies, all defined in a series of intents. There are no external hooks etc.
When used on a screen based device (eg. mobile phone) I want to display more text than that which is spoken. (displayText) eg:
User: "What colour is the sky?"
Bot: "Blue" (spoken and displayed on screen). "At night it is black". (Additional information displayed on screen only.)
I want to do the same for each intent.
What is the simplest way of achieving that please? I would prefer to keep most of it in Dialogflow and to write the minimum amount of code possible.
It is ok, I found the solution thanks. In Dialogflow intents under Response there are two tabs, Default and Google Assistant. Under Google Assistant there is an option Customise audio output. When you select that you get two input fields, one for text and one for speech.
So to use the above example under intent training phrase I entered "What colour is the sky?"
Under Default Response I entered "Blue"
Under Google Assistant response, Text Output field I entered: "Blue. At night it is black."
Under Google Assistant response, Speech Output field I entered: "Blue".
It works perfectly in both Google Home (voice only) and Assistant on mobile phone (Speaks "Blue" but displays "Blue. At night it is black.")
It doesn't even seem necessary to enter anything in Default Response. It works fine on Google Home and Assistant on the phone without it. Not sure about other platforms though.
The bot which I have created within Dialogflow is using a webhook to link to our external site.
One of the intents we have for the bot is to search for knowledge
within the site. Originally, we had in the Request Knowledge intent,
a phrase which was a #sys.any parameter, which would then be the
search term.
However, because the whole phrase was a #sys.any parameter, this
would be prioritised over most other intents.
We are trying to get users to use natural language when using the
bot, however people still do just type in one word or a phrase for
the search function.
What I would like if possible is to have a fallback intent which is
the search function. So if the bot cannot successfully match the one
word, it would then run a search for this word.
I am not sure if this would fix this problem or just produce more issues.
If anyone has solved something similar to this, I would greatly appreciate the help. Sorry if this is simple to do, I am all new to the whole Dialogflow world!
You can turn fulfillment on for Fallback Intents, and these will be sent to your webhook. The JSON includes the full text of what was entered.
However... the results will clearly be less useful since some of the results will be text that is conversational, but didn't get picked up by one of the other Intents.
Please excuse me if this is a really basic question - I'm very much still learning, and I just cannot find a solution.
I'm trying to use the standard basic text responses in Dialogflow, which from what I understand, should work.
What I want to do, is have a set statement (Okay, let's see what I can find), then a random pick from a list, then another set statement, essentially stacking the responses in Dialogflow (see screenshot).
It works absolutely fine in Dialogflow's test console - however, it doesn't do what I want when I take it into the Google action simulator.
Have I made a stupid error, missed a toggle switch somewhere, or am I trying to do something unsupported?
To surface text responses defined in Dialogflow's default response tab go to the Google Assistant response tab and turn on the switch that says "Use response from the DEFAULT tab as the first response.":
with Dialogflow (API.AI) I find the problem that names from vessel are not well matched when the input comes from google home.
It seems as the speech to text engine completly ignore them and just does the speech to text based on dictionary so Dialogflow cant match the resulting text all at the end.
Is it really like that or is there some way to improve?
Thanks and
Best regards
I'd recommend look at Dialogflow's training feature to identify where the speech recognition of the Google Assistant may not have worked they way you expect. In those cases, you'll see how Google's speech recognition detected words you may not have accounted for. In cases where you'd like to match these unrecognized words to a entity value, simply add them as synonyms.
I use api.ai to understand what end-user wants from my app.
One of my apps functionalities is translating text using google api.
For that i need to catch string between quotation marks.
How to catch them?
You could use the #sys.any for that. It's a system entity that can be used to capture any kind of string conform a sentence structure.
I've post a picture below as an example of how I achieved this with the expected output.
API.ai Example