I'm new to Api.ai , i read the doc. but i didn't understand how Api.ai works better with many parameters.
I'll try to explain by an example :
I have a Management software which manages the members/actions/projects , where i can get the actions of any member at any project using the normal interface.
let's replace this with a smart bot where the chat will run as i expected below,
USER : i want to see my actions for ANY PROJECT NAME HERE
bot : your action is XXXXXX.
OR
USER: give me all the members of the project ANY PROJECT NAME
Bot: Members are "1-2-3-4-5-...."
i think you got what i mean , if you need more i can explain more.How can i let Api.ai understands this ?
For API.ai to 'remember' values (ie store and retrieve information such as the names of projects, actions and team members) you will need to connect API.ai to a webhook/database of your own, there isn't anyway for API.ai to do this on its own.
Once you connect API.ai to a custom webhook/database you can use the variables that API.ai will parse for you to run your query. You simply need to build the intents corresponding to the search and parameters involved
Here's how the process would flow:
User asks "I want to see my actions for [ANY PROJECT NAME HERE]"
API ai logic recognizes this as the intent 'search-action' for $project_name, you having set this up in API.ai like this
Your custom webhook receives JSON response from API.ai that in this case would look like this:
{
"id": "REDACTED",
"timestamp": "2017-04-19T03:18:18.028Z",
"lang": "en",
"result": {
"source": "agent",
"resolvedQuery": "I want to see my actions for project Unicorn",
"action": "search-action",
"actionIncomplete": false,
"parameters": {
"project_name": "project Unicorn"
},
"contexts": [],
"metadata": {
"intentId": "REDACTED",
"webhookUsed": "false",
"webhookForSlotFillingUsed": "false",
"intentName": "Search - Actions"
},
"fulfillment": {
"speech": "",
"messages": [
{
"type": 0,
"speech": ""
}
]
},
"score": 1
},
"status": {
"code": 200,
"errorType": "success"
},
"sessionId": "REDACTED"
}
So, your webhook has logic that recognizes when result.action is 'search-action' is should run a database search for actions in project result.parameters.project_name
Your webhook fulfills the API.ai request, or alternatively, sends message to message platform directly (ie Facebook messenger)
Related
I have managed to retrieve the context variables from the webhook using dialogflow-fulfillment agent.context.get() function however it does not provide the context variables that aren't set as input in that specific intent.
Intent A: "What city are you in?"
Through Dialogflow UI, the output context placeholder 'city' is set .
User answers => "Montréal"
Here I set the output context 'city' with the following code through the fulfiller:
agent.context.set('city', 100, {cityName: "Montréal"})
Intent B: "When will it freeze?"
Intentionally, no input/output context is set on Dialogflow. As I want the user to be able to ask this question without context.
When the intent is matched, Dialogflow sends the following request to my fullfiler.
{ "responseId": "496a4ed8-8421-4297-b464-adc54632cc93-32d6a6f2", "queryResult": { "queryText": "when will it freeze", "parameters": {}, "allRequiredParamsPresent": true, "fulfillmentText": "It will freeze next week", "fulfillmentMessages": [ { "text": { "text": [ "It will freeze next week" ] } } ], "intent": { "name": "projects/project-name/locations/global/agent/intents/7df0bb1d-f4ef-4e69-b9c4-4b6a2f91c95f", "displayName": "zone.freezing_dates via Fulfiller" }, "intentDetectionConfidence": 1, "diagnosticInfo": { "webhook_latency_ms": 59 }, "languageCode": "fr", "sentimentAnalysisResult": { "queryTextSentiment": { "score": 0.1, "magnitude": 0.1 } } }, "webhookStatus": { "message": "Webhook execution successful" }, "agentId": "5d8b71c7-cd3a-453a-b446-6d00334f1ae8" }
As you can see, the context parameters aren't present in the webhook request even though the city parameter has been set by intent A).
Therefore, how do we retrieve the context variables from the user's session (and not from the intent's webhook)?
I have a feeling it has to do with the agent session path as described here but in this sample, it's part of the intent (again) as I expect a way to retrieve the agent context outside of the intent's scope.
I will also add as a note, I know the city context var exists because I can see it through the Dialogflow UI when I add it to the response text.
I have an application within watson assistant that consumes many services from other endpoints. and I would like to call this conversation (from watson) within a google assistant conversation in a certain intention. for example i will develop a rich conversation on google assistant and in one of the options i will call watson's conversation.
I tried as follows, but it didn't work. does anyone know any example that can help me?
{"locale": "pt-BR",
"actions": [
{
"description": "Launch intent",
"name": "MAIN",
"fulfillment": {
"conversationName": "mainConversation"
},
"intent": {
"name": "actions.intent.MAIN"
}
},
{
"description": "Direct access",
"name": "BUY",
"fulfillment": {
"conversationName": "ExampleAction"
},
"intent": {
"name": "com.example.ExampleAction.BUY",
"trigger": {
"queryPatterns": [
"teste",
"azul",
"start"
]
}
}
}
],
"conversations": {
"mainConversation": {
"name": "mainConversation",
"url": "https://us-central1-ericanovo-798cc.cloudfunctions.net/webhook",
"fulfillmentApiVersion": 2
},
"BUY": {
"name": "ExampleAction",
"url": "https://orquestrador-sulamerica-teste.mybluemix.net/api/v1/chat/google?externaltoken=574213c0-e904-11e9-9970-ff484aa25334",
"fulfillmentApiVersion": 2
}
}
}
thanks
That won't work because the webhook for everything published under the same project has to be the same URL. You are expected to handle all the Intents and "actions" at that webhook.
In your case, you would also need to make sure the request is formatted the way the Watson API would be expecting it. The Assistant will send it using the Conversation Webhook Format, and it sounds like you would send it using Watson's Analyze Text API.
You're not showing any of your code, so it is difficult to be sure - but the first would be in a JSON format that you can extract. You can then use a library in Node (such as request-promise to make the calls to Watson. Based on the result from Watson, you'd need to format the results as a response and return it to the Assistant.
It isn't clear why you'd need multiple webhooks specifically, although it is certainly possible that some Intents may make different API calls than others.
Keep in mind that your custom Intents will only be valid on invocation. Subsequent Intents will all be TEXT Intents.
I am building a bot for google assistant. I have enabled fulfillment section for some intents. Dialog flow sends the request to the fulfillment url. The url is executed and a hard coded response is returned. I can see the response in the assistant simulator. Everything works fine except one thing. The request is empty.I can't access fields that are supposed to be present in the request.
I have accessed the same url using post request from a python code and it displays the parameters. So, there are no issues in the code. I think I am missing some configuration option.
I was expecting the post body in the following format:
POST body:
{
"responseId": "ea3d77e8-ae27-41a4-9e1d-174bd461b68c",
"session": "projects/your-agents-project-id/agent/sessions/88d13aa8-2999-4f71-b233-39cbf3a824a0",
"queryResult": {
"queryText": "user's original query to your agent",
"parameters": {
"param": "param value"
},
"allRequiredParamsPresent": true,
"fulfillmentText": "Text defined in Dialogflow's console for the intent that was matched",
"fulfillmentMessages": [
{
"text": {
"text": [
"Text defined in Dialogflow's console for the intent that was matched"
]
}
}
],
"outputContexts": [
{
"name": "projects/your-agents-project-id/agent/sessions/88d13aa8-2999-4f71-b233-39cbf3a824a0/contexts/generic",
"lifespanCount": 5,
"parameters": {
"param": "param value"
}
}
],
"intent": {
"name": "projects/your-agents-project-id/agent/intents/29bcd7f8-f717-4261-a8fd-2d3e451b8af8",
"displayName": "Matched Intent Name"
},
"intentDetectionConfidence": 1,
"diagnosticInfo": {},
"languageCode": "en"
},
"originalDetectIntentRequest": {}
}
But when I print the post data using print(request.POST), the actual post request shown is
One more thing: Does dialog flow append the action at the end of the fulfillment url? If so, I will have to handle the logic separately. I have done it without considering the action name. But a lot of my stuff is hacked, so I just want to be sure.
On another note, is dialogflow good enough? It has worked fine on a few examples similar to what it was trained on. How many training samples does it need to work properly? What is the underlying algorithm used in dialogflow? Or should I use the fulfillment url and handle everything on my own? I am inclined towards the later. I do not have too much faith in the existing chatbots.
Any help is appreciated.
If the Fallback Intent is the one being triggered, then you wouldn't get any parameters since this means that nothing else matched.
Got it. Used request.body. This solves the problem. Then parsed it using json.loads and accessed the parameters.
I have an API.ai agent that sends a request (comes from the user) to a webhook which needs a lot of processing (more than 5 sec) to get the answer. As far as I know, that there is no way to increase the response timeout in API.ai
So, I have created 2 intents. The first one simply will call my webhook to start the processing the result, and at the same time the webhook will reply to the user, "Your request is under processing...".
The second intent has an event and action. The purpose of the new event is just to display the result to the user.
Once the result is ready, my backend application will send a curl statement to trigger the event in the second intent with the necessary parameter modifications like sessionID, v, and time zone … etc.
I have received the following JSON from API.AI (I created an example to simplify my case):
{ "id": "de31ee96-c42f-4f2d-8461-ee39279ec2ed", "timestamp": "2017-09-27T13:39:46.932Z", "lang": "en", "result": {
"source": "agent",
"resolvedQuery": "custom_event",
"action": "test",
"actionIncomplete": false,
"parameters": {
"user_name": "Sam"
},
"contexts": [
{
"name": "welcoming-followup",
"parameters": {
"name.original": "",
"user_name": "Sam",
"name": "",
"user_name.original": ""
},
"lifespan": 2
}
],
"metadata": {
"intentId": "c196a388-16ac-4966-b55c-7cd999a7d680",
"webhookUsed": false,
"webhookForSlotFillingUsed": "false",
"intentName": "Welcoming"
},
"fulfillment": {
"speech": "Hello Sam",
"messages": [
{
"type": 0,
"speech": "Hello Sam"
}
]
},
"score": 1.0 }, "status": {
"code": 200,
"errorType": "success" }, "sessionId": "67cb28fd-6871-750c-d668-d0b736b763ec" }
Here is the curl statement that was sent by my backend.
The curl statement is: curl -X POST -H "Content-Type: application/json; charset=utf-8" -H "Authorization: Bearer I INSERTED THE CORRECT CODE HERE" --data "{'event':{ 'name': 'custom_event', 'data': {'name': 'Sam'}}, 'timezone':'America/New_York', 'lang':'en', 'sessionId':'a6ac2555-4b19-40f8-92ec-397f6a042dde'}" "https://api.api.ai/v1/query?v=20150910"
As shown from the above JSON, the API.ai agent received the trigger successfully. But, The response that I have specified in the “Response Section” does not appear to the user.
I attached a screenshot for the second intent in the API.ai agent.
Note: I tried the agent in developer console, WebDemo and Slack. None of them shown to me (as a user) the specified response.
I am not sure if I did something wrong?
screenshot of the second intent
API.AI is not really meant to handle event-driven activities. It is meant to be the intermediary in a conversation - so the normal pattern is:
User says something
API.AI processes this, possibly with a webhook, and sends a response.
Devices such as Google Home do not have a way to get a notification, so unless the user says something (step 1), then you will never get to step 2.
When you try to trigger it manually, API.AI is treating your trigger as the step 1, and it is replying to your trigger. It has no way to send that reply back to the Assistant because it isn't having a conversation with the Assistant at that moment - it is having a conversation with however you manually triggered it.
There isn't really a good way to do what you want right now. We know notifications are coming to the Assistant eventually (it was announced at I/O 2017), but we don't know if it will have an API or what it will look like. The Transaction API does have notifications as part of it, but Transactions are meant for activities where you are purchasing or reserving something. If you need to, you can use something like Firebase Cloud Messaging to let your user know they can ask for the result, but that's a sub-optimal experience.
I have an intent where I might say 'Transfer 4 to Bob' and it identifies this as 'Transfer for to Bob'
Also I might say 'Transfer 10 to Bob and it identifies this as 'Transfer 102 Bob' treating to word to as 2 on the end of the previous number.
What is the best way to get API.AI to recognise these parts correctly so 4 is not for and to is not 2?
You mentioned that you're using the Actions on Google platform. This means that speech recognition - the process of translating what the user says into text - is happening before the data gets to API.AI.
The problem you're experiencing is that Actions on Google is misrecognizing some numbers as words, e.g. four becomes for.
Because this happens before - and separately from - API.AI, you won't be able to fix the misrecognition.
Below, I'll explain how you can work around this issue in API.AI. However, it's also worth thinking about how you could make your conversation design as robust as possible so that issues like this are less likely to cause problems.
One way you could increase robustness would be to mark the number as a required parameter in API.AI so the user is prompted if it isn't detected due to a recognition error. In that case, the dialog would go like this:
User: Give me four lattes.
App: Sure, four lattes coming up.
User: Give me for lattes.
App: How many do you want?
User: Four.
App: Sure, four lattes coming up.
Regardless, here's a workaround you can use to help recover from misrecognition:
In your intent, provide examples of these commonly misrecognized values. Highlight and mark them as numbers.
Test out your intent out in the console and you'll see that "for" is now matched as a "number" entity with value "for".
In your fulfillment webhook, check the parameter for this value and convert it to the appropriate number using a dictionary. Here's the JSON for the above query:
{
"id": "994c4e39-be49-4eae-94b0-077700ef87a3",
"timestamp": "2017-08-03T19:50:26.314Z",
"lang": "en",
"result": {
"source": "agent",
"resolvedQuery": "Get me for lattes",
"action": "",
"actionIncomplete": false,
"parameters": {
"drink": "lattes",
"number": "for" // NOTE: Convert this to "4" in your webhook
},
"contexts": [],
"metadata": {
"intentId": "0e1b0e72-78ba-4c61-a4fd-a73788034de1",
"webhookUsed": "false",
"webhookForSlotFillingUsed": "false",
"intentName": "get drink"
},
"fulfillment": {
"speech": "",
"messages": [
{
"type": 0,
"speech": ""
}
]
},
"score": 1
},
"status": {
"code": 200,
"errorType": "success"
},
"sessionId": "8b0891c1-50c8-43c6-99c4-8f77261acf86"
}