I have a Google Assistant app already in production, which gives First Aid treatment procedures whenever someone is in need. I built this app with Dialogflow. It was running well for the past few months, but now whenever I call something after the app's first statement, it gives me this error:
Request contains an invalid argument.
The query pattern 'Where can I get `$SchemaOrg_Number:ordinal aid information?'
contains an undefined parameter` (name: 'ordinal' type: 'SchemaOrg_Number').
This error comes when I try to implement my app from Dialogflow to Google Assistant, and from the error message, it is not clear that exactly which part of the program is faulty. I have no idea where to go from here and I'd love your help. Thanks a tonne!
Here's my Dialogflow Screen
Here's the AoG Simulator Screen where the error happened
This error must have occurred due to the incorrect phrases in the Actions of your app. To check this go to Actions tab from the AOG console. Check every intent for any error. Also, check that the parameter used in the query must be defined below the phrases.
Actions menu screenshot
Parameters screenshot
Related
I have an action that no longer works when issuing an utterance with "ask [invocation name] [utterance]"
it just returns with a "The agent returned an empty TTS".
However, if i set up nGrok, and try to debug it, the request never gets sent to nGrok and thus never hits my endpoint (backend code base).
Looking for next steps to debug this issue.
Note, "Talk to [invocation name]" works perfectly fine.
Also, if i use the "try it now" feature for utterance testing within dialogflow, with the same utterance, it triggers my endpoint.
The intent that is to be triggered is set as "Implicit Invocation"
While testing in the Simulator, Request tab is empty, Response is empty, Debug shows "the agent returned an empty TTS.", and Errors is empty as well. StackDriver logs does not even show this request being made.
While searching for other ideas, it appears this is possibly the same issue
Try to change your invocation name.
I had the same problem and that minor change fixed it!
Maybe is because you have some reserved word in your invocation
I have created an action currently under review for approval and there is a problem (better described below) that shows only when the user is a member of the Google review team.
5 coworkers/friends of mine from different countries (India, Germany, Switzerland, Italy) have tested the app from android Google assistant and action console simulator and the problem never occurred. The application worked flawlessly.
I hope someone can explain why this happen and suggest a solution. I worked hard on this app and I cannot publish it and have real users until this problem is not solved.
TECHNICAL DETAILS ON THE PROBLEM:
basically the Context name received from the Google team tester is crashing the Google cloud api Parser. I am using C# in my web hook with Google.Cloud.Dialogflow.V2 1.0.0-beta02.
Example of Context name received when tester from google action team uses the application
projects/blitzy-84d12/agent/environments/__aog-2/users/-/sessions/ABwppHEMNPVl9O-OVXEOzT_ch6uSa_cZ08pHV6YUF5kpkSwZHDVmk6ShexlLi50yWFMkktYClMDX01z9/contexts/sendmessageprocessing
Example of Context name recevied when me or my friends test the application (using android phone or action console simulator)
projects/voice-rider/agent/sessions/ABwppHFyb9HLDSmiETz6d91QveUw0kTIjC5T1kJmNF2QVKrRBrHtTvR3t83lhU9hVxORZ8rXBbQBtRQ/contexts/tellmemypositionprocessing
Error produced while parsing the context name with Google.Cloud.Dialogflow.V2.ContextName.Parse(String contextName)
errMsg=Name does not match template: incorrect number of segments
the source code of the error can be seen at line 272 of this github source:
https://github.com/googleapis/gax-dotnet/blob/master/Google.Api.Gax/PathTemplate.cs
it's quite easy to understand why the Google cloud api Parser throw the exception, it is receiving a context name with more segments than expected. By segments we mean the number of strings separated by a slash '/' character. Displaying the context name of my tests with the one of google team tester we can see the difference:
projects/blitzy-84d12/agent/environments/__aog-2/users/-/sessions/ABwppHEMNPVl9O-OVXEOzT_ch6uSa_cZ08pHV6YUF5kpkSwZHDVmk6ShexlLi50yWFMkktYClMDX01z9/contexts/sendmessageprocessing
projects/voice-rider/agent/ sessions/ABwppHFyb9HLDSmiETz6d91QveUw0kTIjC5T1kJmNF2QVKrRBrHtTvR3t83lhU9hVxORZ8rXBbQBtRQ/contexts/tellmemypositionprocessing
in the google team tester contextname there is this additional block:
environments/__aog-2/users/-/
composed of 4 additional "alien" segments that crash the Google api parser
I could try to override the Google api parser with my own parser but really I dont want to enter that path, for the negative side effects and future compatibility it could produce.
Thank you for your support
I have on Dialogflow agent and I want to sync it with google assistant.
For user says section, I want to use Template mode. It works fine in api.ai, but when I click on option 'See how it works in Google Assistant', it throws error like 'Request contains an invalid argument.
Query pattern contains invalid characters in custom intent '999154f0-830a-480b-bdc9-2e9e89f45cdd': $SchemaOrg_Color:color $SchemaOrg_Text:query $quantifier:quantifier #sys.unit-currency:unit-currency $Filters:Filters'
After checking each and every annotations, I came to know that #sys.unit-currency:unit-currency is creating issue. Its working with api.ai agent but not with google assistant.
I've attached screenshot of intent. What's need to be corrected? I went through many blogs, but not able to find any help regarding this.
During our testing, we were unable to complete at least one of the behaviors or actions advertised by your app. Please make sure that a user can complete all core conversational flows listed in your registration information or recommended by your app.
Thank you for submitting your assistant app for review!
During testing, your app was unable to complete a function detailed in the app’s description. The reviewer interacted with the app by saying: “how many iphones were sold in the UK?” and app replied “I didn't get that. Can you try with other question?" and left conversation.
How can I resolve the above point to approve my Google Assistant action skills?
Without seeing the code in question or the intent you think should be handling this in Dialogflow, it is pretty difficult - but we can generalize.
It sounds like you have two issues:
Your fallback intent that generated the "I didn't get that" message is closing the conversation. This means that either the "close conversation" checkbox is checked in Dialogflow, you're using the app.tell() method when you should be using app.ask() instead, or the JSON you're sending back has close conversation set to true.
You don't have an intent to handle the question about how many iPhones were sold in the UK. This could be because you just don't list anything like that as a sample phrase, or the two parameters (the one for object type and the one for location) aren't using entity types that would match.
It means that somewhere, either in your app description or in a Dialogflow intent(they have full access to see what's in your intents) you hinted that “how many iphones were sold in the UK?” would be a valid question. Try changing the description/intents to properly match the restrictions of your app.
I've created a simple helper in api.ai to tell and added a few intents with responses that link to google actions. When I link my agent to a google project and test it with those intents i get the following errors:
expected_inputs[0].input_prompt.rich_initial_prompt: the first element must be a 'simple_response' or a 'structured_response'.
and
expected_inputs[0].input_prompt.rich_initial_prompt.items[0].basic_card.image: 'accessibility_text' is required.
these are both classified as Malformed Response Errors, but I don't really understand seeing as I didn't write any code, simply just used used the UI for api.ai and google projects
Any ideas?
The problem is that the Actions on Google responses still require a text response to be displayed and/or spoken in addition to the card responses. So in the Actions on Google section of the response, you must either set "Use response from the DEFAULT tab as the first response" on:
or you must add a Simple Response:
When you enter a Basic Card, if you enter an Image URL, you must also enter the Image Accessibility Text: