I have been writing an actions on google app tied into dialog flow with a .net fulfillment back end. My app needs coarse location to function. However in testing the location is always empty when I use the coarse location permission. When I change the permission to precise it sends lat and long back, but when I use coarse it does not send back the address entered in the simulator location box.
Edit: The device type is set to speaker at the top.
Edit2: Added images album
https://imgur.com/a/C6XPATL
According to the documentation:
Currently, precise location only returns lat/lng coordinates on phones
and a geocoded address on voice-activated speakers. Coarse location
only works on voice-activated speakers.
To make sure you are testing with the speakers in simulation, make sure you have selected that device icon before you invoke the action
Related
I am trying to have different input views for different devices in Bixby.
For example, on a Bixby tablet, i would want to allow the user to select multiple switch-inputs in a single screen. For the bixby speaker, on the other hand I would like to break this up and only prompt the user for one input at a time. Is this possible?
You can definitely customize how information is presented to the user depending on the device being used.
First, you will need to declare the devices your capsule will support by defining the targets as shown below:
capsule {
id (playground.example)
version (0.1.0)
format (3)
runtime-flags {
modern-prompt-rejection
support-halt-effect-in-computed-inputs
}
targets {
target (bixby-mobile-en-US)
target (bixby-mobile-en-GB)
}
}
Once your targets have been declared, you can create views for them. When you create a view in Bixby Studio, the "Create New File" pop-up window has the option for you to define the device that your view will be for. You will end up having multiple views presenting the same information across different devices and Bixby will use the correct one depending on the device.
Additionally, you can also use Hands-Free List Navigation to further refine the behavior of your views.
TV device does not support touch operation, users have to use Remote Control. We need to move focus by press left, top, right, bottom direction key, and when the target widget get focus, we will press OK button to response kinds of key event. But I cannot find any flutter interface to solve this interaction, anyone who can help me?
There are SystemChannels for this.
I haven't tried it myself, but it looks like this should do what you need:
DartDocs - SystemChannels.keyEvent
A JSON BasicMessageChannel for keyboard events.
[DartDocs - SystemChannels.textInput[(https://www.dartdocs.org/documentation/flutter/0.0.41-dev/services/SystemChannels/textInput-constant.html)
A JSON MethodChannel for handling text input.
This channel exposes a system text input control for interacting with
IMEs (input method editors, for example on-screen keyboards). There is
one control, and at any time it can have one active transaction.
Transactions are represented by an integer. New Transactions are
started by TextInput.setClient. Messages that are sent are assumed to
be for the current transaction (the last "client" set by
TextInput.setClient). Messages received from the shell side specify
the transaction to which they apply, so that stale messages
referencing past transactions can be ignored.
The later is used in https://github.com/flutter/flutter/blob/4389f07024a4c69f7223401abd4d0ab3ecc45698/packages/flutter/lib/src/services/text_input.dart
There are known issues with physical keyboards thought that might cause this use case not to work
https://github.com/flutter/flutter/issues/11177
https://github.com/flutter/flutter/issues/7943
https://github.com/flutter/flutter/issues/9347
Yep,Flutter doesn't support D-pad navigation yet.But,I have an Android Smart TV and if I connect a Bluetooth mouse,I am able to navigate,swipes,click,etc my Flutter app on the TV.
How to display list of available audio outputs when Bluetooth is connected and switch to user selected audio route?
I tried checking AVAudioSession.sharedIntsance().CurrentRoute.outputs, but it always returns only one output route.
We are trying to capture a phone number. Actually many other numbers, like amounts, zip, etc. We are using Google Home.
The below urls are JSON payloads we received on the fulfillment side. The entity name is TheNumber.
One JSON is when we setup the entity as #sys.number the other JSON when it was #sys.phone-number.
https://s3.amazonaws.com/xapp-bela/gh/number-test.json
https://s3.amazonaws.com/xapp-bela/gh/phone-number-test.json
The first problem is that the google assistant is really struggling to recognize number sequences, like phone numbers or zip codes. But even when it gets it right (according to the originalRequest in the JSON payload), the entity still has the wrong value when it arrives to the fulfillment side.
I guess my question is what am I doing wrong? Is anybody seeing the same problems?
Not sure this will help since this is more about talking to the Google Home device but.... I too was having a similar issue with a long number. If you use #sys.number-sequence as part of your Intent's context, this will allow you to recite much longer numbers without the device interrupting you. In your NodeJS code, you can grab the argument for that number-sequence for use in your Google Home agent.
if (assistant.getArgument('number-sequence') != null) { <do something> }
I'm working on a windows phone app where I want to provide a search of the specified place and locate it on Map control. I'm using GeocodeQuery to search for a search term:
private void SearchForTerm(String searchTerm)
{
myGeocodeQuery = new GeocodeQuery();
myGeocodeQuery.SearchTerm = searchTerm;
myGeocodeQuery.GeoCoordinate = new GeoCoordinate(0, 0);
myGeocodeQuery.QueryCompleted += GeocodeQuery_QueryCompleted;
myGeocodeQuery.QueryAsync();
}
The problem is, that this code works only with location service or wifi turned on. With only location on and wifi off I can't search for every place, even if I have it on my map, but don't have downloaded detailed maps.
For example I can zoom in into Italy and I can see Rome, not streets in detail, but the name of the city is Visible. When I search for "Rome", I'll get 0 results.
This looks like the location service feature is not usable when offline, even with maps in phone. I didn't find any tutorial or example explaining this feature in detail. With this example on Nokia developers I have the same problem.
Windows phone gathers location data from three sources
Location Services
Network(Data)
Sim
And a cumulative of the three sources gives the best result. Obviously, you can get the data from location services alone but data from location services aggregated with the data phone gets from wifi or sim location is supposed to be the most accurate location.
just read this once.
also use
myGeolocator.DesiredAccuracyInMeters = value;
for more accuracy