pass http://yousite/sample.mp3 to external android app to stream - audio

I have a link http://yousite/sample.mp3 and I want to pass it to default audio player in android:
I'm doing:
RssItem item = (RssItem) adapter.getItem(position);
Uri uri = Uri.parse(item.getLink());
Intent intent = new Intent(Intent.ACTION_VIEW, uri);
intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
intent.setType("audio/mp3");
startActivity(intent);
and it says:
android.content.ActivityNotFoundException: No Activity found to handle Intent { act=android.intent.action.VIEW typ=audio/mp3 }
if I omit intent.setType("audio/mp3"); it opens in the Browser with audio player in it which is clearly not what I want. I then tried to share:
shareIntent.setAction(Intent.ACTION_SEND);
shareIntent.putExtra(Intent.EXTRA_STREAM, uri);
and there was no application found to share to!

Related

How can i set default answer in Q&A Azure bot

I want change Default Answer in Q&A Maker Azure Framework Bot, but I cant find field that respond this value. I'm reading documentation (but it looks like it uses an older interface), and I'm trying to find this field but with result.
Here's my current configuration screen:
I'm assuming that you're referring to these docs: QnaMaker - Change Default Answer
They're a little confusing, but they key part is:
You can override this default response in the bot or application code
calling the endpoint.
Where the docs have this image:
What they actually mean is that in the QnAMaker Test Console, you can edit the default answer from your Application Settings. Be sure to Save, Train, and Publish your app or the setting may not show.
There's also kind of a way that you can use this setting for your default answer in a bot:
In Node/JS, your bot will not receive that DefaultAnswer at all. It receives nothing if there isn't a match, so you have to hard code it with something like:
const qnaResults = await this.qnaMaker.getAnswers(context);
// If an answer was received from QnA Maker, send the answer back to the user.
if (qnaResults[0]) {
await context.sendActivity(qnaResults[0].answer);
// If no answers were returned from QnA Maker, show this reply.
// Note: .getAnswers() does NOT return the default answer from the App Service's Application Settings
} else {
const defaultAnswer = 'No QnA Maker answers were found. This example uses a QnA Maker Knowledge Base that focuses on smart light bulbs. To see QnA Maker in action, ask the bot questions like "Why won\'t it turn on?" or "I need help."'
await context.sendActivity(defaultAnswer);
}
When creating an Azure Web Bot, one of the default Web Chat clients is a fork of microsoft's BotBuilder-Samples project, specifically 49 - QnAMaker All Features
The source code for Dialog/QnAMakerBaseDialog.cs defines the constant DefaultNoAnswer:
public const string DefaultNoAnswer = "No QnAMaker answers found.";
And then uses that value when returning a response from GetQnAResponseOptionsAsync:
protected async override Task<QnADialogResponseOptions> GetQnAResponseOptionsAsync(DialogContext dc)
{
var noAnswer = (Activity)Activity.CreateMessageActivity();
noAnswer.Text = DefaultNoAnswer; // <- used right here
var cardNoMatchResponse = (Activity)MessageFactory.Text(DefaultCardNoMatchResponse);
var responseOptions = new QnADialogResponseOptions
{
ActiveLearningCardTitle = DefaultCardTitle,
CardNoMatchText = DefaultCardNoMatchText,
NoAnswer = noAnswer,
CardNoMatchResponse = cardNoMatchResponse,
};
return responseOptions;
}
This particular sample repo doesn't appear to leverage the DefaultAnswer configuration key anywhere.
You can opt to include it when available by updating the noAnswer.Text like this:
- noAnswer.Text = DefaultNoAnswer;
+ noAnswer.Text = this._configuration["DefaultAnswer"] ?? DefaultNoAnswer;
You'll also have to pass in the configuration object through the dependency management system. See this commit for a full example.
Change the line in qamakerBaseDialog.js as below
var noAnswer = ActivityFactory.DefaultNoAnswer;
Remove ActivityFactory. and rebuild the code.
constructor(knowledgebaseId, authkey, host) {
//ActivityFactory.
var noAnswer = DefaultNoAnswer;
var filters = [];
super(knowledgebaseId, authkey, host, noAnswer, DefaultThreshold, DefaultCardTitle, DefaultCardNoMatchText,
DefaultTopN, ActivityFactory.cardNoMatchResponse, filters, QNAMAKER_BASE_DIALOG);
this.id = QNAMAKER_BASE_DIALOG;
}

Notification not displaying on android device. (Azure notification hub)

I'm trying do send a push notification to my android emulator. When the notification is sent, it receives the notification but does not display it.
I'm using this code to display it.
void SendNotification(string messageBody)
{
var intent = new Intent(this, typeof(MainActivity));
intent.AddFlags(ActivityFlags.ClearTop);
var pendingIntent = PendingIntent.GetActivity(this, 0, intent, PendingIntentFlags.OneShot);
NotificationCompat.Builder notificationBuilder = new NotificationCompat.Builder(this)
.SetContentTitle("FCM Message")
.SetSmallIcon(Resource.Drawable.icon)
.SetContentText(messageBody)
.SetChannelId("my-chanel")
.SetAutoCancel(true)
.SetContentIntent(pendingIntent)
.SetSound();
var notificationManager = NotificationManager.FromContext(this);
notificationManager.Notify(0, notificationBuilder.Build());
}
But when the notification is received nothings happens and the logs of my devices gives me this:
[Mono] Assembly Ref addref ODEON.Android[0xa31db5c0] -> System.Core[0xa1f2f900]: 7
[MyFirebaseMsgService] From: 241571420247
[MyFirebaseMsgService] noti:
[Mono] DllImport searching in: '__Internal' ('(null)').
[Mono] Searching for 'java_interop_jnienv_call_boolean_method'.
[Mono] Probing 'java_interop_jnienv_call_boolean_method'.
[Mono] Found as 'java_interop_jnienv_call_boolean_method'.
[Mono] Assembly Ref addref Xamarin.Android.Support.Compat[0xa31dca60] -> Java.Interop[0xa1f2f780]: 8
[Notification] Use of stream types is deprecated for operations other than volume control
[Notification] See the documentation of setSound() for what to use instead with android.media.AudioAttributes to qualify your playback use case
Can anyone tell my why it doesn't display.
Thanks in advance!
If the send notification mehtod is called use the below code to show notifications:
var intent = new Intent(this, typeof(MainActivity));
intent.AddFlags(ActivityFlags.ClearTop);
var pendingIntent = PendingIntent.GetActivity(this, 0, intent, PendingIntentFlags.OneShot);
NotificationCompat.Builder notificationBuilder = new NotificationCompat.Builder(this)
.SetSmallIcon(Resource.Drawable.ic_stat_ic_notification)
.SetContentTitle("Title")
.SetContentText(messageBody)
.SetAutoCancel(true)
.SetContentIntent(pendingIntent);
NotificationManagerCompat notificationManager = NotificationManagerCompat.From(this);
notificationManager.Notify(0, notificationBuilder.Build());
Also if you have Android V 8+ devices which I'm sure you will see to it you are registering for a notification channel in your MainActivity OnCreate method
void CreateNotificationChannel()
{
if (Build.VERSION.SdkInt < BuildVersionCodes.O)
{
// Notification channels are new in API 26 (and not a part of the
// support library). There is no need to create a notification
// channel on older versions of Android.
return;
}
var channel = new NotificationChannel(CHANNEL_ID, "FCM Notifications", NotificationImportance.Default)
{
Description = "Firebase Cloud Messages appear in this channel"
};
var notificationManager = (NotificationManager) GetSystemService(NotificationService);
notificationManager.CreateNotificationChannel(channel);
}

UWP How to check incoming requests from BLE device?

How to check all incoming requests from paired BLE device to current device?
I think it possible with Events, maybe UWP have needle event, or i must implement custom event, but where is the right way?
Microsoft have explainations about GATT Server, i think it's not what i need, 'cause i don't need a server with services and characteristics, i need only check incoming request and parse passed data in my application.
I'm not found sure way for checking incoming requests, but i make some trick.
Application can subscribe for notifications from device (in my case it's Mi Band 2) and receive some data from this device across ValueChanged.
One time i call ValueChanged handler in App.xaml.cs after connecting and pairing device and this working on all application, i don't need call it again and again.
Here is App.xaml.cs part of code.
protected async override void OnLaunched(LaunchActivatedEventArgs e)
{
Frame rootFrame = Window.Current.Content as Frame;
MiBand2SDK.MiBand2 band = new MiBand2SDK.MiBand2();
var page = typeof(Views.AuthPage);
// Checking for device availability and current session
if (_LocalSettings.Values["isAuthorized"] != null
&& await band.ConnectAsync())
{
if (e.PreviousExecutionState == ApplicationExecutionState.NotRunning && await band.Auth.AuthenticateAsync())
page = typeof(Views.MainPage);
else if (band.Auth.IsAuthenticated())
page = typeof(Views.MainPage);
// Here we are, this notification handler of responses from the band.
band.HeartRate.SetNotificationHandler();
}
else
{
System.Diagnostics.Debug.WriteLine("Not Authenticated...");
}
// other part of code...
Here is HeartRate.SetNotificationHandler() code:
public async void SetNotificationHandler()
{
_heartRateMeasurementCharacteristic = await Gatt.GetCharacteristicByServiceUuid(HEART_RATE_SERVICE, HEART_RATE_MEASUREMENT_CHARACTERISTIC);
Debug.WriteLine("Subscribe for HeartRate notifications from band...");
if (await _heartRateMeasurementCharacteristic.WriteClientCharacteristicConfigurationDescriptorAsync(GattClientCharacteristicConfigurationDescriptorValue.Notify) == GattCommunicationStatus.Success)
// Just subscribe for notifications and set ValueChanged. It's all.
_heartRateMeasurementCharacteristic.ValueChanged += HeartRateMeasurementCharacteristicValueChanged;
}
Hope it helps someone...

text integrating api.ai into android

I'm trying to integrate api.ai with android. I have followed the steps required for that. I require integrating text instead of speech. I want to receive the text as input from the user and display it. Can anyone please suggest me the solution for this?
Step 1:Create Configuration for API AI
final AIConfiguration config = new AIConfiguration("<Client access token>",
AIConfiguration.SupportedLanguages.English,
AIConfiguration.RecognitionEngine.System);
aiService = AIService.getService(this, config);
aiService.setListener(this);
aiDataService = new AIDataService(config);
aiRequest = new AIRequest();
Step 2:Set your text here
aiRequest.setQuery(message);
Step 3:Get your response from API AI using AsyncTask
new AsyncTask<AIRequest,Void,AIResponse>(){
#Override
protected AIResponse doInBackground(AIRequest... aiRequests) {
final AIRequest request = aiRequests[0];
try {
final AIResponse response = aiDataService.request(aiRequest);
return response;
} catch (AIServiceException e) {
}
return null;
}
#Override
protected void onPostExecute(AIResponse response) {
if (response != null) {
Result result = response.getResult();
String reply = result.getFulfillment().getSpeech();
sendMessage(message);
mimicOtherMessage(reply);
mListView.setSelection(mAdapter.getCount() - 1);
}
}
}.execute(aiRequest);
If I understand right -- you do not need the Voice input for your Android application and simply want the user to type in the text in their Android application. Post that, you are expecting to pass on that request to API.AI for further processing and a possible response.
If the above is correct, then you do not need any of the Voice capability Activities. Simply take it in the text from the User and pass that on to the API.AI HTTP API.

Play Media (.mp4) on the receiver app

I am trying to play/push a MP4 to my receiver app on my whitelisted device.
I am able to launch my receiver app (an web page with a video tag).
Once I launch my receiver app, then from my Sender app (another web page) I do this:
new MediaLoadRequest(url);
cast_api.loadMedia(activityId, mediaRequest, callback);
My receiver looks like this:
initReceiver = function(){
_remoteMedia.setMediaElement(videoSurface);
_remoteMedia.onOpen = mediaOnOpen;
_remoteMedia.onLoad = mediaOnLoad;
_remoteMedia.onLoadMetadataError = mediaMetaDataError;
_remoteMedia.onMetadataLoaded = mediaMetaDataLoaded;
_receiver.start();
mediaOnLoad = function (channel, message){
_remoteMedia.load(channel, message);
_remoteMedia.sendSuccessResponse(channel,message);
}
mediaMetaDataLoaded = function (channel, message){
console.log("mediaMetaDataLoaded", message);
}
mediaMetaDataError = function (channel,message){
console.log("mediaMetaDataError", message);
}
mediaOnOpen = function (event){
console.log("mediaOnOpen", event);
}
On the console output the last message I see is this, after the code hits _remoteMedia.load:
[ 41.321s] [cast.receiver.RemoteMedia] loading media
and nothing happens after that. The media is a valid URL from my DropBox...an MP4.
Any ideas what I am doing wrong here?
Thanks!
You actually can just use our default Receiver and it will play your content. No need to provide all those extra functions.

Resources