I’m trying to play mp3 live streaming and I’m using “media response” as shown in the Action on Google guide, here is the code:
if(!conv.surface.capabilities.has('actions.capability.MEDIA_RESPONSE_AUDIO')) {
conv.ask('Sorry, this device does not support audio playback.');
}
else{
conv.ask(new MediaObject({
name: 'Radio one',
url: 'my_streaming_url.mp3',
description: 'A funky Jazz tune',
}));
conv.ask(new Suggestions(['Radio two']));
}
All works fine, but there is about 20 seconds of audio latency on Google home and Google home mini. No latency on Google Assistant Android app and on Action on Google Simulator and no latency also if “url” is an mp3 file. Any idea why there is this delay?
Google Home's media player appears to buffer roughly 20-30 seconds of playable audio before starting.
If you control the Icecast streaming server, increase the <burst-size> value either in the <limits /> section or <mount />. By default, it is set to 65536 bytes.
You can work out the ideal burst-size limit for your stream in bytes by calculating the following:
bitrateKbps * bufferSeconds * 1024 / 8
For a 128kbps stream, try 327680.
(Also, make sure the server's <queue-size /> is bigger than the burst-size)
The 20 second latency with live streaming urls appears to be a thing on google home (mini) and chromecast. Not only if initiated from google assistant : also when using google cast API. I have no idea why they use so much buffering.
Related
How can I play a specific Youtube Video on my Google Hub via Google Actions? I know I can use a Basic Card to display images and text and even a link (although that link does not show up on the HUB)
I specifically want to be able to trigger or to play a youtube video on my Google Hub.
Actions are not able to start playing video content. Media responses are only for audio.
I have a similar need. After a chat with an action on google, I want to play user requested youtube videos (chains-of) on a local "big screen" (TV-like / PC).
A workaround solution could be:
you realize an action that select one or more videos.
The action act also as a server for a client described here below
The action communicate (SSE, websocket, HTTP...) with a client browser page containing a javascript small program that dynamically visualize the video (id sent via SSE client-server communication)
Here below the rough js script (I'm not a web developer); that just gives you the idea:
<script language="javascript">
function loadVideoWithId(id) {
const tvEmbedMode = "embed/" //"tv#/watch?v="
const url = `https://www.youtube.com/embed/${id}?fs=1&autoplay=1&loop=1` //
const iframe = `<iframe src="${url}" width="1600" height="900" allowFullScreen="allowFullScreen" frameBorder="0" />`
document.write(iframe)
}
loadVideoWithId('hHW1oY26kxQ')
</script>
I'm writing a bot in Node.js using the MS Bot Framework. To send attachments, I'm actually using the filestream buffer as the contentUrl, e.g.
...
var base64 = new Buffer(filedata).toString('base64');
var msg = new builder.Message()
.setText(session, text)
.addAttachment({
contentUrl: util.format('data:%s;base64,%s', contentType, base64),
contentType: contentType
});
session.send(msg);
...
where contentType is the proper mimetype for the file in question.
When I test this locally (using the Bot Framework Emulator), this works perfectly for both image and audio files - messages with image attachments display the image, and messages with audio attachments show the audiocard allowing for playback, etc.
However, when I test this through FB Messenger, the images work fine, but the audio messages just never appear in FB. Not even the text of the message comes through; it's like the entire message is lost. The dialogue simply skips over the message containing the audio attachment. I'm not even seeing any errors received server-side.
This is happening with both mp3 and wav test audio files, that are each under 1MB (smaller than many of the image files I've successfully tested).
Is there some trick to sending playable audio files to the FB Messenger channel specifically?
Thanks!
I wasn't (yet) able to get a response from FB support, but after further testing, it looks like there is a filesize limit on audio files FB Messenger will accept.
Specifically, I was able to get a sample file of ~45KB to send and display in Messenger successfully, but a larger file of ~400KB got dropped (aka seemed to send successfully from the server-side perspective, but did not show up in Messenger).
Strangely, some of my much larger image files went through, so it seems like this same limit does not exist for image attachments.
Will do some further testing, but it seems like the ultimate solution will be either to majorly compress my audio files, or to host them somewhere else instead of sending as a filestream.
I am using the navigator.webkitGetUserMedia API to capture the desktop and using microphone to capture audio. When I make the following call
navigator.webkitGetUserMedia({
audio:true,
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: id,
maxWidth:screen.width,
maxHeight:screen.height}
}
}, gotStream, getUserMediaError);
I am getting a screen capture error. Does this API not support the above scenario?
I am able to capture audio and desktop video individually but not together. Also, since I am capturing desktop and not the webcam video, does that make any difference?
Chrome does not allow you to request an audio stream alongside a chromeMediaSource.
See Why Screen Sharing Fails here for more info.
You may be able to circumvent this by sending individual getUserMedia requests - one for the audio stream and one for desktop.
I'm trying to get my receiver to play an mp3 file hosted on the server with the following function
playSound_: function(mp3_file) {
var snd = new Audio("audio/" + mp3_file);
snd.play();
},
However, most of the time it doesn't play and when it does play, it's delayed. When I load the receiver in my local browser, however, it works fine.
What's the correct way to play audio on the receiver?
You can use either a MediaElement tag or Web Audio API. Simplest is probably a MediaElement.
I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).