I work with onvif protocol and when i get profiles there are both have audio encoder configuration, but only first (main-stream) plays sound, how can I turn on audio on the second(Sub-stream) profile ?
Check the different media profiles you can get from the Media and Media2 Services. Probably the second profile does not have any AudioEncoderConfiguration and/or AudioSourceConfiguration attached to it.
You can use the functions of the Media Service (if it is Profile S) or Media 2 Service (if it is Profile T) to modify the profiles.
Related
I am trying to add music into my dialogflow agent. I don't want to add it from the dialogflow console and I wish to add it from the webhook. Can you please tell me how to add the music from the webhook. I am trying this code but it's not working:
app.intent('Music', (conv) => {
var speech = '<speak><audio src="soundbank://soundlibrary/ui/gameshow/amzn_ui_sfx_gameshow_countdown_loop_32s_full_01"/>Did not get the audio file<speak>';
});
Also I want to use one interrupt keyword which will stop this music, is there any predefined way or if user defined, how to interrupt the music and proceed with my other code?
Firstly, to be able to add music, it needs to be hosted on a publicly accesible https endpoint, see the docs. So make sure you can access your file even when using a private browsing mode such as incognito on chrome.
Secondly, If you choose to use SSML to play your audio, the audio will become part of the speech response. By doing this, you won't be able to create any custom interruptions or control over the music. The user can only stop the music by stopping your action or saying "Okay Google" again to interrupt your response.
If you would want to allow your users to control the music you send to them, try having a look at the media responses within the actions on google library.
I am trying to make a music player app that plays music one after the other. Here is a sample conversation:
User: play Beyblade sound
Appl: Playing "Beyblade Burst: The LOUDEST Beyblade?" by "Kevo"
User: turn it down
Appl: <default fallback>
What can i do here? Is there a way i can ask google to handle that request without closing my app?
It is a known bug currently that volume controls aren't available inside a Conversational Action (either when playing a Media response or audio through SSML).
I have an audio file (e.g. mp3) located on Google Drive and accessible at a direct url that looks like https://drive.google.com/file/d/audio_file_id/view where there is a small sliding bar timeline scrubber and play/pause button that let the user hear the audio (and supposedly control the location in the file, but this feature is not working for me on Chrome on mac…; advice on this appreciated).
I would like to start the playback at a specific time. How can this be one? Is there a way to do it like one can do with video files hosted on Google Drive by adding a parameter like t=15s to the video url, like in YouTube?
Drive API doesn't have the seekTo method in Youtube API but you can add the t=123s at the end of the video url and it will work just like it.
sample:
https://drive.google.com/file/d/0Bz6447wI7cGV12546VBmQ2M/view?t=651s
I want to implement audio card like soundclound, basically when I made audio card it was working fine in web, but when I tried playing same card in mobile app but it was redirecting to web browser. so the thing is I want my audio to play in the app itself not to redirect and open a new page on web . let me know if anyone has some idea about implementation of the same.
The only way to implement an audio card for yourself would be to use Twitter's player card option. On mobile, that will end up redirecting to a browser.
The Soundcloud card is a special card implemented by Twitter, and is not generally available to other sites.
I am new with the spotify app. I try to display a video in spotify's advertise page. However none of the things that I was trying work (jw player, flow player, youtube iframe. html5). Any hints?
Greetings!
Any external hosts with assets you request need to be added to the RequiredPermissions part of the app manifest:
http://developer.spotify.com/technologies/apps/guidelines/developer/#applicationmanifest
In order to display a video in the spotify you have to fulfill following requirements:
your video file has to be stored on the 3rd party server.
you have to use open-source video player (e.g “JW Player”).
you have to create a html file that will be your lightbox. This is the file that you will embed in the video file and player library.
image file for background of the player.
Destination URL must be open in new browser window.
And don't forget to add the address of host your host to the permissions in spotify's manifest file!!!
Example: here