How to use RTSP live stream's link from tuya API? - rtsp

I've got a link from Tuya API explorer using the service "IoT Video Live Stream". I want to know where I can use this link for stream my camera's video. I have the video on my tuya APP, but I want use this link.
Here's an example of the API return.
{"result": {
"url": "rtsps://eb3ba0.....aa418b.....:4UBYKMX9T.....T0#aws-tractor1.tuyaus.com:443/v1/proxy/echo_show/b6721582601.....b46ac1b71.....bbb3d*****bf7.....},"success": true,"t": 1642462403630}

Related

How to integrate Google Home CameraStream with AWS kinesis video streaming?

We have a home camera device which is using aws kinesis video streaming for webrtc.
We were trying to integrate google home to enable command based live straminga and using https://developers.google.com/assistant/smarthome/traits/camerastream to do set up part, as per this link we need following info to proceed streaming:
{
"cameraStreamIceServers": "[{"urls": "stun:stun.l.partner.com:19302"},
{"urls":"turn:192.158.29.39:3478?transport=udp","credential": "JZEOEt2V3Qb0y27GRntt2u2PAYA=","username": "XXXXXXX"},{"urls": "turn:XXX.XXX.29.39:3478?transport=tcp","credential": "XXXXXXXXXXX=","username": "XXXXXXXXXXXXXXXXX"}]",
"cameraStreamSignalingUrl": "https://example.com/signaling/answer",
"cameraStreamOffer": "o=- 4611731400430051336 2 IN IP4 127.0.0.1...",
"cameraStreamProtocol": "webrtc"
}
cameraStreamSignalingUrl - URL endpoint for retrieving and exchanging camera and client session description protocols (SDPs). The client should return the signaling URL which uses the cameraStreamAuthToken as the authentication token in the request header.
cameraStreamOffer - Offer session description protocol (SDP).
We also checked kinesis document but haven't got any clue to get this information as we are beginner of this thing any help would be recommendable.

How to use Azure Media Service Live Stream with Apple AirPlay

I am using Azure Media Service to live stream an event. My source is a HD SDI feed being captured via an AJA Kona LHi card and sent to Azure using Wirecast.
I'm using the default settings in Wirecast for Azure Media Service. I am using standard encoding and the 720p preset. I am using the Azure Media Player embedded in a private site.
Everything seems to work however iOS devices are unable to AirPlay the stream. The video plays correctly on the device but no AirPlay controls are available.
If I use the exact same setup to stream my webcam the AirPlay controls are available. Is there some specific configuration required to make this work reliably?
On iPad, Azure Media Player by default streams the videos in DASH. You need to specify the techorder to use give HLS a higher priority, e.g.:
var myOptions = {
techOrder: ["html5", "azureHtml5JS", "flashSS"],
autoplay: false,
controls: true,
width: "640",
height: "400",
};
var myPlayer = amp("azuremediaplayer", myOptions);
If you don't see the AirPlay button on the video player, you can do a swipe down gesture on the upper right region of the screen and bring up the "Quick controls" to access AirPlay:
If this still doesn't work, reply with the url of your webpage and I can take a look.

google nest hub can't play hls

My Question is
I push HLS steram to gnh(google nest hub) by action.devices.commands.GetCameraStream response format.gnh do nothing but show loading UI some seconds.
It's somthing wrong with my HLS file?
How to get log from gnh to help me debug?
As I know
I am tried to push mp4(1080p/under 60 fps) url to gnh, that's work well.
I am tried to convert mp4 to hls by some lib,include ffmpeg,Bento4.
Here is my JSON send to gnh:
{
"payload": {
"commands": [{
"status": "SUCCESS",
"states": {
"cameraStreamAccessUrl": "http:/path/of/steram.m3u8"
},
"ids": ["....."]
}]
},
"requestId": "My_Request_Id"
}
It seems that you are missing the required property cameraStreamSupportedProtocols. Try adding the protocol and see if you are able to get the stream to work. This will load the default cast camera receiver since you are trying to play HLS content. If you are still seeing an issue with playback, it could be that your stream is malformed and needs to be revised.
Playback logs will only be available to you if you create your own basic receiver app and specify this in your response using the cameraStreamReceiverAppId property. To see more about creating a cast receiver app refer to the overview page (https://developers.google.com/cast/docs/web_receiver) and how to create a basic receiver (https://developers.google.com/cast/docs/web_receiver/basic) for more information. We also do have a default camera receiver sample located in our sample github (https://github.com/googlecast/CastCameraReceiver)

Google Action - playing youtube video on google hub

How can I play a specific Youtube Video on my Google Hub via Google Actions? I know I can use a Basic Card to display images and text and even a link (although that link does not show up on the HUB)
I specifically want to be able to trigger or to play a youtube video on my Google Hub.
Actions are not able to start playing video content. Media responses are only for audio.
I have a similar need. After a chat with an action on google, I want to play user requested youtube videos (chains-of) on a local "big screen" (TV-like / PC).
A workaround solution could be:
you realize an action that select one or more videos.
The action act also as a server for a client described here below
The action communicate (SSE, websocket, HTTP...) with a client browser page containing a javascript small program that dynamically visualize the video (id sent via SSE client-server communication)
Here below the rough js script (I'm not a web developer); that just gives you the idea:
<script language="javascript">
function loadVideoWithId(id) {
const tvEmbedMode = "embed/" //"tv#/watch?v="
const url = `https://www.youtube.com/embed/${id}?fs=1&autoplay=1&loop=1` //
const iframe = `<iframe src="${url}" width="1600" height="900" allowFullScreen="allowFullScreen" frameBorder="0" />`
document.write(iframe)
}
loadVideoWithId('hHW1oY26kxQ')
</script>

Play Audio from receiver website

I'm trying to get my receiver to play an mp3 file hosted on the server with the following function
playSound_: function(mp3_file) {
var snd = new Audio("audio/" + mp3_file);
snd.play();
},
However, most of the time it doesn't play and when it does play, it's delayed. When I load the receiver in my local browser, however, it works fine.
What's the correct way to play audio on the receiver?
You can use either a MediaElement tag or Web Audio API. Simplest is probably a MediaElement.

Resources