Livestreaming audio get metadata - google-cast

With the help of this forum I have my app playing on chromecast now. I am trying to figure out how or if it is possible to retrieve the live streaming audio metadata and display it on my tv. I am using the default media player/receiver and the Cast Companion Library.
The sample cast video app does not play a live stream. I have had no luck trying to figure out how to implement this. For now I am just displaying a static message, but I know the users will complain and will want the live song info.

As far as providing metadata, there is no difference between live or buffered streams; you provide the metadata in the same MediaInfo object and load that for your receiver.

Related

How can I Record video from webcam on client with lossless frame pixels through browser?

I need to build a website which recording the person from the camera (he must allow the camera first), but I need the record frame by frame with lossless pixels.
I tried to figure this out with some options:
opencv.js - didn't figure it, it is using the browser video element, this is changing the pixels by compressions right?
ngx-webcame - I read it using capturing lossless images but not video
Now the other issue that I need to send the frames to the server?
should I save the frames on client process it on client computer and then send the result to the server?
Is there an option to send the video data frame to the server for future use?
Someone told me to build an agent that will do this actions and send the data on chunks but in that case I don't know really how to do it and I need clarification on that and some instruction on how to start build something like that.
If anyone have an example codes or anything that can direct me to the solution it will be very helpful.
I've created something similar befor using RecordRTC.
It takes advantageg of WebRTC. It works pretty straitforward. Record the video localy and upload it as a file.
https://github.com/muaz-khan/RecordRTC

Google Action should play radio stream

I need to develop a Google Action which streams an audio/radio stream.
i thought about media response.
But the documentation says: "Audio for playback must be in a correctly formatted .mp3 file. Live streaming is not supported."
Documentation
Can someone give me an hint, what i have to do to stream an audio-stream? i found a german google action "baden fm" which streams their radio. But not sure how they do it.
Kind Regards
Stefan
The only ways to do this currently:
Stream it in chunks of MP3 files, using the callback at the end of streaming to stream the next chunk
Getting listed on TuneIn, Radio.com or iHeartRadio. From observation, Baden FM seems to be using TuneIn
Through an App Action
Use a Web site link that starts streaming via BrowseCarousel or Button
Last 2 options are not helpful if you're going after non-browser-enabled devices.
Also saw this thread which has some insight on MP3 size/duration: How can I tell Actions on Google to stream audio?
Google Actions do not currently support live audio streaming. I'm in contact with them but it seems they have no ETA to support this.
I was successful doing so with an mp3 live stream:
NPR: https://npr-ice.streamguys1.com/live.mp3?ck=1597372625378
but not with mpd
BBC test stream: https://rdmedia.bbc.co.uk/dash/ondemand/testcard/1/client_manifest-audio.mpd
or with the HLS that my company uses ( .m3u8, can't publish the link publicly )
Note: added links as text/code since I'm not sure whether their companies policies are cool with them being indexed.

Can I record and upload video on Youtube API?

Can I use the YoutubeAPI on a web-app to record and save/upload a video with out it being broadcast live?
Use the method Videos: insert to upload video using YouTube api. About saving/downloading, I think, it's against their Terms of Service.
You shall not download any Content unless you see a “download” or
similar link displayed by YouTube on the Service for that Content.
You can see this SO post for further reference on downloading content in YouTube.
Youtube does not support recording straight to their platform anymore.
https://support.google.com/youtube/answer/57409?hl=en

Nest Camera Video Streaming in VLC player

I have got a public share nest camera address from my friend.
Instead of using a web browser for seeing the video, I want to use a VLC player to video stream. This way allows me to use many other features of VLC to do video analytics on the video.
How to do it?
I was able to do this in these steps:
Go to the public video share URL. It should be something like this:http://video.nest.com/live/pSgnOZ0s4t
If you use developer tool on chrome and see network traffic ....look for a URL with .m3u8 in the end... it will be something like this:https://stream-delta.dropcam.com/nexus_aac/37451e60aeac457f9800704f1662147e/playlist.m3u8
Once you get that open that file in a text editor....you will get something like this inside the file
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-STREAM-INF:BANDWIDTH=400816,CODECS="avc1.77.31,mp4a.40.2",RESOLUTION=1280x720
chunklist_w391480529.m3u8
The stream URL is then
https://stream-delta.dropcam.com/nexus_aac/37451e60aeac457f9800704f1662047e/chunklist_w391480529.m3u8
Once you have this then install livestreamer to extract video like this:
livestreamer "hls://https://stream-delta.dropcam.com/nexus_aac/37451e60aeac457f9800704f1662047e/chunklist_w391480509.m3u8" best -o nest_video.ts
This will save the file to your disk.
I used this to avoid nest aware subscription. Unfortunately, they charge so much for that service. When someone can just save the video to a disk and upload to a cheap cloud option...
I wrote a page that takes a public Nest video url and returns an HLS media .m3u8 streaming url
get media url for nest/ dropcam cameras

Record Screen's Happenings(Audio+Video)

i am new baby in WebRTC and want to implement system like video conferencing , live streaming or you can skype using WebRTC and NodeJS.
i am confused with one thing , as its our one of client's requirement , suppose on page whatever is happening it may be video conferencing say one moderator answering to many audiences one by one , so there should be one video created , which continuously recording all this stuff together and sending live stream to server to save in our database.
is this kind of stuff implementable or not?
any help please.
You can capture video through grabbing Jpeg images from a canvas element. You could also capture the entire page(if numerous videos in the same page) through grabbing the page itself through chrome.
For audio, recording remote audio with the Audio API is still an issue but locally grabbed audio is not an issue.
RecordRTC and my Modified Version for recording streams either to file or through websockets respectively.
Capture a page to a stream on how to record or screenShare an entire page of Chrome.
If you have multiple different videos not all in the same page but want to combine them all, I would suggest recording them as above and then combining and syncing them up server side(not in javascript but probably in C or C++).
If you MUST record remote audio, then I would suggest that you have those particular pages send their audio data over websockets themselves so that you can sync them up with their video and with the other sessions.

Resources