How to display RTSP stream from IPCamera in webapp - http-live-streaming

I'm looking for a way to display multiple camerastreams (up to 200 cameras) in a single web application (only a single stream will be visible at each time).
My initial plan was to connect the webapp to the cameras by using an rtsp stream, but this protocol is not supported by most browsers. I have found some sources that it should be possible to display using a thirdparty plugin but for now no luck.
Another idea I had was to deploy a kubernetes cluster with a transcoding service for each camera that converts a rtsp stream into an HLS stream, which is usable in a webapp. But this means defining a hard link between each transcoder pod and each camera.
So my question: Is there an easy way of using rtsp streams in a webapp? Or what do you guys think is a viable way to handle this many cameras in a webapp?
So many thanks!

Related

rtmp cdn to cloud streamer or free sreamer

I am new into RTMP and live streaming.
I have my rtmp server, but the issue is distribution, was looking for a simple rtpms streaming cdn. That can support audio streaming with HSL or dash support.
Or something free similar to youtube live, but for audio but with embeddable html.
Recently(2022.01) most of CDNs support only file-based streaming protocol, like HLS/DASH/CMAF, even you publish the stream by RTMP or WebRTC, the CDN also covert the stream to these protocols.
If you want to build low lagging live streaming application, like RTMP, HTTP-FLV is recommend and you need a CDN to support HTTP-FLV rather than RTMP. HTTP-FLV works well on PC or mobile, please read this post.
You could build your CDN by open-source media-server cluster, like SRS Edge to delivery HTTP-FLV, based on AWS EC2.
For CDN which support HTTP-FLV, you could check Tencent Cloud Streaming Services, which supports publish by RTMP, and deliver by HLS/HTTP-FLV/WebRTC.

Azure Communication Services (Calling SDK) - How many video streams are supported?

I am very confused about the calling sdk specs. They are clear about the fact that only one video stream can be rendered at one time see here...
BUT when I try out the following sample I get video streams for all members of the group call. When I try the other example (both from ms), it behaves like written in the specs... So I am totally confused here why this other example can render more than one video stream in parallel? Can anybody tell me how to understand this? Is it possible or not?
EDIT: I found out that both examples work with multiple videos streams. So it is cool that the service provide more than the specs say, but I do not get the point why the specs tell about that not existing limitations...
Only one video stream is supported on ACS Web (JS) calling SDK, multiple video stream can be rendered for incoming calls but A/V quality is not guaranteed at this stage for more than one video. Support for 4(2x2) and 9(3x3) is on the roadmap and we'll publish support as network bandwidth paired with quality assurance testing and verification is identified and completed.

Azure media service. How to stream from web browser

I am new to video streaming and trying to implement simple app to stream video from web browser.
I decided to use Azure Media Service for this purpose and found a lot of tutorials on their side how to stream to Chanel using desktop encoders like OBS, wirecast etc...
What I want to achieve is to stream video from web browser using webcam. I am trying to find if there is some solution for Azure Media Service or everything has to be implemented from scratch.
Thanks in advance
You will need support for RTMP streaming for this. RTMP streaming support isn't natively supported in browsers. One approach is to use a web application with RTMP relay support - e.g. restream.io - as this allows use of a streaming protocol (e.g. WebRTC) that is natively supported in the browser to send the stream out which can then be converted to RTMP by a web service. If you have a need to build your own web application, you would need to implement this essentially from scratch and can leverage a streaming media application framework such as GStreamer for this.

How online radio live stream music and are there available resources to build one with Node.js?

i'm little curious about 'How live streaming web application works'. Recently I want to built something like a online radio that can perform live stream through all the client, like music, speech etc. I'm quite familiar with Java Spring MVC and Node.js . If there are some resource using thease above technology, it would be really helpful for me to see how it works. Thanks in advance.
There are two good articles about it:
Streaming Audio on the Web with NodeJS
Using NodeJS to Stream a Radio Broadcast
You may also find this module helpful:
https://www.npmjs.com/package/websockets-streaming-audio
The best way to do this is use Node.js as your source application, and leave the actual serving of streams to existing servers. No reason to re-invent streaming on the web if you can get all the flexibility you need by writing the source end.
The flow will look like this:
Your Radio Source App --> Icecast (or similar) --> Listeners
Inside your app itself:
Raw audio sources --> Codecs (MP3, AAC w/ADTS, etc.) --> Icecast Source Client
Basically, you'll need to create a raw PCM audio stream using whatever method you want for your use case. From there, you'll send that stream off to a handful of codecs, configured with different bitrates. What bitrate and quality you use is up to you, based on the bandwidth availability of your users and tradeoff with quality you prefer. These days, I usually have 64k streams for bad mobile connections, and 256k streams for good connections. As long as you have at least a 128k stream in there, you'll be putting out acceptable quality.
The Icecast source client can be a simple HTTP PUT these days. The old method is very similar... instead of PUT, the verb was SOURCE. (There are some other minor differences as well, but that's the gist.)

Scalable cloud storage

I'm going to publish a video in a Web page for streaming. I expect having more than 100.000 visits per day in a month. I want to upload my video to a server (or service) that offers the same band-with for all the clients, even if there are hundreds of thousands of clients connected simultaneously.
I will connect the player with the external video.
Note: I cannot use Youtube or Vimeo because the video is 360ยบ technology, so I need to use my custom player.
Please, could you suggest any service that offers this feature?
Thanks!!
I would say this is mostly a question of the streaming technology you'd like use but not the storage alone.
E.g. if you wish to stream via some binary protocol like RTMP, you'll have to use software like Wowza for transcoding and delivery. Hence the load balancing for proper usage of bandwidth will also be served via load balancer like Wowza.
So you should decide what protocols and other technologies you plan using. This will narrow your search parameters.

Resources