I would like to stream live audio from my browser to my icecast server.
So far I managed to record the audio in the browser and store it as a .WAV file.
I was thinking of using a nodejs server to get the audio, but I don't know how to stream the audio to one of the icecast stream clients.
Does anybody knows how to make the link between the nodejs server and the icecast server? (they can both be on the same server).
You can try Webcaster
http://webcast.github.io/
it has an example for NodeJS
Related
I am using youtube_stream_capture, to download an ongoing live stream from youtube. youtube_stream_capture downloads chunks something like this
I was wondering if there is any way to stream these chunks to a RTMP server, maybe to this
Node Media Server
I haven't tried anything but I did do a lot of google search but nothing as such
I don't know how to get started with this.
What I am trying to do is get a video + audio stream from front-end and host the live stream as mp4 thats accessible on browser.
I was able to find information on WebRTC, socket.io, rtmp, but I'm not really sure what tool to use / whats best suited for something like this?
also follow up question, my front-end is iOS app. So what format would I send the live stream to the server?
It depends on which live streaming protocol you want to play on the player, as #Brad said, HLS is the most common protocol for player.
Note: Besides of HLS, iOS native app is able to use fijkplayer or FFmpeg to play any format of live streaming, like HLS, RTMP or HTTP-FLV, even MKV. However, the most straight forward solution is HLS, only need a tag to play MP4 or HLS, and MSE is also a optional solution to use flv.js/hls.js to play live streaming on iOS/Android/PC, this post is about these protocols.
The stream flow is like this:
FFmpeg/OBS ---RTMP--->--+
+--> Media Server---> HLS/HTTP-FLV---> Player
Browser ----WebRTC--->--+
The protocol to push to media server, or receive in node server, depends on your encoder, by RTMP or H5(WebRTC):
For RTMP, you could use FFmpeg or OBS to push stream to your media server.
If want to push stream by H5, the only way is use WebRTC.
The media server coverts the protocol from publisher to player, which use different protocols in live streaming right now(at 2022.01), please read more from this post.
I am able to play rtsp stream on web page using live555 server but I need to extract the frames from the rtsp stream and store them in a file.
Can anyone guide me how to do this?
If you are already familiar with live555 try use simple RTSP client application which comes with live555 package. Application called openRTSP and located in testProgs folder. It could ready input RTSP stream and save to file.
Is it possible to capture all audio stream on my PC (from web browser) and stream it via LAN ?
I use Yandex Music (music.yandex.ru) service. So I logged into my yandex account and I have no any audio files, just online stream. I want to make something like LAN-radio. Users will visit an HTML-page located on our server and listen my audio stream.
Can I use icecast or similar software to stream non-file audio?
Or should I connect my PC's line out to line IN (or mic) and read audio stream via Java or flash? Any ideas?
Have you tried looking at things like Jack and Soundflower? These allow you to reroute the audio from one program to another. You could then reroute the sound into Java or flash and go from there.
https://rogueamoeba.com/freebies/soundflower/
http://jackaudio.org/
You can try WebRTC and MediaStream API for that. You can get audio from user's audio device or a stream they are playing in browser. You can find dcoumentation on those APIs from MDN pages.
Good day! I'm a newbie on video streaming. Can you help me find good ways on how to make a video streaming secure?
I'm having some issues on my video hosting project security.
I am creating a web page which calls a video stream hosted on a different server where
my web page is deployed.
Server 1(web page video embed) calls video to stream on Server 2(video host).
The problem is that they are hosted on an absolute different network. Should Server 2 where the video is hosted should be private and only allow Server 1 to fetch the video stream creating a server to server transfer of data, or should it be public for the clients to be able access it.
Can you help me decide what to do to secure my videos?
I badly need some idea on this... thanks guys!
How are you streaming and what streaming protocol are you using?
Server to server wont help in securing the video.it is better to stream the video direcly from your Server 2(video host) directly to the client,so that it wont be overhead for server 1(web page video embed).You need to use secure way to protect you video on server 2.if the server2 is not secure,even if you stream through server1 it wont help.
Here are details of security level on different video streamings.
If you are using progressive download.This can be done using normal http protocol.In this approach you would be able to see the video url in the browser.Once you got the url you can download it as a normal file download.Security is very low here.Even if you sign the video url,the user can download the video easily.
Streaming,you can stream the video using different protocol like rtmp etc.If you are streaming videos using some rtmp.In this approch, you wont be able to download the video directly,but you can use some good software to capture the video stream and save to the pc.
Streaming securly.There are some protocols like rtmpe.I tried only rtmpe,In this protocol,the streaming content will be encrypted on the server and decrypted on the client.so the software wont be able to capture the video stream.
Along with approach 3,if you sign the video url,it will add more security.Hope this helps.