Continuous audio download stream - audio

I'm looking to set up a server which will read from a some audio input device and serve that audio continuously to clients.
I don't need the audio to be necessarily be played by the client in real time I just want to be able for the client to start downloading from the point at which they join and then leave again.
So say the server broadcasts 30 seconds of audio data, a client could connect 5 seconds in and download 10 seconds of it (giving them 0:05 - 0:15).
Can you do this kind of partial download over TCP starting at whenever the client connects and end up with a playable audio file?
Sorry if this question is a bit too broad and not a 'how do a set variable x to y' kinda question. Let me know if there's a better forum to to post this in.

Disconnect the concepts of file and connection. They're not related. A TCP connection simply supports the reliable transfer of data. Nothing more. What your application chooses to send over that connection is its business, so you need to set your application in a way that it sends the data you want.
It sounds like what you want is a simple HTTP progressive internet radio stream, which is commonly provided by SHOUTcast and Icecast servers. I recommend Icecast to get started. The user connects, they get a small buffer of a few second in front to get them started (optional), and when they disconnect, that's it.

Related

Nodejs Audio Stream - Main Producing Client -> Server -> Multiple client listeners

I want to build an internet radio station using nodejs. My architecture needs to be like the following, there is one producer who records live audio, this live audio data needs to be sent to the server. On the server-side, I need to save the live streams of audio data in some audio format(for future playback) and also simultaneously stream the live audio to multiple clients. Can somebody please point me towards some implementations or library available to achieve this? I have read several posts and stackoverflow answers but couldn't find anything related to my need.
Is webRTC needed? I don't want clients to get peer to peer connection as I also want to save the live audio on the server. Please any help would be appreciated.

Ways to broadcast audio from WebAudio API to server-side and then to connected clients

I am developing a colaborative instrument playing game, where multiple users will play an instrument (a synthesizer or sample, using the WebAudio API). On my first prototype I've set up a keyboard that sends note/volume signals via Socket.io to the server, and when the server gets that signal it sends it back to all connected sockets, which will play the corresponding note.
You might have guessed it right: there's a massive amount of lag and inconsistency as to the order of arrival of notes.
What are some efficient ways that I can send the output of WebAudio to the server, and have it broadcast to all connected users, so I have some sort of consistency?
You could try using a MediaStream by adding an MediaStreamAudioDestinationNode to your audio node graph as a destination and use that stream with either WebRTC or RecordRTC to send to your server.
Here is some info I found you could look at.
It does talk about using the getUserMedia method, but both getUserMedia and MediaStreamAudioDestinationNode methods send out a MediaStream constructor. This info
has some ideas on how you could send a MediaStream to your sever. However it does say that it needs to be recorded first. Not when it's live and running.
Sending a MediaStream to host Server with WebRTC after it is captured by getUserMedia
I hope this helps :)

WebSocket Streaming dual videos

Wanted to clear a few questions about websocket.
Is it possible to stream videos from server to client and client to server at the same time...something like video calling?
Can the server stream two videos to a single client at a time?
Regarding the first question, yes you can. There are already wrappers that simplify that task such as BinaryJS.
As per your second question, it would require a little extra configuration. Once a bidirectional link between client and server is established, the client will treat every incoming message as part of the same stream. Separating or multiplexing two videos in the same stream would have to carry another mark to help the client separate it.
It would be a better idea to open a new connection (with the same server) to stream a second video.

Node.js module for WebRTC data channels usage?

I am writing a multiplayer real time game for the browser with the server as a master instance and the clients as input devices and slaves to show the graphics.
I have to send out changes in the game world very often and very fast and it doesn't matter if some of the data sometimes gets lost on the way because a couple of milliseconds later there will be the next update anyway.
Right now I am using Socket.io to talk between the server and the clients but this uses TCP which makes the update come in unnecessary late sometimes.
I know that there is WebRTC with data channels where I would be able to send my updates through wit UDP which would be very awesome and exactly what I want. And it even seems to be implemented in Firefox and Chrome already https://stackoverflow.com/a/12864512/63779
What I now need is some Node library which would allow me to use data channels to send my data (for now just JSON strings) with help of UDP to the clients which are browsers. On the browser I would be able to use webkitRTCPeerConnection() but I have no idea how to start something like that on the Node server. Any suggestions? If there is no Node module for that, would it be possible to write something in some other language and just send the data via Unix domain sockets or something?

how to create a RTSP streaming server

So I am trying to create a RTSP server that streams music.
I do not understand how the server plays a music and different requests get what ever is playing at that time.
so, to organize my questions:
1) how does the server play a music file?
2) how does the request to the server look like to get whats currently playing?
3) what does the response look like to get the music playing in the client that requested the music?
First: READ THIS (RTSP), and THEN READ THIS (SDP), and then READ THIS (RTP). Then you can ask more sensible questions.
It doesn't, server streams little parts of the audio data to the client, telling it when each part is to be played.
There is no such request. If you want, you can have URL for live streaming, and in RTSP DESCRIBE request, tell the client what is currently on.
Read the first (RTSP) document, all is there! Answer to your question is this:
RTSP/1.0 200 OK
CSeq: 3
Session: 123456
Range: npt=now-
RTP-Info: url=trackID=1;seq=987654
But to get the music playing you will have to do a lot more to initiate a streaming session.
You should first be clear about what is RTSP and RTP. The Real Time Streaming Protocol (RTSP) is a network control protocol designed for use in communications systems to control streaming media servers. where as Most RTSP servers use the Real-time Transport Protocol (RTP) for media stream delivery. RTP uses UDP to deliver the Packet Stream. try to Understanding these concepts.
then Have a look at this project.
http://sourceforge.net/projects/unvedu/
This a open source project developed by our university, which is used to stream video(MKV) and audio file over UDP.
You can also find a .Net Implementation of RTP and RTSP here # https://net7mma.codeplex.com/ which includes a RTSP Client and Server implementation and many other useful utilities e.g. implementations of many popular Digital Media Container Formats.
The solution has a modular design and better performance than ffmpeg or libav at the current time.

Resources