Re-stream icecast stream through nodejs - node.js

Play our icecast stream through Nodejs, so that I can read metadata and push another audio file at key parts.
What I am wondering is why the following script won't allow a user to hear the stream.
var http = require('http'),
request = require('request'),
remote = 'http://stream.radiomedia.com.au:8003/stream';
http.createServer(function (req, res) {
res.writeHead(200, {
'Content-Type': 'audio/mpeg',
'Content-Length': 1500
});
// http://somewhere.com/noo.bin
var remoteUrl = remote + req.url;
request(remoteUrl).pipe(res);
}).listen(8080);

'Content-Length': 1500
That's your primary problem. You need to leave the Content-Length unspecified, as it's indefinite for your stream.
Also, this will cause the server to use chunked transfer encoding, which many clients these days can handle just fine. Some can't, so if legacy client compatibility matters to you, you'll have to disable chunked transfer encoding.
play our icecast stream through nodejs, so that I can read metadata and push another audio file at key parts.
This isn't a trivial thing to do. MP3 uses the concept of a bit reservoir, so you cannot arbitrarily trim the stream, even on frame boundaries, unless you disable the bit reservoir on the encoder which causes a pretty significant degradation in quality.
For more information, see my answer here: Is it possible to splice advertisements or messages dynamically into an MP3 file via a standard GET request?

Related

minimize latency and bandwidth usage between node.js server

I have a node.js script which sends data frequently to a node.js server. Right now client is using node.js request module to send post request and server is handling through request http server.
client:
const request = require('request');
// some iteration over time for sending event
request.post('http://localhost:8080/event',
{ json: true, body: body },
(err, res, body) => {
callback(err)
})
Server:
const http = require('http')
const server = http.createServer(function(req, res) {
let body = []
req.on('data', body.push.bind(body))
req.on('end', () => {
console.log(Buffer.concat(body).toString())
res.end()
})
})
I want to minimize the bandwidth usage and latency.
Here are my thoughts:
I was thinking to use http/2 as it may have better performance (not sure, just from some r&d) but it's still experimental in node.js.
right now it's using http, will websocket make any difference on that case as socket can send data over single connection without sending headers all the time.
what will be the best approach to minimize the latency and bandwidth usage?
Note: I am not expecting to use any third party service provider for that (want to improve from coding perspective).
If the question is WebSocket vs. HTTP/2, I'd go with WebSocket.
WebSocket frames are a bit smaller, 2-14 bytes of overhead per frame, compared to HTTP/2's fixed 9. For small messages therefore WebSocket offers less overhead. Which should result in better performance1.
As of right now, WebSocket is still better supported in servers, proxies and firewalls.
If the question is more general "how to reduce latency", then I'd abandon JavaScript and switch to C++ and zero-copy binary communication.
1 Though, to be sure, benchmark both and compare.

What is the most efficient way of sending files between NodeJS servers?

Introduction
Say that on the same local network we have two Node JS servers set up with Express: Server A for API and Server F for form.
Server A is an API server where it takes the request and saves it to MongoDB database (files are stored as Buffer and their details as other fields)
Server F serves up a form, handles the form post and sends the form's data to Server A.
What is the most efficient way to send files between two NodeJS servers where the receiving server is Express API? Where does the file size matter?
1. HTTP Way
If the files I'm sending are PDF files (that won't exceed 50mb) is it efficient to send the whole contents as a string over HTTP?
Algorithm is as follows:
Server F handles the file request using https://www.npmjs.com/package/multer and saves the file
then Server F reads this file and makes an HTTP request via https://github.com/request/request along with some details on the file
Server A receives this request and turns the file contents from string to Buffer and saves a record in MongoDB along with the file details.
In this algorithm, both Server A (when storing into MongoDB) and Server F (when it was sending it over to Server A) have read the file into the memory, and the request between the two servers was about the same size as the file. (Are 50Mb requests alright?)
However, one thing to consider is that -with this method- I would be using the ExpressJS style of API for the whole process and it would be consistent with the rest of the app where the /list, /details requests are also defined in the routes. I like consistency.
2. Socket.IO Way
In contrast to this algorithm, I've explored https://github.com/nkzawa/socket.io-stream way which broke away from the consistency of the HTTP API on Server A (as the handler for socket.io events are defined not in the routes but the file that has var server = http.createServer(app);).
Server F handles the form data as such in routes/some_route.js:
router.post('/', multer({dest: './uploads/'}).single('file'), function (req, res) {
var api_request = {};
api_request.name = req.body.name;
//add other fields to api_request ...
var has_file = req.hasOwnProperty('file');
var io = require('socket.io-client');
var transaction_sent = false;
var socket = io.connect('http://localhost:3000');
socket.on('connect', function () {
console.log("socket connected to 3000");
if (transaction_sent === false) {
var ss = require('socket.io-stream');
var stream = ss.createStream();
ss(socket).emit('transaction new', stream, api_request);
if (has_file) {
var fs = require('fs');
var filename = req.file.destination + req.file.filename;
console.log('sending with file: ', filename);
fs.createReadStream(filename).pipe(stream);
}
if (!has_file) {
console.log('sending without file.');
}
transaction_sent = true;
//get the response via socket
socket.on('transaction new sent', function (data) {
console.log('response from 3000:', data);
//there might be a better way to close socket. But this works.
socket.close();
console.log('Closed socket to 3000');
});
}
});
});
I said I'd be dealing with PDF files that are < 50Mb. However, if I use this program to send larger files in the future, is socket.io a better way to handle 1GB files as it's using stream?
This method does send the file and the details across but I'm new to this library and don't know if it should be used for this purpose or if there is a better way of utilizing it.
Final thoughts
What alternative methods should I explore?
Should I send the file over SCP and make an HTTP request with file details including where I've sent it- thus, separating the protocols of files and API requests?
Should I always use streams because they don't store the whole file into memory? (that's how they work, right?)
This https://github.com/liamks/Delivery.js ?
References:
File/Data transfer between two node.js servers this got me to try socket-stream way.
transfer files between two node.js servers over http for HTTP way
There are plenty of ways to achieve this , but not so much to do it right !
socket io and wesockets are efficient when you use them with a browser , but since you don't , there is no need for it.
The first method you can try is to use the builtin Net module of nodejs, basically it will make a tcp connection between the servers and pass the data.
you should also keep in mind that you need to send chunks of data not the entire file , the socket.write method of the net module seems to be a good fit for your case check it : https://nodejs.org/api/net.html
But depending on the size of your files and concurrency , memory consumption can be quite large.
if you are running linux on both servers you could even send the files at ground zero with a simple linux command called scp
nohup scp -rpC /var/www/httpdocs/* remote_user#remote_domain.com:/var/www/httpdocs &
You can even do this with windows to linux or the other way.
http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
the client scp for windows is pscp.exe
Hope this helps !

Stream audio simultaneously from soundcloud source with node

I am using the Soundcloud api from a node server. I want to stream an audio track simultaneously to multiple users.
I tried something like this (using the code on this question Streaming audio from a Node.js server to HTML5 <audio> tag) but it does not work. Any idea on how I could do this?
var radio = require("radio-stream");
var http = require('http');
var url = "http://api.soundcloud.com/tracks/79031167/stream?client_id=db10c5086fe237d1718f7a5184f33b51";
var stream = radio.createReadStream(url);
var clients = [];
stream.on("connect", function() {
console.error("Radio Stream connected!");
console.error(stream.headers);
});
stream.on("data", function (chunk) {
if (clients.length > 0){
for (client in clients){
clients[client].write(chunk);
};
}
});
stream.on("metadata", function(title) {
console.error(title);
});
var server = http.createServer(function(req, res){
res.writeHead(200,{
"Content-Type": "audio/mpeg",
'Transfer-Encoding': 'chunked'
});
clients.push(res);
console.log('Client connected; streaming');
});
server.listen("8000", "0.0.0.0");
console.log('Server running at http://127.0.0.1:8000');
There are several problems
Follow Redirects
The radio-stream module that you're using hasn't been updated in 4 years. That's an eternity in Node.js API's time. I recommend not using it, as there are undoubtedly compatibility issues with current and future versions of Node.js. At a minimum, there are much better ways of handling this now with the new streams API.
In any case, that module does not follow HTTP redirects. The SoundCloud API is redirecting you to the actual media file.
Besides, the radio-stream module is built to demux SHOUTcast/Icecast style metadata, not MP3 ID3 data. It won't help you.
All you need is a simple http.get(). You can then either follow the redirect yourself, or use the request package. More here: How do you follow an HTTP Redirect in Node.js?
Chunked Encoding
Many streaming clients cannot deal with chunked encoding. Node.js (correctly) adds it when you have streaming output. For our purposes though, let's disable it.
res.useChunkedEncodingByDefault = false;
https://stackoverflow.com/a/11589937/362536
Building a Coherent Stream
In theory, you can just append MPEG stream after MPEG stream and all will work fine. In practice, this doesn't work. ID3 tags will corrupt the stream. One file might be in a different sample rate than the other file and most software will not be able to switch the hardware to that new sample rate on the fly. Basically, you cannot reliably do what you're trying to do.
The only thing you can do is re-encode the entire stream by playing back these audio files, and getting a solid stream out the other end. This gives you the added bonus that you can handle other codecs and formats, not just MP3.
To handle many of your codec issues, you can utilize FFmpeg. However, you're going to need a way to play back those files to FFmpeg for encoding.
Rate Limiting
You must stream audio at the rate of playback. (You can send an initial buffer to get clients started quickly, but you can't keep slamming data to them as fast as possible.) If you don't do this, you will run out of memory on the server very quickly, as clients will lower their TCP window size down to zero and stay there until the audio has caught up enough to allow buffering more data. Since you're not using pipe, your streams are in flowing mode and will indefinitely buffer on the server. Now, this is actually a good thing in some ways because that prevents one slow client from slowing down the others. It's a bad thing though in that your code, you are streaming as fast as possible and not at the rate of playback.
If you play back the audio to another encoder, use RTC over several seconds as a clock. It doesn't have to be perfect, that's what client buffers are for. If you're playing back to an audio device, it has its own clock of course, which will be used.
What you should actually do
You've stumbled into a huge project. I strongly recommend using Liquidsoap instead. There are ways you can control it from Node.js. From there, use a server like Icecast for your streaming.

node (socket) live audio stream / broadcast

Please, is there any easy way to stream (broadcast) media file (ogg, mp3, spx..) from server to client (browser) via NODE.js and possibly SOCKET.IO?
I have to record audio input on the server side and then be able to play it realtime for many clients.
I've been messing with binary.js or socket.io streams but wasnt able to get it right.
I've tried to encode audio input with speex, vorbis or lame and then load it by FS to client but I havent been successful. Or do i have to capture PCM and then decode it in browser?
Any suggestion on this, nothing Ive found ever helped me.
Many thanks for any tips, links and ideas.
You'll want to look for packages that work on Streams and from there it's just about piping your streams to output as necessary. Using Express or just the built-in HTTP you can accomplish this quite easily. Here's an example built around osx-audio which provides a PCM stream, lame which can encode a stream to mp3, and Express:
var Webcast = function(options) {
var lame = require('lame');
var audio = require('osx-audio');
var fs = require('fs');
// create the Encoder instance
var encoder = new lame.Encoder({
// input
channels: 2, // 2 channels (left and right)
bitDepth: 16, // 16-bit samples
sampleRate: 44100, // 44,100 Hz sample rate
// output
bitRate: options.bitrate,
outSampleRate: options.samplerate,
mode: (options.mono ? lame.MONO : lame.STEREO) // STEREO (default), JOINTSTEREO, DUALCHANNEL or MONO
});
var input = new audio.Input();
input.pipe(encoder);
// set up an express app
var express = require('express')
var app = express()
app.get('/stream.mp3', function (req, res) {
res.set({
'Content-Type': 'audio/mpeg3',
'Transfer-Encoding': 'chunked'
});
encoder.pipe(res);
});
var server = app.listen(options.port);
}
module.exports = Webcast;
How you get your input stream might be the most interesting part, but that will depend on your implementation. The popular request package is built around Streams as well though, so it might just be an HTTP request away!
On the web browser you have the HTML5 video element and the audio element. Both of them have sources. Each web browser supports different codecs natively. So you'll want to watch out for that if you're trying to stream mp3.
You don't need socket.io, you only need HTTP. Your app is reading a file, music.ogg, and for each chunk it reads, it will send it through the http server. It will be one single HTTP request that's kept open until the file is transferred.
Here's how your html will look:
<audio src="http://example.com/music.ogg"></audio>
And your nodejs code will be something like this (haven't tested this):
var http = require('http');
var fs = require('fs');
http.on('request', function(request, response) {
var inputStream = fs.open('/path/to/music_file.ogg');
inputStream.pipe(response);
})
I'm only using the ReadableStream.pipe method on the inputStream and the http and fs modules for the above code. If you want to transcode the audio file (for example, from mp3 to ogg) you'll want to find a module that does that and pipe the data from the file into the transcoder then into response:
// using some magical transcoder
inputStream.pipe(transcoder).pipe(response);
The method will call end on the stream whenever it's finished writing so that the HTTP request will be finished as soon as the file is done being read (and transcoded).
You can do this with node and RTC. There is some tools ready to use like SimpleWebRTC or EasyRTC. For what I already tested video is still a trouble, but audio works great.

Store WebM file in Redis (NodeJS)

I'm searching a solution to store a WebM file into Redis.
Let's explain the situation:
The NodeJS server receive a WebM file from a client, and save it into server file system.
Then it have to save this file in redis, because I don't want to manage redis and file system too. In this way I can delete the video just with redis command.
I think to read file with fs.readFile() and then save it into a Buffer, but I don't know witch encode format to use, and I don't know how to refer this process to give back the WebM video to a client when it make a request.
Is this a good way to proceed? Any suggestion?
PS: I use formidable to upload file.
EDIT: I found a way to proceed, but theres another problem:
var file = fs.readFileSync("./video.webm");
client.set("video1", file1, function(){
client.get("video1", function(err, data) {
var buffer = new Buffer(data, 'binary');
// file ≠ buffer
});
});
Is this an encode problem? Like unicode/UTF8/ASCII?
Maybe node and redis use different encode?
Solution found!
The problem became when you create the client object.
Usually this is what is done
var client = redis.createClient();
And return_buffers param will be set as false.
In this way
var client = redis.createClient(6379, '127.0.0.1', {
return_buffers: true,
auth_pass: null
});
everything gone right! ;)
this is the issue page they help me
I don't know much about NodeJS and WebM files.
Redis stores C char type on 8 bit String, so it should be binary friendly. Check the js code and configuration to ensure your js redis client sends / receives data as bytearray and not as UTF-8 string, probably there is a bad conversion in JS of data.

Resources