I'm trying to make mobile app that can play audio files, but, that audio files are big, more than 100 mb, and I don't want save this file into the device, I want make a stream to this file into the browser.
I'm building a prototype only with ionicframework and phonegap plugins this is my github of the project:
https://github.com/cmarrero01/famvoice
My server is in node.js, so my first intent was make the streaming of the audio file with node.js to the audio tag, something like this:
app.get('/record/:userId/:playListId/:recordName',function(req,res){
var userId = req.params.userId;
var playListId = req.params.playListId;
var recordName = req.params.recordName;
if(!userId || !playListId || !recordName){
res.end();
return;
}
var path = "./records/"+recordName;
if (!params.fs.existsSync(path)) {
res.end();
return;
}
res.setHeader("content-type", "audio/mpeg");
var stream = params.fs.createReadStream(path, { bufferSize: 64 * 1024 });
stream.pipe(res);
}
And in the ionic html the audio tag:
<audio controls preload="none" type="audio/mpeg" autoplay>
<source ng-src="{{audioPlay | trustAsResourceUrl}}" />
</audio>
This is was a bad, bad, bad Idea, when I add the src to the audio tag, this wait to download all file to start playing it, so this is really an streaming, this is the same that I download the file an run it.
So, my second thought was ffmpeg, ffserver. I install ffmpeg, and I run ffserver with this config file:
Port 8090
BindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 1000
CustomLog -
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 200K
ACL allow 127.0.0.1
</Feed>
<Stream status.html>
Format status
ACL allow 127.0.0.1
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
</Stream>
<Stream test.mp3>
Feed feed1.ffm
Format mp2
AudioCodec libmp3lame
AudioBitRate 128
AudioChannels 1
AudioSampleRate 44100
NoVideo
</Stream>
And then run the next command:
ffmpeg test.mp3 http://IP:PORT/feed1.ffm
This command send to the feed the audio file test.mp3, and if you access to the url: http://IP:PORT/test.mp3 you can listen.. BUT, HERES IS MY BIG PROBLEM...
My Problems are:
The ffmpeg send the file to the feed, when the the process finished if you can try to access to the that url, you can't listen anything and the URL is stay loading.
Is suppouse that the user can select a file to listen, how I make to stream a specific file to a spefic user and not change stream for all users. I suppouse that I need one feed per user?
Exist some way to send the stream to node.js and node.js send it to the app with the correct codecs and stuff?.
If the ffmpeg is not the best way, what it is?
Thanks.
Related
I'm using a video stream on a raspberry pi with raspivid and ffmpeg in a node app. Using them in the terminal (without using node) will stream for hours, but when I use them in a node child_process (I spawn 2, one for each) it works great for a little over 3 minutes and then the stream stops. The child_processes are still running and I'm not seeing any errors.
The gist of my code:
let camera = spawn('raspivid', args)
let ffmpeg = spawn('ffmpeg', args)
camera.stdout.on('data', (data) => {
ffmpeg.stdin.write(data)
})
Any ideas why it is stopping after 3 minutes? Thanks!
use video.mkv format
It happened to me in mp4 format and when I switched to mkv it worked out
I got a http server setup using node.js which responds with an html file in port 3000.This html file has a script tag that returns files using relative path.
Eg html file:
<script src="../helloworld.js"></script>
Now the request.url in node.js http server callback returns only /helloworld.js instead of ../helloworld.js
Node.js file:
var http=require('http');
var fs=require('fs')
http.createServer(function(req,res){
if(req.url=='/')
// read and return html file
else
{
console.log(req.url) // prints /helloworld.js instead of ../helloworld.js
//reading helloworld.js from filesystem
}
}).listen(3000)
From the server side, if the url is rooted, ../ means nothing.
That is:
on the url
www.example.com/notroot
Using
../somefile.js
fetches it from
www.example.com/somefile.js
But if the url is already rooted, ie:
www.example.com
using
../somefile.js
will not work, since there is no parent directory to access.
Also, you don't need the fs component to fetch files from the client side. This is only used to fetch files from the server side (where ../ WILL work).
But as you used I assume that this is a DOM tag embedded in the client's browser and not on the server side.
I've been trying to make a server that can visualize music (This is what I have so far). That's been successful but I want to try and make it work with youtube videos, and I've found a lot of repositories on github for youtube video to audio conversion that make this reasonably doable, but in order to deploy a server on heroku that can host temporary audio files of youtube videos in the format that I want, I'd need to include ffmpeg in a buildpack and I'm not sure how to go about doing that. This is the heroku buildpack for node.js but I don't really understand how it works.
TL;DR: What steps would I need to follow after forking the heroku-buildpack-nodejs repository on github in order to successfully deploy a node.js server to heroku and run this code?
var conversionProcess = child_process.spawn(
'ffmpeg',
['-i', 'some_youtube_audio.mp3', 'some_youtube_audio.webm'],
{
cwd: __dirname + '/tmp'
}
);
The documentation for this function is on the node.js API, by the way.
you should use the multipack https://github.com/ddollar/heroku-buildpack-multi
then use the node buildpack as well as an ffmpeg buildpack https://github.com/jonathanong/heroku-buildpack-ffmpeg-latest
i'm trying to play raw pcm data, which came from socket.io (using nodejs) with alsa. Recently i used node-speaker that solved my problem, but i can't install it to my target device. Now i'm trying to do it with nodejs "fs" module:
...
var pcm = fs.createWriteStream("audio.pcm");
socket.on('stream', function(data) {
console.log(data);
pcm.write(data);
});
....
Afterwards i'm trying to run aplay command immediately:
aplay -t raw -c 1 -r 48000 -f S16_LE audio.pcm
I able to listen my data with delay (2-3 seconds. It depends of how quickly I ran above command), but it crashes after 5-10 seconds without any messages. I guess it is not a right way to play live pcm. What is the best way to do that?
I am looking to read audi/video stream using nodejs, haven't found any thing that explains how to achieve this.
Please advice how this can be done
Thanks
VidStreamer.js A simple streamer for Flash and HTML5-style videos. Supports HTTP pseudostreaming and works with JW Player's bitrate switching.
you can install it using
npm install vid-streamer
then give your audio/video files path in config/vidStreamer.json
if it not work then clone the https://github.com/meloncholy/vid-streamer.git repository and make appropriate configuration. Sample configuration
{
"mode": "development",
"forceDownload": false,
"random": false,
"rootFolder": "/path/to/videos/",
"rootPath": "videos/",
"server": "VidStreamer.js/0.1.4"
}
To make a standalone video streamer, try something like this
var http = require("http");
var vidStreamer = require("vid-streamer");
var app = http.createServer(vidStreamer);
app.listen(3000);
console.log("VidStreamer.js up and running on port 3000");