Play video from a memory stream inside a xamarin iOS app - xamarin.ios

I am building a app in xamarin that users will use for playing movies. I need the app to play the movie from a memory stream and not from a url or file directly.
Is there a player control or library that i can use to play the movie from a mmemory stream?

Since there is no API you can use, what you can do is stream your in-memory representation of the media file at a url.
Use the HttpListener class to create an HTTP server embedded in your application, have it listen say on port 9000, and then have the movie player use an NSUrl like this:
new NSUrl ("http://127.0.0.1:9000/myfile.mov")

Generally its best to use MPMoviePlayerViewController (doc here) for playing videos on iOS. Sadly it requires an NSUrl to play the video.
What you'll have to do in order to use it, is save your stream to a temporary file.
Such as:
string path = Path.GetTempFilename();
using (var yourStream = GetYourStream())
using (var fileStream = File.Create(path))
{
await yourStream.CopyToAsync(fileStream);
}
var controller = new MPMoviePlayerViewController(NSUrl.FromFilename(path));
//show the controller
Not quite ideal, but it should work.

Related

How to save a webRTC stream into a file on server with nodejs?

I get my stream from my client like this :
webrtc_connection.ontrack = async (e) => {
//TODO : RECORD
}
How can I record / save it into a file on server? Apparently nodejs does not have MediaRecorder, so I am at loss for going further.
There are two options. The first is to use MediaRecorder+Socket.io+FFmpeg. Here is an example of how to stream from the browser to RTMP via node.js, but instead of streaming, you can just save it to the file.
draw your video on canvas, use canvas.captureStream() to get MediaStream from the canvas;
append your audio to MediaStream that you got in the previous step using MediaStream.addTrack();
use MediaRecorder to get raw data from the MediaStream;
send this raw data via WebSockets to node.js;
use FFmpeg to decode and save your video data to a file.
The Second is to use node-webrtc. You can join your WebRTC room from the server as another participant using WebRTC and record media tracks using FFmpeg. Here is an example.

best way to save images on a mongoose domain

I'm new to node.js and i'm trying to make an application which saves photos of users just like a normal application. The user can set a profile picture and could add other pictures as well in their wall.
I'm also done with other parts of my application, but I'm trying to figure out what would be the best way to save those images -- since my application should be able to scale the number of users to a big number.
I referenced :
How to upload, display and save images using node.js and express ( to save images on server)
and also: http://blog.robertonodi.me/managing-files-with-node-js-and-mongodb-gridfs/ (to save images on mongo via grid-fs)
and I'm wondering what would be the best option.
So, could you please suggest what I must be rooting for?
Thanks,
It depends on your application needs, one thing I've did for a similar applications was to create an abstraction over the file storage logic of the server application.
var DiskStorage = require('./disk');
var S3Storage = require('./s3');
var GridFS = require('./gridfs');
function FileStorage(opts) {
if (opts.type === 'disk') {
this.storage = new DiskStorage(opts);
}
if (opts.type === 's3') {
this.storage = new S3Storage(opts);
}
if (opts.type === 'gridfs') {
this.storage = new GridFS(opts);
}
}
FileStorage.prototype.store = function(opts, callback) {
this.storage.store(opts, callback);
}
FileStorage.prototype.serve = function(filename, stream, callback) {
this.storage.serve(filename, stream, callback);
}
module.exports = FileStorage;
Basically you will have different implementation for your logic to store user uploaded content. And when you need it you can just scale from your local file storage/mongo gridfs to a S3 maybe. But for a seamless transition when you store the user file relationship in your database you could store also the file provider, local or S3.
Saving images directly to the local file system can be sometimes a little bit complicated when we are talking about many uploaded content, you could easily run into limitations like How many files can I put in a directory?. GridFS should not have such a problem, I've had pretty good experience using MongoDB for file storage, but this depends from application to application.

Is it possible to read music meta data from a Meteor JS app?

I understand it is possible to get a devive's native functions with Meteor JS (like camera, geolocation, accelerometer etc).
Is there some way of getting the meta data of music that is currently playing in a Meteor JS app? (or any other broadcasted data for that matter)
If you use the HTML5 player (<audio> or <video> tags), there is an event you can track in Meteor:
Template.AudioPlayer.events({
'loadedmetadata #audio_player':function (e) {
//and you can access the metadata from e.target.whatever
}
});

Stream audio simultaneously from soundcloud source with node

I am using the Soundcloud api from a node server. I want to stream an audio track simultaneously to multiple users.
I tried something like this (using the code on this question Streaming audio from a Node.js server to HTML5 <audio> tag) but it does not work. Any idea on how I could do this?
var radio = require("radio-stream");
var http = require('http');
var url = "http://api.soundcloud.com/tracks/79031167/stream?client_id=db10c5086fe237d1718f7a5184f33b51";
var stream = radio.createReadStream(url);
var clients = [];
stream.on("connect", function() {
console.error("Radio Stream connected!");
console.error(stream.headers);
});
stream.on("data", function (chunk) {
if (clients.length > 0){
for (client in clients){
clients[client].write(chunk);
};
}
});
stream.on("metadata", function(title) {
console.error(title);
});
var server = http.createServer(function(req, res){
res.writeHead(200,{
"Content-Type": "audio/mpeg",
'Transfer-Encoding': 'chunked'
});
clients.push(res);
console.log('Client connected; streaming');
});
server.listen("8000", "0.0.0.0");
console.log('Server running at http://127.0.0.1:8000');
There are several problems
Follow Redirects
The radio-stream module that you're using hasn't been updated in 4 years. That's an eternity in Node.js API's time. I recommend not using it, as there are undoubtedly compatibility issues with current and future versions of Node.js. At a minimum, there are much better ways of handling this now with the new streams API.
In any case, that module does not follow HTTP redirects. The SoundCloud API is redirecting you to the actual media file.
Besides, the radio-stream module is built to demux SHOUTcast/Icecast style metadata, not MP3 ID3 data. It won't help you.
All you need is a simple http.get(). You can then either follow the redirect yourself, or use the request package. More here: How do you follow an HTTP Redirect in Node.js?
Chunked Encoding
Many streaming clients cannot deal with chunked encoding. Node.js (correctly) adds it when you have streaming output. For our purposes though, let's disable it.
res.useChunkedEncodingByDefault = false;
https://stackoverflow.com/a/11589937/362536
Building a Coherent Stream
In theory, you can just append MPEG stream after MPEG stream and all will work fine. In practice, this doesn't work. ID3 tags will corrupt the stream. One file might be in a different sample rate than the other file and most software will not be able to switch the hardware to that new sample rate on the fly. Basically, you cannot reliably do what you're trying to do.
The only thing you can do is re-encode the entire stream by playing back these audio files, and getting a solid stream out the other end. This gives you the added bonus that you can handle other codecs and formats, not just MP3.
To handle many of your codec issues, you can utilize FFmpeg. However, you're going to need a way to play back those files to FFmpeg for encoding.
Rate Limiting
You must stream audio at the rate of playback. (You can send an initial buffer to get clients started quickly, but you can't keep slamming data to them as fast as possible.) If you don't do this, you will run out of memory on the server very quickly, as clients will lower their TCP window size down to zero and stay there until the audio has caught up enough to allow buffering more data. Since you're not using pipe, your streams are in flowing mode and will indefinitely buffer on the server. Now, this is actually a good thing in some ways because that prevents one slow client from slowing down the others. It's a bad thing though in that your code, you are streaming as fast as possible and not at the rate of playback.
If you play back the audio to another encoder, use RTC over several seconds as a clock. It doesn't have to be perfect, that's what client buffers are for. If you're playing back to an audio device, it has its own clock of course, which will be used.
What you should actually do
You've stumbled into a huge project. I strongly recommend using Liquidsoap instead. There are ways you can control it from Node.js. From there, use a server like Icecast for your streaming.

node (socket) live audio stream / broadcast

Please, is there any easy way to stream (broadcast) media file (ogg, mp3, spx..) from server to client (browser) via NODE.js and possibly SOCKET.IO?
I have to record audio input on the server side and then be able to play it realtime for many clients.
I've been messing with binary.js or socket.io streams but wasnt able to get it right.
I've tried to encode audio input with speex, vorbis or lame and then load it by FS to client but I havent been successful. Or do i have to capture PCM and then decode it in browser?
Any suggestion on this, nothing Ive found ever helped me.
Many thanks for any tips, links and ideas.
You'll want to look for packages that work on Streams and from there it's just about piping your streams to output as necessary. Using Express or just the built-in HTTP you can accomplish this quite easily. Here's an example built around osx-audio which provides a PCM stream, lame which can encode a stream to mp3, and Express:
var Webcast = function(options) {
var lame = require('lame');
var audio = require('osx-audio');
var fs = require('fs');
// create the Encoder instance
var encoder = new lame.Encoder({
// input
channels: 2, // 2 channels (left and right)
bitDepth: 16, // 16-bit samples
sampleRate: 44100, // 44,100 Hz sample rate
// output
bitRate: options.bitrate,
outSampleRate: options.samplerate,
mode: (options.mono ? lame.MONO : lame.STEREO) // STEREO (default), JOINTSTEREO, DUALCHANNEL or MONO
});
var input = new audio.Input();
input.pipe(encoder);
// set up an express app
var express = require('express')
var app = express()
app.get('/stream.mp3', function (req, res) {
res.set({
'Content-Type': 'audio/mpeg3',
'Transfer-Encoding': 'chunked'
});
encoder.pipe(res);
});
var server = app.listen(options.port);
}
module.exports = Webcast;
How you get your input stream might be the most interesting part, but that will depend on your implementation. The popular request package is built around Streams as well though, so it might just be an HTTP request away!
On the web browser you have the HTML5 video element and the audio element. Both of them have sources. Each web browser supports different codecs natively. So you'll want to watch out for that if you're trying to stream mp3.
You don't need socket.io, you only need HTTP. Your app is reading a file, music.ogg, and for each chunk it reads, it will send it through the http server. It will be one single HTTP request that's kept open until the file is transferred.
Here's how your html will look:
<audio src="http://example.com/music.ogg"></audio>
And your nodejs code will be something like this (haven't tested this):
var http = require('http');
var fs = require('fs');
http.on('request', function(request, response) {
var inputStream = fs.open('/path/to/music_file.ogg');
inputStream.pipe(response);
})
I'm only using the ReadableStream.pipe method on the inputStream and the http and fs modules for the above code. If you want to transcode the audio file (for example, from mp3 to ogg) you'll want to find a module that does that and pipe the data from the file into the transcoder then into response:
// using some magical transcoder
inputStream.pipe(transcoder).pipe(response);
The method will call end on the stream whenever it's finished writing so that the HTTP request will be finished as soon as the file is done being read (and transcoded).
You can do this with node and RTC. There is some tools ready to use like SimpleWebRTC or EasyRTC. For what I already tested video is still a trouble, but audio works great.

Resources