I have created a audio stream with libvlc and the code below:
public partial class Form1 : Form
{
private LibVLCSharp.Shared.LibVLC libVLC;
private LibVLCSharp.Shared.MediaPlayer mp;
private void button1_Click(object sender, System.EventArgs e)
{
libVLC = new LibVLC();
mp = new MediaPlayer(libVLC);
Media media = new Media(libVLC, new Uri("http://listen.technobase.fm/tunein-aac-hd-pls"));
media.AddOption(":sout=#transcode{vcodec=none,acodec=mp3,ab=128,channels=2,samplerate=44100,scodec=none}:http{dst=:8080/stream.mp3}");
media.AddOption(":sout-keep");
media.AddOption(":no-sout-all");
mp.Play(media);
}
}
The stream is up and running and accessible via VLC and the following html5 audio element. The MIME type is also installed on the IIS.
If I now try to connect to the stream via the HTML5 audio element on my IIS server it doesn't work. I use the following HTML5 audio element:
<audio controls="controls" autoplay="autoplay">
<source src="http://localhost:8080/stream.mp3" type="audio/mp3" runat="server" />
Your Browser does not support this stream!
</audio>
Could it be that there is a permission issue with my IIS. What kind of permission need my IIS to access a local stream on my server.
I partly solved the issue. The main problem was that my IIS side runs with https and the stream just run with http. If I turn off the url rewrite to https then the streams runs in my browser.
I just have to find out, how to create a https stream with libvlc.
Related
How can I play a specific Youtube Video on my Google Hub via Google Actions? I know I can use a Basic Card to display images and text and even a link (although that link does not show up on the HUB)
I specifically want to be able to trigger or to play a youtube video on my Google Hub.
Actions are not able to start playing video content. Media responses are only for audio.
I have a similar need. After a chat with an action on google, I want to play user requested youtube videos (chains-of) on a local "big screen" (TV-like / PC).
A workaround solution could be:
you realize an action that select one or more videos.
The action act also as a server for a client described here below
The action communicate (SSE, websocket, HTTP...) with a client browser page containing a javascript small program that dynamically visualize the video (id sent via SSE client-server communication)
Here below the rough js script (I'm not a web developer); that just gives you the idea:
<script language="javascript">
function loadVideoWithId(id) {
const tvEmbedMode = "embed/" //"tv#/watch?v="
const url = `https://www.youtube.com/embed/${id}?fs=1&autoplay=1&loop=1` //
const iframe = `<iframe src="${url}" width="1600" height="900" allowFullScreen="allowFullScreen" frameBorder="0" />`
document.write(iframe)
}
loadVideoWithId('hHW1oY26kxQ')
</script>
I'm trying to get my receiver to play an mp3 file hosted on the server with the following function
playSound_: function(mp3_file) {
var snd = new Audio("audio/" + mp3_file);
snd.play();
},
However, most of the time it doesn't play and when it does play, it's delayed. When I load the receiver in my local browser, however, it works fine.
What's the correct way to play audio on the receiver?
You can use either a MediaElement tag or Web Audio API. Simplest is probably a MediaElement.
We want to play Smoothstreaming URL by providing Manifest file of smoothstreaming to Chromecast device.
We could play the following on Chromecast device,
1. .mp4 file
2. .ismv file
3. .isma file.
But, if we provide an Manifest file as follows, we are not able to play on Chromecast device.
http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest
Please let us know how to play Smoothstreaming URL on Chromecast device.
Or do we need to play .ismv files one by one by providing them in a loop.
The Chromecast has support for SmoothStreaming content through their Media Player Library: https://developers.google.com/cast/docs/player
Below is a bare bones implementation.
Google provides a proper example on GitHub which takes advantage of the MediaManager and accounts for other streaming formats: https://github.com/googlecast/CastMediaPlayerStreamingDRM)
var $mediaElement = $('<video>').attr('autoplay', ''),
mediaElement = $mediaElement[0],
mediaUrl = "http://playready.directtaps.net/smoothstreaming/SSWSS720H264/SuperSpeedway_720.ism/Manifest",
mediaHost,
mediaPlayer;
cast.receiver.CastReceiverManager.getInstance().start();
$('body').append(mediaElement);
mediaHost = new cast.player.api.Host({
mediaElement: mediaElement,
url: mediaUrl
});
var protocol = cast.player.api.CreateSmoothStreamingProtocol(mediaHost);
mediaPlayer = new cast.player.api.Player(mediaHost);
mediaPlayer.load(protocol);
Microsoft's test files (inc the ISM) do not return the CORS header required for the chromecast. use a CORS on all of your server and it shall work.
I've encountered this too, and its working if I host them myself with CORS
I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).
i have tried String url = "38.101.195.5:9156"
facing java.io.IOException: 11-Error in HTTP operation
code here
connection = (HttpConnection) Connector.open(url);
dataIn = connection.openDataInputStream();
If the device have support to the JSR 234 - Advanced Multimedia Supplements, then you can use your API: http://jcp.org/en/jsr/detail?id=234
The package "javax.microedition.amms.control.tuner" Description
This package contains Controls for various tuner settings. These Controls, if they are supported, can typically be fetched from a radio Player (for example a Player created by Manager.createPlayer("capture://radio");).
See more in
public interface TunerControl