Embedding a streaming audio page into an iframe - audio

I am trying to figure out a way to embed a url that streams an audio stream to either an iframe of the best cross compatible browser frame that I could use. I would like to have it autoplay on mobile devices - which I would assume would be no issue as it's already streaming all the time anyway.
My stream url is
http://67.212.165.106:8028/stream
Thanks for any assistance with this, I tried to enclose this url in an iframe but it doesn't work.
Here's where I have that:
http://radio.baseballpodcasts.net/IframePlayer.html
Tried this and it doesn't seem to work:
http://radio.baseballpodcasts.net/IframePlayer.html

You're linking directly to a stream so there's nothing to iframe. What you need is an <audio> element.
<audio src="http://67.212.165.106:8028/stream" controls preload="none" autoplay></audio>
Note that autoplay does not normally work. If users come to your site and click play often enough, it may eventually.

Related

How To Capture WebRTC Stream From Different Page In NodeJS

I'm new to Node.js and WebRTC concept. I'm trying to make a an audio stream in which one of the page is the music controller and the other page is just playing whatever stream the music controller page is playing.
I based the idea on this link:
https://webrtc.github.io/samples/src/content/capture/video-pc/
But instead of video I just want audio. I'm able to make it work when they are on the same page but there is a problem when capturing stream from different page/url. NodeJS cannot access DOM elements so I'm stuck. I tried accessing the controller page audio element using document.getElementById but its not working. Please help how to get over this.

Is there a standard for using the Azure Media Player?

I've been looking into Azure Media Services and I've been able to create a program that copies my video blob from my website storage to my media service storage account and create an asset/asset file from it. Then I've got it encoding for adaptive streaming.
The issue I'm having is on playback. I'm wanting to use the Azure Media Player as it shows great promise in detecting the environment and providing the correctly encoded video for streaming.
When I use the iframe approach (found here) it works, but I feel I'm losing some ability to customize -- also it's breaking in Safari on Mac.
<iframe class="video-preview" src="//aka.ms/azuremediaplayeriframe?url=[MANIFEST URL HERE]&autoplay=false" name="azuremediaplayer" allowfullscreen></iframe>
The other method (found here) utilizes the <video> tag along with css & js files put in the header.
Header code:
<link href="//amp.azure.net/libs/amp/1.1.0/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
<script src="//amp.azure.net/libs/amp/1.1.0/azuremediaplayer.min.js"></script>
<script>
amp.options.flashSS.swf = "//amp.azure.net/libs/amp/1.1.0/techs/StrobeMediaPlayback.2.0.swf"
amp.options.flashSS.plugin = "//amp.azure.net/libs/amp/1.1.0/techs/MSAdaptiveStreamingPlugin-osmf2.0.swf"
amp.options.silverlightSS.xap = "//amp.azure.net/libs/amp/1.1.0/techs/SmoothStreamingPlayer.xap"
</script>
Video code:
<video id="azuremediaplayer" class="azuremediaplayer amp-default-skin amp-big-play-centered video-preview" controls data-setup='{"nativeControlsForTouch": false}'>
<source src="[MANIFEST URL HERE]" type="application/vnd.ms-sstr+xml" />
<p class="amp-no-js">To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video</p>
</video>
The <data-setup> attribute is supposed to activate the <video> tag and turn it into an Azure Media Player, but that's not happening for me.
So, my question is: what method is preferred/standard? I know that's difficult to pin down as it's still very young and is always changing, but just wanted to see what everyone else's experiences were.
The iframe approach which is on the demo website is currently a proof of concept (see the warning on the page "Note: this embed code is for demo purposes only. Do not use in production"). It is meant to serve as a way to show that the player can work in an iframe. This will expand over time, but the flexibility of the iframe currently is limited to how you want to design the parameters.
In general, the approach you take depends on what you are trying to achieve (meaning depending on the level of flexibility you require). In general, the current recommended approach is to use the JS and CSS method directly on your page.
Now, for the issues you are having, it would be great to understand what you are seeing.
1.For the iframe issue on Safari on Mac, what are you seeing? I just tried the following on OS X Yosemite and Safari and it seems to be working fine
<iframe src="//aka.ms/azuremediaplayeriframe?url=%2F%2Famssamples.streaming.mediaservices.windows.net%2F91492735-c523-432b-ba01-faba6c2206a2%2FAzureMediaServicesPromo.ism%2Fmanifest&autoplay=false" name="azuremediaplayer" scrolling="no" frameborder="no" align="center" height="280px" width="500px" allowfullscreen></iframe>
2. Are you able to view the samples provided in the documentation? Here is the list of samples and specifically you should look at the basic videotag sample. You will need to make sure an source is added to the video tag for the auto-detect to work. If you are still
If you are still having issues, please reach out to ampinfo#microsoft.com

Record Screen's Happenings(Audio+Video)

i am new baby in WebRTC and want to implement system like video conferencing , live streaming or you can skype using WebRTC and NodeJS.
i am confused with one thing , as its our one of client's requirement , suppose on page whatever is happening it may be video conferencing say one moderator answering to many audiences one by one , so there should be one video created , which continuously recording all this stuff together and sending live stream to server to save in our database.
is this kind of stuff implementable or not?
any help please.
You can capture video through grabbing Jpeg images from a canvas element. You could also capture the entire page(if numerous videos in the same page) through grabbing the page itself through chrome.
For audio, recording remote audio with the Audio API is still an issue but locally grabbed audio is not an issue.
RecordRTC and my Modified Version for recording streams either to file or through websockets respectively.
Capture a page to a stream on how to record or screenShare an entire page of Chrome.
If you have multiple different videos not all in the same page but want to combine them all, I would suggest recording them as above and then combining and syncing them up server side(not in javascript but probably in C or C++).
If you MUST record remote audio, then I would suggest that you have those particular pages send their audio data over websockets themselves so that you can sync them up with their video and with the other sessions.

Will the chromecast play media segment files (.ts) from an m3u8 playlist?

I've noticed a lot of websites use m3u8 playlists on their html5 video tags, and those segment files inside the playlist appear to be h264 encoded, so I'm guessing the container is the only thing that the chromecast doesn't support in this case although I know very little about video containers and codecs so I'm probably just making no sense. So with all this in mind, is there any chance the chromecast will one day play those files?
Here is an example http://stream.gravlab.net/003119/sparse/v1d30/posts/2014/barcelona/barcelona.m3u8
Thanks.
Yes - You can using either the default Receiver, a Styled Receiver, or in a Custom Receiver and using the Media Player Library. Of course, you (the owner of the data) must turn on CORS headers for the m3u8 manifest, any sub-manifests, and for the segments and any keys on your server / CDN to support this. This requirement is due to our player being written in JavaScript and running in Chrome on the Chromecast device.
Note - for the Default Receiver & Styled Receiver, the URL to allow CORS from is www.gstatic.com. For your Custom Receiver, it will be the URL where you host your Receiver.

How to catch a flash stream url from browser plugin

My question has similar point like this one.
I’m wondering how I can catch a media URL which SWF loads from browser add-on. Let’s say YouTube flash player starts playing or loading some video (let it be via http) and I want to know that url. Just like browser plugins from “RealDownloader” and “Moyea YouTube FLV Downloader” does. I’m newbie with plugin development and flash and I want to know what technologies it may be. XPCOM, NPAPI, ActiveX, or simple API hooking. Any ideas how this may be accomplished?
NPAPI plugins typically ask the browser to load data for them, they don't do it themselves. This means that a browser extension can intercept these requests. This can be done for example by implementing a content policy. Requests initiated by a plugin will cause a shouldLoad call with type OBJECT_SUBREQUEST.
The simpler option is using HTTP observers - but this way you won't recognize requests initiated by Flash, they will look just like any other request processed by the browser.
Firebug does that, and it's open source. Why not study it a little?
https://github.com/firebug/
It's easy if you only want to get the url from a single swf in a single website. for example if all you need are urls from that swf,you can keep only one instance of your browser open and use a tool to intercept its http requests.

Resources