Stream rtsp url in Windows Media Player 12 - rtsp

I have created a live streaming server using rtsp protocol on my machine with link of media as
rtsp://192.168.xx.xx/livedata.sdp
When I pass the link on another machine connected in network to VLC player streaming works fine.
But when I pass the same link to Windows Media Player I get following error:-
"Windows Media Player cannot play the file because the specified protocol is not supported. If you typed a URL in the Open URL dialog box, try using a different transport protocol (for example, "http:" or "rtsp:")."
I have searched on net for the said error and tried all kind of stuff like registering wmnetmgr.dll and checking for permissions of WMSDKNS.XML but nothing worked.I have also checked the settings in Tools->Options->Network Tab and ticked all the check boxes i.e.
RTSP/TCP, RTSP/UDP, HTTP.
I am using Windows Media Player 12.0.7600.16385.
I am attaching the screenshot of the error below.
Please provide me with a solution for the same.
Thanks in advance.

Related

Raspberry Pi IFTTT DO Button

I recently put together a Raspberry Pi garage door opener and it seems to be working well at this point. I used this as a reference and used his scripts and webpage with slight fixes/modifications. I can now log into the site I'm hosting from the pi with Apache2 and open and close my door and view my webcam stream. I am using DuckDNS and port forwarding port 80 to my pi currently.
What I would like to do next is set up IFTTT Maker Channel integration with the goal of using a DO Button to control the door from my Android Wear watch. The problem is I don't know how to set up the pi to receive an HTTP request from IFTTT.
Essentially what I need to learn how to do is have the pi listen for this request and run a script (setting GPIO pin 17 to high for a half second). Presumably once I figure this out I would also be able to use Tasker/AutoVoice for Google Now integration.
Thanks in advance for any help.
So I figured it out and want to post my limited knowledge answer for anyone looking to do the same, it was actually easier than I was making it.
I had a couple of problems from the start
My whole html root folder requires a password
I didn't understand what a GET request was doing in this instance
My script files weren't working properly
Here's how I was able to correct these issues and get it working
Send the username/password in the GET request, not extremely secure, but oh well:http://user:password#address.org/file.php
With the basic GET request it appears to just basically "load" the URL given as if you typed it in a browser address bar and hit enter, this means that if my script file is working correctly it should run the script
I corrected my script files

Send content to chromecast from native application

Is it possible to send video to the chromecast device from a native application? It would be nice to share any window on a system instead of only chrome tabs. Also, is there any documentation of the communication used by chrome to communicate with the chromecast? It is my understanding that the chromecast essentially loads content from an embedded chrome instance, but there appears to be more direct ways of communicating with the device since it is able to stream content from a chrome tab using the extension.
You need to whitelist your receiver device if you are developing a receiver application. That would be a Chome app that runs on the receiver's Chrome instance.
You need to whitelist a sender url if you are developing a Chrome app that will cast it's contents.
Video casting works by sending a url to the receiver device, which the device will load directly.
Tab casting works by encoding the tab contents using WebM/Opus (in the Chrome cast extension) and streaming that to the receiver device. (This is limited to 720p, see this question)
Chrome apps can only use Video casting.
The chrome cast extension is going to be the only way to stream directly to the device.
So the answer to your question is no, you cannot stream video directly to the device. The receiver must load the video from the url you provide.
There is some speculation whether the receiver can be provided with a local url or if it must already be available on the internet. This has yet to be clarified.
From how I understand the Chromecast architecture:
You can display any URL you want on the TV (you have to whitelist your app and register the URL first). It must be a URL. This can include HTML, JS, CSS, etc. Anything that is already on the internet.
To receive data from a device (say, the URL of a video to load), you must implement logic to interpret messages from channels. These messages are encoded as JSON, which makes it difficult to send videos or pictures (binary data). It is obviously easiest to upload things like this to some website, and have the receiver display them.
People have asked, "well, then how does the tab/screen sharing work?" The JSON encoding is just what Google provides in their SDK. In their own source, they don't have this restriction.
Update:
It turns out you can actually stream local videos to your TV by just opening the local file in Chrome, and then casting that to your TV.

Using Chromecast Android API to push a URL to the "ChromeCast" App

So far I've been able to write a webpage that pushes a url to the (what I'm calling) native app in the chromecast device. Through this API I can open a "video_playback" app that sends the URL and some other info, just like in this webpage http://googlecast.github.io/cast-chrome/ to my device and my video plays just fine...
Now I want to do that with the Android API, but it treats that receiver "app" as if it doesn't exist. With some more poking around I found that the actual name of the app is ChromeCast, but all I've been able to do is get a blank screen or a 404 to show up. Is this not supported on the Android app? (ie I'm forced to write my own receiver) or am I doing something wrong?
I perfectly able to open a YouTube app through the Android API and load a video, so most of my code is fine. It seems I just need to figure out what application name and arguments to use in the ApplicationSession.startSession() function.
Any help would be appreciated.-
How are you starting your session (which version of startSession() are you using?)
It sounds like you are starting your session ok but then you need to send the url of the video via the MediaProtocolMessageStream.loadMedia().
https://developers.google.com/cast/reference/android/javadoc/reference/com/google/cast/MediaProtocolMessageStream#loadMedia(java.lang.String, com.google.cast.ContentMetadata, boolean)

streaming video by origin URL with azure media services

I'm trying to make an app with Smooth Streaming. So I'm doing my app with examples
like these.
In result I have many URLs. Some of them is URL for files that I encoded, they are like:
<mediaservicename>.blob.core.windows.net/asset-d66c43e8-a142-4618-8539-39a2bbb14300/BigBuckBunny_650.mp4?sv=2012-02-12&se=2013-06-23T15%3A21%3A16Z&sr=c&si=aff41a1d-6c8a-4387-8c2f-84272a776ff2&sig=8OPuwW6Kssn2EVQYwqUXkUocc7Qhf0xM62rS9aSPsMk%3D
And one of URL is like:
<mediaservicename>.origin.mediaservices.windows.net/6eca30d3-badd-4f45-bc29-264303ffe84a/BigBuckBunny_3400.ism/Manifest
When I try playing the first one on WindowsAzure portal - that's ok.
But when I'm trying to play the second one on WindowsAzure portal - there is an error "we are unable to connect to the content you've requested. We apologize for the inconvenience".
When I'm trying to play them both in my app with Silverlight they do not play as well as on smf.cloudapp.net / healthmonitor.
Maybe there are some errors in the examples on Windiws Azure site? Or what can it be?
The first url you copied cannot be used in a Smooth Streaming player, but the second one may be, if you have created a valid origin locator with a valid access policy.
Can you copy the code you have used to generate these URLs please ?
Hope this helps
Julien

why i can't pass parameters to flash using 'file://' protocol?

system info
flash player version: 10_1_102_65
OS: linux debian, 6.0.2
web browser: Mozilla Iceweasel 3.5.16
problem description
I have a flash file that uses parameters to show output on the screen. unfortunately i don't have sources and can't modify/review it.
i can successfully run flash and pass parameters to it using http:// protocol. for example,
#> iceweasel http:///localhost/40.swf?channel_id=1
shows correct flash in the browser.
but when i try to load flash from file
#> iceweasel file:///home/user/40.swf?channel_id=1
flash can't read passed parameter and shows invalid output
it's really strange. because when i downgraded flash plugin (from 10_1_102_65 to 9) both protocols works! That is something occurred in the flash player above 9.
unfortunately i can't use Flash Player 9 in my production environment so i should resolve the issue with Flash Player 10.
question
how can i pass parameters to flash using 'file://' protocol?
any help is appreciated.
thanks.
As Ignacio says, GET parameters are part of HTTP and won't work with the file:// protocol, but one thing you could try is to supply the channel_id is a a FlashVar. Internally in the swf, the ActionScript code normally access GET parameters and FlashVars the same way (using loaderInfo.parameters).
GET only exists in HTTP. You cannot use a query string when accessing a local file. And since you don't have the source, it's near impossible to provide an alternative.

Resources