Is ChromeCast only for Video? - google-cast

Can ChromeCast and/or the Google Cast protocol run apps that are not video?
In other words, is it possible to create an html5 app that runs on the ChromeCast and controlled by the smartphone/tablet?

Yes, the receiver is an arbitrary HTML5 application. Of course, you should make sure it's suitable to be displayed on a TV with no attached keyboard and mouse, etc.
See the Google Cast receiver documentation for some more detail.

Related

Manage playback of audio stream from device to chromecast

I have been searching for the best practice of stopping the playback on a device when the chromecast is selected. right now I connect to an audio stream, it starts playing on the chromecast fine, but also stays playing on my phone. I had hoped this was some type of automatic switch that was supposed to occur. Is it up to me to manage all of this?? If so what are the best practices to start/resume playback when switching back and forth from the chromecast to the device? It is a live stream so no way to pause and pick up where it left off.
Are there certain callbacks that I need to watch for to make the switch?
Yes, it is your responsibility to manage the behavior of your app. Our UX Design Checklist outlines the flow that we are recommending; for example when you start streaming to a cast device, you stop the local playback. Details of how you can stop the playback locally depends on your application but what you should use is a set of callbacks that the Android Cast SDK provides for you to learn about the success of your cast control commands and state changes that happen on the receiver. These callbacks can tell you if your launch of application was successful or not, whether the media is playing or paused, or when the metadata for the media has changed. You need to look at our SDK documentation to see which ones are appropriate for your case. We also have a number of sample projects that do most of these tasks.

How to implement Connect and Play

I have an app that streams live audio and it will connect and play through the chromecast with no errors (play and connect). So now I am trying to figure out how to play through the chromecast if the user connects first before playing the audio. Per the UI guidelines you have to display a cast button on all activities. So my app has a main activity with 2 play buttons for different stations and also the cast button and then has another activity for when the station is playing after you select which station you want to play (ie the ip address gets selected depending on which button you choose). In this activity's onRouteSelected() I am then switching to the chromecast and stopping local playback on the device.
My questions is how do I call the onRouteSelected() to get the chromecast going if the chromecast has already been connected in the previous activity?? I have looked at the sample apps and cannot figure out how to do this.
Take a look at the CastVideos-android sample project that uses the CastCompanionLibrary to maintain state and manage most of the cast related job. You can either use the library or see how things are done there if you want to do it yourself.
If you are already connected to a chromecast device, you are not going to get a new call to onRouteSelected() so you need to maintain the state of connectivity across your activities (say, in a singleton or in your Application), that is what CastCompanionLibrary does.

Build own Chromecast device

The Chromecast device is a "receiver device [that] runs a scaled-down Chrome browser with a receiver application". Can I download and install this receiver app on a chrome browser for example on my Windows notebook?
I have implemented a complete chromecast v2 receiver, called YouMap ChromeCast Receiver, available in Google play store and Amazon store, xda-developer thread here: http://forum.xda-developers.com/android-tv/chromecast/app-youmap-chromecast-receiver-android-t3161851
The current Chromecast protocol is a completely different one from the original DIAL based protocol. Right now, only YouTube still uses the old protocol, which chromecast maintains its backward compatibility.
The discovery is mDNS, exactly same as Apple TV bonjour protocol.
The most difficult part is device authentication, the sender and the receiver perform handshakes by exchanging keys and certificates in a way extremely difficult to crack. AppleTV does the same using FairPlay encryption.
The next difficult part is the mirroring protocol, which is also very complicated, need to deal with packet splits, packet retransmissions. Overall, chromecast mirroring protocol is well designed, better than miracast, better than AirPlay mirroring (I have also implemented both of them, so I know what I am talking about).
When I get chances, will write more here.
The chromecast device works using the DIAL protocol. It is completely possible to emulate this protocol using some simple code to listen on the multicast group for discovery and then handle the HTTP requests to launch applications. It is then the launched application that communicates with the casting device, I believe using the RAMP protocol.
Luckily for us the applications that the chromecast device uses are mostly web applications meaning our device emulator just needs to launch a web browser and point it to a specific url when it receives an application request.
For example the youtube app, after device discovery and establishing where the applications are located (part of DIAL). Will send a HTTP POST request containing a pairing key to /<apps url>/YouTube. All the emulating device needs to do now is open https://www.youtube.com/tv?<pairing key> in a browser window. From here, I believe, communication for controlling the youtube app is not sent through the casting device but through the open tabs on the casting device and the emulator.
This is my understanding of how the chromecast device works and specifically the youtube app from looking at https://github.com/dz0ny/leapcast which is a python emulator that has youtube and google music working.
Google is in progress of open sourcing some part of the chrome cast.
https://code.google.com/p/chromium/codesearch#chromium/src/chromecast/
https://code.google.com/p/chromium/issues/list?q=label:Chromecast
So theoretically you can build a similar device.

can the chromecast device be programmed also be used as a sender

I wanted to know if it is possible to program one chromecast device to send video to another chromecast device acting like a receiver. I have browsed the API reference on the Google developer website and it doesn't seem to be supported, but also doesn't say it isn't supported.
This is not currently possible, and probably never will be. Chromecast devices don't "source' video, they only transfer from wifi to an HDMI output. it is possible to play the same stream on two different chromecast devices, but there is no way to synchronize them at the moment. In theory this is something the devices could support.

can J2ME access camera(image capture) event from N73 device

I am working on project where I need to catch the image capture event.
It's for nokia N73 having platform S60 3rd edition.
Is there any possible way using J2ME only (without using symbian).
Description:
J2ME application running in background, on click of capturing image from camera J2ME application initiates and comes in front. Takes the captured image and transfers it to J2ME app and displays on screen.
if not possible using J2ME , Is there any possible way using symbian? can anyone provide tutorial or code snippet?
Thank you.
Regards,
Rajiv
Not possible to access the native camera from J2ME. You'd need to get the user to start your app first, then access the camera from your app (using JSR 135, spec here, introduction and examples here). Then you can use the captured image however you wish.
HTH
The N73 in particular has a fairly large hardware limitation when you want to use the camera.
You need to have the user manually open the camera cover before you can use the camera.
This launches the native camera application included in S60.
The user then needs to close that application.
From that point on, J2ME can use the camera, via the mobile media API defined in JSR-135.
If the user reboots the phone, the camera cover needs to be re-opened before J2ME can use the camera again.
You may have better luck using J2ME and JSR-135 to capture images using the front camera on the N73.
I seriously doubt that J2ME would see the user pressing the camera key in javax.microedition.lcdui.Canvas.keyPressed();
JSR-135 doesn't really provide a system-wide camera capture event for J2ME.

Resources