How to capture media events (play/pause/next etc.) in a Custom Receiver? - google-cast

Google Chromecast supports external control, such as play, pause, next, previous using both the Google Home app and an Infrared remote (over HDMI CEC).
How can these events be captured in a custom media receiver (using the CAF Receiver API) when the receiver has no media playing?

When no media is playing, the receiver is in IDLE state - that means that a sender is connected and the receiver app is loaded and running, but there is currently no playback, paused playback or buffering operation ongoing.
The messages that can now be intercepted/observed by the receiver are basically the same regardless if they have been issued by a sender app, Google home/assistant or CEC - and you can process them the same way.
If you want to implement different behavior depending on the device that send the message (or track that maybe), you can have a look at the customData section - you can set up your sender app to include some data into that, but you have no influence on how a Google Home / Google Assistant or CEC issued messages look like: CustomData will be empty here.

Related

Device connection event for Azure Iothub

The use case is to receive device connected event as soon as the device is connected to the iothub.
There is a device connected/disconnected event that can be captured and routed to eventhub/servicebus/eventgrid, but, this event isn’t triggered in Amqp if the message is sent by .net program i.e it is inconsistent.
Is there any mechanism available to get the recent device connected event in Iothub! Or heartbeat pattern is the only mechanism available or is most efficient way?
Your understanding is correct. Suggestion is to use the Device heartbeat pattern.
The IoT Hub identity registry contains a field called connectionState. Only use the connectionState field during development and debugging. IoT solutions should not query the field at run time. For example, do not query the connectionState field to check if a device is connected before you send a cloud-to-device message or an SMS. We recommend subscribing to the device disconnected event on Event Grid to get alerts and monitor the device connection state. Use this tutorial to learn how to integrate Device Connected and Device Disconnected events from IoT Hub in your IoT solution.
If your IoT solution needs to know if a device is connected, you can implement the heartbeat pattern. In the heartbeat pattern, the device sends device-to-cloud messages at least once every fixed amount of time (for example, at least once every hour). Therefore, even if a device does not have any data to send, it still sends an empty device-to-cloud message (usually with a property that identifies it as a heartbeat). On the service side, the solution maintains a map with the last heartbeat received for each device. If the solution does not receive a heartbeat message within the expected time from the device, it assumes that there is a problem with the device.

How to get the crash log for Receiver apps that are crashed on the chromecast device?

I am looking as how to get the crash log for my receiver app which is on chromecast device.
I have a custom sender app and a receiver app. I am using player to fetch the content URL from sender app and playback the content in receiver app. After some time of playback or data streaming between sender and receiver app, suddenly I observe that Receiver app crashes and sends TCP packet with RST flag enabled(which means receiver app is closed the connection). I tried to check the debug logging in debugger tool, but as the Receiver app is terminated due to crash, the debugger tool is disconnected.
I want to know how to get the crash log and the error log of our receiver app running on chromecast device in this scenario?
You can try debugging for a Chromecast device, where you navigate to your receiver's IP address on port 9222.
http://:9222
The logs will still be available in 9222 and you can see them.

How come it is always the GATT server that exposes services?

Looking at various GATT-based profiles, it seems that services are always exposed in the GATT server rather than the GATT client. For instance, the Time Profile (TIP) has the server exposing the Current Time Service (CTS). So, if a phone is to update a heart rate monitor with the current time using TIP, the phone will be the server whereas the monitor will be the client. But, being a heart rate monitor, the Heart Rate Profile expects the monitor to be a GATT server.
So, for a monitor that takes the current time from a phone, should it be a GATT client or server? Should it be set as a client whilst time syncing with the phone and set as a server otherwise? Should a custom profile be implemented such that the CTS is exposed in the client instead?
Thanks
Generic Attribute Profile (GATT) defines how server and client communicate with
each other using Attribute Protocol for the purpose of transporting data. Client
and server roles are determined when a procedure is initiated and released when the procedure is ended. Hence, a device can act in both roles at the same time.
I would suggest you to read Bluetooth Spec. In Part G 2.2 it explains the roles and configurations.
Client—This is the device that initiates commands and requests towards the
server and can receive responses, indications and notifications sent by the
server.
Server—This is the device that accepts incoming commands and requests
from the client and sends responses, indications and notifications to a client.
Back to your question:
The Time profile enables the device to get the date, time, time zone,
and DST information and control the functions related the time.
In your case, the monitor will be the GATT client when it takes the time from a phone. However, it can be a server at the same time for another procedure (operation, request etc.) with the phone.
In short, client and server roles are not fixed to the devices. When your phone exposes the current time, it will be server. Similarly, when it gets the current time from the monitor, it will be client. no need to customize the profile. If you want your phone to get the current time from a device and expose it to another device, just implement the same profile for client and server roles to your phone.
EDIT:
According to TIP profile spec, to get the current time information, the GATT Read Characteristic Value sub-procedure shall be used with the handle of the Current Time Characteristic. Monitor as a client will read the Current Time Characteristic from the GATT Table of the server (in this case it is the phone). As soon as the monitor retrieves the value from phone, it can update its Current Time Characteristic Value, and expose it to its environment in three ways:
Notifying it to its subscribed clients (BLE notifications). If you do it in this way, you will customize the Bluetooth TIP profile since this procedure is not defined there (I had a quick look to the document and didn't see it).
Broadcasting it in the advertisement packet (Doesn't require BLE connection)
Another BLE device connects to the monitor and reads the Current Time Characteristic value. This is the recommended way if you want to use Bluetooth SIG defined TIP profile as a server.

Controlling multiple Chromecast Receivers from one Sender?

Is it possible to write a Chrome extension (or Android app) that creates multiple Senders, each connecting to a different Receiver?
In other words, I need to build an interface from which an operator can control the streams on multiple different Chromecasts in the vicinity - each will be playing a different video stream.
I understand from other posts that the chrome.cast API does not allow for this - that the Chrome extension may acts as a single Sender only? This restriction seems arbitrary - I read somewhere that someone was able to control two devices by running two different versions of Chrome, so if this restriction exists in the Chrome API, it's not due to any limitation of the underlying protocol, correct? (what then, politics?)
Is there a lower-level API (perhaps on Android?) that would permit you to create multiple Senders and connect them to different Receivers?
I've seen some apps (such as Videostream) which appear to continue to run on the Receiver after you've closed the Sender. Might it be possible to, for example, launch a Receiver app on multiple devices, one at a time, have them identify themselves and connect to a local webserver, e.g. via WebSockets, and then have my webserver send messages to those Receiver apps to ask them to change videostreams?
As a last resort, is there an open specification of the underlying protocol?
There is nothing to stop you from writing a sender app that connects to a chromecast, launches an app and then disconnects from that device while letting the chromecast continue running the app; you would need to make sure that you do not stop the receiver when it detects that there are no connected devices. Then, on the sender side, you can repeat the same process but this time connect to a second device and so on. The important thing to keep in mind is that your sender device cannot hold multiple concurrent connections to multiple devices (MediaRouter is a global instance); this means you cannot receive messages (status updates, etc) from different Cast devices except the one you are directly connected to at that time. Also, there is nothing to stop a different user to connect to one of these devices and launch a different app.
To answer your other question, the underlying protocol is not open.

How to check if a ChromeCast Session is already in progress

The use case is that a user starts playback from their iPhone, lets say, and then picks up their iPad (both running my app) and wants to contect to and control the running video from this other iOS device.
On iOS, I do not see any way to determine if there is a instance of my receiver app already running on the Google ChromeCast device. Once I create my session it seems the only thing I can do is to attach a new protocol message stream, which interrupts whatever might be playing already.
It this suppose to be handled in the iOS client side Framework, perhaps there is some coding I need to do in the HTML receiver app?
Thanks.
There is a way outside the API to determine if an app is running. Do a HTTP GET on the apps URL for the ChromeCast IP address: http://192.168.0.x:8008/apps/
If the HTTP response is 200, nothing is running. If the HTTP response is 204, then an app is running and the HTTP response would be redirected to a URL like: http://192.168.0.x:8008/apps/GoogleMusic
Which tells you which app is running.
Interestingly, Google Play Music cannot be controlled by 2 devices simultaneously, but YouTube can. I suspect Play Music is using RAMP which is what the Cast SDK does for media streams. YouTube could be using a proprietary message stream to control the media playback. So you might have to do the same if you want to have the an app on a device controlled by multiple sender apps.
One method is to check the playStatus after you start your session and before you initiate a loadMedia(). If your app is already running - it should return a non-nil (ie. IDLE, PLAYING, ...) result.

Resources