Stream Audio to Google Cast Audio from Raspberry Pi - audio

I'd like to send audio directly from a Raspberry Pi to a Google Cast Audio device.
I'm happy to do this via a fileserver hosted on the Pi (or similar) but no external services can be used- no internet connection is available during operation. Is this possible?

Using Google Cast SDK, you would need a "sender" application to initiate the launch of your app on a cast device and sending the instructions to that. After that is done, your (html5) application on the cast device can continue the playback on its own. Whenever the application on the cast device is brought down, you would need a sender to restart this process. The platforms supported by Cast SDK are Android, iOS and chrome, so RP is not among the supported platforms. Another issue in your scenario is that any cast device requires internet connectivity; without one, such device will not boot up completely and won't be useful.

Related

Establish a connection between smartphone and PC via Bluetooth automatically

I'm trying to establish a connection between my PC running Ubuntu and my iPhone via Bluetooth automatically when it becomes available, after being manually paired beforehand. I've seen this to be possible with certain peripherals, mainly audio. For example, my phone will automatically connect to a Bluetooth speaker when it is turned on and Bluetooth is active on my phone; another example is my phone automatically connects to my car's radio system via Bluetooth when I turn the car on.
I'm not able to connect my phone to my PC without first initiating the connection from the smartphone's Bluetooth menu. I'm thinking that I could possibly write an application for the PC to attempt to connect to the device every few minutes or something, but it seems that the phone needs to be the device to initiate the connection.
The only information that I need for what I'm trying to do ultimately is that the devices can pair successfully. Essentially I'm trying to build a sort of proximity trigger between my phone and my PC without using Wi-Fi and GPS - I can't use these for some specific reasons.
Is there any way to make this happen?
Yes this should be doable as long as you use the Background Processing feature for iOS apps. In the example I'll give below, we'll have the PC be the peripheral and the phone be the central, but you can really have it working either way. You will need to do the following:-
First initial connection needs to be performed in the foreground (this is due to iOS's background limitations).
On the iOS side, you need an application that acts as a central that scans and connects to the remove device (check this example as a starting point).
Upon connection, you need to bond with the PC. Bonding is important as it will prevent you from having to do the pairing again in the future. However, pairing/bonding is managed by the iPhone's OS so you cannot write it in your application, so the workaround is to have an encrypted characteristic on the PC side that will force the iPhone to bond (this is covered later).
On the PC side, you need to have a BlueZ script that acts as a peripheral that is always advertising. You can do this using bluetoothctl (check the examples here and here).
Before you start advertising, you need to have a GATT server on the PC side (to do this, check this example).
When registering characteristics, ensure that one of them has the encrypt-read property (you can find a full list of the properties here).
Now when you attempt to read this characteristic from the iOS side, the two devices should bond (make sure that your PC is bondable which you can do this via these commands).
Once the devices are paired, your iOS app needs to be working in the background constantly scanning and attempting to connect to the same peripheral (have a look at this and this example).
You can find more useful information at the links below:-
Getting started with Bluetooth Low Energy
The Ultimate Guide to CoreBluetooth Development
How to manage Bluetooth devices on Linux using bluetoothctl

Read data from non-connectable BLE sensor?

I have a BLE thermometer and I'd like to record the data from it as the monitoring application (available for both Android and iOS) that comes with it doesn't do it, it merely displays the data on the screen for a limited time.
I tried all BLE apps on both Android (I have a Nexus5 with Android 4.4) and iOS 7.1 iPad with but, while some discovered the device, none of them could display any data because all try to connect to the sensor whereas the sensor returns 0 for kCBAdvDataIsConnectable.
Is this possible? It must be because its own app does it, but I'm not too familiar with BLE and I may be missing something. I was surprised that none of the apps in the App/Play store had the feature of "listening" what a device sends without connecting. But then again, I'm not very familiar with BLE.
A few pointers are appreciated.

Does the content stream from WiFi directly to Chromecast, or from WiFi to Android to Chromecast?

Does the content stream from WiFi directly to Chromecast, or from WiFi to Android (or any other device) to Chromecast?
I know that the other devices can be used to control the Chromecast, but I just want to know whether say you can stream directly from your mobile due say to battery life.
The mobile device is only used for the initial content discovery phase where the user selects a video stream to watch. Once playback is initiated, the Chromecast device connects to the internet over WiFi and downloads the stream data without requiring the mobile device to be turned on.
More details available on the official Chromecast developer site.

Build own Chromecast device

The Chromecast device is a "receiver device [that] runs a scaled-down Chrome browser with a receiver application". Can I download and install this receiver app on a chrome browser for example on my Windows notebook?
I have implemented a complete chromecast v2 receiver, called YouMap ChromeCast Receiver, available in Google play store and Amazon store, xda-developer thread here: http://forum.xda-developers.com/android-tv/chromecast/app-youmap-chromecast-receiver-android-t3161851
The current Chromecast protocol is a completely different one from the original DIAL based protocol. Right now, only YouTube still uses the old protocol, which chromecast maintains its backward compatibility.
The discovery is mDNS, exactly same as Apple TV bonjour protocol.
The most difficult part is device authentication, the sender and the receiver perform handshakes by exchanging keys and certificates in a way extremely difficult to crack. AppleTV does the same using FairPlay encryption.
The next difficult part is the mirroring protocol, which is also very complicated, need to deal with packet splits, packet retransmissions. Overall, chromecast mirroring protocol is well designed, better than miracast, better than AirPlay mirroring (I have also implemented both of them, so I know what I am talking about).
When I get chances, will write more here.
The chromecast device works using the DIAL protocol. It is completely possible to emulate this protocol using some simple code to listen on the multicast group for discovery and then handle the HTTP requests to launch applications. It is then the launched application that communicates with the casting device, I believe using the RAMP protocol.
Luckily for us the applications that the chromecast device uses are mostly web applications meaning our device emulator just needs to launch a web browser and point it to a specific url when it receives an application request.
For example the youtube app, after device discovery and establishing where the applications are located (part of DIAL). Will send a HTTP POST request containing a pairing key to /<apps url>/YouTube. All the emulating device needs to do now is open https://www.youtube.com/tv?<pairing key> in a browser window. From here, I believe, communication for controlling the youtube app is not sent through the casting device but through the open tabs on the casting device and the emulator.
This is my understanding of how the chromecast device works and specifically the youtube app from looking at https://github.com/dz0ny/leapcast which is a python emulator that has youtube and google music working.
Google is in progress of open sourcing some part of the chrome cast.
https://code.google.com/p/chromium/codesearch#chromium/src/chromecast/
https://code.google.com/p/chromium/issues/list?q=label:Chromecast
So theoretically you can build a similar device.

Data Exchange between applications over ActiveSync

Can anyone tell me how to send receive data between two applications over an ActiveSync connection?
In my scenario there will be one application running on a desktop and another on a windows mobile device, both these applications need to communicate among them. The connection between the desktop and the mobile device can be ActiveSync over USB or Bluetooth. I need the applications to exchange a continuous stream of data, more like a chat application. Ideally, the mobile device application will be sending out data 10-15 times a second (maybe more) and the desktop application will receive the data and display it.
For e.g., let’s consider the ‘Notes’ application for mobile device. Basically it allows user to save small textual notes. Now my application would be something similar, with the exception that it will send out all input it receives to the desktop application. The desktop app will receive the ‘inputs’ and process it.
Finally, I'm open to using any other option then ActiveSync, provided it supports Bluetooth.
You should check out ActiveSync api documentation for informations.
There is also an alternative solution, which I use.
Windows Mobile activates a temporary LAN when the device is connected on the USB.
You can use Window Sockets for the communication and avoid ActiveSync,
if it's not too much trouble for you.
Usually, the device gets IP 169.254.2.1 and the PC the 169.254.2.2.

Resources