How can I retrieve the audio data sent to chromecast audio? - linux

For checking purpose, is there any way to retrieve the audio data sent to chromecast audio on linux platform?

Related

Capture video frames of DRM protected video in chrome

I'm building a video player for my application, which will play DRM protected videos (WideVine on Chrome and Firefox) using dash.js and video.js. As part of this application i want to be able to annotate the video and send the data back to server.
Annotation data should be attached to a frame of the video instead of timestamps and the application should send the frame and related annotation data to server. Is it possible to capture the raw frames of the Widevine DRM protected video in chrome or firefox and send them to server using webGL.
No, it's not possible to capture the decrypted frame of a DRM protected video. This would to some degree defeat the DRM system. It's only possible for non-DRM protected streams.

Can anyone explain how voice commands works via Bluetooth remote(Nexus player remote) in Android(Nexus player)?

Can anyone please elaborate following questions?
How bluetooth stack handles audio data?
How audio commands are processed?
Did we need any service to process audio data?
Thanks in advance.
Basically, voice commands over BLE require:
some audio codec for reducing required bandwidth (ADPCM and SBC are common, OPUS is emerging),
some audio streaming method through BLE,
decoding and getting the audio stream from BLE daemon to a command processing framework.
In the android world, command processing framework is google sauce (closed) that most easily gets its audio from an ALSA device. What is left to be done is getting audio from the remote to an ALSA device.
So for audio streaming, either you:
use a custom L2CAP channel or a custom GATT service, this requires a custom android service app and/or modifications to Bluedroid to handle those, it will need a way to inject audio stream as ALSA, most probably with a "loop" audio device driver,
declare audio as custom HID reports, this way, Bluedroid injects them back to the kernel, then add a custom HID driver that processes these reports and exposes an audio device.
Audio over BLE is not standard, so all implementations do not do the actual same thing. In Nexus Player case, implementation uses HID: It streams an ADPCM audio stream, chunked in HID reports. There is a special HID driver "hid-atv-remote.c" in Android linux kernel that exposes an ALSA device in addition to input device. Bluedroid has no information about audio, all it does is forwarding HID reports from BLE to UHID.

Visualizing waveform for stored (not real-time) audio using Web Audio API

I'm using MediaRecorder API to record a media stream from a microphone. And I want to draw a visualization of the recorded audio on a canvas. Web Audio API's AnalyserNode visualises real-time audio from the stream.
Is there a way to visualize a waveform of a static/stored (as opposed to real-time) audio using Web Audio API's AnalyserNode?

How to query the capability of the audio output channel configuration via Android API?

I am writing a sound player to play multi-channel audio files, and I need to know whether the running Android device can physically support multi-channel playback. I mean the final output will not be downmix to 2.0 stereo.
Is there an API to get this information?
For example, some devices can play audio via the MHL or HDMI interface, in this case, the query result of multi-channel should return true.
And some devices will always mixdown audio to stereo, in this case, the result should return false.
Thanks~

HDMI Pass through for Audio

I have a stream which I want to cast to a Smart TV. But the streams contains audio format which is not supported by Chromecast as of now. But the Smart TV is capable for playing this audio format. Is there anyway I can tell chromecast to do a passthrough for this audio through HDMI?
No, there is no such functionality in chromecast.

Resources