I have a stream which I want to cast to a Smart TV. But the streams contains audio format which is not supported by Chromecast as of now. But the Smart TV is capable for playing this audio format. Is there anyway I can tell chromecast to do a passthrough for this audio through HDMI?
No, there is no such functionality in chromecast.
Related
Our TV has issues outputting sound to our speakers, which I can connect to a Google Chromecast Audio. We use a regular Chromecast on the TV, and I was wondering if it was possible to have the TV audio outputted to the speakers. Thanks!
Not really.
With some hacks you could get this sort of working, but the two devices will never be in sync. Even if you got the Chromecast devices synchronized perfectly, your TV itself will have latency.
For checking purpose, is there any way to retrieve the audio data sent to chromecast audio on linux platform?
Can anyone please elaborate following questions?
How bluetooth stack handles audio data?
How audio commands are processed?
Did we need any service to process audio data?
Thanks in advance.
Basically, voice commands over BLE require:
some audio codec for reducing required bandwidth (ADPCM and SBC are common, OPUS is emerging),
some audio streaming method through BLE,
decoding and getting the audio stream from BLE daemon to a command processing framework.
In the android world, command processing framework is google sauce (closed) that most easily gets its audio from an ALSA device. What is left to be done is getting audio from the remote to an ALSA device.
So for audio streaming, either you:
use a custom L2CAP channel or a custom GATT service, this requires a custom android service app and/or modifications to Bluedroid to handle those, it will need a way to inject audio stream as ALSA, most probably with a "loop" audio device driver,
declare audio as custom HID reports, this way, Bluedroid injects them back to the kernel, then add a custom HID driver that processes these reports and exposes an audio device.
Audio over BLE is not standard, so all implementations do not do the actual same thing. In Nexus Player case, implementation uses HID: It streams an ADPCM audio stream, chunked in HID reports. There is a special HID driver "hid-atv-remote.c" in Android linux kernel that exposes an ALSA device in addition to input device. Bluedroid has no information about audio, all it does is forwarding HID reports from BLE to UHID.
I am writing a sound player to play multi-channel audio files, and I need to know whether the running Android device can physically support multi-channel playback. I mean the final output will not be downmix to 2.0 stereo.
Is there an API to get this information?
For example, some devices can play audio via the MHL or HDMI interface, in this case, the query result of multi-channel should return true.
And some devices will always mixdown audio to stereo, in this case, the result should return false.
Thanks~
As I understand it, streaming via bluetooth is handled via the A2DP profile. While the SBC codec is default, A2DP supports AAC, MP3, and a few other Codecs.
My question is, since spotify files are in the OGG VORBIS format (OGG Container, Vorbis Codec), what is the best way to handle streaming via Bluetooth without quality loss? Is there a specific A2DP implementation? Are folks like Jambox, etc just using the SBC implementation?
Spotify's streaming format is an implementation detail to all clients, and making the assumption that it's OGG Vorbis is not something you should do, and in some circumstances is actually a false assumption.
Since you've managed to use every single Spotify tag in your question, I don't know which platform you're developing for. However, the correct thing to do is take the PCM data the Spotify playback library gives you and use whatever playback stack your target platform gives you. On Android, iOS, Mac OS, etc the system will handle audio output devices for you, including Bluetooth streaming.