ChromeCast Smooth Streaming - google-cast

I would like to create an application which streams an ism file.
I followed theese instructions and make the app works:
https://developers.google.com/cast/docs/player
But the video is still appear in my screen too, I want to stream the video only at the TV with full screen mode.
Is there any way to modify this sample how I want?
Thank you very much!

If you are saying that the video plays both on your mobile device and on your TV and you only want it to be on your TV, then modify your mobile application to not play the movie on your phone. If You mean something else, please clarify.

Related

Stream music from streaming platform (Deezer, Spotify, Soundcloud) to Web Audio API

Do any of you, know a way to get the audio stream of a music platform and plug it to the Web Audio API ?
I am doing a music visualizer based on the Web Audio API. It currently reads sounds from the mic of my computer and process a real-time visualization. If I play music loud enough, my viz works !
But now I'd like to move on and only read the sound coming from my computer, so that the visualization render only to the music and no other sound such as people chatting.
I know I can buffer MP3 file in that API and it would work perfectly. But in 2020, streaming music is very common, via Deezer, Spotify, Souncloud etc.
I know they all have an API but they often offer an SDK where you cannot really do more than "play" music. There is no easy access to the stream of audio data. Maybe I am wrong and that is why I ask your help.
Thanks
The way to stream music to WebAudio is to use a MediaElementAudioSourceNode or MediaStreamAudioSourceNode. However, these nodes will output zero unless you're allowed to access the data. This means you have to set the CORS property correctly on your end and also requires the server to allow the access through CORS.
A google search will help with setting up CORS. But many sites won't allow access unless you have the right permissions. Then you are out of luck.
I find a "no-code" work around. At least on Ubuntu 18.04, I am able to tell Firefox to take my speakers as the "microphone input".
You just have to select the good "mic" in the list when your browser asks for mic permission.
That solution is very convenient since I do not need to write platform-specific binding-code to access to the audio stream

Why don't hear music after recording (Swift 2.3)

I make a karaoke application. If headphone doesn't plugin app works fine.(my voice and background music record together). it is successful. but I'm the same way with headphone Then I listen to the recording. I can not hear the background music.But I hear the voice clearly.I attached the code below I used:
https://github.com/genedelisa/AVFoundationRecorder
Just a shot in the dark, but it might be the same issue here: How can I record the audio output of the iPhone? (like sounds of my app)
It seems like you can't record the audio your own app plays using AVFoundation, but have to use CoreAudio to get the low level audio signal. Maybe that helps? :-)

Remote control audio player (e.g. Groove-Music)

I'd like to write an app for My Band2 that should be able to simply change the audio track when I hit a button on my Band2. I'll use this because my car can't Play Audio via Bluetooth. So I bought a Bluetooth dongle but I'm unable to change tracks or simply to pause it. So a 3 button app on my Band2 should do a good job.
I've found a lot of examples to do so with Background Audio Players in a self written app.
But I'd like to remote Control the standard player, which is Groove-Music for me.
Any suggestions to do so? Can you please give me a start on that?
Thanks,
Jo

WebRTC Audio Is completely garbled

Check it out here:
https://webrtc.github.io/samples/src/content/devices/input-output/
All the audio recording sounds like aliens trying to communicate with you. I could of swore WebRTC was working just last week but this site of samples is completely freaking out on both my laptop and my phone.
Does anyone know why the audio is freaking out like that?

Sending audio to a bluetooth enabled speaker, IOS

I want to add a function to my App, where the user can choose to play the audio on a bluetooth enabled speaker. I have a Parrot Easydrive in my car and this works for phonecalls and for example the Dictafoon App among others.
I understand that I should use the Core Audio framework. WHen a bluetooth device is connected it is said that it is easy to stream the audio to that connection. I am now looking for Core Audio sample code (or a book) where connecting and streaming to a bluetooth device with Core Audio is explained.
Can anyone shed a light on this? If there is another framework or sample code which I can use please mention it!
Many thanks in advance!
You don't write any specific Core Audio code, it is the same process as is used to play audio via AirPlay.
Basically you put a MPVolumeView into your UI, and the underlying framework will redirect your audio output for you. Once you implement this you will be able to use Bluetooth and any AirPlay enabled device with your app.

Resources