How to cast Local Video using google cast sdk on iOS Sender - google-cast

searched for similar issues in this forum, and in the previous replies, someone suggested to embed a webService in iOS platform to implement it, but last year someone suggested to use MediaStore (but this does not apply to the iOS platform, and there is no following in the reply)
I checked google's relevant documents, but I didn't find any API that can implement casting local video/image, maybe I didn't read them carefully enough.
Do you have a good solution?

With the way that senders work, you are expected to implement your own local player. The requirements that your app needs to meet then is to ensure that you are able to transition from the local player to the cast device and vice versa. I would suggest you to look at the cast videos sample for reference. We use a simple AVPlayer in the sample: https://github.com/googlecast/CastVideos-ios/blob/master/CastVideos-swift/Classes/LocalPlayerView.swift

Related

Using Google Cloud Speech on mobile never returns an isFinal

I am currently implementing a ChatBot using Google Cloud Speech.
I am using socket.io to record a microphone stream and then sending that through node to Google Cloud Speech.
Everything is working fine on my laptop and my android mobile phone (Nexus 5x, Chrome 68)
I record the audio, and having set single_utterance to true, get a result with "isFinal" as soon as I pause speaking.
But if I set the language code to 'da-DK', I never get a "isFinal" result (unless I end the stream myself) on mobile. Works fine on my laptop, but not mobile.
Have anyone experienced anything similar?
As a bonus info:
If I set interimResults to true, I do get multiple results, but they are just never isFinal.
So just to be clear: everything is working perfectly apart from the one case: da-DK on mobile.
As this behavior is occurring only when using the da-DK supported language on a mobile device, it might be related to an internal service issue; therefore, I think that you should take a look the Issue Tracker tool that you can use to raise a Speech-to-Text API in order to verify this scenario with the Google Technical Support Team. In this way, you will be able to share your code, audio files and internal project information if required by the troubleshooting process.
Additionally, I suggest you to take a look on this link that contains some useful documentation and example to use Google Cloud Speech API on an Android environment that you may use as a reference for your project.

Is it possible to make a receiver app for chrome tab

I'm posting here because I didn't find any satisfying answer anywhere.
The question is quiet simple. I see a lot of application implementing the cast feature on Android. The issue is that even if I have a brand new smart TV, it actually doesn't support the cast feature of the majority of my apps.
For example, my TV has a Youtube app so I can cast youtube videos from the youtube app on my phone to my TV.
Now I would like to cast my favorite streaming app to my TV but my TV is not found. So I'm thinking, okay let's try to make an app for my TV that will receive that kind of command.
I know that I can make an app for my TV. Before starting that ambitious project, I want to be sure that the google cast sdk will allow me to write such receiver app.
Do you think this is possible ? Or do we really need one receiver app for every emitter app ?
YouTube uses its own discovery protocol beyond what the Cast SDK supports. Apps need to integrate the Cast SDK in their senders and implement receivers that support their authentication and DRM.

How to develop Spotify Desktop Applications, now Libspotify is discontinued

have done my due diligence, and not found any other posts that answer this question, but as usual, if you know a similar question, point me that way!
I noticed a long time back that Libspotify has been dicontinued:
(https://developer.spotify.com/technologies/libspotify/)
So, my question is - what should we do for developing Desktop applications?
They do state: "We hope to be able to provide you with a new library for other platforms." But, this has been going on since 2015!
I have seen many projects in GitHub still using Libspotify - so what should we do? An update was promised "in the upcoming months" but I've not seen anything yet.
What should we do for developing Desktop Applications?
We at Spotify don't currently provide playback as part of our platform offering outside of our iOS and Android SDKs, and I don't have any updates on that at the moment. As mentioned on the website, we hope to be able to provide playback SDKs for more platforms in the future. We don't support any new development on libspotify.
You can use the Spotify Web API to interact with Spotify in a variety of ways, including getting information about metadata, and accessing/modifying user libraries and playlists, which may be useful. You can also use the Applescript API to control playback on macOS, which may also help.
The Spotify Web API is pretty straight forward to use. Of course it defines the protocol rather than implements it so it is OS independent.
I put together a few classes to help unwrap some of the JSON parameters simply. These were written in Swift for macOS.

Chrome based slideshow app for ChromeCast

I am trying to write a simple chrome app to play a sequence of online pictures on my chromecast device.
I have looked at some examples, but could't find anything which I could tweak around to get the simple behavior i needed. Maybe someone here could help, by providing directions or advise on getting started with developing something like that for chromecast.
UPDATE:
To give you a better idea, about the specifics, let me add some more details to my requirements.
It needs to be controlled from chrome
I want to pass a playlist with 10s-100s of images so it can slide them in circles.
After receiving playlist chromecast device should be able to continue on its own, without continuously asking for next image.
This is actually similar to backdrop feature Google is planning to introduce, but I wanted to write something myself.
Thanks
If you don't want to develop your own Cast receiver, then you can use the media namespace channel and the Styled Media Receiver to display a photo at a time:
https://developers.google.com/cast/docs/styled_receiver
You will have to add the logic to advance from photo to photo in your sender app.
If you are willing to develop your own custom receiver, then you can start with this Cast sample app:
https://github.com/googlecast/CastHelloText-android
It allows you to send messages to a custom receiver. You can use that to send the URLs of the photos and then you can add JavaScript logic in the receiver to play a slideshow.
Just to let you know, I have tried various options and ended up writing custom receiver and Chrome sender applications. This was really straightforward and exactly what I wanted.
See the links above for guidance and also examples here.

Is YouTube Data API v2 no longer supported?

I've been looking into getting a developer key for YouTube Data API v2.Navigating to the page Google lists for this purpose gives me a blank page--so there is no way to actually get a developer key. Does this page show up blank for anyone else?
Is this a sign from Google that it's time to move to version 3? I would really prefer not to, since it has some anti-features that I want to avoid.
There were some transient issues with https://code.google.com/apis/youtube/dashboard/gwt/index.html that should be resolved.
To answer the larger question, using the Data API v2 is still an option, but we're definitely recommending the Data API v3 for new development. (There are still a few pieces of functionality that are only available in v2, but that list is not very long.)
I'm not sure what "anti-features" you're referring to, so it's hard to comment beyond that.

Resources