Mouse control using Twilio's screen share session - keyboard

Here is my situation. I have used Twilio on my portal for "creating s meeting + chat + screen share" and want to add hand-over mouse control as in join.me, zoom, teamviewer. Is there any api, sdk or any way to achieve that while using Twilio for screen-sharing, as I have already paid for Twilio I can not opt for other integrations for meetings. Or is it possible to leverage the facility of any other application with Twilio.
Thanks

The nearest thing is to make use of the Twilio Programmable Video Data Tracks API.
Announcing Programmable Video DataTracks API

Related

How to deploy a Dialogflow bot on Google assistant that will call a number?

I want to create a DialogFlow agent that will be Deployed on the Google Assistant that will get a Phone number from a backend service and will be able to call the number using the Google Assistant. Is it possible?
You can only play sound files or play streams. The Google Assistant also doesn't provide you with the actual sound that was recorded as it always converts the detected sound to text. This text is then delivered to your Action.
You could however call someone from your back-end using Twilio by synthesizing the text that was detected. Responding to whatever the person you're calling says would be hard as well.
I usually opt for sending text messages instead of calling when using Actions.
The platform does not support the ability to programmatically call telephone numbers through the user's phone.

How can I view custom events in Chatbase?

I am currently trying to integrate Chatbase into my Google Action project.
I want to use the Custom Events API (https://chatbase.com/documentation/events#documentation) to track purchases via my action.
Unfortunately if I send custom events to Chatbase, they don't show up in any reporting view.
I'm Sean with Chatbase support. It is on our roadmap to surface custom events in the funnels report, however other priorities have delayed the release. Please stay tuned to the Chatbase Blog for updates on new feature releases.

Possible to return an image in the Google Actions webhook response?

From the docs it seems like SpeechResponse is the only documented type of response you can return:
https://developers.google.com/actions/reference/conversation#SpeechResponse
Is it be possible to load an image or some other type of media in the assistant conversation via API.AI or the Actions SDK? Seems like this is supported with api.ai for FB, other messengers:
https://docs.api.ai/docs/rich-messages#image
Thanks!
As of today, Google Actions SDK supports Conversation Actions, by building a better Voice UI, which is integrated with Google Home.
Even API.AI integrations with Google Actions can be checked out here, which shows currently no support for images in the response.
When they provide integrations with Google Allo, then in the messaging interface, they might start supporting images, videos etc.
That feature seems to be present now. You can look it up in the docs at https://developers.google.com/actions/assistant/responses
Note: But images would be supported only on devices with a visual output. So Google Home would obviously not be able to do it. But the devices with screen do support a card with an image.
Pro Tip: Yes you can
What you want to do is represent your (image/video) as a URL within API.AI and render the URL as a (image/video) within your app
see working example

Can I let the client stream the content from Spotify in a browser based player?

Couldn't find anything on Google about that topic, so I'm asking here.
I had an idea for a web based Spotify player (not like the offical one) and I would like to know if it's possible to let the client (user) stream the content from Spotify instead of my server (app). Would be pretty expensive if my server would have to stream the data and to send it to the client :-/
Thanks!
Unfortunately, there is no web library that you can use for streaming content from Spotify. The closest is the Spotify Play Button but that is a widget that remote controls Spotify from the desktop client or Spotify's web player.
You are limited to use the 30 seconds previews, or use the Android or iOS SDK if you were to build a mobile version of your site. The SDKs allow full playback for Spotify premium users.
There is a feature request for fetching full tracks on Spotify's Web API GitHub repo that you can watch or add comments to.

is there any function to act as a magnet trigger in google cardboard api

I looked into the CardBoard SDK for Android but I have not found any function which can generate a magnet trigger. But I could find bluetooth devices which can act as a magnet trigger - it is likely using some API to able to do that. So Is it possible to write such app which can generate magnet trigger? Thanks in advance.
I think the solution is to write an accessibility service which receives the bluetooth events and generate a TYPE_VIEW_CLICKED event

Resources