Dash.js player doesn't play when debug - ripple

I've embedded dash.js player into Multi Device Hybrid App template, feed it with working live video link (tested with the same dash.js in trivial html page). When app is starting in Ripple (Apache Ripple™, web based mobile environment simulator), I see debug info in VS with errors "net::ERR_CONNECTION_REFUSED". After a few attempts it stops to try and doesn't play a thing.
However, to check if network is working between client and server I've created a website with test image on the server side and embedded this image into application on the client side. It manages to load it, so at least network is working. Firewall disabled both sides.
So, the same exact MPEG DASH manifest is working inside an html with embedded dash.js in Chrome browser, but doesn't work when in Hybrid App in Ripple. But remote image can be loaded inside that app, so network is fine and Ripple let the app to request remote resources from the net.
What the reason? How can I debug it?

Is the HTML5 MSE (Media Source Extentions) supported within Ripple? You may also try it with some basic sample code like this or this, or try another DASH web player like dash-js or bitdash

Related

Running an app from the browser, and detecting if the app is installed or not

In my web app, I want to discover devices on the local network. The devices announce themselves with mDNS (Bonjour) and from everything that I read, It's impossible to go deep enough in the network layer and detect these devices from the browser.
However I can do that from a Desktop app. Thus, what I need is running a Desktop app from a custom URI. There is a lot of documentation about that, like this article. But if the user do not have the app, the link just does nothing.
I noticed that a lot of apps like slack, discord, etc. that redirect to their desktop app do not provide dead links if you don't have the app installed. If you don't have the app, they make you download and install it.
How do they achieve this ?
Thank you for reading !

How to make Azure Media Service hosted video auto play on mobile?

I am working on a project where by we are hosting and streaming video through Azure Media Service.
There is a particular video we have positioned as the hero background upon entry to the site. On desktop the video auto-play's and streams just fine but on mobile it does not autoplay at all. It simply showcases the preview image.
I'd love to be able to paste a link to the site but unfortunately due to the confidentiality of the project I am not able to. However, if there is something in particular you'd like me to post to help support the question please let me know.
The web-app is build using Angular.
Does anyone know how to fix this problem? or can point me in the right direction?
Check with the browser platform you are targeting on the mobile applications. Most mobile browsers have disabled autoplay. User MUST now initiate all playback actions.
Since the release of iOS 10 Apple has allowed muted video autoplay: https://webkit.org/blog/6784/new-video-policies-for-ios/
Chrome 53 on Android also allowing muted video autoplay: https://developers.google.com/web/updates/2016/07/autoplay

Can i use Chromecast as a server?

Studying the possibility to achieve the following:
We have a CMS that from time to time posts to a web hook a media URL (video) (public internet hosted)
This web hook post we would like to post directly to a ChromeCast which is plugged in to a TV
Questions:
1. Can a web server like nodeJS be installed on a chrome cast?
2. Is it possible to use for example DynamicDNS to link the Chromecast to a domain name so the post from the web hook can be made?
ChromeCast has a sender API which allows you to "send" content to a specific Chromecast. Right now, the sender API works on Android, IOS and Chrome OS. You can read more about it here: https://developers.google.com/cast/docs/sender_apps.
And, here's how a receiver application that would receive your content on the ChromeCast would work: https://developers.google.com/cast/docs/receiver_apps or if you're content is a standard type, then you can use a prebuilt receiver application without building your own.
To answer your specific questions:
Can a web server like nodeJS be installed on a chrome cast?
No, not without enormous hacking and development yourself to basically take over the hardware somehow and get your own stuff to run on it.
Is it possible to use for example DynamicDNS to link the Chromecast to
a domain name so the post from the web hook can be made?
Not that I know of.
The chromecast has an android like google chrome operating system. It is possible to root it, but you will not be able to (to my knowledge) get a server on it. I would suggest taking a look at the Raspberry Pi. You should be able to run a slim server on it. After you get that set up it might be feasible to pass command line commands to chrome or another web browser to display the data you like. A browser is not necessary, but I'm not sure if you know of any other way to display the media.
A different approach would be to have a server anywhere (could be in your home) and have something like the raspberry pi (any computer for displaying the content) connect to a webpage hosted with that server. Using websockets something like socket.io, you could set it up so that the server could send messages (url of video) to the browser session you have open. The javascript of your webpage would then use that message to open that url.

Why am I getting a LAUNCH_ERROR, NOT_FOUND when attempting to play to an audio-only Google Cast device?

I recently purchased a LG Music Flow H3 to test my Google Cast app with an 'audio-only' device. I've enabled audio-only device support within the Google Cast Dashboard and I've registered the device for development. My app works as expected when played to Google Chromecasts, however when attempting to load the receiver app on the LG device I get the following error:
{"reason":"NOT_FOUND","requestId":1,"type":"LAUNCH_ERROR"}
Thinking that perhaps 3rd-party devices can't be registered for development, I went ahead and published my app. Unfortunately this did not address the problem.
Upon further investigation, I'm noticing that other Google Cast apps (i.e. Songza, TuneIn, Pandora, etc. on both Android & iOS) aren't able to play to the LG Music Flow H3 either.
I discovered that the only way to get the H3 to play from Google Cast apps (both my app and others) is to first run the LG Music Flow multi-room audio app. Running the LG Music Flow app appears to effect the device's _googlecast._tcp zeroconf service discoverability. And sometimes the H3 shows up in zeroconf yet you still can't play to the speaker unless the LG Music Flow app is running.
This seams like very strange behavior. I called LG Tech Support, and they recommended I return the device and exchange it for a different one. I did this, and I still get the same result.
Is this how Google Cast on 3rd-Party devices is intended to work? Have I encountered a buggy 3rd-party implementation?
This is not the intended behavior for sure. You are supposed to be able to launch your 3rd party application on the speaker.
And it works for me.
The fact you are not able to use any other 3rd party apps (TuneIn, Pandora etc) indicate this is a some general problem and not specifically with your app.
Couple of steps I would try:
- Setup you H3 speaker on Wifi (vs Ethernet) and get the latest SW version using the Music Flow app.
Have you Factory reset the device? if you did, you will need to go through setup again. I have noticed Google Cast becomes enabled only after first complete setup.
Please try rebooting, and than try casting apps like TuneIn or Pandora and than see if this works (without using the MusicFlow app). I have noticed in the past that if some app is loaded and is behaving badly (for example uses tons of memory) it will put the speaker into a bad state that don't allow any app to work until reboot. So perhaps when you are casting your app, it gets into a bad state causing other apps to fail afterwards.
Have you made sure not to consume to much memory? avoid graphics or video?

WebRTC Streaming between PC and Mobile Client

I would like to implement peer to peer communication between mobile device (iOS & Android) AND Windows PC, I would like Mobile app will stream camera output to PC(no audio will require), and on PC user will able to capture screenshot from running steam. Below is possibility I am thinking.
Option 1 : Develop a Web based application which will run in Google Chrome or Firefox browser on Windows PC, and also will develop mobile client app which will run on Android and iOS devices, and using WebRTC it will steam mobile camera output to website which will be running in PC’s Chrome or Safari browser, and User will able to capture screenshot from running steam and that will be saved on user’s computer. Drawback of this solution is that I have to develop Website so will have not user’s computer file storage, as standalone desktop application is more preferable because desktop application will able to easily access user’s computer file system
Option 2: Develop 3 applications
one Standalone desktop application which will have all features which require to access computer’s local file system.
Develop a small web site which will have just a single screen, it will use for display mobile camera steam, and user will capture output from that page, will develop a kind of watchdog service in desktop app, which will grab latest captured screen from Chrome or Firefox browser.
3rd app would be mobile client which will be running on mobile which will stream camera output to PC using WebRTC. Drawback of this solution is that this solution would be not real-time, because user have to use two separate interface for Screen capture have to use PC Chrome or Firefox browser, and after screen capture have to move back to PC application.
My understanding is that It’s not possible to have Server less solution for WebRTC, Signaling server will require, I found some of open source WebRTC servers i.e. Easyrtc, signalmaster which I have to use and have to configure in own environment.
As this is my first WebRTC based project, so would like to know your opinion about Solution which i am thinking, is it right or is there any better way to achieve it.
Thanks
Suresh
Hi suresh IOS not support WebRTC,But its possible in android
My option is Node-webkit(desktop app using HTML 5,Javascript,css3,Nodejs,NPM)
https://github.com/rogerwang/node-webkit/wiki
mobile app(intel xdk ) but ios not support WebRTC
http://xdk-software.intel.com/
You could use Twilio Video to do this.
You can build multi-party video calling into both web and native applications with the SDKs for:
JavaScript
iOS
Android
https://www.twilio.com/docs/api/video
You will also find the server-side starter apps in various languages you need to get started quickly.
In my preferred language example Python, a small Flask app handles token creation to handle user access for video conversations in app.py and the basic WebRTC functions can be found in quickstart.js.
Note: I work for Twilio.

Resources