Web Push notifications across multiple devices - browser

I was following this guide to setup Web Push notifications in our web app:
https://blog.elmah.io/how-to-send-push-notifications-to-a-browser-in-asp-net-core/
But unrelated to ASP.NET core, I would like to know if it is possible if these push notifications can be received across multiple devices, since I cannot seem to get this working (yet) during my first tests.
Let's say:
A user logs in into browser, for example Chrome (with same user, both mobile and desktop)
Registers for web push notifications on our app (on desktop)
we send a notification to the registered user
--> Can push notifications both be received on mobile AND desktop?
--> Is that registration linked to a device, or linked to the logged in user from browser? (or other?)

Currently I don't think it's possible to do this. (if someone would be able to show me otherwise, I'll update this)
The reason is because that when you subscribe for web push notifications, a serviceworker (= in background running javascript file) needs to be installed on your device to receive the notifications. If you allow notifications on desktop (for example), this doesn't mean the serviceworker will be installed on mobile.

Related

Running an app from the browser, and detecting if the app is installed or not

In my web app, I want to discover devices on the local network. The devices announce themselves with mDNS (Bonjour) and from everything that I read, It's impossible to go deep enough in the network layer and detect these devices from the browser.
However I can do that from a Desktop app. Thus, what I need is running a Desktop app from a custom URI. There is a lot of documentation about that, like this article. But if the user do not have the app, the link just does nothing.
I noticed that a lot of apps like slack, discord, etc. that redirect to their desktop app do not provide dead links if you don't have the app installed. If you don't have the app, they make you download and install it.
How do they achieve this ?
Thank you for reading !

Can i use Chromecast as a server?

Studying the possibility to achieve the following:
We have a CMS that from time to time posts to a web hook a media URL (video) (public internet hosted)
This web hook post we would like to post directly to a ChromeCast which is plugged in to a TV
Questions:
1. Can a web server like nodeJS be installed on a chrome cast?
2. Is it possible to use for example DynamicDNS to link the Chromecast to a domain name so the post from the web hook can be made?
ChromeCast has a sender API which allows you to "send" content to a specific Chromecast. Right now, the sender API works on Android, IOS and Chrome OS. You can read more about it here: https://developers.google.com/cast/docs/sender_apps.
And, here's how a receiver application that would receive your content on the ChromeCast would work: https://developers.google.com/cast/docs/receiver_apps or if you're content is a standard type, then you can use a prebuilt receiver application without building your own.
To answer your specific questions:
Can a web server like nodeJS be installed on a chrome cast?
No, not without enormous hacking and development yourself to basically take over the hardware somehow and get your own stuff to run on it.
Is it possible to use for example DynamicDNS to link the Chromecast to
a domain name so the post from the web hook can be made?
Not that I know of.
The chromecast has an android like google chrome operating system. It is possible to root it, but you will not be able to (to my knowledge) get a server on it. I would suggest taking a look at the Raspberry Pi. You should be able to run a slim server on it. After you get that set up it might be feasible to pass command line commands to chrome or another web browser to display the data you like. A browser is not necessary, but I'm not sure if you know of any other way to display the media.
A different approach would be to have a server anywhere (could be in your home) and have something like the raspberry pi (any computer for displaying the content) connect to a webpage hosted with that server. Using websockets something like socket.io, you could set it up so that the server could send messages (url of video) to the browser session you have open. The javascript of your webpage would then use that message to open that url.

Remote control a Chrome Extension

I've written a non-published (personal) Chrome extension that performs page checking and then performs actions such as opening new tabs if certain conditions are met. I would like to be able to "remote control" it from my phone though, e.g. turn on or off or adjust settings when I'm away from my desk.
I considered if the extension can read/write to a file in Dropbox, which I could then edit from my phone too, or any other device. But I'm not sure if extensions are allowed to arbitrarily read/write in the filesystem, or only "apps". Any other suggestions?
Assuming you can't directly connect to your computer (otherwise wOxxOm's answer is valid)..
You could make a companion phone app and use GCM push messages; your phone would message your server via it (which can be hosted on a free App Engine tier easily if it's just for your private use) and the server will push out the message.
Though it'll probably be much easier to just have said App Engine server up and providing a WebSocket endpoint that your extension can connect to to receive commands in real-time, and some sort of API / control panel on the web (authenticated, of course).
Any free webserver-based solution would lag, as bad as 500ms, I think.
Try making a complementary native PC program: mobile apps for remote control usually have their PC part running as a background service or an application with just a shelltray icon. Such program opens a TCP/UDP port on PC and listens for commands from the mobile app, and can communicate with your extension via Chrome's native messaging API.

Accessing file system in an iOS device

I'm an absolute newbie to Xamarin world. I'm working on a web application where a user completes a long form (say some 100+ fields) and then submits the form which will write the information to a database. One of the requirement is user should be able to load the form, resume his work even when he is Offline( No internet connection). I have used HTML5 Application cache, Local Storage in Html5, KnockOut.JS, Java script so that for every 2 seconds all the user form information is saved to Local Storage of the browser. But lately, I noticed with few users that the forms are getting deleted sometimes due to an iOS update. Also I don't want to rely on browsers cookie/cache to store this information.
I want to find out what my options are with Xamarin. Can I use a component like 'UIWebview' in the Xamarin app to launch my web application and then access the file system of iOS of that Xamarin app from the browser launched ?
Sure you can!
One launch image plus one screen with UIWebView is what you need for your task. You can handle UIWebView's event to save and load it's state.
Good news is that such app could be small enought to build it using free (Starter) version of platform.

WebRTC Streaming between PC and Mobile Client

I would like to implement peer to peer communication between mobile device (iOS & Android) AND Windows PC, I would like Mobile app will stream camera output to PC(no audio will require), and on PC user will able to capture screenshot from running steam. Below is possibility I am thinking.
Option 1 : Develop a Web based application which will run in Google Chrome or Firefox browser on Windows PC, and also will develop mobile client app which will run on Android and iOS devices, and using WebRTC it will steam mobile camera output to website which will be running in PC’s Chrome or Safari browser, and User will able to capture screenshot from running steam and that will be saved on user’s computer. Drawback of this solution is that I have to develop Website so will have not user’s computer file storage, as standalone desktop application is more preferable because desktop application will able to easily access user’s computer file system
Option 2: Develop 3 applications
one Standalone desktop application which will have all features which require to access computer’s local file system.
Develop a small web site which will have just a single screen, it will use for display mobile camera steam, and user will capture output from that page, will develop a kind of watchdog service in desktop app, which will grab latest captured screen from Chrome or Firefox browser.
3rd app would be mobile client which will be running on mobile which will stream camera output to PC using WebRTC. Drawback of this solution is that this solution would be not real-time, because user have to use two separate interface for Screen capture have to use PC Chrome or Firefox browser, and after screen capture have to move back to PC application.
My understanding is that It’s not possible to have Server less solution for WebRTC, Signaling server will require, I found some of open source WebRTC servers i.e. Easyrtc, signalmaster which I have to use and have to configure in own environment.
As this is my first WebRTC based project, so would like to know your opinion about Solution which i am thinking, is it right or is there any better way to achieve it.
Thanks
Suresh
Hi suresh IOS not support WebRTC,But its possible in android
My option is Node-webkit(desktop app using HTML 5,Javascript,css3,Nodejs,NPM)
https://github.com/rogerwang/node-webkit/wiki
mobile app(intel xdk ) but ios not support WebRTC
http://xdk-software.intel.com/
You could use Twilio Video to do this.
You can build multi-party video calling into both web and native applications with the SDKs for:
JavaScript
iOS
Android
https://www.twilio.com/docs/api/video
You will also find the server-side starter apps in various languages you need to get started quickly.
In my preferred language example Python, a small Flask app handles token creation to handle user access for video conversations in app.py and the basic WebRTC functions can be found in quickstart.js.
Note: I work for Twilio.

Resources