I am looking a solution for Video Chatting in Xamarin forms backend Azure. Azure currently not supporting WebRTC. So I plan to do Create 2 live streaming channel for the users. Take one end camera for one live streaming channel and same for another end user. Before I am doing this test, I want to know it will work or not and performance wise good or bad?
Or I can go with signalr?
Unfortunately, I think neither Azure Media Services, nor SignalR will give you the low latency you need for a live video chat application.
I think that your best bet when running on Azure, will be to grab a Virtual Machine and install a 3rd party product such as:
Kurento
jitsi
Wowza (which I think also offer their product as a SaaS)
Any other product you might find
Hope it helps!
Related
im currently working in a module of analysis of stadistics of videos from azure media services. I want to ask how can i get some data like average visualization time, number of visualizations and more stuff like that. im pretty sure it has to exist a very easy way to get this data but i cannot find it. I found that application insights could be useful. I have found that i may have to manually track this information. Im working on .net6. An example of code would be awesome. Thanks in advance!
pd: https://github.com/Azure-Samples/media-services-javascript-azure-media-player-application-insights-plugin/blob/master/options.md
I have found that Application Insights could be useful to my problem. Some classes like TelemetryClient (from the package Microsoft.ApplicationInsights) seems to be useful to my problem, but i cant find clear information about them.
No, there is no concept of client side analytics or viewer analytics in Azure Media Services. You have to track and log things on your own on the client side. App Insights is a good solution for this, and there are some older samples out there showing how to do that with a player application.
Take a look at this sample - https://learn.microsoft.com/en-us/samples/azure-samples/media-services-javascript-azure-media-player-application-insights-plugin/media-services-javascript-azure-media-player-application-insights-plugin/
Just WARNING: it is very old and probably very out of date. I would not use much of the code from that sample, as it is using SDK's from 4 year ago. Just use it as guidance at a high level for what the architecture might look like.
Another solution would be to look to a 3rd party service like Mux.com/Data that can plug into any player framework for client analytics.
I want to use an IP camera without iot edge support to live stream the video footage to azure and I want to get insights from video using Azure Video Analyzer for Media(aka VideoIndexer).
I have came across 2 possible ways to achieve it in Azure-
Came across LiveStreamAnalysis GitHub repo but the functions are not getting deployed as it is using older version of Media Service(v2). However, I read the newer version of Media Services but didn't found the Live Stream sample to start with.
Found Video Analyzer(preview) documentation but I am only able to stream and record the live stream using a simulated IP camera live stream.
I want to do further analysis on video by using video indexer apis but I didn't find any way to achieve it using 2nd approach.It only explained using IOT edge device pipelines and worflows.
How can I achieve this?
thank you for bringing (1) to our attention.
I reached out to the relevant contact.
There isn't other built in integration your (2) option is using Azure video analyzer (not for media) which is a different service.
The most promising path at the moment is (1) based on a fix.
Regarding (1), I am the main author of the Live Stream Analysis sample and it is true that the functions need to be updated to use AMS v3 API, and the logic apps to be updated too to use these new functions. I started the work to build the new functions but the work is not complete. They will be posted to https://github.com/Azure-Samples/media-services-v3-dotnet-core-functions-integration/tree/main/Functions
You can see that there is a SubmitSubclipJob which will be key for the sample.
We have written two mobile apps and a web back end. Mobile apps are written in Xamarin, back end in C# in Azure.
There is shared data between all three apps, some are simple keyword tables, but some data tables will change, e.g. mobile user is moving around and making some updates to a table, updates need to go back to web app and then possibly out to the apps.
Currently use SQLite on the mobile apps and following a off-line first approach, i.e. user changes a table we write to SQLite on mobile and then sync to server. If user has no connectivity a background process will eventually sync up data to server when possible.
All this is custom code now, and I am a little hesitant to continue on this path. We are in testing with 4 users or so, but expectation is to grow to thousands or tens of thousands of users in 6 to 18 months.
I think that our approach might not scale. Would prefer to switch to an Offline first framework instead of continuing to roll our own.
Given our environment I think using Azure Mobile SDK would be the obvious path to follow.
In general would you choose an offline first framework if your app will grow? In particular, any experience with using Azure Mobile SDK?
Note that your question will likely be closed because you're asking for an opinion/recommendation but anyways...
From the Azure Mobile Apps Github repo:
Please note that the product team is not currently investing in any
new feature work for Azure Mobile Apps.
Also to my knowledge, Microsoft has not announced any new SDK or upgrade path.
With that in mind, one option is to keep your custom code and bonify it with code that you'd extract from the SDK or vice versa.
Assuming that your mobile app calls a web service, which then performs any necessary writes, you could load test a copy of your production environment to see if things fall over and at what point. I'm not a huge fan of premature optimization.
Assuming things do fall over, you could introduce a shock absorber between your web service endpoint and the database using a Service Bus Queue.
I have 2 different Unity apps and i wish to connect them. My Aim is to stream a live video from App 2, to app 1. therefore i want app 2 to act as the server and sender, and app 1 to act as the receiver and show the video in a panel or rawimage. The 2nd app will run on IOS or android and i access the Native camera, in the finale product i wish to have whatever is visible in app 2 to be visible in app 1 image. only the image. What i have achieved so far:
I have actually finished all about 90% of both apps and this is the last step i need. When it comes to servers and networking i just don't have the required knowledge. Can someone tell me how can i do this? How can i stream a live video from one unity app to the other. Note: The 2nd app will be on ipad or any android phone while the 1st app will be running on a normal desktop. Regarding the networking part, should i use nodejs, websockets? Unity netoworking? What? Even if i know what to use, how shall i stream the thing. this is the first project i do that has networking and servers in it and i really don't have the experience needed for this. Any help is much appreciated
Rather than try to stream the app it might be easier to use sockets or unity's multiplayer services and recreate the relevant assets client-side and have the server send position updates.
In terms of what to use Unity networking looks like a good choice. When I've done unity client-server stuff before I've done it with sockets and that works okay especially if you don't have too many things to keep track of.
I'm going to publish a video in a Web page for streaming. I expect having more than 100.000 visits per day in a month. I want to upload my video to a server (or service) that offers the same band-with for all the clients, even if there are hundreds of thousands of clients connected simultaneously.
I will connect the player with the external video.
Note: I cannot use Youtube or Vimeo because the video is 360ยบ technology, so I need to use my custom player.
Please, could you suggest any service that offers this feature?
Thanks!!
I would say this is mostly a question of the streaming technology you'd like use but not the storage alone.
E.g. if you wish to stream via some binary protocol like RTMP, you'll have to use software like Wowza for transcoding and delivery. Hence the load balancing for proper usage of bandwidth will also be served via load balancer like Wowza.
So you should decide what protocols and other technologies you plan using. This will narrow your search parameters.