Can different apps share anchors stored in the anchors store? - hololens

I use one app on the HoloLens to save its anchors to the anchor store. I would like to know can I share this anchor with other apps?
Thanks.
YL

With Azure Spatial Anchors, it is possible to share anchors across HoloLens devices. If this is just one device, you can share across applications too.
Unreal engine 4.7 and UE 5.0 do have holographic remoting support too.
Since I am reading just one device, you should be able to get this all working using Local Spatial Anchors. Here is the link to sample:
https://learn.microsoft.com/en-us/windows/mixed-reality/develop/unreal/unreal-spatial-anchors?tabs=426

Related

Is there a standard way of sending GPS navigation over BLE?

I'm developing a 'smart watch' which is connected via BLE to an app on a phone.
My idea is to show the direction on the watch so the user doesn't have to remove the phone from their pocket while using their favorite application (Waze, Google Maps, ...)
Is there any standard to send navigation information (turn left/right, ...) to a smart device over BLE?
It seems there are apps that do this, but they are fully custom: https://www.youtube.com/watch?v=naAhe7DTKYM. I'm checking on the side of Android Auto, but it seems to me it's only by USB cable.
Off the top of my head there are two services that you can use for this purpose:-
Location and Navigation Service (Used mostly for outdoor):-
"The Location and NavigationService(LN Service)exposes location and
navigation-related data from a Location and Navigation sensor
(Server)intended for outdoor activity applications."
Indoor Positioning Service (Used mostly for indoor):-
"The Indoor Positioning Service exposes location information to
support mobile devices to position themselves in an environment where
GNSS signals are not available, for example in indoor premises. The
location information is mainly exposed via advertising and the
GATT-based service is primarily intended for configuration."
If both are not 100% suitable for your application, then you can create your custom profile that contains one or both of these services in addition to a custom service (turn left/right etc). This way, any watch can connect to the phone and get the adopted services info, and for any additional info watch can add support for the custom service.
When I was looking for such "standard way" for turn-by-turn navigation (official Bluetooth specification for navigation, similar to official Battery or Heart rate services), I didn't find anything suitable, but I still have a feeling like maybe I've missed something.
Those services Youssif Saeed suggested are for different type of navigation, not turn-by-turn unfortunatery.
Some apps (Sygic for example) may integrate their own BLE service.
I guess the application you linked reads notifications (posted by navigation apps like Waze, Google Maps), extract instructions from them, and then send to an external device via BLE.

Hololens 2 offline synchronization

Did anyone try to synchronize the object's position between 2 Hololens 2 devices without an internet connection?
I am using Photon to do it for Hololens 2 glasses in the same LAN. When one player moves an object, other people can see it, but the real position of the object in the room is not matched, because each device has an independent coordination system.
What is the best way to solve this problem?
Thanks!
What you need is to bind the holographic to the physical world. It is recommended to use local spatial anchors to ensure anchored holograms stay precisely in place so that the holographic in different devices will remain fixed in the same position of the physical world, and then follow the doc Local anchor transfers in Unity to enable one HoloLens device to export an anchor to be imported by a second HoloLens device.

Azure media Services player

I'm feeling that this is a really dumb question, but my research tells me I have to create my own player. Is that true?
I have a link (publish URL) from Azure Media services like this:
http://streamvideotest.streaming.mediaservices.windows.net/4ed49a08-f82d-462e-a05e-acea910064a5/7g91d5d8-b213-406c-90d8-75a3a5e2456d.ism/Manifest
Which I would like to hand out to a few people to play the video, or live feed at that channel. But you need some type of player? I've tried Windows Media Player (open URL), but that always fails.
It depends on which platform you want to reach.
If you are trying to play live stream on PC using Smooth Streaming, you could use Silverlight Player (http://smf.cloudapp.net/healthmonitor). Or if you want to stream DASH through modern browser (IE or Chrome), you could do it through HTML5 video tag natively.
If you are trying to reach out to iOS platform, you could do it natively by delivering HLS stream - append (format=m3u8-aapl) in your link above.
This is an article that describe different players you could use in different platform: https://msdn.microsoft.com/en-us/library/azure/dn223283.aspx.
As said, Azure Media Services has just rolled out Azure Media Player, which could detect the capability of the platform and feed in right streaming protocols by using the right player technology. please check out http://azure.microsoft.com/blog/2015/04/15/announcing-azure-media-player/.
Thanks,
Mingfei Yan
The Azure Media Player has officially been rolled out at: http://amsplayer.azurewebsites.net/
Of course, Mingfei is the Queen of Azure Media Services and the most definitive answer to any and all AMS questions!
Just to add to this thread - As Mingfei mentioned it depends on where you want your video to work. However, if you have a browser based offering, we'd reccomend starting with Azure Media Player.
http://azure.microsoft.com/blog/2015/04/15/announcing-azure-media-player/

Cirrus or Adobe Media Server

What is the difference between using Cirrus or FMS? What are the pros and cons the limitations and advantages of each?
Thank You
this link might be helpful http://labs.adobe.com/technologies/cirrus/
Unlike Adobe Media Server,Cirrus does not support media relay, shared objects, scripting, etc. So by using Cirrus, you can only develop applications where Flash Player endpoints are directly communicating with each other.

iOS BLE - How to keep app active in the background?

I am trying to find a clever way to keep a BLE app active in the background on iOS 6, without breaking any of Apple's rules. I plan to use the phone as a peripheral device and another BLE circuit as the central. My app will automatically be opened when a user arrives to a building using geofencing. After that the iPhone will connect to the first BLE central device it sees (the device will be in its white list). The user will then be able to move throughout the building switching to different BLE "nodes".
My question is: What do I need to do in the background when a user is stationary at their desk so that the app does not get suspended due to memory resources?
My idea is based on this solution for a separate problem: There could potentially (not regularly) be 10-50 users in an area with only a few BLE "nodes" and I read at bluetooth.org that I could setup a dynamic connection system, basically rotating connections through all the users.
My idea is to setup a similar dynamic system where the central device (not the iPhone) disconnects the device on regular intervals (30-40 minutes) and then the iPhone will reconnect.
Is this something that some feasible? Is this against the iOS development guidelines? I was unable to find anything explicit about this. I have also asked on the iOS developer forum, but unfortunately it is not as popular as this site.
Thanks in advance!
Xcode -> Project target -> Capabilities -> Enable background mode
Check Uses Bluetooth LE Accessories
Capabilities
Also enable the following key in .plist file
Required background modes
App communicates using CoreBluetooth
Plist

Resources