SKMaps: Turn LED for right/left turns - skmaps

I would like to know if it is possible to use SKMaps SDK to implement turn-by-turn navigation and switch the iPhone LED accordingly for right and left turns

Related

How can I configure MRTK to work with touch input in editor and on mobile devices?

I'm building an application that will run on both HoloLens and mobile devices (iOS/Android). I'd like to be able to use the same manipulation handlers on all devices with the goals:
Use ARFoundation for mobile device tracking and input
Use touch input with MRTK with ManipulationHandler and otherwise use touch input as normal (UI)
Simulate touch input in the editor (using a touch screen or mouse) but retain the keyboard/mouse controller for camera positioning.
So far I've tried/found:
MixedRealityPlayspace always parents the camera, so I added the ARSessionOrigin to that component, and all the default AR components to the camera (ARCameraManager, TrackedPoseDriver, ARRayCastManager, etc.)
Customizing the MRTK pointer profile to only countain MousePointer and TouchPointer.
Removing superfluous input data providers.
Disabling Hand Simulation in the InputSimulationService
Generally speaking, the method of adding the ARSessionOrigin to the MixedRealityPlayspace works as expected and ARFoundation is trivial to set up. However, I am struggling to understand how to get the ManipulationHandler to respond to touch input.
I've run into the following issues:
Dragging on a touch screen with a finger moves the camera (editor). Disabling the InputSimulationService fixes this, but then I'm unable to move the camera...
Even with the camera disabled, clicking and dragging does not affect the ManipulationHandler.
The debug rays are drawn in the correct direction, but the default touchpointer rays draw in strange positions.
I've attached a .gif explaining this. This is using touch input in the editor. The same effect is observed running on device (Android).
This also applies to Unity UI (world space canvas) whereby clicking on a UI element does not trigger (on device or in editor), which suggests to me that this is a pointer issue not a handler issue.
I would appreciate some advice on how to correctly configure the touch input and mouse input both in editor and on device, with the goal being a raycast from the screen point using the projection matrix to create the pointer, and use two-finger touch in the same way that two hand rays are used.
Interacting with Unity UI in world space on a mobile phone is supposed to work in MRTK, but there are few bugs in the input system preventing it from working. The issue is tracked here: https://github.com/microsoft/MixedRealityToolkit-Unity/issues/5390.
The fix has not been checked in, but you can apply a workaround for now (thanks largely to the work you yourself did, newske!). The workaround is posted in the issue. Please see https://gist.github.com/julenka/ccb662c2cf2655627c95ffc708cf5a69. Just replace each file in MRTK with the version in the gist.

Does the MRTK include animated hand visualizers with grab functionality for use with their VR headsets?

Looking at the sample scenes, it seems like the MRTK uses a pointer system instead of actually giving you virtual hands that can be manipulated by pressing buttons or triggers on the controllers, and allowing you to grab objects.
Is there a built-in way to use hands?
There is currently not a set of virtual hands build into MRTK for VR, however it would be possible to rig up a custom visual for a controller that looks like hands, and animate the hands based on things like which buttons / triggers are pressed on the controllers.
Oculus provide hand models which are rigged in their SDK, which you can appropriate for use in the MRTK as shown in the video below.
https://www.youtube.com/watch?v=F3e2lwqVPyc&list=PLCK8aOPy3e4JhG06GRdlJkmNXoITEewxV&index=4
You'd just need to do a little more work to get the hand rigging hooked up to the MRTK hands.

Detecting touchpad movement vs regular mouse programmatically on Linux

I love the mod4 + mouse-drag combo for moving/resizing windows in Awesome WM, it's very intuitive with regular mouse. Now that I'm using Awesome WM on my laptop, however, I find this combo more annoying when using the touchpad vs regular mouse.
The problem stems from the fact that I now need 3 fingers to perform a gesture that I could do with 2 before (1 to move on the touchpad, 1 to keep on the left-click at all times, and one on mod4). Alternatively, I can apply more force to the touchpad and have it pressed as I drag my finger, which is not any better since it puts a lot of stress on the finger doing the dragging).
What I would like to do instead is have awesome treat left-mouse button as pressed if both of the following conditions are met:
mod4 is pressed
movement event is coming from touchpad and not regular mouse
To do so, however, I need to be able to detect that the movement is coming from the touchpad. Is there a way to do so in Awesome WM/Linux? I've looked through the keysyms (http://wiki.linuxquestions.org/wiki/List_of_keysyms) but don't see anything for the mouse. I've also looked at the mouse.lua file in Awesome WM but it doesn't seem to have anything to differentiate between the two either (https://github.com/awesomeWM/awesome/blob/master/lib/awful/mouse/init.lua). If there is a way to tell that the last coordinate change came from a touchpad on Linux that would resolve the issue as I could simply create a lua file to run such check whenever Mod4 is pressed.
To do so, however, I need to be able to detect that the movement is coming from the touchpad. Is there a way to do so in Awesome WM/Linux?
Nope, there is no such way in AwesomeWM. Sorry.
In X11, this is possible via the input extension. However, awesome does not use that extension.

UIWebView with Custom Gestures

As of right now I have a View with a UIWebView inside of it and some added custom gestures. Some examples of these gesture are Two Finger Slide Right to Go Back, Two Finger Slide Left to Go Forward, Two Finger Long Press for Refresh, ect.. But now I'm facing an issue I knew I would have to face when I began developing this app:
All of the gestures work great unless the UIWebView is zoomed in. Even if it is zoomed in the tiniest bit (or you are able to scroll the web page horizontally), the gestures that require you to swipe left or right are suddenly disabled because UIWebView takes first priority over these gestures.
If anyone can shine some light on this issue or even provide a work-around, I would be very grateful. Thanks!
I'm more familiar with OS X, but could you subclass UIWebView to remove this behavior?

iPhone SDK UIImagePicker Hide Controls

I know how to hide the camera controls (.showsCameraControls = NO) but if I do this I will lose the button to switch from rear to front facing camera which I need, is there a way to keep that but lose the controls at the bottom?
I tried keeping all of the controls and overlaying on top of the bottom bar but the bottom bar is always on top of the cameraOverlayView whatever I try. I think it used to work but doesn't in 5.0.
I also realise you can add your own button to switch between the 2 cameras (.cameraDevice) but I want to keep it looking as much like the proper interface as possible.
Any pointers are really appreciated, The whole point of this is that I need to call .takePicture myself but want the interface to look exactly like it normally does with all of the default buttons.

Resources