how do i connect my custom button board to pc via usb? - windows-10

Goal/Challenge:
i have a challenge to create a custom controller without using special controllers e.g. raspberry pi or Arduino.
What i have done so far:
I have created a custom board that takes power from the red wire and outputs power via the coloured strip depending on what buttons the user has pressed. 1= on, 0 = off, 9 outputs, 1 power input
my problem:
i am trying to connect this board to my pc via usb.
I have seen that some chips can be used to do this but, i have no idea of what to use or how to connect them.
Help/Info i need:
what could i use to connect my button board to pc(windows 10) via usb?
Extra:
i don't know if i'm posting this in the right place or i'm using the right tags.
please suggest any changes i need to make

If you just want to make a custom button board the easiest way is to buy some buttons e.g arcade buttons und a zero delay usb encoder.
Plugs into pc via usb and work like any controller.. you can set the desired functions for each button via windows itself.

Related

How to Turn on unpaired BLE device (BT-low energy)?

I have a bluetooth energy air purifier. The button to turn it on and put it in pairing mode is broken, but the mask can still charge. My PC and phone can see the device, but can't pair/connect to it (needs an app) but the app is in default state since I moved to new phone (old phone is factory reset - forgot that the mask button was broken).
How do I turn the device on and put in pairing mode w/o the power button working? Anything I can do for it?
I couldn't find a replacement button on ebay and the product isn't sold in North America. It's the LG Puricare Mask Gen 2. Is it a lost cause T.T. I found a teardown video of it.
https://youtu.be/VGjd6fnD-oE

Capturing input device disables two-fingers-for-right-click capability

I have a convertible laptop for which there isn't great Linux support: the desktop environment is unable to detect when the device is in tablet mode, so the keyboard and touchpad are always active, which makes tablet mode almost useless. I've solved this problem by writing a simple Python script that grabs the keyboard and mouse input devices and proxies events to the system until a specific key sequence is received. At this point, it stops proxying events until the same key sequence is seen again.
The code is effectively a slightly more complex version of this example (which reproduces the problem):
import evdev
import selectors
dev = evdev.InputDevice('/dev/input/event5')
ui = evdev.UInput.from_device(dev)
dev.grab()
selector = selectors.DefaultSelector()
selector.register(dev, selectors.EVENT_READ)
while True:
for key, mask in selector.select(0.1):
dev = key.fileobj
for event in dev.read():
cat = evdev.categorize(event)
print(cat)
ui.write_event(event)
/dev/input/event5 is the touchpad. The system has the following devices:
Available devices:
/dev/input/event0: Lid Switch
/dev/input/event1: Power Button
/dev/input/event2: Sleep Button
/dev/input/event3: Power Button
/dev/input/event4: AT Translated Set 2 keyboard
/dev/input/event5: SynPS/2 Synaptics TouchPad
/dev/input/event6: Wacom HID 5072 Pen
/dev/input/event7: Wacom HID 5072 Finger
/dev/input/event8: HDA Intel PCH Mic
/dev/input/event9: HDA Intel PCH Headphone
/dev/input/event10: HDA Intel PCH HDMI/DP,pcm=3
/dev/input/event11: PC Speaker
/dev/input/event12: ThinkPad Extra Buttons
/dev/input/event13: Integrated Camera: Integrated C
/dev/input/event14: Video Bus
When this code runs, movements and regular click actions work just fine, but a two-fingered click, which normally acts like a right-button click, no longer works. Since this code is just taking events and re-sending them, why is this behavior different? How would I get the two-fingered click to work as expected?
I think what you need is very similar to this, altered of course to your needs
Python_Touchscreen_RightClick.py
if this doesn't work for you you can also try and use Gesture Event from python-libinput

How can I configure MRTK to work with touch input in editor and on mobile devices?

I'm building an application that will run on both HoloLens and mobile devices (iOS/Android). I'd like to be able to use the same manipulation handlers on all devices with the goals:
Use ARFoundation for mobile device tracking and input
Use touch input with MRTK with ManipulationHandler and otherwise use touch input as normal (UI)
Simulate touch input in the editor (using a touch screen or mouse) but retain the keyboard/mouse controller for camera positioning.
So far I've tried/found:
MixedRealityPlayspace always parents the camera, so I added the ARSessionOrigin to that component, and all the default AR components to the camera (ARCameraManager, TrackedPoseDriver, ARRayCastManager, etc.)
Customizing the MRTK pointer profile to only countain MousePointer and TouchPointer.
Removing superfluous input data providers.
Disabling Hand Simulation in the InputSimulationService
Generally speaking, the method of adding the ARSessionOrigin to the MixedRealityPlayspace works as expected and ARFoundation is trivial to set up. However, I am struggling to understand how to get the ManipulationHandler to respond to touch input.
I've run into the following issues:
Dragging on a touch screen with a finger moves the camera (editor). Disabling the InputSimulationService fixes this, but then I'm unable to move the camera...
Even with the camera disabled, clicking and dragging does not affect the ManipulationHandler.
The debug rays are drawn in the correct direction, but the default touchpointer rays draw in strange positions.
I've attached a .gif explaining this. This is using touch input in the editor. The same effect is observed running on device (Android).
This also applies to Unity UI (world space canvas) whereby clicking on a UI element does not trigger (on device or in editor), which suggests to me that this is a pointer issue not a handler issue.
I would appreciate some advice on how to correctly configure the touch input and mouse input both in editor and on device, with the goal being a raycast from the screen point using the projection matrix to create the pointer, and use two-finger touch in the same way that two hand rays are used.
Interacting with Unity UI in world space on a mobile phone is supposed to work in MRTK, but there are few bugs in the input system preventing it from working. The issue is tracked here: https://github.com/microsoft/MixedRealityToolkit-Unity/issues/5390.
The fix has not been checked in, but you can apply a workaround for now (thanks largely to the work you yourself did, newske!). The workaround is posted in the issue. Please see https://gist.github.com/julenka/ccb662c2cf2655627c95ffc708cf5a69. Just replace each file in MRTK with the version in the gist.

qt5 windows 10 mobile keyboard

On windows 10 mobile (UWP), the (windows built in)virtual keyboard, which pops up as soon as you focus a textinput element, does not change the screensize, and therefor overlap my input element.
Is there a possibilty, to
1. connect to an event like keyboard open/close
2. get the height of the virtual keyboard
Or any other workaround?
I fixed it by using getting the inputmethod Object with QGuiApplication::inputMethod(); , connect its to visibleChanged() signal
and then request the Rectangle with keyboardRectangle();

What is a Native keyboard?

I have a simple Java phone with touchscreen capabilities.In the 'Write Message' section a simple/normal keyboard(Non-Qwerty) is there to compose messages.Now my questions is, whether this default [simple/normal keyboard(Non-Qwerty)] is called Native Keyboard or it's something different altogether?
When a phone does not have a physical keyboard we call the virtual keyboard (shown on screen) as native keyboard.
If you use LCDUI Forms or TextBox this keyboard is presented automatically by the Java Virtual Machine.

Resources