How can I disable the on screen keyboard for pixelsense surface? - keyboard

I am writing a program for surface mode that has 4 movable boxes for users. There are textboxes to enter data. Since Windows only allows one keyboard at a time, I had to create four keyboards programmatically.
However, I can't find a way to disable the keyboard that pops up by default when I touch these textboxes.
I have read something similar was accomplished on a tablet by editing the registry key for a specific program.
How can I do this for PixelSense?

Perhaps it is possible to to set the textboxes to
Enabled="false"
and than catch the PreviewTouchDown-Event and display the Keyboard.

Related

Windows touch keyboard appearing in controls without text input

In MFC, how do I stop the touch keyboard from appearing when selecting controls without text inputs? Specifically, CComboBox-derived and original CListBox controls. The issue occurs in a large C++ application for touchscreen tablets running Windows 10. Bizarrely, selecting any CEdit control in the application solves the issue until the application restarts.
I've been looking at InputScope, AutomationPeer, and even killing the keyboard process TabTip.exe after it appears, but none of these prevent the keyboard from showing in the first place.
A previous solution involved changing the "Automatically show touch keyboard" registry setting SOFTWARE\Microsoft\TabletTip\1.7\EnableDesktopModeAutoInvoke on focus change, but alerting the touch keyboard with SendNotifyMessage(HWND_BROADCAST, WM_SETTINGCHANGE) added unacceptable delay to the UI.

Enabling keyboard shortcuts to confirm Dialogs in AppleScript

I am looking for a way to allow a user to complete a Dialog entry using keyboard shortcuts. Is this possible?
Other questions have discussed assigning shortcuts to the options in an AppleScript dialog box, but not to the "Continue"/"Okay" etc. button.
The main difficulty is that I'm using a multi-line text entry form, so the Enter button simply creates a new line, instead of targeting the default button as it would conventionally. I'm hoping cmdenter can be assigned to the default button instead.
The line of script defining the dialog in question is:
set theResponse to display dialog "Enter tasks:" default answer "
" buttons {"Cancel", "Continue"} default button "Continue"
Running your AppleScript code from Script Editor on a US English MacBook Pro, whether or not something is typed in, fnenter presses the Continue button.
The same keyboard shortcut works on an US English Apple Magic Keyboard when connected to the MacBook Pro and I'd assume any US English Mac it was connected to would do the same. I only have the MacBook Pro to test with at the moment.
In macOS, by default, pressing the tab key in this use case will not move between the controls as the controlling setting in System Preferences > Keyboard > Shortcuts is not set to allow it to act on all controls.
You must select one of the following options, depending on the version of macOS one is running, in order to use the tab key on all controls.
If you see:
Full Keyboard Access: In windows and dialogs, press Tab to move keyboard focus between:
(•) Text boxed and lists only
( ) All Controls
Select: (•) All Controls
If you see:
[] Use keyboard navigation to move between controls
Press the Tab key to move focus forward and Shift Tab to move focus backward.
Check: [√] Use keyboard navigation to move between controls
With this done, one can then use tabtabenter to press the continue button, with the dialog box produced by the code shown in the OP.
Side Note: One can also try fncommandenter as that was necessary from within a VMware macOS Catalina virtual machine that I also tested in.
⌘-Enter (on the numeric keypad) presses Continue
If you are in a multiline text field, hit the Tab key so that focus is on some element other than the text field. Then the Enter key should route properly to the dialog's default close button.

Detecting touchpad movement vs regular mouse programmatically on Linux

I love the mod4 + mouse-drag combo for moving/resizing windows in Awesome WM, it's very intuitive with regular mouse. Now that I'm using Awesome WM on my laptop, however, I find this combo more annoying when using the touchpad vs regular mouse.
The problem stems from the fact that I now need 3 fingers to perform a gesture that I could do with 2 before (1 to move on the touchpad, 1 to keep on the left-click at all times, and one on mod4). Alternatively, I can apply more force to the touchpad and have it pressed as I drag my finger, which is not any better since it puts a lot of stress on the finger doing the dragging).
What I would like to do instead is have awesome treat left-mouse button as pressed if both of the following conditions are met:
mod4 is pressed
movement event is coming from touchpad and not regular mouse
To do so, however, I need to be able to detect that the movement is coming from the touchpad. Is there a way to do so in Awesome WM/Linux? I've looked through the keysyms (http://wiki.linuxquestions.org/wiki/List_of_keysyms) but don't see anything for the mouse. I've also looked at the mouse.lua file in Awesome WM but it doesn't seem to have anything to differentiate between the two either (https://github.com/awesomeWM/awesome/blob/master/lib/awful/mouse/init.lua). If there is a way to tell that the last coordinate change came from a touchpad on Linux that would resolve the issue as I could simply create a lua file to run such check whenever Mod4 is pressed.
To do so, however, I need to be able to detect that the movement is coming from the touchpad. Is there a way to do so in Awesome WM/Linux?
Nope, there is no such way in AwesomeWM. Sorry.
In X11, this is possible via the input extension. However, awesome does not use that extension.

How can I change the keyboard shortcuts of Unity desktop (Ubuntu 14.04)?

I want to change the default shorcuts for some touchpad events and key combinations, for example swiping left or right with three fingers for switching workspaces, using Super + K for opening the terminal and so on. What can I do?
Thanks!
click on
shutdown button --> system setting --> keyboard-->shortcut
, choose one element from the left list then modify what you want

How can I set the default orientation of labview windows?

Whenever I open a new labview project, it opens two small windows, one for the block diagram and the front panel. Since using labview effectively requires simultaneous use of both, is it possible to set things up such that, upon starting a new VI, it opens these two windows in pre-determined positions and sizes?
I do not know setting to do so (and think there is no such setting), but your problem is easily solvable if you press ctrl+t when new vi is opened.
ctrl+t will set front panel on the left half part of the screen and block diagram on the right part. Pressing ctrl+t a second time will set the panel to top half and diagram to the bottom half.
Shortcuts In LabVIEW
Another workaround:
Create a new empty VI
Resize and reposition the front panel window as you wish
Do the same for the block diagram window
Save the VI as a template (.vit)
Double click the template to use it (position and size of windows will be as they were when saving)
Alternatively if you want to be doing manually everytime. You can press WIN+LEFT on one of the windows and WIN+RIGHT on the other. This will evenly distribute the two windows over the screen.
You can set window position for individual VIs by pressing Ctrl+I to open the VI properties, and setting the desired appearance under "Window Size"

Resources