Emulate physical keyboard input on windows? On-screen-keyboard isn't enough. - keyboard

I have an unusual exploratory test automation task, which relies on detecting keyboard input. (certain trigger words).
However, to automate the task, has proven very difficult, as the input must pass through the keyboard driver it seems. Thus, setting UI automation (like Sikuli, or even ghost mouse) to click on the on-screen keyboard, OR using Sikuli to 'type' or copy/paste text into any windows UI, is not registered.
Any ideas of how to emulate physical keyboard input for the purpose of automation?

You could use an Arduino to emulate a physical keyboard (there are easy to use libraries available). Also, if you use one of the more capable Arduino's with multiple serial ports you can send it what keypresses to emulate through the serial port. The computer will think it is a keyboard like any other. Except you can control it with software.
Of course, this solution assumes you can connect something to the USB port of the test computer.

Related

How to disable built in keyboard in windows10 pro using code?

What i want to happen is that, everytime i'll use my usb keyboard automatically my internal keyboard will disabale..
Chuwi brand Loptop
Your best bet is manually (see last paragraph). If you have coding skills, you can do these ideas, though they are not simple:
Use an always-running Batch or PowerShell script to interface with devcon.exe.
Write a driver or low-level utility to do it yourself.
If not, you should disable it manually using Device Manager (devmgmt.msc). You'll want to check under Human Interface Devices, and disable (or enable) the keyboard that's always there.

Linux virtual keyboard and evdev

I write some software for Linux, which uses libevdev for input processing.
To my surprise all virtual onscreen keyboards that I found simulate high level X Window Server events. So, they're not recognized by udev, don't appear at /dev/input folder and aren't visible with evtest.
Is there any software keyboard that is low-level enough for that? Or maybe some trick for that?
There is a good reason why this is done in this way. The /dev/input devices are devices that have somekind of physical (electrical, optical and/or mechanical) input. These are converted by the linux kernel drive into something that generates EV_EVENTS. These events are processed by the xf86_input_evdev driver in to X11 inputs, which are understood by the server. As you can generate X11 inputs from an X11 program, it is quite a lot of work to create a device driver that accepts input on one side from an X11 app and generates input on the other. So while not impossible, it is a lot of work for no gain to create a driver or two for this purpose.

Capturing Global Keyboard Events On Linux With NodeJS

I have a headless Debian ARM machine that I'm running Node on. The device has hard buttons that are mapped to normal keyboard events using gpio-keys.
My goal is to capture the global events from both the hard buttons as well as any attached keyboards in Node. I need a solution that can capture the keydown/keyup events independently of the terminal that it's run in (it will be run over an SSH session). It doesn't have to be cross-platform, as long as it works on ARM Debian I'll accept it.
I am imagining something reading directly from whatever sysfs attributes are necessary, but that's not a requirement.
Can anyone help me on this? I've been stuck for a while.
One of the device files /dev/input/event* will represent the gpio-keys device. You can figure out which one in a number of ways; one easy one is to look at the contents of the uevent file for the device, e.g. /sys/class/input/event0/device/uevent. It'll contain a number of useful key-value properties.
Once you've figured out which device you want, you can open and read from it. It'll return a stream of struct input_events, as defined in <linux/input.h>. These events will correspond to presses and releases for each of your buttons.
You may also want to take a look for existing solutions for at least part of the problem, such as node-keyboard: https://github.com/Bornholm/node-keyboard

Map stdin from second keyboard to a specific program / tty

I have a program (a bash script actually - console only) that scans or makes copies, etc based on user input. It asks questions such as how many copies would you like to make, etc and then scans the document, and prints it to another printer. The program runs in a loop so it's always there when a user passes by, and using a keyboard or number pad you can easily operate it. It basically makes a simple scanner/printer combo into a complex multifunction device.
I can leave it running on a dedicated system just fine, but to save electricity and resources, I would love to have it run on a computer someone else is already using. There is a user who has a laptop on the same desk as the scanner, and I was wanting to have her be able to do her thing in Xorg, as per usual, but have this little program running on an external monitor. That part is easy, but separating input is not. First of all the window has to be in focus, and then any input from the laptop keyboard OR usb keyboard is sent to the program, obviously.
I can think of one way to do this: using virtualbox, I can run a virtual machine without X, have it permanently ssh into the host OS (to which the usb scanner is connected) and I have virtualbox grab the usb keyboard input. But that seems excessive.
Does anyone know of a way to map input from a specific keyboard to a specific program or tty?

How can I write a virtual keyboard for the linux console?

I'd like some way to send keyboard input to a linux machine without a keyboard. I'd like this input to be accepted in any linux console, and in xmbc.
One idea I had would be to write some type of virtual USB keyboard that would send the input to in that way, but maybe this is over complicating it.
Basically what I'm looking for is Synergy without X (and with XMBC)
Anyone have suggestions on where to look?
You could write a user space driver to inject events into kernel.. so you can ssh into your linux box, run the driver and inject the keypresses you type.. so it will be as if there is a physical keyboard attached and would work with all programs. This gives good info on how to write user space drivers... if you don't want to reinvent the wheel, this program TermKeyboard does exactly that.
Disclaimer: I wrote the program
there is a keyboard for fbcon on sourceforge. to com;ile it you must add string.h include
http://sourceforge.net/projects/tabletvk/?source=dlp

Resources