Is it possible to send SysRq commands with a barcode scanner? - linux

We have a kiosk application that runs matchbox on top of linux, and has only a barcode scanner for input (no keyboard). It would be great to be able to print a barcode that--when scanned--sent commands like SysRq R etc, so that one could REISUB without having to disassemble the unit.
If there is not an existing way, could you patch the barcode driver to interpret a certain set of symbols and initiate the sequence?

Why do you need SysRq? Is the machine actually wedging itself or are you just trying to reboot cleanly? Why not just put a "reboot" command into whatever protocol you're using? What's wrong with simply doing a hard power cycle?

Related

Emulate physical keyboard input on windows? On-screen-keyboard isn't enough.

I have an unusual exploratory test automation task, which relies on detecting keyboard input. (certain trigger words).
However, to automate the task, has proven very difficult, as the input must pass through the keyboard driver it seems. Thus, setting UI automation (like Sikuli, or even ghost mouse) to click on the on-screen keyboard, OR using Sikuli to 'type' or copy/paste text into any windows UI, is not registered.
Any ideas of how to emulate physical keyboard input for the purpose of automation?
You could use an Arduino to emulate a physical keyboard (there are easy to use libraries available). Also, if you use one of the more capable Arduino's with multiple serial ports you can send it what keypresses to emulate through the serial port. The computer will think it is a keyboard like any other. Except you can control it with software.
Of course, this solution assumes you can connect something to the USB port of the test computer.

Linux virtual keyboard and evdev

I write some software for Linux, which uses libevdev for input processing.
To my surprise all virtual onscreen keyboards that I found simulate high level X Window Server events. So, they're not recognized by udev, don't appear at /dev/input folder and aren't visible with evtest.
Is there any software keyboard that is low-level enough for that? Or maybe some trick for that?
There is a good reason why this is done in this way. The /dev/input devices are devices that have somekind of physical (electrical, optical and/or mechanical) input. These are converted by the linux kernel drive into something that generates EV_EVENTS. These events are processed by the xf86_input_evdev driver in to X11 inputs, which are understood by the server. As you can generate X11 inputs from an X11 program, it is quite a lot of work to create a device driver that accepts input on one side from an X11 app and generates input on the other. So while not impossible, it is a lot of work for no gain to create a driver or two for this purpose.

What is the tty subsystem for?

By now I have now spent at least 10 hours trying to get my head around the famous blog post by Linus Akesson, and Im still struggling. So let me ask my doubts about tty/ptty as a series of short questions.
1) Is the tty/ptty in user space or kernel space?
2) What is tty/ptty's connection to devices or drivers or some numbering or something?
3) The tty seems to be linked to something called the controlling terminal of a process, What is the relation and is every process related to a terminal?
4) On the whole I still dont understand where the heck this terminal concept fits in. A process wants to read something from the stdio, cant it simply do it from the required device file. What exactly is the problem that the tty intends to take care of?
5) I read somewhere that there are attempts to move the tty from the userspace to the kernel space. Is the tty simply a historical residue than a strong design feature.??
A clarification (which might answer some of your questions):
I think you meant pty (and not ptty) which is pseudo-tty/pseudo-terminal.
A tty (/dev/ttyx) - stands for teletype - is the original terminals (used a line printer for output and a keyboard for input!). A terminal is basically just a user interface device that uses text for input and output.
A pty (/dev/pty/n) is a pseudo-terminal - it's a software implementation that appears to the attached program like a terminal, but instead of communicating directly with a "real" terminal, it transfers the input and output to another program. It's the end point of telnet/SSH or even the GNOME terminal.
For example, when you ssh into a remote machine and run ls, the ls output is sent to a pseudo-terminal, the other side of which is attached to the SSH daemon.
EDIT:
As far as I know, the tty and so pty, are usermode. BUT they represent terminal-driver. What I mean is: the device file /dev/tty1 is the first virtual console. Most code lives in drivers/char, in the files tty_io.c and n_tty.c and vt.c (kernel source). In contrast to character devices in order to open those files tty_open routine is called, and trust me, it's way messier than opening a character device...
Tty/pty stands for terminal drivers mentioned above but they stands for serial ports (the"numbering" you said). I know very little about it so I don't want to say incorrect data... but you can search the net about it (or someone else can continue from here)
EDIT2:
You have changed the question so now it seems like I spoke out of context...
Anyway, tty has many different roles even nowday. Terminal driver is the way user-kernel can "communicate". There are some techniques such as terminal drivers, character device etc.
If you still have a question please comment and don't change the whole post....

Map stdin from second keyboard to a specific program / tty

I have a program (a bash script actually - console only) that scans or makes copies, etc based on user input. It asks questions such as how many copies would you like to make, etc and then scans the document, and prints it to another printer. The program runs in a loop so it's always there when a user passes by, and using a keyboard or number pad you can easily operate it. It basically makes a simple scanner/printer combo into a complex multifunction device.
I can leave it running on a dedicated system just fine, but to save electricity and resources, I would love to have it run on a computer someone else is already using. There is a user who has a laptop on the same desk as the scanner, and I was wanting to have her be able to do her thing in Xorg, as per usual, but have this little program running on an external monitor. That part is easy, but separating input is not. First of all the window has to be in focus, and then any input from the laptop keyboard OR usb keyboard is sent to the program, obviously.
I can think of one way to do this: using virtualbox, I can run a virtual machine without X, have it permanently ssh into the host OS (to which the usb scanner is connected) and I have virtualbox grab the usb keyboard input. But that seems excessive.
Does anyone know of a way to map input from a specific keyboard to a specific program or tty?

How can I write a virtual keyboard for the linux console?

I'd like some way to send keyboard input to a linux machine without a keyboard. I'd like this input to be accepted in any linux console, and in xmbc.
One idea I had would be to write some type of virtual USB keyboard that would send the input to in that way, but maybe this is over complicating it.
Basically what I'm looking for is Synergy without X (and with XMBC)
Anyone have suggestions on where to look?
You could write a user space driver to inject events into kernel.. so you can ssh into your linux box, run the driver and inject the keypresses you type.. so it will be as if there is a physical keyboard attached and would work with all programs. This gives good info on how to write user space drivers... if you don't want to reinvent the wheel, this program TermKeyboard does exactly that.
Disclaimer: I wrote the program
there is a keyboard for fbcon on sourceforge. to com;ile it you must add string.h include
http://sourceforge.net/projects/tabletvk/?source=dlp

Resources