I am working on an open source project to add something in that i need. It uses an XInput system to get input from the keyboard and i cannot change that. However, for keypresses, i need to get the hardware scancodes, like those listed here. Is there a way to go from an XKeyPressedEvent to these KEY_* scan codes from input-event-codes.h, to know what key was pressed in a layout independent way?
Related
In Linux (most distros?) there is a Keyboard Shortcuts settings app that lets you set keyboard shortcuts like the key combination that minimizes a window or switches to the next workspace/desktop or moves to the next media track. I want to connect to the event that is fired when the defined shortcut is pressed rather than the key combination as the key combination can change but the event is always the event. Specifically I want to capture the next/previous workspace/desktop events, not the workspace/desktop changing, but the keyboard request to make the change.
However, I cannot figure out where these events are surfaced. Does anyone know where I might connect to these events? I would think GTK+ has some way to surface it, but even if I have to go down to X11/XLib, I'm ok with that.
Ultimately I will code this in python, but for now I'm just looking for a way to capture these events.
I've got a bit of an interesting problem here. There are plenty of threads I've found where people are working to hide or get rid of a cursor on an embedded Qt GUI...but I'm trying to get a cursor to show up on an embedded Qt GUI.
I inherited a project that was 'finished' some time ago, and the person who did the most work on the project has moved on. Fast forward to today and there is a need to add a cursor to this functional touchscreen GUI. The system OS is Yocto Linux, and it is running a Qt 5.4 application on a framebuffer.
I've scoured the Qt code and there is nothing there that would hide a cursor. I've added in the appropriate QT_QPA_FB_HIDECURSOR=0 environment variable to my Qt startup script. I've experimented with adding a QCursor obejct to the GUI. Unfortunately none of these things are working. Using the QCusor I am sometimes able to get a cursor up on the screen, but isn't tied to the touch input (the cursor shows up at the position I programatically move it to, but it stays there when I interact with the GUI).
My touch input events are tied into Qt (via QT_QPA_GENERIC_PLUGINS=evdevtouch and QT_QPA_EVDEV_TOUCHSCREEN_PARAMETERS=/dev/input/event9:rotate=180), but for some reason that touch input cannot be tied to a cursor.
At this point I've spent a few days messing around with environment variables and startup script modifications, but nothing I've done has got the result I'm looking for.
Does anybody out there have some ideas on where to look for solutions to this problem?
Thanks!
Ian
So, now 3 months later I think my team and I just came up with a passable solution to this problem.
The path towards the solution started with the Qt Documentation on "Using libinput". The documentation boils down to a few important statements:
Parameters like the device node name can be set in the environment variables QT_QPA_EVDEV_MOUSE_PARAMETERS, QT_QPA_EVDEV_KEYBOARD_PARAMETERS and QT_QPA_EVDEV_TOUCHSCREEN_PARAMETERS
The mouse cursor shows up whenever QT_QPA_EGLFS_HIDECURSOR (for eglfs) or QT_QPA_FB_HIDECURSOR (for linuxfb) is not set and Qt's libudev-based device discovery reports that at least one mouse is available. When libudev support is not present, the mouse cursor always show up unless explicitly disabled via the environment variable.
The evdevtablet plugin provides basic support for Wacom and similar, pen-based tablets. It generates QTabletEvent events only. To enable it, pass QT_QPA_GENERIC_PLUGINS=evdevtablet in the environment or, alternatively, pass -plugin evdevtablet argument on the command-line. The plugin can take a device node parameter, for example QT_QPA_GENERIC_PLUGINS=evdevtablet:/dev/event1, in case the Qt's automatic device discovery (based either on libudev or a walkthrough of /dev/input/event*) is not functional or misbehaving.
So, in my system I have the device nodes: event0, event1, event2, event3, event4, event5, mice, and mouse0. Because I'm trying to get the mouse working, I made the assumption that I'd have to use the mouse0 node. This lead to me setting these environment variables:
QT_QPA_GENERIC_PLUGINS=evdevmouse
QT_QPA_EVDEV_MOUSE_PARAMETERS=/dev/input/mouse0
Much to my frustration these environment variables led to nothing. After some time my team and I figured out how to get debug output from Qt source on our system:
Modifying source code in the qtbase directory under our yocto build (roughly /yocto/poky/build/tmp/work/temp build directory/qtbase
Copying qtbase/plugins/generic/libqevdevmouseplugin.so to my hardware (roughly /usr/lib/qt5/plugins/generic)
Running Qt from the command line
We quickly discovered that the input events coming from mouse0 and mice were basically garbage data. On our system we did set up EVDEV in the kernel, so the mouse input was also tied to the device node event0. When we tried setting the Qt mouse parameter to event0 we started to see debug output that looked like real data.
QT_QPA_GENERIC_PLUGINS=evdevmouse
QT_QPA_EVDEV_MOUSE_PARAMETERS=/dev/input/event0
However, the problem of no-mouse-pointer still remained. After a while we looked back at the Qt Documentation, specifically at the 2nd paragraph listed above. As a last ditch attempt we tried adding in the QT_QPA_FB_HIDECURSOR environment variable...
QT_QPA_GENERIC_PLUGINS=evdevmouse
QT_QPA_EVDEV_MOUSE_PARAMETERS=/dev/input/event0
QT_QPA_FB_HIDECURSOR=0
And...voila! After countless hours of debugging and reading documentation, we finally got a mouse pointer.
I think the main crux of our issue was misinterpreting the Qt Documentation.
The mouse cursor shows up whenever ... QT_QPA_FB_HIDECURSOR (for linuxfb) is not set
By "not set", Qt means explicitly defined as FALSE...not simply "not set" at all.
This solution will work for us, but it does leave at least one thing to be desired. Along the way I stumbled across this thread answer on the Unix StackEx which points to the Kernel documentation of input/input.txt. In section "3.2.2 mousedev" you can see the line:
Each 'mouse' device is assigned to a single mouse or digitizer, except
the last one - 'mice'. This single character device is shared by all
mice and digitizers, and even if none are connected, the device is
present. This is useful for hotplugging USB mice, so that programs
can open the device even when no mice are present.
What this means for us is that while we can use event0 (which goes away when we unplug the mouse) for our mouse input event handling, we won't be able to support hot plugging without making some kernel/Qt-source modifications or figuring out how to get mice working as a Qt mouse input parameter.
So, the question of "why does event0 work and not mouse0/mice" still stands...but for now we've got a solution we can live with.
UPDATE: Now a little bit later we've figured out that udev was not working properly on our system. We added udev to the RDEPENDS in our package group for the Yocto build, and now we can set
QT_QPA_GENERIC_PLUGINS=evdevmouse
and we get a working mouse pointer with hotplug support.
I dont know if this applies to your problem (i dont use QT), but there is a
HAVE_TOUCHSCREEN=1 variable in the machconfig file. It is located normally in your BSP-layer in a recipes-bsp/formfactor/formfactor directory.
Setting this to 1 makes the cursor invisible.
Try setting it to 0
I would like to create some windows on a linux desktop for simple layout purposes. I need to avoid user input going to these windows (and I suppose avoiding the windows from gaining focus should suffice for that to happen).
I think that I can do this with the xprop command, by setting the WM_HINTS property, but I haven't found specific documentation on how to do it.
By the way, for an mplayer window, I can do this by using the option -input nodefault-bindings:conf=/dev/null. I simply need a general solution which I can enforce at a low level on any application's window.
Thanks!
A window indicates whether it wants to receive keyboard input by setting the KeyPress and KeyRelease bits in its event mask. If you do not want your window to receive keyboard input, simply do not set those event in CreateWindow()'s event mask. See http://www.x.org/releases/X11R7.7/doc/xproto/x11protocol.html#requests:ChangeWindowAttributes for more information.
Additionally, you should also set the input focus hints for your window to "NoFocus", as described in section 4.1.7 of ICCCM: http://tronche.com/gui/x/icccm/sec-4.html#s-4.1.7
If you want to fiddle with other applications' windows, you should be able to change their attributes and hints, although this may result in undesirable behavior and/or side effects.
I've been looking for the answer for quite some time now. This is a project I have but I can't manage to find a way to do it. The main idea would be to plug an additional keyboard on my computer that write multiple letters by hitting only one key. For example, instead of writing down a (when I hit the a key), it would write \textbf{ (for example).
I already manage to find the keyboard layout file under Linux and to switch the a and b keys, but I cannot find a way to print multiple characters.
I know it exist editors (like Texmaker or Kile) that have auto-completion, but I'm most of the time working in project in groups and therefore we use writelatex.com which does not propose auto-completion in it's free user pack ! Besides, I'm doing that for my personal interest.
Thanks a lot.
Have a look at autokey. It can assign phrases to hotkeys. It requires X11.
Another option might be to use a powerful text editor like vim or emacs which both have features like this, and then copy/paste the text into writelatex.com.
Some browsers have add-ons that allow you to edit the contents of a text field on a web page with a chosen text editor.
Edit: In Xorg you can use the X KeyBoard extension to e.g. change the meaning of individual keys. While you can configure the keyboard to generate (multibyte) unicode characters, you cannot assign arbitraty character strings to one key, to the best of my knowledge.
To scratch a personal itch, I'm writing something like a cross between a character map and an on-screen keyboard. When the user selects a character, I'd like to insert it into another application, specifically, the application that would next receive focus if my application were closed. Is there any way to do this? Right now, I work around it by just putting the character into the clipboard and terminating, leaving the user to hit paste in the other application, but usage would be far more streamlined if I could just insert the text programmatically.
I'm doing this in GTK and expect to run it only on Linux. But cross-platform solutions are also appreciated, and if GTK can't do it but some other toolkit can, I'll gladly switch.
This sounds like you should use libwnck, which is a GTK-related library that lets you manipulate windows on the desktop. The documentation is a little sparse, but the function wnck_screen_get_previously_active_window() seems promising.
From a WnckWindow you can get an X window ID, and perhaps from there you can use the X libraries to send a paste message (or even send it a "Ctrl-V" keypress event), perhaps with XSendEvent().
Very good question, by the way. I wish I could answer it more knowledgeably.