I need to build a simple app that takes (click only) input from 4 USB mice connected in addition to the built-in touchpad on a notebook. My preferred operating system for this setup would be Linux.
Any idea how I might be able to discern in an application which mouse a click came from? I'm open to C programming or whatever it takes. It's a simple, one-off project, so nothing too elaborate though.
For what it's worth, I think I found an answer to my question.
bobince's mention of xorg led me to look in /etc/X11/xorg.conf. That turns out to be full of comments like
# commented out by update-manager, HAL is now used
I had heard of HAL before, and not just in 2001. I tried man -k hal and found lshal, which lists 133 (!) HAL devices in my PC. In the entry for one of my mice, I found
linux.sysfs_path = '/sys/devices/pci0000:00/0000:00:0b.0/usb2/2-7/2-7:1.0/input/input6/event6'
which turns out to be a directory in the file system. Exploring from there, I discovered a reference back to /dev/input/mouse3. In fact, all my mice were sitting there in /dev/input!
Wearing my superuser cape, I was able to read /dev/input/mouse3:
root#carl-ubuntu:/dev/input# od -t x1 -w6 mouse3
0000000 09 00 00 08 00 00
*
so it turns out a left mouse click is 09 00 00 08 00 00, consistently and repeatably.
Conclusion: I can read mouse clicks from /dev/input/mouseX. Having done chmod a+r on those files, I can even read them from a normal user account. I need to figure out a way to stop 4 mice running wild in the hands of probably drunk people from actually interacting directly with the GUI, but that's a problem for another day.
MPX is where it's at for multiple-mouse and multitouch under Linux, but you'll need to be using xorg xserver 1.7 to get the ‘proper’ version of it; this is generally taken as part of X11R7.5, which has only just come out as ‘stable’ and has not been integrated by the distros yet. (Even xorg-edgers doesn't have it, though that's where you'd keep an eye on if you're an Ubuntu-er.)
GTK+ seems to have had some work put into allowing you to detect which mouse clicked (GdkEvent.gdk_event_get_device), but I don't know what the timetable is for getting this into a full stable release. Nor do I know how far along Qt4 is with it. So in summary the situation is: it works if you're willing to put the time into grabbing, compiling and fixing stuff, but it's not smooth with mainstream Linux, yet.
I'm not too sure where to start for this, but it sounds a lot to me like it'd be similar to getting multi-touch to work. Maybe start looking for multi-touch drivers for linux?
Also, luvieere's first link might be helpful.
Related
Scenario:
I have an usb-RFID reader
attaching it to notebook it works as an newly attached USB keyboard, e.g. without needing to install any drivers
when touching the reader with RFID tag
it enters into my current window (for example terminal/shell) the RFID number (like 0009339384\n) - e.g. it even sends the \n.
so, it works exactly as if I had typed the numbers in my notebook's keyboard
The questions are:
is it possible read the RFID reader directly without some kernel-level drivers, e.g. something like cat /dev/keyboard1 ...
in other words, how can I determine from which "keyboard" the characters are coming?
using OS X, but would be nice to know the solution for Linux too.
Moreover, I want attach two readers - so I definitely need to clearly differentiate between the two readers. And I want use the rfid-reader in a bash (perl) script, so I'm definitely looking for a solution without compiling some "drivers"... It is possible?
The OS X identifies it as:
SYC ID&IC USB Reader:
Product ID: 0x0035
Vendor ID: 0xffff
Version: 1.00
Serial Number: 08FF20140315
Speed: Up to 1.5 Mb/sec
Manufacturer: Sycreader RFID Technology Co., Ltd
Location ID: 0x14100000 / 18
Current Available (mA): 500
Current Required (mA): 100
Extra Operating Current (mA): 0
EDIT Okay, looks like in Linux it can be done - just found
this https://unix.stackexchange.com/questions/72483/how-to-distinguish-input-from-different-keyboards
also Accessing multiple keyboards input by C++ (or python) in linux
For OS X - exact duplicate on unix: https://unix.stackexchange.com/questions/228413/route-keyboard-through-only-dev-ttys000-on-mac-os-x - unfortunately, closed without any answer :(
Ok, so - easily solvable in Linux. As in edits in the question - here are already many similar questions like this.
The solution is: reading the particular /dev/input/eventN device(s).
In my case, me using the Linux::Input perl module. Works perfectly.
It is pointless adding code here, the package comes with the evtest.pl - so anyone could easily check how it works.
Still need solve one issue - e.g. even when reading the device and nicely getting all events from the rfid reader (4 events for one number), the rfid-code still is inserted into the active window, like it coming from a keyboard. (This will be an another question).
For OS X i haven't an easy solution yet, but now focusing for the Linux variant. :)
There is one thing that might help you solve this problem without resorting to programming in C. It is called multiseat. I didn't do it myself I just know it exists. In general it is a way how multiple people can work on same computer at the same time just using multiple keyboards, mice and monitors. It is not exactly what you are looking for but there might be possible way.
I have recently moved to Linux full time, and am enjoying the learning curve. However, one particular thing has me stumped big time: Some of the Fn key combinations on my laptop are not working, spec. Volume up/down, Mute, etc. Combinations that are working include WLAN, Sleep, Video cycle, numeric pad, etc. I can rule out a H/W fault, since the keys worked perfectly fine on Windows 7 (although only when the hotkeys software by the laptop maker was installed).
I have scoured the net for possible explanations, and have come across the concepts of scancode (HW dependent), keycode and keysym. I think I understand the basics, and have found that console and X have their own mappings, and need to be remapped separately. The console uses the kernel mapping of scancodes to keycodes, but X for some reason has its own mapping. For my part, I have tried:
Set the boot parameter atkbd.softraw=0
Switched to console mode by Ctrl + Alt + F1
Used showkey --scancodes. Unfortunately, the keys that I am trying to get working do not show any scancode output
Used dmesg to see if any Unknown key pressed events have occured, but none found.
In my desperation, tried acpi_listen to see if the keys were actually firing any acpi events instead, only sleep and video cycle keys do, others do not output anything
At this point, I thought maybe I should try getting scancodes from the X environment itself, using xev, but no luck.
I have come here as a last resort only. I hope somebody has a good explanation as to why some of the function key combinations are not generating any output in the tools I have tried above. If it helps, I am using Linux Mint 17.3 Cinnamon, and the laptop is made by HCL. evtest shows the keyboard device to be AT Translated Set 2 keyboard. If more info is needed, I would be happy to oblige. Thanks.
EDIT: No relevant BIOS setting is available.
Confession: All my knowledge on this is based on what I have been reading up on Arch wiki, Ubuntu wiki, a whole lot of forum posts and other websites. So, if I am technically wrong about something, please bear with me, and correct me. I love learning this stuff :)
Yes, some keys on USB keyboards might not generate a scan code sent via the USB HID keyboard protocol but instead use a different USB protocol to communicate some user input. From what you've described, that's most likely what's happening here. You may be able to use programs from the evmu-tools package (that's the Debian name) or the older evtest program to find out more about what your particular device is doing for things that appear not to be sending keyboard scan codes.
(It also seems, from reading The Unix & Linux SE question "How to get all my keys to send keycodes" that there's something going on with keycodes above 255, but I'm not clear on what's going on there.)
There's also an error in your understanding of the layering:
The console uses the kernel mapping of scancodes to keycodes, but X for some reason has its own mapping.
This is not quite correct. The kernel maps scan codes to keycodes between 1 and 255; you can see this mapping with getkeycodes(8) and change it with setkeycodes(8) or udev. (The Arch Wiki page Map scancodes to keycodes has many details on this.) Not all scan codes have a mapping to a keycode; receipt of a scan code with no translation entry are what you would have seen in dmesg, had there been any.
Only after the scan code is converted to a kernel keycode do the console and X11 have access to these; each has its own mechanism to translate keycodes to actions.
Note that the console program showkey -s does not show actual scan codes that have been received; it reads keycodes (as shown by showkey -k) and translates those back into scan codes using the kernel table shown by getkeycodes(8).
It might depend upon the X11 window manager. You should try using xev(1) to understand what is going on.
Maybe using some other desktop suite like xfce or lxde or gnome, kde, icewm might help
Maybe configuring explicitly your keyboard (e.g. in /etc/Xorg.conf...) might help.
Running an up-to-date Gentoo on my Sager NP8298 (Clevo P177SM-A), and I am heartbreakingly close to having all of my hardware running beautifully. I found a nice open source driver to run my keyboard backlight at this GitHub repo, but the problem was it was made for a Clevo chassis that didn't have the touchpad light that mine does. Kinda tacky, I know, but the problem is that the default color for the touchpad light is blue, and can be kind of distracting when the keyboard is set to a different color.
I'd at least like to be able to turn the light off, if not control its color. I have a Windows install and am able to access the proprietary driver that came with the computer. I just don't quite know where to start on trying to modify this driver, if there were some Windows utilities that I could use to see what the driver is doing and how to access the LED programatically, it would be a huge help. Any ideas?
Other functionality that I'd like to add is Fn+Num pad 7 through 9 for toggling the left, center, and right part of the keyboard individually, and Fn+5 for a num pad light toggle, as the Windows driver does. I just need to know what signals need to be sent to the hardware and how to send them.
Whatever I end up with I'll be sure to fork the project and share the results with other users of this hardware.
You need the source code of driver you want to change. With that and all required bits and bobs (a.k.a. dependences) you can change it to do whatever you want.
That said, there are quite a few things to consider. You need to know, at least at a reasonable level, the language used to build the driver, platform dependencies if any.
I've done similar work for some network drivers like 15 years ago and no it's not a fun job.
I'm looking into making a project with the Kinect to allow my Grandma to control her TV without being daunted by using the remote. So, I've been looking into basic gesture recognition. The aim will be to say turn the volume of the TV up by sending the right IR code to the TV when the program detects that the right hand is being "waved."
The problem is, no matter where I look, I can't seem to find a Linux based tutorial which shows how to do something as a result of a gesture. One other thing to note is that I don't need to have any GUI apart from the debug window as this will slow my program down a fair bit.
Does anybody know of something somewhere which will allow me to in a loop, constantly check for some hand gesture and when it does, I can control something, without the need of any GUI at all, and on Linux? :/
I'm happy to go for any language but my experience revolves around Python and C.
Any help will be accepted with great appreciation.
Thanks in advance
Matt
In principle, this concept is great, but the amount of features a remote offers is going to be hard to replicate using a number of gestures that an older person can memorize. They will probably be even less incentivized to do this (learning new things sucks) if they already have a solution (remote), even though they really love you. I'm just warning you.
I recommend you use OpenNI and NITE. Note that the current version of OpenNI (2) does not have Kinect support. You need to use OpenNI 1.5.4 and look for the SensorKinect093 driver. There should be some gesture code that works for that (googling OpenNI Gesture yields a ton of results). If you're using something that expects OpenNI 2, be warned that you may have to write some glue code.
The basic control set would be Volume +/-, Channel +/-, Power on/off. But that will be frustrating if she wants to go from Channel 03 to 50.
I don't know how low-level you want to go, but a really, REALLY simple gesture recognize could look at horizontal and vertical swipes of the right hand exceeding a velocity threshold (averaged). Be warned: detected skeletons can get really wonky when people are sitting (that's actually a bit of what my PhD is on).
I have made a simple, mouse-controlled taskbar using a shell script. It's working very well, and uses rxvt-unicode to make the "graphics".
Unfortunately, however, I moved this script from my netbook to my laptop, and when I changed the size of the terminal window and updated the code, I discovered that my mouse reporting stopped working beyond column 95 (it always returns ! no matter where it is clicked beyond 95).
I discovered that there is a "limit" with mouse reporting, at column 95. My program now requires 123 columns, where before it was happening to fit into under 95.
I looked up the problem, and only found one reference to the 95 column limit. Most of what I found actually refered to a 223 column limit. If I had a 223 limit, I'd be utterly fine, but I do not understand how to get it switched over.
Basically, I do not understand enough of the problem to apply what I'm reading on google. Usually I can do my own fishing, but this problem got me.
I'm using this guide to tell me what escape sequence to use (I picked X10, click-only, reporting, or escape sequence \033[?9h).
how to get MouseMove and MouseClick in bash?
I found this that mentioned a 95 column limit, but made little sense of it:
Emacs, unicode, xterm mouse escape sequences, and wide terminals
I am using small code snippets, more or less based on this:
http://www.imbe.net/bizen?Linux/Mouse_In_A_Terminal
I found other others that did not minus 223, but rather 255. My code seemed unaffected by this change.
I have solved my problem. What I never understood was that some of the mouse reporting settings can be active at the same time. When \033[?1015h and \033[?9h were combined, I started to get mouse reporting beyond the 95 mark. The 1015 is suitable for urxvt. I believe 1005h is used in xterm.