I want to set up my embedded application as a HID device, with a separate process controlling the HID interface to allow dynamic connections to a PC. There seems to be many people out there that have done it, but I would like to do is:
a) Understand how to configure my build (Freescale i.MX Linux using ltib) to include the USB APIs and includes in my build (ie g_hid.h).
b) Where can I find an example application which does something like move the mouse about the screen to demonstrate the operation of the HID?
Thanx for your help!
http://lxr.linux.no/#linux+v3.3/Documentation/usb/gadget_hid.txt is an example of how to operate a mouse HID.
Related
Outside of Laptops, changing e.g. brightness of monitors requires DDC/CI. This is best done in userspace, I believe. Loading i2c-dev (kernel module) gives access to i2c-buses under /dev/i2c-<number>. Unfortunately not just monitors supporting DDC/CI have i2c-buses and it is far from ideal to read/write on unrelated buses, while trying to find which connects to what.
It seems that i2c bus adapter drivers already categorize their buses: e.g. I2C_CLASS_DDC for exactly what I’m looking for.
Is there any way to see the adapter class of a i2c-dev device?
(Or equally good: any way to match the device I want to talk to for DDC/CI from X11 workspaces or similar?)
You can try to look at ddccontrol:
ddccontrol -p
This utility scan I2Cs and somehow detects which are connected to monitors.
You can examine it's source code and find solution.
P.S.
This may be not an answer, but, sorry, I have not enough "reputation" to write a comment.
I'm trying to write a component that collects data about connected / attached devices.
My component should work on Linux as well as on Windows.
For the time being I succeeded doing that on Windows machines by querying the Win32_PnPEntity.
I'm looking for a way to programmatically collect data about all attached devices (I.e usb devices, disks, Bluetooth etc...) on Linux.
After searching the internet,
I didn't found any solution to get all of this information.
As I said, in windows I can query the Win32_pnpentity,
Is there a way to do the same in Linux?
(I rather not use utilities, such as lshw etc...)
Thanks,
Amit.
libusb offers examples/listdevs.c, and this code should run equally well on Windows and Linux.
Alternatively, you can simply poke around in /sys/bus/usb/devices. For example, entries like 1-2, 1-4, (a digit, a dash, and a digit) represent whole connected devices, and these directories contain manufacturer and product files representing the device.
I'd use the libusb approach for anything I wanted to distribute widely. If you're doing internally-owned code then the directory approach should work well. Changes to the interface should be few and far between.
I write some software for Linux, which uses libevdev for input processing.
To my surprise all virtual onscreen keyboards that I found simulate high level X Window Server events. So, they're not recognized by udev, don't appear at /dev/input folder and aren't visible with evtest.
Is there any software keyboard that is low-level enough for that? Or maybe some trick for that?
There is a good reason why this is done in this way. The /dev/input devices are devices that have somekind of physical (electrical, optical and/or mechanical) input. These are converted by the linux kernel drive into something that generates EV_EVENTS. These events are processed by the xf86_input_evdev driver in to X11 inputs, which are understood by the server. As you can generate X11 inputs from an X11 program, it is quite a lot of work to create a device driver that accepts input on one side from an X11 app and generates input on the other. So while not impossible, it is a lot of work for no gain to create a driver or two for this purpose.
In most example project for embedded system there is a system file in which we can find structures for different peripheral as well as the memory mapping of the peripheral register, in addition there is also a module per peripheral that contains basic function to manipulate the periheral like: periph_enable, periph_write, periph_read; this is the architecture i have in mind when i tackle a new project.
Actually i started to work with a BF609 but now with an embedded linux in it, my task consist in writing a communication driver with another device via UART, as usual i tried to look for the files i used to use but in vain, i can't find the mapping of the different peripheral.
I started to read this book, i undrestand that the kernel see each device like a file and that a driver is mainly the implementation of the open, close, read and writefunctions in these file but i still don't undrestand how these functions communicate with peripheral registers.
My questions:
1) How device drivers recognise the mapping of the peripheral is there sth i missed, is there any example that explain how to implement simple read and write functions via UART for example
2) Where can i find the mapping of the peripheral in the buildroot directory
Thanks in advance
This is a fairly broad question, so I will try to keep it as focused as I can.
I currently own a Lenovo laptop with Ubuntu installed and touchscreen functionality and own a pressure-sensitive Bluetooth pen, and been trying to make the two work together as a cheap Cintiq-like tablet.
The pen has, unfortunately, support for only specific apps for iOS phones and tablets.
So after lots of research, I've managed to interface with the pen and create a uinput device for it, so I can register button clicks and pressure changes on the pen and even see them routed to GIMP when configuring the device through the Input Controllers menu.
The code I have so far for that interface is available here.
The trouble starts when trying to test it out with GIMP.
From what I gather, this is because GIMP assumes Wacom devices report their own position, treats touchscreen touches as mouse movements and only allows input from a single device at a time.
My question is, how can I work around this?
More specifically, how can I create a uinput device that would behave as a Wacom tablet and supersede/block the behavior I described?
Or if there's a different solution, such as patching GIMP or writing a plugin for it.
Update (2014-06-07)
The code mentioned above now works.
I have written a blog post on the process of getting this to work: http://gerev.github.io/laptop-cintiq
As you said, Gimp expects you to provide ABS_X and ABS_Y along with ABS_PRESSURE in your driver - which is not strange, because you are using you virtual device as input, so it wouldn't make much sense to pick ABS_X and ABS_Y coordinates from one device and ABS_PRESSURE from another (although they will always be the same in this case). Maybe you can just read the current coordinates of the mouse and copy them as your own device coordinates.
As an example, the project GfxTablet does something similar to what you are trying, they have an Android application for tablets with pen and use uinput to create virtual device that works like pressure-sensitive pen on Linux. I have used it and it worked like a charm in Gimp and mypaint on my laptop, and I had no problem with having a mouse (or the touchpad) active at the same time as the uinput device (I think that Krita added support for generic pressure-sensitive devices recently). You can take a look at the source code of the driver here (surprinsingly simple, to be fair).
Note that this is not a faulty behavior of Gimp, because this is what is expected from a tablet-like device. Take a look at the event codes kernel documentation page, in the last section (Guidelines), it is said that tablets must report ABS_X and ABS_Y. Moreover, they should use BTN_STYLUS and BTN_STYLUS2 to report the tool buttons and some BTN_TOOL_* (e.g. BTN_TOOL_PEN) to report activity (you can find all the available codes in input.h); however, these last does not seem that important, as GfxTablet does not implement them and worked without problem.