Linux drawing to display without desktop manager - linux

Lets say i have a Linux (Ubuntu), well NVIDIA packaged one that comes with Jetson boards, and i have HDMI output and DP.
The goal is to: Display Desktop part via HDMI, and display GUI that i made with OpenGL via DP, but without DP ever showing Desktop, if process with GUI exits, the display connected via DP does not show anything until process is restarted.
So far i have researched X server, but most of the materials just talk about X11 protocol, and how to create a window. If i just open a window in full-screen mode, it could work, but until GUI is launched i would see part of Ubuntu desktop there.
The GUI is basically a Monitor with CT Overlay that sends touch input via I2C.

Related

Activate HDMI output of nVidia GPU for audio without Windows extending the desktop to that display

To get audio output from the HDMI port on an nVidia GPU the desktop must be extended to that output because the audio is contained within the video stream but if my monitor uses Displayport and the AV Reciever that is connected to the HDMI output does not support the resolution of my monitor I cannot use the portion of the desktop that is on the HDMI output.
Assuming that duplicating the desktop on the Displayport and HDMI outputs is not an option.
Is there a way to stop windows from extending the desktop to the secondary display, by invoking a system function for example, without deactivating the video output on the HDMI port or is the desktop tied into the graphics driver in such a way that this is not possible?
I've been looking into this as well... I am very surprised to learn that NVidia don't have a way of providing this.
I've found a few people recommending usine Dual Monitor Tools to lock your mouse to your Primary screen - not elegant, but is a solution I guess. :)

Can't log in to embedded Linux Buildroot

I'm looking to install Linux onto an Intel Galileo Gen 2 utilizing this and this via installing onto an SD card.
I believe I have successfully done this, as during the boot sequence I am able to select Linux to boot from, however as soon as it starts booting from Linux, I am unable to interact with the Galileo anymore by say typing in my username and password when it comes time to login.
I'm unsure if my peripheral setup is wrong, if I need to install some more drivers to support I/O or something else.
I am viewing the logs from the Galileo via an FTDI cable and currently have a keyboard plugged directly into the Galileo.
Log data
When I boot the Galileo, this is what is logged.
Interestingly, the
flashing cursor stops flashing and is just steady when I get to the
login screen, as if the device is sort of frozen
However if I then
say connect a keyboard, it recognizes it and outputs this log data.
Solved! Turns out it was a faulty FTDI cable!

How to move without VR or AR device?

Most if not all MR devices drive the camera automatically. How is this supposed to be achieved using Windows Standalone for example, where I use the mouse as input device. I suppose there is some sort of WASD movement, either already available, or implementable?
You can use the Input Simulation Service to move the camera in Windows Standalone. Then the camera will move using WASD. To enable this for Windows Standalone, just enable the Input Simulation Service for the Windows Standalone Platform:

How to remotely send keyboard events to embedded Qt Quick Application?

I have an embedded Linux 3.10.17 system running a Qt Quick 5.2.1 application. It has a graphical UI that can be controlled by plugging in a USB keyboard. What I would like to do is to control the application remotely via a remote desktop connection to a Windows PC sitting next to the embedded system. Currently any STDIO is not sent in as keyboard events to the Qt application. Three ideas came to mind
Modify Qt application to take in STDIO data so it acts on those events. I thought this would be a common thing to do, but so far my searches has not yielded any good solutions.
Create a Linux kernel driver that sends any keycodes received through a char device write (pipe) through the input subsystem. Something like this should be available I'd think...
Buy some form device that plugs into the embedded system via USB and connects to the PC via USB, RS232 or Ethernet.
I'm not sure which path offers the least resistance. Any expert advice on this would be appreciated
Thanks,
Otto

Sending touch screen events through X11

I have an embedded application which is using x11 with opengl for windowing and rendering of graphics. The device has a touch screen, which is used for application interaction. Currently, the touch screen driver is implemented in our application space and we handle the events accordingly.
However, I want that the touch events should go to the application via X11 interface.
Can anyone help me understand how this can be achieved ?
Probably the easiest way is the uinput module. This allows you to create a "virtual device" in userspace that can generate events. Those can be caught with the evdev driver by the xserver and sent to your application (o any other window).
See linux uinput: simple example?

Resources