Multiple screen view on laptop display - graphics

First I dont know wheather this is right place to ask such question.
My laptop displays multiple screen on booting. I googled about this but I am unable to find the solution. came across this type of dysfunction on many laptops screen.
I cant figure out whether it is a screen problem/ hardware problem/ graphic card problem or its a bios problem.
I would really appreciate if anyone can direct me towards correct solution.

Following steps will help zero down
Enter Safe mode ( F8 in most ) and then see if its there. If it's gone then there's a problem likely with the graphics driver
Try to download the latest Chipset drivers from Intel or AMD Site depending on your Graphics card. Intel has a chipset detect utility that will pull up the latest drivers for all intel h/w on the pc
If you see this just on the desktop ( and not other apps like browser ) - it because the desktop background image is tiled vs Centered / stretched. In that case change it to centered.
Uninstall graphics driver and see if the multiple images disappear. You will not see high res image without driver but it will rule out card failure vs improper driver

Related

How do I enumerate and use OpenGL on a headless GPU?

Despite days of research, I can't seem to find a reliable way to programmatically create GL[ES] contexts on secondary GPUs. On my laptop, for example, I have the Intel GPU driving the panel, and a secondary NVIDIA card not connected to anything. optirun and primusrun let me run on the NVIDIA card, but then I can't detect the Intel GPU. I also don't want to require changing xorg.conf to add a dummy display.
I have tried a number of extensions, but none seem to work correctly:
glXEnumerateVideoDevicesNV returns 0 devices.
eglQueryDevicesEXT returns 0 devices.
eglGetPlatformDisplay only works with the main panel and gives me an Intel context
I am fine getting my hands dirty, e.g. rolling my own loader, but I can't seem to find any documentation for where to start. I've looked at the source for optirun but it just seems to do redirection. Obviously something equivalent to Windows' IDXGIFactory::EnumAdapters would be ideal, but I'm fine with anything that works without additional system configuration.

elementary OS sometimes one external monitor goes black when login

I have a problem. I am using elementary os on laptop ThinkPad T570. I've installed Nvidia drivers through installing cube into it. I am using 2 external display, one is connected via VGA cable into my dock station and another one is connected via mini display port in a monitor to display port in my dock station. First I thought it was drivers since I was using Nouveau, but yesterday I installed Nvidia as I mentioned and the result is same. However, when I lock screen, all displays are up and running and showing. When I unlock first thing which bothers me it took like 20sec of blinking black to turned off the display to show actual screen. And in 9/10 cases one of the screens stays black ( normally one in display mini port ), when I get lucky all 3 display shows up and works perfectly fine.
As I mentioned I search online for a solution but what I found was drivers, I managed to install cube from Nvidia. Which should work fine. Still a bit newby into Linux. So if you need any additional pieces of information, just say will provide it.
Greetings,
Marko.

Create a Wacom-like Linux uinput device for work with touchscreen and pen

This is a fairly broad question, so I will try to keep it as focused as I can.
I currently own a Lenovo laptop with Ubuntu installed and touchscreen functionality and own a pressure-sensitive Bluetooth pen, and been trying to make the two work together as a cheap Cintiq-like tablet.
The pen has, unfortunately, support for only specific apps for iOS phones and tablets.
So after lots of research, I've managed to interface with the pen and create a uinput device for it, so I can register button clicks and pressure changes on the pen and even see them routed to GIMP when configuring the device through the Input Controllers menu.
The code I have so far for that interface is available here.
The trouble starts when trying to test it out with GIMP.
From what I gather, this is because GIMP assumes Wacom devices report their own position, treats touchscreen touches as mouse movements and only allows input from a single device at a time.
My question is, how can I work around this?
More specifically, how can I create a uinput device that would behave as a Wacom tablet and supersede/block the behavior I described?
Or if there's a different solution, such as patching GIMP or writing a plugin for it.
Update (2014-06-07)
The code mentioned above now works.
I have written a blog post on the process of getting this to work: http://gerev.github.io/laptop-cintiq
As you said, Gimp expects you to provide ABS_X and ABS_Y along with ABS_PRESSURE in your driver - which is not strange, because you are using you virtual device as input, so it wouldn't make much sense to pick ABS_X and ABS_Y coordinates from one device and ABS_PRESSURE from another (although they will always be the same in this case). Maybe you can just read the current coordinates of the mouse and copy them as your own device coordinates.
As an example, the project GfxTablet does something similar to what you are trying, they have an Android application for tablets with pen and use uinput to create virtual device that works like pressure-sensitive pen on Linux. I have used it and it worked like a charm in Gimp and mypaint on my laptop, and I had no problem with having a mouse (or the touchpad) active at the same time as the uinput device (I think that Krita added support for generic pressure-sensitive devices recently). You can take a look at the source code of the driver here (surprinsingly simple, to be fair).
Note that this is not a faulty behavior of Gimp, because this is what is expected from a tablet-like device. Take a look at the event codes kernel documentation page, in the last section (Guidelines), it is said that tablets must report ABS_X and ABS_Y. Moreover, they should use BTN_STYLUS and BTN_STYLUS2 to report the tool buttons and some BTN_TOOL_* (e.g. BTN_TOOL_PEN) to report activity (you can find all the available codes in input.h); however, these last does not seem that important, as GfxTablet does not implement them and worked without problem.

No touches after adding touchscreen driver to CE 6 in platform builder

I have added a TSHARC touchscreen driver to my Windows CE project, but the touch does not work. The dll is there, as is the touchscreen calibration executable. I have no visibility into which drivers are loaded and when. Any guidance would be appreciated.
You're going to have to do some debugging, and touchscreen drivers tend to be challenging because they get loaded into GWES and because the electrical characteristics of touchpanels change dramatically based on size and manufacturer. It's very rare for a driver to just work right out of the box - you almost always have to adjust sample timings and the like based on panel characteristics, and that's best done using an oscilloscope.
Things to check:
Is the driver getting loaded at all? A RETAILMSG/DEBUGMSG would tell you that
Are you getting touch interrupts?
After a down interrupt, is your code getting back to state to receive an up?
If you look at the timings from panel signals themselves, are you sampling when the signals are stable (i.e. you're not sampling too soon after the interrupt)?
Turns out it was a conflict between the OHCI driver and another USB driver already installed.

Is kernel or userspace responsible for rotating framebuffer to match screen

I'm working on embedded device with screen rotated 90 degrees clockwise: screen controller reports 800x600 screen, while device's screen is 600x800 portrait.
What do you think, whose responsibility it is to compensate for this: should kernel rotate framebuffer to provide 800x600 screen as expected by upper-level software or applications (X server, bootsplash) should adapt and draw to rotated screen?
Every part of stack is free software, so there are no non-technical problems for modification, the question is more about logical soundness.
It makes most sense for the screen driver to do it - the kernel after all is supposed to provide an abstraction of the device for the userspace applications to work with. If the screen is a 600x800 portrait oriented device, then that's what applications should see from the kernel.
yes,I agree, The display driver should update the display accordingly and keep the control
Not sure exactly how standard your embedded device is, if it is running a regular linux kernel, you might check in the kernel configurator (make xconfig, when compiling a new kernel) , one of the options for kernel 2.6.37.6 in the device, video card section, is to enable rotation of the kernel messages display so it scrolls 90 degrees left or right while booting up.
I think it also makes your consoles be rotated correctly after login too.
This was not available in kernels even 6-8 months ago, at least not available in kernel that slackware64 13.37 came with about that time.
Note that the bios messages are still rotated on a PC motherboard,
but that is hard-coded in the bios, which may not apply to the embedded system you are working with.
If this kernel feature is not useful to you for whatever reason, how they did it in the linux kernel might be good example of where and how to go about it. Once you get the exact name of the option from "make xconfig", it should be pretty easy to search where ever they log the kernel traffic for that name and dig up some info about it.
Hmmm. I just recompiled my kernel today, and I may have been wrong about how new this option is. Looks like it was available with some kernel versions before the included-with-Slackware64 versions that I referenced. Sorry!

Resources