Qml application strange behavior on nvidia card enabled - linux

I am using Ubuntu 14.04 with two graphics card - first from Intel and the second one is NVIDIA Corporation GF108M [GeForce GT 635M]. On Intel card every qml applications works well, but on nvidia card, even the simplest qml, not. When I am resizing window, the window content "flows" faster than window is resizing and disappears outside window frame leaving artifacts. Just watch this video. To enable nvidia card I am using optirun from bumblebee package. Please help me to solve this problem.
edit: On NVIDIA Corporation GF119M [GeForce GT 520M] is the same problem.

I removed bumblebee and installed nvidia-prime (remember to set nvidia in nvidia-setting and remove bumblebee nvidia blacklists). Problem gone, so the problem is with the bumblebee.

Related

Virtual Monitor on Arch Linux and NVIDIA Proprietary Driver?

I want to add multiple Virtual Displays to my Linux PC, for VNC Purposes. I have an NVIDIA GeForce GTX 750 Ti and an Ryzen 3 1200 with no On-Board Graphics.
I have already tried:
The Option "ConnectedMonitor" in /etc/X11/xorg.conf (Works, but only allows for one output at the moment, because xrandr only shows 5 outputs. When I try and add the fifth output tho, nvidia-settings throws me some kind of error, that i can only use 4 Monitors, and other programs just straight up crash when they try to change the X Screen Config...)
The EVDI Driver (After the Modprobe the X Screen crashes as expected, but xrandr --listproviders doesn't show the driver and xrandr --setprovideroutputsource doesn't work either...)
Wayland (I am an KDE Plasma user and Wayland support of KDE is Trash with the proprietary NVIDIA Driver)
Adding the Dummy Driver (Doesn't work because of NVIDIA's Proprietary trash Xinerama [I also have multiple Physical Displays])
Is it maybe possible to somehow use the nouveau driver at the same time? Or am i just adding the Dummy Driver wrong? Is it possible to get the EVDI Driver working?
I really hope someone helps me very soon...

AMD GPU won't work with Blender's "Cycles Render" on Linux

I've been working a lot with Blender and it's "Cycles Render" on Fedora lately. But Blender keeps getting a lot slower while rendering. So I discovered that my Blender is only capable of rendering with my CPU. I tried running Blender from the terminal, so I could see any errors. And if I set "Device" to "GPU Compute" in the rendering settings, I get this output:
DRM_IOCTL_I915_GEM_APERTURE failed: Invalid argument
Assuming 131072kB available aperture size.
May lead to reduced performance or incorrect rendering.
get chip id failed: -1 [2]
param: 4, val: 0
My machine's specifications are:
Operaring system: Fedora GNU/Linux 27
Blender version: 2.79
Graphics card: AMD Radeon RX 480 using "amdgpu" driver (default open-source driver)
So it seems like, Blender's Cycle Render won't work with my AMD GPU...
Any ideas?
As far as I've seen on the release docs, the Blender cycles engine is not yet fully optimized for all AMD graphics cards, currently they only support AMD cards with GCN architecture 2.0 and above. The dev team focuses on NVIDIA cards mostly (also blender is most optimized for windows).
However, you might as well try to change the settings, first you must make sure you are using OpenCL and not CUDA in your User Preferences, under the System tab, Compute Device(s). Then if your card is not supported, enable the experimental features on the render properties of your workspace, which warn you that will make everything go unstable, this usually enables most AMD GPUs to be selectable as a render device. here on the render properties, you will also be selecting the compute device you desire to use for each scene.
Also, Using an official AMD driver would make rendering faster (also this is a requirement by Blender to use AMD cards) but its not available for fedora as far as I know. I suggest changing your distro to Ubuntu.
EDIT: You MUST use an official AMD driver for the desired card, I have checked the card you have is on the list of supported cards, just that it IS a requirement to have the AMD driver and not opensource. this is the list of supported cards https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units, according to the blender documentation.
But it must be a driver from this list: https://support.amd.com/en-us/download/linux, according to the blender documentation.
Now if that doesn't solve the issue then it must be a hardware issue or blender bug, although you could try to run it on windows to discard it being a hardware issue, if you are willing to do a dual boot or usb boot test.

Multiple screen view on laptop display

First I dont know wheather this is right place to ask such question.
My laptop displays multiple screen on booting. I googled about this but I am unable to find the solution. came across this type of dysfunction on many laptops screen.
I cant figure out whether it is a screen problem/ hardware problem/ graphic card problem or its a bios problem.
I would really appreciate if anyone can direct me towards correct solution.
Following steps will help zero down
Enter Safe mode ( F8 in most ) and then see if its there. If it's gone then there's a problem likely with the graphics driver
Try to download the latest Chipset drivers from Intel or AMD Site depending on your Graphics card. Intel has a chipset detect utility that will pull up the latest drivers for all intel h/w on the pc
If you see this just on the desktop ( and not other apps like browser ) - it because the desktop background image is tiled vs Centered / stretched. In that case change it to centered.
Uninstall graphics driver and see if the multiple images disappear. You will not see high res image without driver but it will rule out card failure vs improper driver

ERROR: clGetPlatformIDs -1001 when running OpenCL code (Linux)

After finally managing to get my code to compile with OpenCL, I cannot seem to get the output binary to run! This is on my linux laptop running Kubuntu 13.10 x64
The error I get is (Printed from cl::Error):
ERROR: clGetPlatformIDs
-1001
I found this post but there does not seem to be a clear solution.
I added myself to the video group but this does not seem to work.
With regards to the ICD profile... I am not sure what I need to do - shouldn't this be included with the cuda toolkit? If not, where could I download one?
EDIT: It seems I have an ICD file in my system under /usr/share/nvidia-331/nvidia.icd. It contains the following text:
libnvidia-opencl.so.1
The only file in my system that resembles this is:
/usr/lib/nvidia-331/libnvidia-opencl.so.331.20
Is my ICD profile somehow wrong? Does anyone know a way to fix it?
(Mods: I am not sure if this post should be moved to AskUbuntu seeing as it was an issue related to Linux bumblebee rather than OpenCL itself?)
Ok so I managed to solve the issue after loads of effort.
There are two things that I needed to do:
Getting ICD to work
create a symbolic link from /usr/share/nvidia-331/nvidia.icd to /etc/OpenCL/vendors
sudo ln -s /usr/share/nvidia-331/nvidia.icd /etc/OpenCL/vendors
NOTE: In most cases you would need to replace nvidia-331 with whatever driver you are using - most commonly nvidia-current
I am really curious as to why this isn't done automatically when installing the cuda toolkit - but I have noticed that OpenCL programs will not work without this step!
Nvidia Optimus with Bumblebee
The reason why this was so complicated to get working was that I have an nvidia optimus laptop with poor driver support on linux. To fix this I have bumblebee installed to allow switching between my nvidia card and intel card.
However, because I am using bumblebee - the nvidia graphics card (and nvidia driver) will be unloaded unless explicitly told so. In order to use OpenCL, we need to turn on the nvidia graphics card.
To do this we need to explicitly tell bumblebee to turn the nvidia card on by using the commands optirun or primusrun:
optirun myopenclprogram
Note however, because all that matters is that the nvidia card is turned on and the drivers are loaded, you do not need to keep using optirun myprogram to get this to work (as this always involves the initial delay of waiting for the graphics card to be initialised).
You can run optirun kate for example and this would turn on the nvidia graphics card. You can then, in a separate terminal just run you opencl program without optirun and it will work just fine since the graphics card has already been turned on (and will remain on as long as you leave e.g. kate running).
You will notice that there is no delay in starting your program this time! This saves you alot of waiting - especially if you are developing the opencl program in question.
Once again, as long as you keep the nvidia graphics card turned on, your opencl program will work.
I will probably contact the bumblebee devs to see if there is an easier way to get this to work and report what they say here. Hopefully there is some way to turn the nvidia card on and off without requiring to keep a program (like kate in my example) running.
I hope this helps anyone trying to use OpenCL on linux laptops with bumblebee in the future (I could not find any clear cut solutions myself)
EDIT2: Turning you graphics card on and off can be done as follows for bumlebee users:
Turn graphics card on and load nvidia module
sudo tee /proc/acpi/bbswitch <<< ON
sudo modprobe nvidia
Turn graphics card off (nvidia module is automatically unloaded)
sudo tee /proc/acpi/bbswitch <<< OFF
To share some add info. I have installed intel opencl version on Ubuntu 13.10 saucy. Problem has been the same: -1001 error. I solved it by link analogicly to previous post:
sudo ln -sf /opt/intel/opencl-1.2-3.2.1.16712/etc/intel64.icd /etc/OpenCL/vendors/nvidia.icd

Completely silent (or custom) boot for Linux appliance

I am working on an appliance that runs on Linux (Ubuntu 10.04 for now) and x86-64 Intel based PC. I need to completely customize the boot screens - no BIOS messages, and either (a) no screen output until X is launched or (b) custom screen output via VESA/VBE.
(b) looks hard to achieve because none of the framebuffer drivers (vesafb, uvesafb) seem to support addressing pixels on more than 2 monitors being driven by the same graphics card
So, I'm looking at (a) : no output on screen until X is launched. This should mean no Dell BIOS output, no Ubuntu splash screen, no console output.
Any ideas will be much appreciated.
I don't know much about the framebuffer stuff, but I can tell you for sure that you'll have no control over the BIOS output from Linux.
In the boot process, BIOS comes first, then the boot loader, then Linux. So when Linux appears, the BIOS stuff is done for long.

Resources