Shortest OpenGL geometry shader example that will run on Linux? - linux

I am looking for a a short OpenGL geometry shader example that will run on Linux, preferably with as few dependencies as possible. Basically I want to use that program as a test to see if geometry shaders are supported at all on the system it's currently running on.

Just use glxinfo (in the package mesa-utils on Ubuntu/Debian) and check the extension list (GL_EXT/ARB_geometry_shader4) or OpenGL version (>= 3.2) for geometry shader support.
Extension example:
user#machine:~$ glxinfo | grep "GL_EXT_framebuffer_object"
GL_EXT_framebuffer_multisample, GL_EXT_framebuffer_object,
Version example:
user#machine:~$ glxinfo | grep "OpenGL version"
OpenGL version string: 2.1 Mesa 7.10.2

Related

Use llvmpipe instead of softpipe in yocto

I have an old x86 machine. I build generic intel-core2-i32 machine configuration and installed yocto on it. Display is very slow.
Ubuntu 12.04 runs smoothly on the same machine. Graphics driver is same on both cases : gma500_gfx
Then i looked into glxinfo and found the difference, ubuntu12.04 is using Gallium on llvmpipe
Whereas yocto is using:
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: softpipe
OpenGL core profile version string: 3.3 (Core Profile) Mesa 19.0.3
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.1 Mesa 19.0.3
OpenGL shading language version string: 1.40
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.1 Mesa 19.0.3
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.10
OpenGL ES profile extensions:
How can i change the OpenGL renderer string to "llvmpipe" on Yocto and will this help
You should start by checking that llvmpipe driver is actually enabled. The mesa recipe configuration has this suspicious bit (I believe swrast here really means llvmpipe):
GALLIUMDRIVERS = "swrast"
# gallium swrast was found to crash Xorg on startup in x32 qemu
GALLIUMDRIVERS_x86-x32 = ""
Check the configuration logs or the resulting packages: if llvmpipe is not being built, try removing the last line above and see if that improves things.

Python3 pip install vtk: OpenGL2 Error

I'm trying to install vtk via pip for Python 3.5.1 under CentOS 7.2. It seems to install and work with the system Python 2.7 instead. I get an OpenGL driver incompatibility error with Python 3.
I must use sudo. My attempts:
sudo /path/to/pip3 install vtk
sudo /path/to/python3 -m /path/to/pip3 install vtk
When executing a simple cylinder example Python 2 works properly. Python 3 error messaage:
ERROR: In /work/standalone-x64-build/VTK-source/Rendering/OpenGL2/vtkShaderProgram.cxx, line 446
vtkShaderProgram (0x169d500): 0:31(12): error: extension `GL_EXT_gpu_shader4' unsupported in fragment shader
glxinfo output:
server glx vendor string: SGI
server glx version string: 1.4
client glx vendor string: Mesa Project and SGI
client glx version string: 1.4
GLX version: 1.4
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: Gallium 0.4 on llvmpipe (LLVM 3.6, 128 bits)
OpenGL version string: 2.1 Mesa 10.6.5
OpenGL shading language version string: 1.30
OpenGL ES profile version string: OpenGL ES 2.0 Mesa 10.6.5
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 1.0.16
I get the same error with Mesa 17.x. Has anyone installed this under Python 3 with success? What version of Mesa and/or llvm is compatible?
The answer posted here works.
MESA_GL_VERSION_OVERRIDE=3.3 /path/to/executable
I had tried this so many times with a manual install that I gave up on it. I've also seen here that this is a bad idea.
Edit:
I've had to crank up the override version number to 4.5 on other installs.

Investigating 32 bit MESA driver selection

I've used NVidia card, with the properary drivers installed on a Debian Stretch.
But because i'm carry my hard drive between different machines (intel, amd, but always on amd64 act), i'm decided to drop the NVidia card, and rollback opengl to MESA in order to use 3D acceleration on every machine. After a lot of struggling i'm successfully identified and recovered some badly overwritten file (libGL.so, libdrm2.so, by NVidia installer).
Now i'm successfully recovered the 64bit related libraries, so glxgears, browser's WebGL support, gnuplot, etc. works well.
But 32bit libraries (wine, steam) still doesn't work well, it's always falls back to "Mesa X11" render.
I'v used glxgears
$ LIBGL_DEBUG=verbose glxinfo | grep "OpenGL renderer string"
to identify which so and DRI selected. It's prints the lookup process and the renderer:
libGL: OpenDriver: trying /usr/lib/x86_64-linux-gnu/dri/tls/r600_dri.so
libGL: OpenDriver: trying /usr/lib/x86_64-linux-gnu/dri/r600_dri.so
libGL: Using DRI2 for screen 0
OpenGL renderer string: Gallium 0.4 on AMD SUMO (DRM 2.50.0 / 4.12.0-0.bpo.1-amd64, LLVM 3.9.1)
To investigate in 32 bit libraries (we can't have from mesa both the 64 and 32 bit installed), i've downloaded the 32bit version:
$ apt-get download mesa-utils:i386
Unpacked it and tries to figure out why it's fails to select the proper DRI:
LIBGL_DEBUG=verbose ./glxinfo | grep "OpenGL renderer string"
OpenGL renderer string: Mesa X11
The pevious 64bit glxinfo prints debugging information to the stderr therefore we can see how the selection happens.
With 32bit version i can't get any useful information, even if i specify the
LIBGL_DRIVERS_PATH=/usr/lib/i386-linux-gnu/dri/
evironment variable, where mesa might find the proper 32 bit so.
$ file /usr/lib/i386-linux-gnu/dri/r600_dri.so
/usr/lib/i386-linux-gnu/dri/r600_dri.so: ELF 32-bit LSB shared object, Intel 80386, version 1 (GNU/Linux), dynamically linked, BuildID[sha1]=d5177f823f11ac8ea7412e517aa6684154de506e, stripped
How can i get more information about the MESA DRI selection?

Can't run my own OpenGL 3 programs on Ubuntu

I am experimenting with OpenGL 2.x and 3.x tutorials. The programs compile and link but then segfault on seemingly innocent lines such as
glGenBuffers (1, &m_buffer);
My main() starts with glewInit and glutInit. OpenGL 1 programs compile and run fine, it just seems to be the new functions wrapped by glew.
One tutorial says I should have this test before trying anything else:
if (false == glewIsSupported ("GL_VERSION_2_0"))
This test always fails, even when I change the version string to GL_VERSION_1_0.
#define GL_VERSION_1_3 1 is the highest such definition in GL/gl.h, and there is no GL/gl3.h or GL/GL3 directory.
apt says I have freeglut3 and freeglut3-dev installed, also mesa-common-dev, libglew-1.6 and libgl1-mesa-dev, but there doesn't seem to be any libgl3* package available.
Here is some driver info (I have no proprietary drivers, integrated Intel Ivy Bridge graphics with Nvidia extra card, both are I belive OpenGL 1.4 compatible)
#> glxinfo | grep version
server glx version string: 1.4
client glx version string: 1.4
GLX version: 1.4
OpenGL version string: 3.0 Mesa 9.0
OpenGL shading language version string: 1.30
All this has left me quite confused.
Are there specific OpenGL2/3/4 packages I should be installing, or in theory is it the same development package for all (for Ubuntu)?
Why is GL_VERSION_1_3 the highest defined version whereas glGenBuffers wasn't introduced until version 1.5?
Why does glewIsSupported fail even for version 1.0?
The impression I get is that I don't have libraries and/or drivers which actually implement the API, but seems as though I do according to glxinfo, which makes me think there's something wrong with the development libraries, but I don't have a coherent picture of what is going on there.
Basically, what do I have to do to get my program to compile/link/run?
I know Ubuntu isn't a great development environment but please don't suggest that I change distro. There must be a way!
My main() starts with glewInit and glutInit
Nope. You don't get a current GL context until glutCreateWindow() returns. You can call glewInit() and glewIsSupported() after that.
Something like this:
#include <GL/glew.h>
#include <GL/glut.h>
...
int main( int argc, char** argv )
{
glutInit( &argc, argv );
glutInitDisplayMode( GLUT_RGBA | GLUT_DOUBLE );
glutInitWindowSize( 300, 300 );
glutCreateWindow( "OpenGL" );
glewInit();
...
return 0;
}

Does my computer support OpenGL 2.1?

Basing from the above image, is the max version of Opengl my computer can support is 1.4? does that mean that there's no way I can write code with Opengl 2.1?
I'm using Ubuntu 12.04
What you see there is the GLX version. GLX is the container protocol, that delivers OpenGL to the X11 server. You should look for the OpenGL version string, which comes a bit later in that output. Use grep to filter the output, e.g. on my laptop
datenwolf#narfi ~
%> glxinfo | grep version
server glx version string: 1.4
client glx version string: 1.4
GLX version: 1.4
OpenGL version string: 2.1 Mesa 8.0.3
OpenGL shading language version string: 1.20
As you can see, my GLX version is 1.4 as well, but I have OpenGL 2.1 and GLSL 1.20 support on my mobile machine.

Resources