Running any of the "Sample" projects that come with MonoGame will result in "MonoGameGLException". This occurs in the Texture2D constructor at the following code..
GL.CompressedTexImage2D(TextureTarget.Texture2D, 0, glInternalFormat,
this.width, this.height, 0,
imageSize, IntPtr.Zero);
GraphicsExtensions.CheckGLError(); <-- Error here
The value for "glInternalFormat" is:
OpenTK.Graphics.OpenGL.PixelInternalFormat.CompressedRgbaS3tcDxt3Ext
This occurs on Ubuntu Linux running on an Intel integrated graphics card.
TL;DR
This is an error that is caused because the OpenGL implementation (Mesa) does not contain support for the internal texture format (S3TC). Install the libtxc_dxtn library to add this functionality to Mesa.
The Problem
OpenGL is the graphics API that MonoGame uses to render graphics. Since OpenGL is just an API, you need some implementation of that API in order to have it work on your system. If you had an NVIDIA or ATI card, you could download their proprietary drivers (which include closed-source implementations of the OpenGL API). If you are not using their proprietary drivers (or are using a card like Intel graphics card that has open source drivers), then you are probably running Mesa, which is an open-source implementation of the OpenGL API standard.
OpenGL has a base standard, but it also has many extensions. One of these extensions is EXT_texture_compression_s3tc. This is an extension that allows the user to specify that the textures we are loading into the graphics card should be compressed using a specific algorithm of the S3TC family of algorithms. That is what the "GL.CompressedTexImage2D" line actually does: it requests that we make room for a texture that will be compressed using the DXT3 algorithm.
Mesa does not implement ALL of the OpenGL API and list of extensions. Specifically, they do not implement the S3TC extension (due to legal reasons).
"Invalid Enum" is the error that Mesa is throwing because you are trying to tell Mesa to use an extension (as specified by the enumerated value PixelInternalFormat.CompressedRgbaS3tcDxt3Ext) that Mesa does not support. This is the standard way an OpenGL implementation tells you that you are referring to an unsupported value.
The Solution
So, there are two options to fix this:
First, workaround MonoGame so that it stops using S3TC. This might require changing MonoGame code, or perhaps there is a better way using the ContentPipeline to specify what texture compression algorithm to use.
Second, install the S3TC algorithm. While Mesa does not include S3TC by default, you can include an external library for Mesa. More details on the Mesa Wiki S3TC page. Mesa needed to be built with a specific compiler flag, and then the library needs to be installed. In my Ubuntu install (12.04), I'm assuming that the Mesa package did have this flag enabled when compiling, since installing the library worked well. In Ubuntu, I installed the libtxc-dxtn-s2tc0 package.
sudo apt-get install libtxc-dxtn-s2tc0
After that, all the MonoGame samples were working.
Related
I recently ported an installer from an unmaintainable and complicated batch script to python and thought it would be a nice idea to have a neat graphical front end for the console installer. I decided to use Kivy because I used it for some pet projects at home and I and UI designing went nice and fast.
However, what I did not know until recently, is that Kivy seems only to work with OpenGL 2.0. Unfortunately our company's software is frequently installed on virtual machines and their virtualized graphics adapters often don't support OpenGL that is newer than 1.0 or 1.1 (VirtualBox for example). This prevents the Kivy app from starting or if it does start, doesn't render correctly.
I searched the internet for a way to get Kivy to work without OpenGL 2.0. Some posts on github and I think on reddit suggested to use Angle instead of sdl2 or switching to glew. I tried the suggested solutions but with no success.
I wonder, is there actually a way to get Kivy apps to work without OpenGL 2.0, like OpenGL 1.1 ?
I use Python 3.6.4 and Kivy 1.10.1 on Windows as a dev and target system.
Kivy targets OpenGL ES 2.0 as a minimum version. Note that OpenGL ES is not the same as OpenGL, it's closer to OpenGL 3.0.
This is a minimum required version, anything newer should work fine.
You can use angle on Windows if you want. I think we do it because it's more stable than relying on Windows OpenGL drivers, but I'm not sure.
I have managed to build a set of mesa libraries that together with a dummy xserver can run an OpenGL application on a machine with no GPU using openswr without relinking or changing any code in the application on a machine with a pre-existing Xorg installation. I'm using Linux on x64 (centos 7 to be specific)
The mesa build instructions I used are the following:
http://openswr.org/build-linux.html
Now I'm trying to set up a build process for this for machines without XOrg installed to start with for various reasons I won't go into detail on. Since there are some back and forth dependencies between the XOrg build process and mesa it becomes a dance involving building the right XOrg and mesa modules with the right parameters.
I have reached a point where I'm stuck.
It seems like mesa won't build gallium if dri is enabled. I.E if I remove the line --disable-dri from the mesa configure command line from the openswr example it won't work giving the following error.
:configure: error: Xlib-based (Gallium) GLX cannot be built with DRI
enabled
On the Xorg side, using a mesa without dri, building xserver with the --disable-dri flag, the configure script works but the build fails with the following error:
glxdriswrast.c:39:10: fatal error: GL/internal/dri_interface.h: No
such file or directory #include
(suggesting there are parts of xserver still referring to dri headers even if disabling it using --disable-dri).
Anyone have any idea what the right combination of flags to get all of this working? I'm not married to using openswr, llvmpipe is probably good enough for me in case it makes things easier.
I am running Qt 5.4.2 on a Tegra 3 with a Yocto compiled Angstrom image. Unfortunately the Tegra only has proprietary drivers and they are forcing me to use X11 for hardware acceleration. I therefore had to build Qt with XCB instead of EGLFS.
My problem is that I am seeing tearing on some QML 2 animations on the device that are not visible on my PC. I am wondering if there is anything I can do to try and get rid of this tearing? I have read that QT_QPA_EGLFS_FORCEVSYNC might help but unfortunately I can't use EGLFS and doubt if it will be of any use.
EDIT:
I have noticed that "QSGContext::initialize: stencil buffer support missing, expect rendering errors" as well as "Qt Warning: Could not find a location of the system's Compose files. Consider setting the QTCOMPOSE environment variable." come up when the application is started. Could they be related to the problem?
Can someone confirm that WebGL works with node-webkit on Linux distros?
I can make WebGL run on Google Chrome after enabling the flag Override software rendering list on chrome://flags/, but I'm getting errors regarding Three.js when trying to execute the same application on node-webkit.
To answer my own question:
It just depends if the system has a video driver installed that supports:
hardware acceleration and
OpenGL ES2.0
(the requirements for WebGL)
Node-WebKit or not, WebGL is gonna run smoothly, if the above requirements are met.
I have done coding in OpenGl now I want to switch to OpenGL-ES 2.0 I know we need an emulator to run OpenGL-es on linux . I have done some examples using PVRSDK but now i want to do coding on my own . Can anyone tell me the way I can start coding in OpenGL-ES 2.0 for ubuntu 10.10.
Any tutorial or something like that.
I think you can use the following threads as a starting point:
Recommended practice environment for OpenGL ES 2.0?
Create an OpenGL ES 2.0 context on a "standard" Linux system
I use SDL 1.3 with the PVRSDK. You could also use the Mesa OpenGL ES wrapper libraries from the software repositories.
And in OpenGL 4.1 there also is the GL_ARB_es2_compatibility extension which should allow you to use the OpenGL ES functionality without the extra libraries.
I prefer to test with the PVRSDK though since that is the one that my Android device uses.