Nvidia High Performance Processor Setting leads to graphical bug (Seizure Warning) with current lighting system, drawing completely in the shader code - graphics

I followed the Lighting tutorial on learnopenGL, modifying some of the code to work in a 2D game engine. Everything was looking great and my team got our game done and the lights were quite simple for our designers to use. However we ran into a rare bug. as shown here: https://www.youtube.com/watch?v=to0mMP5I0cs one team member was able to recreate the bug by switching his Nvidia settings to use the "High Performance Processor" as opposed to "Integrated Graphics". Otherwise everything renders properly. The bug doesn't appear when there are no lights and everything is rendered in its full color. We have gone through alot of Ideas already but they haven't worked and now I am at a loss. Does anyone have any ideas about what is going on?

Always make sure you initialize your variables. Apparently some cards and drivers automatically initialize vec3 to (0,0,0), but others don't. That was what was going on here. Garbage values causing different colors at each fragment. By Initializing my resulting color vec3 to (0,0,0) at the beginning the problem is fixed.

Related

Vector3.ToString "rounds" decimals after build

I'm trying to get very precise Input.mousePosition in Unity3D with Input.mousePosition.x.ToString("F4"); and in the editor everything works fine. I get 4 decimal signs exacly how I need it. However, after I build the app, I get x.0000 all the time.
What important is, that, in the editor, I don't get zeroes only when I choose precise resolution and not the aspect ratio or Free Aspect. I was trying to play around with build resolution settings, for example disabling all aspect ratios, setting exact resolution, windowed/fullscreen mode. Nothing helps so far.
It seems like the built app sort of disables inbetween pixel position, and I have no idea why.
I know that there are plenty of questions and info on vector3 decimals, but my problem is mostly about display resolution (inbetween pixel coordinates) in Unity. Thanks in advance for any advice.

zooming out on terrain mesh colour distortion: Unity

I'm creating a procedural terrain in unity with hexagons. When the camera is fairly close the textures and colours work perfectly fine but... as soon as I zoom out I get this.
Map view from an angle:
As you can see I get these weird dots that become more exagerated the further I zoom...
When I'm up close I don't get this and the hexes blend very well toghether.
The textures blend very nicely when it's close and you can't see any boundaries. But as you go further out, some dots start appearing and even further so the colours get completly distorted.
Does anyone know what's causing it and how it could be fixed? I'm sorry, I'm really new to unity and this is my first unity project
Also, I presume this wouldn't be due to the texture bleding at boundaries because when I set the colour of the hexagons to be the same, I still get that effect.
(for example the sea in the pic below)
Same Colour hexes still show the problem
And when it gets further out, the problem pops up
Any help would be greatly appreciated! :)
Hmm, these might be artifacts caused by mipmapping (lower texture resolution as you zoom out). Try disabling mipmaps in your textures. It will negatively affect your game's performance but it might fix the problem.

How to read Location of a view whilst it is being animated

I am animating a small space ship (derived from UIView) and periodically (whilst in animation) send it a PointF to check if this is near the space ship's current position.
However, when reading out the Frame position of the View it keeps returning the starting position before the animation started.
I think this is by design but it is causing me big problems since the space ship(s) should move independently along Paths and it is very tricky for me to do this by hand.
Is there another way - and/or has anyone some sample code?
Not sure of a workaround for your issue, but I have some suggestions on game development for iOS.
Your problem is one of the reasons why using GUI frameworks like UIKit/CoreGraphics for games isn't a good idea. For both performance reasons, as well as the fact as they aren't designed for it.
If you are looking for a simple framework for making games on iOS, have you looked at MonoGame? If you are doing lots of animations, we also use XNA Tweener along with MonoGame to get some lifelike animations.
PS - check out our game here.

Touch Screen Running Windows CE

I'm starting my first project that runs on a 7 inch touch screen running Windows CE 6.0 (and NETCF 3.5).
The touch screen doesn't respond to touch too well when I use my finger. The only way for me to navigate around is by using a stylus (or similar).
Since I've never worked with Windows CE or a resistive touch screen, I'm not sure if I should expect to be able to use my finger or if the stylus method is, essentially, the only way to effectively navigate around. - or, maybe, I have a touch screen that simply isn't that good.
If you have experience with WinCE running on a touch screen, do you find that a stylus is the only way to go?
A resistive touchscreen can certainly provide feedback for a finger - I've even configured them for hands-in-gloves. It sounds like the touchscreen driver is tossing out the data samples it's getting from the panel and the key for you is going to be to figure out why.
In my experience there are really two primary reasons for your samples to be ignored.
The driver has been configured to too tight of a tolerance.
Sensitivity is often a configurable item. Maybe through recompile of the OS, maybe through the registry - depends on how your OEM implemented it. Check with the OEM and see if you can adjust it.
The panel has too much noise, causing your samples to get tossed
This one is easy to check. Drag a selection rectangle on the desktop with a stylus and hold the end point down (don't lift the stylus). Is it steady, or does it "wiggle" a lot at the final point? If so, you have noise. Grounding the panel usually helps, but it could be a hardware issue. I've done rolling-average work in touchpanel drivers to help smooth this out, but you then have to fight hysteresis.
Be aware that larger touchscreens have different resistivity properties than smaller ones, so if you just swapped panels from small to large, it's quite possible that the output range difference of the panels is not workable with the current driver settings. Again, some OEMs provide the ability to adjust these settings.
So can it work with a finger? Well there's nothing in the physical theory that would prevent using a finger. In fact if you can't use a finger, there's something wrong. Will it work in reality? Check with your OEM.

How to avoid tearing with pygame on Linux/X11

I've been playing with pygame (on Debian/Lenny).
It seems to work nicely, except for annoying tearing of blits (fullscreen or windowed mode).
I'm using the default SDL X11 driver. Googling suggests that it's a known issue with SDL that X11 provides no vsync facility (even with a display created with FULLSCREEN|DOUBLEBUF|HWSURFACE flags), and I should use the "dga" driver instead.
However, running
SDL_VIDEODRIVER=dga ./mygame.py
throws in pygame initialisation with
pygame.error: No available video device
(despite xdpyinfo showing an XFree86-DGA extension present).
So: what's the trick to getting tear-free vsynced flips ? Either by getting this dga thing working or some other mechanism ?
The best way to keep tearing to a minimum is to keep your frame rate as close to the screen's frequency as possible. The SDL library doesn't have a vsync unless you're running OpenGL through it, so the only way is to approximate the frame rate yourself.
The SDL hardware double buffer isn't guaranteed, although nice when it works. I've seldomly seen it in action.
In my experience with SDL you have to use OpenGL to completely eliminate tearing. It's a bit of an adjustment, but drawing simple 2D textures isn't all that complicated and you get a few other added bonuses that you're able to implement like rotation, scaling, blending and so on.
However, if you still want to use the software rendering, I'd recommend using dirty rectangle updating. It's also a bit difficult to get used to, but it saves loads of processing which may make it easier to keep the updates up to pace and it avoids the whole screen being teared (unless you're scrolling the whole play area or something). As well as the time it takes to draw to the buffer is at a minimum which may avoid the blitting taking place while the screen is updating, which is the cause of the tearing.
Well my eventual solution was to switch to Pyglet, which seems to support OpenGL much better than Pygame, and doesn't have any flicker problems.
Use the SCALED flag and vsync=True when calling set_mode and you should be all set (at least on any systems which actually support this; in some scenarios SDL still can't give you a VSync-capable surface but they are increasingly rare).

Resources