Cocos2d ColorLayer makes half of the screen go black - colors

I have a standard scene with the init method as follows:
#interface CommodoreScene : CCLayerColor
#end
#implementation CommodoreScene
- (id) init {
if (( self=[super initWithColor:ccc4(255, 255, 255, 255)] )) {
}
return self;
}
#end
However when I run the scene on my iPhone4, or the simulator half the screen goes black, the other half my layer color (white).
I'm currently running the scene with [_director pushScene:[CommodoreScene node]]; and running cocos2d-iphone 2.0.0.

- (id) init {
if (( self=[super initWithColor:ccc4(255, 255, 255, 255) width:480 height:320] )) {
}
return self;
}
Try that. The important part is of course the change in the call to super. You might still have an issue with [[CCDirector] winSize] reading properly in the -init method if you're in landscape (which judging from the bug you're experiencing - you are), at least I did on my tests. As I recall, this is a known bug of some kind, as it seems to read the winSize as portrait instead of landscape. You can fix this by doing the following:
CGRect rect = CGRectMake(0, 0, 480, 320);
CGSize size = rect.size;
// position the label on the center of the screen
label.position = ccp( size.width /2 , size.height/2 );
Using that newly created size instead of winSize in your -init method fixes the issue. Notice that if you create and position an element after initialization, such as in onEnter, this problem goes away and the winSize from the director reads properly. Odd bug, it is.

You can change content size of your color layer;
CGSize boxSize = CGSizeMake(sGame.winWidth, sGame.winHeight);
CCLayerColor *box = [CCLayerColor layerWithColor:ccc4(155,155,200,255)];
box.position = ccp(0.0f, 0.0f);
box.contentSize = boxSize;
[self addChild:box z:-6];

Related

Direct3D Window->Bounds.Width/Height differs from real resolution

I noticed a strange behaviour with Direct3D while doing this tutorial.
The dimensions I am getting from the Window Object differ from the configured resolution of windows. There I set 1920*1080, the width and height from the Winows Object is 1371*771.
CoreWindow^ Window = CoreWindow::GetForCurrentThread();
// set the viewport
D3D11_VIEWPORT viewport = { 0 };
viewport.TopLeftX = 0;
viewport.TopLeftY = 0;
viewport.Width = Window->Bounds.Width; //should be 1920, actually is 1371
viewport.Height = Window->Bounds.Height; //should be 1080, actually is 771
I am developing on an Alienware 14, maybe this causes this problem, but I could not find any answers, yet.
CoreWindow sizes, pointer locations, etc. are not expressed in pixels. They are expressed in Device Independent Pixels (DIPS). To convert to/from pixels you need to use the Dots Per Inch (DPI) value.
inline int ConvertDipsToPixels(float dips) const
{
return int(dips * m_DPI / 96.f + 0.5f);
}
inline float ConvertPixelsToDips(int pixels) const
{
return (float(pixels) * 96.f / m_DPI);
}
m_DPI comes from DisplayInformation::GetForCurrentView()->LogicalDpi and you get the DpiChanged event when and if it changes.
See DPI and Device-Independent Pixels for more details.
You should take a look at the Direct3D UWP Game templates on GitHub, and check out how this is handled in Main.cpp.

LWJGL Fullscreen while keeping aspect ratio?

I want to have a fullscreen mode that keeps the aspect ratio by adding black bars on either side. I tried just creating a display mode, but I can't make it fullscreen unless it's a pre-approved resolution, and when I use a bigger diaplay than the native resolution the pixels become messed up, and lines appeared between all of the tiles in the game for some reason.
I think I need to use FBOs to render the scenario to a texture instead of the window, and then just use a fullscreen approved resolution and render the texture properly stretched out in the center of the screen, but I just don't understand how to render to a texture in order to do that, or how to stretch an image. Could someone please help me?
EDIT
I got fullscreen working, but it makes everything all broken looking There are random lines on the edges of anything that's written to the window. There are no glitchy lines when it's in native resolution though. Here's my code:
Display.setTitle("Mega Man");
try{
Display.setDisplayMode(Display.getDesktopDisplayMode());
Display.create();
}catch(LWJGLException e){
e.printStackTrace();
}
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0,WIDTH,HEIGHT,0,1,-1);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
glHint(GL_LINE_SMOOTH_HINT, GL_NICEST);
try{Display.setFullscreen(true);}catch(Exception e){}
int sh=Display.getHeight();
int sw=WIDTH*sh/HEIGHT;
GL11.glViewport(Display.getWidth()/2-sw/2, 0, sw, sh);
Screenshot of the glitchy fullscreen here: http://sta.sh/021fohgnmxwa
EDIT
Here is the texture rendering code that I use to draw everything:
public static void DrawQuadTex(Texture tex, int x, int y, float width, float height, float texWidth, float texHeight, float subx, float suby, float subd, String mirror){
if (tex==null){return;}
if (mirror==null){mirror = "";}
//subx, suby, and subd are to grab sprites from a sprite sheet. subd is the measure of both the width and length of the sprite, as only images with dimensions that are the same and are powers of 2 are properly displayed.
int xinner = 0;
int xouter = (int) width;
int yinner = 0;
int youter = (int) height;
if (mirror.indexOf("h")>-1){
xinner = xouter;
xouter = 0;
}
if (mirror.indexOf("v")>-1){
yinner = youter;
youter = 0;
}
tex.bind();
glTranslatef(x,y,0);
glBegin(GL_QUADS);
glTexCoord2f(subx/texWidth,suby/texHeight);
glVertex2f(xinner,yinner);
glTexCoord2f((subx+subd)/texWidth,suby/texHeight);
glVertex2f(xouter,yinner);
glTexCoord2f((subx+subd)/texWidth,(suby+subd)/texHeight);
glVertex2f(xouter,youter);
glTexCoord2f(subx/texWidth,(suby+subd)/texHeight);
glVertex2f(xinner,youter);
glEnd();
glLoadIdentity();
}
Just to keep it clean I give you a real answer and not just a comment.
The aspect ratio problem can be solved with help of glViewport. Using this method you can decide which area of the surface that will be rendered to. The default viewport will cover the whole surface.
Since the second problem with the corrupt rendering (also described here https://stackoverflow.com/questions/28846531/sprite-game-in-full-screen-aliasing-issue) appeared after changing viewport I will give my thought about it in this answer as well.
Without knowing exactly how the rendering code for the tile background looks. I would guess that the problem is due to any differences in the resolution between the glViewport and glOrtho calls.
Example: If the glOrtho resolution is half the viewport resolution then each openGL unit is actually 2 pixels. If you then renders a tile between x=0 and x=9 and then the next one between x=10 and x=19 you will get an empty space between them.
To solve this you can change the resolution so that they are the same. Or you can render the tile to overlap, first one x=0 to x=10 second one x=10 to x=20 and so on.
Without seeing the tile rendering code I can't verify it this is the problem though.

XLib Window background has colours inverted

I'm almost there with my little "window background from PNG image" project in Linux. I use pure X11 API and the minimal LodePNG to load the image. The problem is that the background is the negative of the original PNG image and I don't know what could be the problem.
This is basically the code that loads the image then creates the pixmap and applies the background to the window:
// required headers
// global variables
Display *display;
Window window;
int window_width = 600;
int window_height = 400;
// main entry point
// load the image with lodePNG (I didn't modify its code)
vector<unsigned char> image;
unsigned width, height;
//decode
unsigned error = lodepng::decode(image, width, height, "bg.png");
if(!error)
{
// And here is where I apply the image to the background
Screen* screen = NULL;
screen = DefaultScreenOfDisplay(display);
// Creating the pixmap
Pixmap pixmap = XCreatePixmap(
display,
XDefaultRootWindow(display),
width,
height,
DefaultDepth(display, 0)
);
// Creating the graphic context
XGCValues gr_values;
gr_values.function = GXcopy;
gr_values.background = WhitePixelOfScreen(display);
// Creating the image from the decoded PNG image
XImage *ximage = XCreateImage(
display,
CopyFromParent,
DisplayPlanes(display, 0),
ZPixmap,
0,
(char*)&image,
width,
height,
32,
4 * width
);
// Place the image into the pixmap
XPutImage(
display,
pixmap,
gr_context,
ximage,
0, 0,
0, 0,
window_width,
window_height
);
// Set the window background
XSetWindowBackgroundPixmap(display, window, pixmap);
// Free up used resources
XFreePixmap(display, pixmap);
XFreeGC(display, gr_context);
}
The image is decoded (and there's the possibility to be badly decoded) then it is applied to the background but, as I said, the image colors are inversed and I don't know why.
MORE INFO
After decoding I encoded the same image into a PNG file which is identical to the decoded one, so it looks like the problem is not related to LodePNG but to the way I play with XLib in order to place it on the window.
EVEN MORE INFO
Now I compared the inverted image with the original one and found out that somewhere in my code the RGB is converted to BGR. If one pixel on the original image is 95, 102, 119 on the inverted one it is 119, 102, 95.
I found the solution here. I am not sure if is the best way but the simpler for sure.

Using DirectX 9; Sprite Anti-Aliasing Halo

So I am coding in DirectX 9 and whenever I place a sprite inside of a 2D world. There is a white colored "halo" that appears around the sprite image p. I am using PNGs and the background behind the sprite is transparent. I have also tried using a pink background as well. It seems that the halo only appears on straight lines of pixels but only on some edges. Any help is greatly appreciated!
m_d3d = Direct3DCreate9(D3D_SDK_VERSION); // create the Direct3D interface
D3DPRESENT_PARAMETERS d3dpp; // create a struct to hold various device information
ZeroMemory(&d3dpp, sizeof(d3dpp)); // clear out the struct for use
d3dpp.Windowed = windowed; // is program fullscreen, not windowed?
d3dpp.SwapEffect = D3DSWAPEFFECT_DISCARD; // discard old frames
d3dpp.hDeviceWindow = hWnd; // set the window to be used by Direct3D
d3dpp.BackBufferFormat = D3DFMT_X8R8G8B8; // set the back buffer format to 32-bit
d3dpp.BackBufferWidth = screenWidth; // set the width of the buffer
d3dpp.BackBufferHeight = screenHeight; // set the height of the buffer
d3dpp.EnableAutoDepthStencil = TRUE; // automatically run the z-buffer for us
d3dpp.AutoDepthStencilFormat = D3DFMT_D16; // 16-bit pixel format for the z-buffer
// create a device class using this information and the info from the d3dpp stuct
m_d3d->CreateDevice(D3DADAPTER_DEFAULT,
D3DDEVTYPE_HAL,
hWnd,
D3DCREATE_SOFTWARE_VERTEXPROCESSING,
&d3dpp,
&m_d3ddev);
D3DXCreateSprite(m_d3ddev, &m_d3dspt); // create the Direct3D Sprite object
LPDIRECT3DTEXTURE9 texture;
D3DXCreateTextureFromFileEx(m_d3ddev, "DC.png", D3DX_DEFAULT, D3DX_DEFAULT,
D3DX_DEFAULT, NULL, D3DFMT_A8R8G8B8, D3DPOOL_MANAGED, D3DX_DEFAULT,
D3DX_DEFAULT, D3DCOLOR_XRGB(255, 0, 255), NULL, NULL, &texture);
m_d3ddev->BeginScene();
m_d3dspt->Begin(D3DXSPRITE_ALPHABLEND); // begin sprite drawing with transparency
D3DXVECTOR3 center(0.0f, 0.0f, 0.0f), position((appropriate x), (appropriate y), 1);
m_d3dspt->Draw(texture, NULL, &center, &position, D3DCOLOR_ARGB(255, 255, 255, 255));
m_d3dspt->End(); // end sprite drawing
m_d3ddev->EndScene();
m_d3ddev->Present(NULL, NULL, NULL, NULL);
Thanks
Peter
This occurs when you screw up your texture co-ordinates from sprite atlasing and you accidentally run off the texture or on to another texture.
Most commonly, anyway, AFAIK.

Object disposed exception to be thrown

I have recently integrated in a HUD method into my XNA game project and when the method is called by the main Draw method it throws out a object disposed exception this has something to do with the two Drawstring used in the program.
The exception is thrown at spriteBatch.End() and says Cannot access a disposed object.
Object name: 'Texture2D'.
//initiation of the spritebatch
private SpriteBatch spriteBatch;
//game draw method
public override void Draw(GameTime gameTime)
{
ScreenManager.GraphicsDevice.Clear(Color.CornflowerBlue);
// Our player and enemy are both actually just text strings.
spriteBatch = ScreenManager.SpriteBatch;
tileMap.Draw(spriteBatch, camera);
spriteBatch.Begin(SpriteSortMode.Deferred,
BlendState.AlphaBlend,
null, null, null, null,
camera.TransformMatrix);
DrawHud();
level.Draw(gameTime, spriteBatch);
spriteBatch.End();
// If the game is transitioning on or off, fade it out to black.
if (TransitionPosition > 0 || pauseAlpha > 0)
{
float alpha = MathHelper.Lerp(1f - TransitionAlpha, 1f, pauseAlpha / 2);
ScreenManager.FadeBackBufferToBlack(alpha);
}
base.Draw(gameTime);
}
the HUD method
private void DrawHud()
{
Rectangle titleSafeArea = ScreenManager.GraphicsDevice.Viewport.TitleSafeArea;
Vector2 hudLocation = new Vector2(titleSafeArea.X + camera.Position.X, titleSafeArea.Y + camera.Position.Y);
Vector2 center = new Vector2(titleSafeArea.Width + camera.Position.X / 2.0f,
titleSafeArea.Height + camera.Position.Y / 2.0f);
// Draw time remaining. Uses modulo division to cause blinking when the
// player is running out of time.
string timeString = "TIME: " + level.TimeRemaining.Minutes.ToString("00") + ":" + level.TimeRemaining.Seconds.ToString("00");
Color timeColor;
if (level.TimeRemaining > WarningTime ||
level.ReachedExit ||
(int)level.TimeRemaining.TotalSeconds % 2 == 0)
{
timeColor = Color.Yellow;
}
else
{
timeColor = Color.Red;
}
DrawShadowedString(hudFont, timeString, hudLocation, timeColor);
// Draw score
float timeHeight = hudFont.MeasureString(timeString).Y;
DrawShadowedString(hudFont, "SCORE: " + level.Score.ToString(), hudLocation + new Vector2(0.0f, timeHeight * 1.2f), Color.Yellow);
}
//method which draws the score and the time (and is causing the problem)
private void DrawShadowedString(SpriteFont font, string value, Vector2 position, Color color)
{
spriteBatch.DrawString(font, value, position + new Vector2(1.0f, 1.0f), Color.Black);
spriteBatch.DrawString(font, value, position, color);
}
As the exception says, the problem exists because one of the Texture2Ds you are using is being disposed before you are using it.
There are two things in the XNA API (that come to mind) that will dispose of a Texture2D: The ContentManager.Unload() method for any textures loaded by that content manager, and the Texture2D.Dispose() method. So check if your own code is calling one of these two functions at any point.
The exception will only be thrown when the Texture2D instance is "used". Because SpriteBatch batches together texture draws, the texture doesn't actually get used until you end the SpriteBatch (at which point it draws everything in one go). If you change to SpriteSortMode.Immediate SpriteBatch will stop batching sprites and will instead draw them "immediately" you ask it to. This will cause the texture to be used and the exception to be thrown at a Draw call instead of an End call, which should make it easier to identify which texture is being disposed of while still in use.
The code you have posted seems to be fine, I suspect the problem exists elsewhere in your code. The above information should help you identify where the problem is.
My guess is that something is happening in level.Draw that is disposing of a texture somewhere. It doesn't look like the drawhud method in particular is responsible
You mention though that you are sure it's caused by the drawstring methods ... if you comment those two out in particular does the error go away?

Resources