Problems implementing perspective projection in 3D graphics, (d3d) - graphics

I'm having massive difficulties with setting up a perspective projection with direct x.
I've been stuck on this for weeks, any help would be appreciated.
As far as I can my pipeline set up is fine, in as far as the shader is getting exactly the data I am packing it, so I'll forgo including that code, unless I'm asked for it.
I am using glm for maths (therefore column matrices) on the host side. It's set to use a left handed coordinate system. However I couldn't seem to get its projection matrix to work,
(I get a blank screen if I use the matrix I get from it) so I wrote this - which I hope only to be temporary.
namespace
{
glm::mat4
perspective()
{
DirectX::XMMATRIX dx = DirectX::XMMatrixPerspectiveFovLH(glm::radians(60.0f), 1, 0.01, 1000.0f);
glm::mat4 glfromdx;
for (int c = 0; c < 3; c++)
{
for (int r = 0; r < 3; r++)
{
glfromdx[c][r] = DirectX::XMVectorGetByIndex(dx.r[c], r);
}
}
return glfromdx;
}
}
this does hand me a projection matrix that works - but I don't really think that it is working. I'm not certain whether
its an actual projection, and my clipspace is miniscule. If I move my camera around at all it clips. Photos included.
photo 1: (straight on)
photo 2: (cam tilted up)
photo 3: (cam tilted left)
Here is my vertex shader
https://gist.github.com/anonymous/31726881a5476e6c8513573909bf4dc6
cbuffer offsets
{
float4x4 model; //<------currently just identity
float4x4 view;
float4x4 proj;
}
struct idata
{
float3 position : POSITION;
float4 colour : COLOR;
};
struct odata
{
float4 position : SV_POSITION;
float4 colour : COLOR;
};
void main(in idata dat, out odata odat)
{
odat.colour = dat.colour;
odat.position = float4(dat.position, 1.0f);
//point' = point * model * view * projection
float4x4 mvp = mul(model, mul(view, proj));
//column matrices.
odat.position = mul(mvp, odat.position);
}
And here is my fragment shader
https://gist.github.com/anonymous/342a707e603b9034deb65eb2c2101e91
struct idata
{
float4 position : SV_POSITION;
float4 colour : COLOR;
};
float4 main(in idata data) : SV_TARGET
{
return float4(data.colour[0], data.colour[1], data.colour[2] , 1.0f);
}
also full camera implementation is here
https://gist.github.com/anonymous/b15de44f08852cbf4fa4d6b9fafa0d43
https://gist.github.com/anonymous/8702429212c411ddb04f5163b1d885b9
Finally these are the vertices being passed in
position 3 colour 4
(0.0f, 0.5f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f),
(0.45f, -0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 1.0f),
(-0.45f, -0.5f, 0.0f, 0.0f, 0.0f, 1.0f, 1.0f)
Edit:
I changed the P*M*V*P calculation in the shader to this;
odat.position = mul(mul(mul(proj, view), model), odat.position);
still having the same issue.
Edit 2:
Hmm, the above changes mixed with a change of what generates the perspective, now just purely using glm.
glm::mat4 GLtoDX_NDC(glm::translate(glm::vec3(0.0, 0.0, 0.5)) *
glm::scale(glm::vec3(1.0, 1.0, 0.5)));
glm::mat4 gl = GLtoDX_NDC * glm::perspectiveFovLH(60.0f, 500.0f, 500.0f, 100.0f, 0.1f);
This works, in as far as object no longer cull at about a micrometre, and are projected to their approximate sizes. However, now everything is upside down and rotating causes a shear...

Related

How to do static environment mapping in DirectX 11?

I have the .dds cube map texture loaded it in and put it on the sphere model. Obviously it doesn't look right, because I would have to map the texture to the right points on the sphere.
how can I map the texture to the right points on the sphere?
somethings like that:
TextureCube TEXTURE_REFLECTION;
//...
VS_OUTPUT vs(VS_INPUT IN) {
//...
float4 worldPosition = mul(float4(IN.Position.xyz, 1.0f), IN.World);
OUT.worldpos = worldPosition.xyz;
//...
}
PS_OUTPUT ps(VS_OUTPUT IN) {
//...
float4 ColorTex = TEXTURE_DIFFUSE.Sample(SAMPLER_DEFAULT, IN.TexCoord);
float3 normalFromMap = mul(2.0f * TEXTURE_NORMAL.Sample(SAMPLER_DEFAULT, IN.TexCoord).xyz - 1.0f), IN.tangentToWorld);
//...
float3 incident = -normalize(CAMERA_POSITION - IN.worldpos);
float3 reflectionVector = reflect(incident, normalFromMap);
ColorTex.rgb = lerp(ColorTex.rgb, TEXTURE_REFLECTION.Sample(SAMPLER_DEFAULT, reflectionVector).rgb, MATERIAL_REFLECTIVITY);
}
I figured it out.
Pixel shader:
TextureCube CubeMap: register(t0);
SamplerState TexSampler : register(s0);
float4 main(LightingPixelShaderInput input) : SV_Target
{
float4 cubeTexture = CubeMap.Sample(TexSampler, input.worldNormal);
//light calculations
float3 finalColour = (gAmbientColour + diffuseLights) * cubeTexture.rgb +
(specularLights) * cubeTexture.a;
return float4(finalColour, 1.0f);
}

hlsl nointerpolation behaviour

For an experiment I want to pass all edges of a triangle to the pixel shader and manually calculate the pixel position with the triangle edges and bayrcentric coodinates.
In order to do this I wrote a geometry shader that passes all edges of my triangle to the pixel shader:
struct GeoIn
{
float4 projPos : SV_POSITION;
float3 position : POSITION;
};
struct GeoOut
{
float4 projPos : SV_POSITION;
float3 position : POSITION;
float3 p[3] : TRIPOS;
};
[maxvertexcount(3)]
void main(triangle GeoIn i[3], inout TriangleStream<GeoOut> OutputStream)
{
GeoOut o;
// add triangle data
for(uint idx = 0; idx < 3; ++idx)
{
o.p[idx] = i[idx].position;
}
// generate verices
for(idx = 0; idx < 3; ++idx)
{
o.projPos = i[idx].projPos;
o.position = i[idx].position;
OutputStream.Append(o);
}
OutputStream.RestartStrip();
}
The pixel shader outputs the manually reconstructed position:
struct PixelIn
{
float4 projPos : SV_POSITION;
float3 position : POSITION;
float3 p[3] : TRIPOS;
float3 bayr : SV_Barycentrics;
};
float4 main(PixelIn i) : SV_TARGET
{
float3 pos = i.bayr.x * i.p[0] + i.bayr.y * i.p[1] + i.bayr.z * i.p[2];
return float4(abs(pos), 1.0);
}
And I get the following (expected) result:
However, when I modify my PixelIn struct by adding nointerpolation to p[3]:
struct PixelIn
{
...
nointerpolation float3 p[3] : TRIPOS;
};
I get:
I did not expect a different result because I am not changing the values of p[] for a single triangle in the geometry shader. I tried debugging it by changing the output to float4(abs(i.p[0]), 1.0); with and without interpolation. Without nointerpolation the values of p[] do not vary within a triangle (which makes sense, because all should have the same value). With nointerpolation the values of p[] do change slightly. Why is that the case? I thought nointerpolate was not supposed to interpolate anything.
Edit:
This is the wireframe of my geometry:

3d model not rendered (DirectX 12)

I am developing a small program that load 3d models using assimp, but it does not render the model. At first I thought that vertices and indices were not loaded correctly but this is not the case ( I printed on a txt file vertices and indices). I think that the probem might be with the position of the model and camera. The application does not return any error, it runs properly.
Vertex Struct:
struct Vertex {
XMFLOAT3 position;
XMFLOAT2 texture;
XMFLOAT3 normal;
};
Input layout:
D3D12_INPUT_ELEMENT_DESC inputLayout[] =
{
{ "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D12_INPUT_CLASSIFICATION_PER_VERTEX_DATA, 0 },
{ "TEXCOORD", 0, DXGI_FORMAT_R32G32_FLOAT, 0, 12, D3D12_INPUT_CLASSIFICATION_PER_VERTEX_DATA, 0 },
{ "NORMAL", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, D3D12_APPEND_ALIGNED_ELEMENT, D3D12_INPUT_CLASSIFICATION_PER_VERTEX_DATA, 0 }
};
Vertices, texcoords, normals and indices loader:
model = new ModelMesh();
std::vector<XMFLOAT3> positions;
std::vector<XMFLOAT3> normals;
std::vector<XMFLOAT2> texCoords;
std::vector<unsigned int> indices;
model->LoadMesh("beast.x", positions, normals,
texCoords, indices);
// Create vertex buffer
if (positions.size() == 0)
{
MessageBox(0, L"Vertices vector is empty.",
L"Error", MB_OK);
}
Vertex* vList = new Vertex[positions.size()];
for (size_t i = 0; i < positions.size(); i++)
{
Vertex vert;
XMFLOAT3 pos = positions[i];
vert.position = XMFLOAT3(pos.x, pos.y, pos.z);
XMFLOAT3 norm = normals[i];
vert.normal = XMFLOAT3(norm.x, norm.y, norm.z);
XMFLOAT2 tex = texCoords[i];
vert.texture = XMFLOAT2(tex.x, tex.y);
vList[i] = vert;
}
int vBufferSize = sizeof(vList);
Build of the camera and views:
XMMATRIX tmpMat = XMMatrixPerspectiveFovLH(45.0f*(3.14f/180.0f), (float)Width / (float)Height, 0.1f, 1000.0f);
XMStoreFloat4x4(&cameraProjMat, tmpMat);
// set starting camera state
cameraPosition = XMFLOAT4(0.0f, 2.0f, -4.0f, 0.0f);
cameraTarget = XMFLOAT4(0.0f, 0.0f, 0.0f, 0.0f);
cameraUp = XMFLOAT4(0.0f, 1.0f, 0.0f, 0.0f);
// build view matrix
XMVECTOR cPos = XMLoadFloat4(&cameraPosition);
XMVECTOR cTarg = XMLoadFloat4(&cameraTarget);
XMVECTOR cUp = XMLoadFloat4(&cameraUp);
tmpMat = XMMatrixLookAtLH(cPos, cTarg, cUp);
XMStoreFloat4x4(&cameraViewMat, tmpMat);
cube1Position = XMFLOAT4(0.0f, 0.0f, 0.0f, 0.0f);
XMVECTOR posVec = XMLoadFloat4(&cube1Position);
tmpMat = XMMatrixTranslationFromVector(posVec);
XMStoreFloat4x4(&cube1RotMat, XMMatrixIdentity());
XMStoreFloat4x4(&cube1WorldMat, tmpMat);
Update function :
XMStoreFloat4x4(&cube1WorldMat, worldMat);
XMMATRIX viewMat = XMLoadFloat4x4(&cameraViewMat); // load view matrix
XMMATRIX projMat = XMLoadFloat4x4(&cameraProjMat); // load projection matrix
XMMATRIX wvpMat = XMLoadFloat4x4(&cube1WorldMat) * viewMat * projMat; // create wvp matrix
XMMATRIX transposed = XMMatrixTranspose(wvpMat); // must transpose wvp matrix for the gpu
XMStoreFloat4x4(&cbPerObject.wvpMat, transposed); // store transposed wvp matrix in constant buffer
memcpy(cbvGPUAddress[frameIndex], &cbPerObject, sizeof(cbPerObject));
VERTEX SHADER:
struct VS_INPUT
{
float4 pos : POSITION;
float2 tex: TEXCOORD;
float3 normal : NORMAL;
};
struct VS_OUTPUT
{
float4 pos: SV_POSITION;
float2 tex: TEXCOORD;
float3 normal: NORMAL;
};
cbuffer ConstantBuffer : register(b0)
{
float4x4 wvpMat;
};
VS_OUTPUT main(VS_INPUT input)
{
VS_OUTPUT output;
output.pos = mul(input.pos, wvpMat);
return output;
}
Hope it is a long code to read but I don't understand what is going wrong with this code. Hope somebody can help me.
A few things to try/check:
Make your background clear color grey. That way, if you are drawing black triangles you will see them.
Turn backface culling off in the rendering state, in case your triangles are back to front.
Turn depth test off in the rendering state.
Turn off alpha blending.
You don't show your pixel shader, but try writing a constant color to see if your lighting calculation is broken.
Use NVIDIA's nSight tool, or the Visual Studio Graphics debugger to see what your graphics pipeline is doing.
Those are usually the things I try first...

How Do I Draw 3D Primitives in WP

I'm a green horn with XNA and I've been trying to get past this issue for several days but every time I try to get it done I get an exception or the application just plain exits and it has been very irritating.
I want to be able to draw a 3D primitive without having to have a model premade for it. I had this code first:
VertexPositionColor[] primitiveList = { new VertexPositionColor(
new Vector3(1,1,0), Color.White), new VertexPositionColor(
new Vector3(0,1,-1), Color.White) };
short[] lineListIndices = { 0, 1 };
Controller.CurrentGame.GraphicsDevice.DrawUserIndexedPrimitives<VertexPositionColor>(
PrimitiveType.LineList,
primitiveList,
0, // vertex buffer offset to add to each element of the index buffer
2, // number of vertices in pointList
lineListIndices, // the index buffer
0, // first index element to read
1 // number of primitives to draw
);
And I get an InvalidOperationException with the following
Message "Both a vertex shader and pixel shader must be set on the device before any draw operations may be performed."
So then I try following the instructions on http://msdn.microsoft.com/en-us/library/bb203926.aspx and end up with the following code:
BasicEffect basicEffect = new BasicEffect(Controller.Graphics.GraphicsDevice);
basicEffect.World = Matrix.Identity;
basicEffect.View = Controller.Cam.view;
basicEffect.Projection = Controller.Cam.projection;
// primitive color
basicEffect.AmbientLightColor = new Vector3(0.1f, 0.1f, 0.1f);
basicEffect.DiffuseColor = new Vector3(1.0f, 1.0f, 1.0f);
basicEffect.SpecularColor = new Vector3(0.25f, 0.25f, 0.25f);
basicEffect.SpecularPower = 5.0f;
basicEffect.Alpha = 1.0f;
basicEffect.LightingEnabled = true;
if (basicEffect.LightingEnabled)
{
basicEffect.DirectionalLight0.Enabled = true; // enable each light individually
if (basicEffect.DirectionalLight0.Enabled)
{
// x direction
basicEffect.DirectionalLight0.DiffuseColor = new Vector3(1, 0, 0); // range is 0 to 1
basicEffect.DirectionalLight0.Direction = Vector3.Normalize(new Vector3(-1, 0, 0));
// points from the light to the origin of the scene
basicEffect.DirectionalLight0.SpecularColor = Vector3.One;
}
basicEffect.DirectionalLight1.Enabled = true;
if (basicEffect.DirectionalLight1.Enabled)
{
// y direction
basicEffect.DirectionalLight1.DiffuseColor = new Vector3(0, 0.75f, 0);
basicEffect.DirectionalLight1.Direction = Vector3.Normalize(new Vector3(0, -1, 0));
basicEffect.DirectionalLight1.SpecularColor = Vector3.One;
}
basicEffect.DirectionalLight2.Enabled = true;
if (basicEffect.DirectionalLight2.Enabled)
{
// z direction
basicEffect.DirectionalLight2.DiffuseColor = new Vector3(0, 0, 0.5f);
basicEffect.DirectionalLight2.Direction = Vector3.Normalize(new Vector3(0, 0, -1));
basicEffect.DirectionalLight2.SpecularColor = Vector3.One;
}
}
VertexDeclaration vertexDeclaration = new VertexDeclaration(new VertexElement[]
{
new VertexElement(0, VertexElementFormat.Vector3, VertexElementUsage.Position, 0),
new VertexElement(12, VertexElementFormat.Vector3, VertexElementUsage.Normal, 0),
new VertexElement(24, VertexElementFormat.Vector2, VertexElementUsage.TextureCoordinate, 0)
}
);
Vector3 topLeftFront = new Vector3(-1.0f, 1.0f, 1.0f);
Vector3 bottomLeftFront = new Vector3(-1.0f, -1.0f, 1.0f);
Vector3 topRightFront = new Vector3(1.0f, 1.0f, 1.0f);
Vector3 bottomRightFront = new Vector3(1.0f, -1.0f, 1.0f);
Vector3 topLeftBack = new Vector3(-1.0f, 1.0f, -1.0f);
Vector3 topRightBack = new Vector3(1.0f, 1.0f, -1.0f);
Vector3 bottomLeftBack = new Vector3(-1.0f, -1.0f, -1.0f);
Vector3 bottomRightBack = new Vector3(1.0f, -1.0f, -1.0f);
Vector2 textureTopLeft = new Vector2(0.0f, 0.0f);
Vector2 textureTopRight = new Vector2(1.0f, 0.0f);
Vector2 textureBottomLeft = new Vector2(0.0f, 1.0f);
Vector2 textureBottomRight = new Vector2(1.0f, 1.0f);
Vector3 frontNormal = new Vector3(0.0f, 0.0f, 1.0f);
Vector3 backNormal = new Vector3(0.0f, 0.0f, -1.0f);
Vector3 topNormal = new Vector3(0.0f, 1.0f, 0.0f);
Vector3 bottomNormal = new Vector3(0.0f, -1.0f, 0.0f);
Vector3 leftNormal = new Vector3(-1.0f, 0.0f, 0.0f);
Vector3 rightNormal = new Vector3(1.0f, 0.0f, 0.0f);
VertexPositionNormalTexture[] cubeVertices = new VertexPositionNormalTexture[6];
// Front face.
cubeVertices[0] =
new VertexPositionNormalTexture(
topLeftFront, frontNormal, textureTopLeft);
cubeVertices[1] =
new VertexPositionNormalTexture(
bottomLeftFront, frontNormal, textureBottomLeft);
cubeVertices[2] =
new VertexPositionNormalTexture(
topRightFront, frontNormal, textureTopRight);
cubeVertices[3] =
new VertexPositionNormalTexture(
bottomLeftFront, frontNormal, textureBottomLeft);
cubeVertices[4] =
new VertexPositionNormalTexture(
bottomRightFront, frontNormal, textureBottomRight);
cubeVertices[5] =
new VertexPositionNormalTexture(
topRightFront, frontNormal, textureTopRight);
RasterizerState rasterizerState1 = new RasterizerState();
rasterizerState1.CullMode = CullMode.None;
Controller.Graphics.GraphicsDevice.RasterizerState = rasterizerState1;
foreach (EffectPass pass in basicEffect.CurrentTechnique.Passes)
{
pass.Apply();
Controller.Graphics.GraphicsDevice.DrawPrimitives(
PrimitiveType.TriangleList,
0,
12
);
}
And I get the following Exception
Message "A valid vertex buffer (and a valid index buffer if you are using indexed primitives) must be set on the device before any draw operations may be performed."
I know I shouldn't have all of this code in the same place, but I just want to be able to get something to draw so I can see how it works but nothing works and I can't seem to load any examples into VS.
As I've said I'm rather new and beyond the question I would also greatly appreciate any kind of reading material that would point in the right direction for 3D XNA. Thank you for reading.
In your first example you are properly drawing a list of vertices, but as the error said it had no shader. Quick note about shaders: since there are no custom shaders on WP7, this means you're not applying any Effect; WP7 is limited to the Effect subclasses that can be found here. They essentially contain built-in shaders.
In the second example you have a shader set up, but now you are not drawing from any list or buffer. DrawUserIndexedPrimitves takes an array of indices & vertices. It uses those to draw. DrawPrimitives doesn't; it relies on a vertex buffer being bound to the graphics device.
Either change that second example to DrawUserIndexedPrimitives and pass in your arrays (cubeVertices and...wait, you don't have an indices array, make one) to it, or set up a VertexBuffer with the same information and bind it as follows:
Controller.Graphics.GraphicsDevice.SetVertexBuffer(vb);

Passing colors through a pixel shader in HLSL

I have have a pixel shader that should simply pass the input color through, but instead I am getting a constant result. I think my syntax might be the problem. Here is the shader:
struct PixelShaderInput
{
float3 color : COLOR;
};
struct PixelShaderOutput
{
float4 color : SV_TARGET0;
};
PixelShaderOutput main(PixelShaderInput input)
{
PixelShaderOutput output;
output.color.rgba = float4(input.color, 1.0f); // input.color is 0.5, 0.5, 0.5; output is black
// output.color.rgba = float4(0.5f, 0.5f, 0.5f, 1); // output is gray
return output;
}
For testing, I have the vertex shader that precedes this in the pipleline passing a COLOR parameter of 0.5, 0.5, 0.5. Stepping through the pixel shader in VisualStudio, input.color has the correct values, and these are being assinged to output.color correctly. However when rendered, the vertices that use this shader are all black.
Here is the vertex shader element description:
const D3D11_INPUT_ELEMENT_DESC vertexDesc[] =
{
{ "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0 },
{ "COLOR", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0 },
{ "TEXCOORD", 0, DXGI_FORMAT_R32G32_FLOAT, 0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0 },
};
I'm not sure if it's important that the vertex shader takes colors as RGB outputs the same, but the pixel shader outputs RGBA. The alpha layer is working correctly at least.
If I comment out that first assignment, the one using input.color, and uncomment the other assignment, with the explicit values, then the rendered pixels are gray (as expected).
Any ideas on what I'm doing wrong here?
I'm using shader model 4 level 9_1, with optimizations disabled and debug info enabled.
output.color.rgba = float4(input.color, 1.0f);
your input.color is a float4 and you are passing it into another float4, i think this should work
output.color.rgba = float4(input.color.rgb, 1.0f);
this is all you need to pass it thru simply
return input.color;
if you want to change the colour to red then do something like
input.color = float4(1.0f, 0.0f, 0.0f, 1.0f);
return input.color;
*Are you sure that your vertices are in the place they are supposed to be? You are starting to make me doubt my D3D knowledge. :P
I believe your problem is that you are only passing a color,
BOTH parts of the shader NEED a position in order to work.
Your PixelShaderInput layout should be:
struct PixelShaderInput
{
float4 position :SV_POSITION;
float3 color : COLOR;
};*
Could you maybe try this as your pixel shader?:
float4 main(float3 color : COLOR) : SV_TARGET
{
return float4(color, 1.0f);
}
I have never seen this kind of constructor
float4(input.color, 1.0f);
this might be the problem, but I could be wrong. Try passing the float values one by one like this:
float4(input.color[0], input.color[1], input.color[2], 1.0f);
Edit:
Actually you might have to use float4 as type for COLOR (http://msdn.microsoft.com/en-us/library/windows/desktop/bb509647(v=vs.85).aspx)

Resources