HLSL: Getting texture dimensions in a pixel shader - graphics

I have a texture and I need to know its dimensions within a pixel shader. This seems like a job for GetDimensions. Here's the code:
Texture2D t: register(t4);
...
float w;
float h;
t.GetDimensions(w, h);
However, this results in an error:
X4532: cannot map expression to pixel shader instruction set
This error doesn't seem to be documented anywhere. Am I using the function incorrectly? Is there a different technique that I should use?
I'm working in shader model 4.0 level 9_1, via DirectX.

This error usually occurs if a function is not available in the calling shader stage.
Is there a different technique that I should use?
Use shader constants for texture width and height. It saves instructions in the shader, which may also be better performance-wise.

Related

How to combine world environment post processing with custom post processing shader in a 3D world, Godot 4.0

I am trying to use the in-built post processing effects attached to a Camera3D while also applying a custom post processing effect to run in combination with the other effects.
I have read tutorials on how to create custom post processing effects, like the one found on the official docs. It tells me to create a MeshInstance with a QuadMesh (well, in Godot 4.0, it is actually now a PlaneMesh) and transform it into clip space.
For one, the transformation explained in the docs did not work, the quad just disappeared when I applied the following vertex shader and applied a large value to extra_cull_margin:
shader_type spatial;
render_mode cull_disabled, unshaded;
void vertex() {
POSITION = vec4(VERTEX, 1.0);
}
Then, I managed to work around this by actually manually rotating the plane such that it faces the camera and with a Z offset of something small but larger than the camera near field.
The issue is that with this plane in front, none of the world environment post processing effects work. Now, I think it might work better if I get the transform working of the quad to clip space, but it doesn't work for me.
Has anyone tried this yet for Godot 4.0 beta 1?
Okay, so reading up on how to do this in general, I stumbled upon this question.
Based on the answer from derhass, I wrote the following vertex shader code:
shader_type spatial;
render_mode cull_disabled, unshaded;
const vec2 vertices[3] = {vec2(-1,-1), vec2(3,-1), vec2(-1, 3)};
void vertex() {
POSITION = vec4(vertices[VERTEX_ID],0.0,1.0);
}
This draws a triangle and it also transforms it successfully into clip space.
Now the world environment effects are working together with the custom post processing shader:
With shader
Without shader

How to write values to depth buffer in godot fragment shader?

How do you specify the depth value in the fragment shader, if you would like to for example render a texture of a sphere that also affect depht buffer in the cameras z-direction?
In OpenGL you can use gl_FragDepth. Is there a similar builtin variable in godot?
Edit:
I found that there is a variable DEPTH after posting the question that seems to be merged.. I have not had time to try it yet. If you have any experience from using successfully, I would accept that answer.
Yes, you can write to DEPTH from the fragment buffer of the shader of an spatial material.
Godot will, of course, also draw depth by default. You can control that with the render modes depth_draw_*, see Depth Draw Mode.
And if you want to read depth, you can use DEPTH_TEXTURE. The article Screen Reading Shaders has an example.
Refer to Spatial Shader for the list available variables and options in spatial shaders.

wgpu Compute Write Direct to Surface Texture View

I am relatively new to using gpu apis, even newer to wgpu, and wanted to mess around with compute shaders drawing to a surface.
However, it seems that this is not allowed directly?
During run time upon attempting to create a binding to the texture view from the surface, an error stating that the STORAGE BINDING bit is necessary, however, that is not allowed to be defined during the surface configuration. I have also attempted to have the shader accept the texture as a regular texture rather than a storage texture, but that came with its own error of the binding being invalid.
Is there a good way to write directly to the surface texture, or is it necessary to create a separate storage texture? Does the render pipeline under the hood not write directly to the surface's texture view?
If a separate texture (which I am guessing it is), is there a best method to follow?
The compute shader cannot write to surface texture directly, that is the responsibility of the fragment shader.
Since swapchain uses double or multi-buffering technology, the surface texture changes from frame to frame; Also, the usage of surface texture is RENDER_ATTACHMENT, which means that it can only be used for RenderPass's color_attachments;
Compute shader can only output Storaga Buffer and Storage Texture, these two types of data can be used by binding to a fragment shader.

DirectX: Vertex Shader using textures

I am a beginner in Graphics Programming. I came across a case where a "ResourceView" is created out of texture and then this resource view is set as VS Resource. To summarize:
CreateTexture2D( D3D10_TEXTURE2D_DESC{ 640, 512, .... **ID3D10Texture2D_0c2c0f30** )
CreateShaderResourceView( **ID3D10Texture2D_0c2c0f30**, ..., **ID3D10ShaderResourceView_01742c80** )
VSSetShaderResources( 0, 1, [**0x01742c80**])
When and what are the cases when we use textures in Vertex Shaders?? Can anyone help?
Thanks.
That completely depends on the effect you are trying to achieve.
If you want to color your vertices individually you would usually use a vertex color component. But nothing is stopping you from sampling the color from a texture. (Except that it is probably slower.)
Also, don't let the name fool you. Textures can be used for a lot more than just coloring. They are basically precomputed functions. For example, you could use a Textue1D to submit a wave function to animate clothing or swaying grass/foilage. And since it is a texture, you can use a different wave for every object you draw, without switching shaders.
The Direct3D developers just want to provide you with a maximum of flexibility. And that includes using texture resources in all shader stages.

HLSL geometry shader texture look up

I'm trying to implement marching cube algorythm in my geometry shader. So i place my datagrid into a Texture3D. Now i want to look up the data in the geometry shader and this trows an error "cannot map expression to gs_4_0 instruction set"
This is the line of code where he trows the error
cubeVale[0] = dataFieldTex.Sample( samPoint, float3(k, j, i)).a;
I hope someone can help me out here.
ty
Sample() only works in pixel shaders, since it automatically computes the mipmap lod to use by taking derivatives of the texture coordinates, and derivatives are only available in pixel shaders.
MSDN has a list of texture object methods and the shader profiles they work in. For the gs_4_0 profile your choices are Load(), SampleLevel() or SampleGrad(). You probably want SampleLevel(), especially if your Texture3D only has one mip level.

Resources