How do I apply different textures to multiple primitives? (Direct3D 9) - graphics

I am creating a game in which every primitive needs its own texture, but I can't seem to figure out how. I searched through Google but it only displays results about texture blending. Can you please tell me how to apply multiple textures on multiple non-indexed primitives? Or do they have to be indexed?

You can change textures by calling SetTexture before each of DrawPrimitives.

I think using an UV atlas can solve your problem. An atlas is basically a large texture made up of smaller textures, like a photo collage. The UV coordinates of your vertices of course refer to the large texture, but if you know the position of your "small" textures, this is easy to calculate.
Of course you have to create that atlas texture first.

Related

How To Texture A Quadtree

I am attempting to apply one solid texture to a quadtree but I am having a problem. How my quadtree works is by creating a new mesh each time there is a subdivision. So the tree starts as one mesh, then when it splits its 4 meshes, so on so forth.
Now I am trying to apply a consistent texture to quadtree where each split still draws the same texture fully. The pictures below give a good example
Before Split:
After Split:
What I want is the texture to look like the before split picture even after the split. I can't seem to figure out the UV-mapping for it though. Is there a simple way to do this?
I have tried taking the location and modifying it's value based on the scale of the new mesh. This has proven unfruitful though and I'm really not sure what to do.
Any help is advice is greatly appreciated, thanks.
Stumbled on this...so it might be too late to help you. But if you are still thinking about this:
I think your problem is that you are getting a little confused about what a quadtree is. A quadtree is a spatial partition of a space. Think of it as a 2 dimensional b-tree. You don't texture a quadtree, you just use it to quickly figure out what lies within an arbitrary bounded region.
I suppose that you could use it to determine texture offsets for texture alignment, but that sounds like an odd use of a quadtree, and I suspect that there is probably a much easier way to solve your problem. (Perhaps use the world space coords % texture size to get the offset needed to seamlessly render the texture across multiple triangles?

Create a polygon from a texture

Let's say I've got a rgba texture, and a polygon class , which constructor takes vector array of verticies coordinates.
Is there some way to create a polygon of this texture, for example, using alpha channel of the texture ...?
in 2d
Absolutely, yes it can be done. Is it easy? No. I haven't seen any game/geometry engines that would help you out too much either. Doing it yourself, the biggest problem you're going to have is generating a simplified mesh. One quad per pixel is going to generate a lot of geometry very quickly. Holes in the geometry may be an issue if you're tracing the edges and triangulating afterwards. Then there's the issue of determining what's in and what's out. Alpha is the obvious candidate, but unless you're looking at either full-on or full-off, you may be thinking about nice smooth edges. That's going to be hard to get right and would probably involve some kind of marching squares over the interpolated alpha. So while it's not impossible, its a lot of work.
Edit: As pointed out below, Unity does provide a method of generating a polygon from the alpha of a sprite - a PolygonCollider2D. In the script reference for it, it mentions the pathCount variable which describes the number of polygons it contains, which in describes which indexes are valid for the GetPath method. So this method could be used to generate polygons from alpha. It does rely on using Unity however. But with the combination of the sprite alpha for controlling what is drawn, and the collider controlling intersections with other objects, it covers a lot of use cases. This doesn't mean it's appropriate for your application.

Conservatively cover bitmap with small number of primitives?

I'm researching the the possibility of performing occlusion culling in voxel/cube-based games like Minecraft and I've come across a challenging sub-problem. I'll give the 2D version of it.
I have a bitmap, which infrequently has pixels get either added to or removed from it.
Image Link
What I want to do is maintain some arbitrarily small set of geometry primitives that cover an arbitrarily large area, such that the area covered by all the primitives is within the colored part of the bitmap.
Image Link
Is there a smart way to maintain these sets? Please not that this is different from typical image tracing in that the primitives can not go outside the lines. If it helps, I already have the bitmap organized into a quadtree.

DirectX: Vertex Shader using textures

I am a beginner in Graphics Programming. I came across a case where a "ResourceView" is created out of texture and then this resource view is set as VS Resource. To summarize:
CreateTexture2D( D3D10_TEXTURE2D_DESC{ 640, 512, .... **ID3D10Texture2D_0c2c0f30** )
CreateShaderResourceView( **ID3D10Texture2D_0c2c0f30**, ..., **ID3D10ShaderResourceView_01742c80** )
VSSetShaderResources( 0, 1, [**0x01742c80**])
When and what are the cases when we use textures in Vertex Shaders?? Can anyone help?
Thanks.
That completely depends on the effect you are trying to achieve.
If you want to color your vertices individually you would usually use a vertex color component. But nothing is stopping you from sampling the color from a texture. (Except that it is probably slower.)
Also, don't let the name fool you. Textures can be used for a lot more than just coloring. They are basically precomputed functions. For example, you could use a Textue1D to submit a wave function to animate clothing or swaying grass/foilage. And since it is a texture, you can use a different wave for every object you draw, without switching shaders.
The Direct3D developers just want to provide you with a maximum of flexibility. And that includes using texture resources in all shader stages.

As relates to texture atlases, what is a "quad"?

It is my understanding that a texture atlas is basically a single texture that contains many smaller textures and that they are useful for making games or animations faster because they allow you to access many animation frames by loading a single file rather than files for each and every frame.
So, in discussions of texture atlases, I see the term "quad" mentioned everywhere - Is a quad simply the x, y, width and height of an individual texture from a texture atlas or am I missing something?
Quadrilateral - not necessarily a rectangle.

Resources