Preferred Method for Texture Blending in Shader ?

Started by
6 comments, last by PhillipHamlyn 11 years, 8 months ago
I have a series of textures to blend over a terrain, where each terrain tile reuses the same geometry and uses a heightmap to offset the Y in the vertex shader. I pass an array of Texture2D to my shader and am looking for a recommendation on how to handle the requirements of the texture blending.

Because I reuse the geometry for each tile I am unable to pass texture weights through the vertex buffer so intend to pass a Texture2D containing blend weights in the RGB channels, and do the blending using these weights in the pixel shader. However since there are only three (four if you count A) channels available to blend more than 3 textures I would end up passing multiple texture blend maps to the shader which seems very profligate.

What general approach is used to do this ?

1) Assign weights according to each color channel (do I really need 8bits of precision for a texture weight ?)
2) Manipulate the 24bits of color on the blend map as discrete 2bit chunks to fit a lower precision but handle 12 possible texture combinations ?
3) Use some other method, possibly using Hardware Instancing to pass the fixed geometry plus a second VertexBuffer of the texture weights down to a single shader as a 'combined' vertex buffer ?

If the recommendation is (3) is this the route I should use instead of using a heightmap in my Shader as well ?

Thanks in advance for your advice.
Advertisement
Are you really going to need more than 4 textures per terrain tile? (I'm not sure how big a "tile" is in relation to player view and interaction). An alternative would be to set up your array of textures in memory, and then define which 4 you want to blend between per tile. This would let you have a terrain with many texture types without sacrificing the range of the blend weights.

I'd steer away from the 2bit blending, as that's not going to give you any kind of smooth transitions.

Hazard Pay :: FPS/RTS in SharpDX (gathering dust, retained for... historical purposes)
DeviantArt :: Because right-brain needs love too (also pretty neglected these days)

four if you count A
You should count A. There's no 24-bit GPU texture formats.
I am unable to pass texture weights through the vertex buffer so intend to pass a Texture2D containing blend weights in the RGB channels[/quote]It's not mentioned in your post whether you're going to sample this weight texture in the pixel or vertex shader, so in case you've not considered it -- you can perform the fetch in the vertex shader, and it's almost equivalent to having the weights as part of the vertex stream.
do I really need 8bits of precision for a texture weight ?[/quote]Do you plan on having smooth transitions between textures with vertex granularity (e.g. a vertex that is 50/50 between two textures)?
N.B. with your 2-bit idea, you'd basically have to perform the weight-texture lookup in the vertex shader, with nearest neighbour sampling, because performing the fetch at the pixel level, with linear filtering, will destroy your packed data.
to blend more than 3 textures I would end up passing multiple texture blend maps to the shader[/quote]Instead of R being weight for tex0, G weight for tex1, etc... You can also encode the texture-index in the texture. e.g. So you store 4 weights and 4 indices.
Re-visiting your packing idea, you could store 4 4-bit weights, and 4 4-bit indices, which allows you to transition between 16 textures.

Are you really going to need more than 4 textures per terrain tile? (I'm not sure how big a "tile" is in relation to player view and interaction). An alternative would be to set up your array of textures in memory, and then define which 4 you want to blend between per tile. This would let you have a terrain with many texture types without sacrificing the range of the blend weights.

I'd steer away from the 2bit blending, as that's not going to give you any kind of smooth transitions.


Thanks for the reply. I find it difficult to restrict myself to only 3-4 textures, but maybe I'm looking at the problem wrong and currently have little in the way of "rocks","boulders" etc to make the otherwise uniform terrain interesting.

[quote name='PhillipHamlyn' timestamp='1344199478' post='4966463']four if you count A
You should count A. There's no 24-bit GPU texture formats.
I am unable to pass texture weights through the vertex buffer so intend to pass a Texture2D containing blend weights in the RGB channels[/quote]It's not mentioned in your post whether you're going to sample this weight texture in the pixel or vertex shader, so in case you've not considered it -- you can perform the fetch in the vertex shader, and it's almost equivalent to having the weights as part of the vertex stream.
do I really need 8bits of precision for a texture weight ?[/quote]Do you plan on having smooth transitions between textures with vertex granularity (e.g. a vertex that is 50/50 between two textures)?
N.B. with your 2-bit idea, you'd basically have to perform the weight-texture lookup in the vertex shader, with nearest neighbour sampling, because performing the fetch at the pixel level, with linear filtering, will destroy your packed data.
to blend more than 3 textures I would end up passing multiple texture blend maps to the shader[/quote]Instead of R being weight for tex0, G weight for tex1, etc... You can also encode the texture-index in the texture. e.g. So you store 4 weights and 4 indices.
Re-visiting your packing idea, you could store 4 4-bit weights, and 4 4-bit indices, which allows you to transition between 16 textures.
[/quote]

Thanks for the detailed reply.

In terms of 24bit textures, its a complex situation, but the default for SpriteBatch is premultiplied alpha, essentially robbing me of the Alpha channel, and becuase I am sampling the texture weights in the Vertex Shader, I cannot use COLOR semantics but must use VECTOR4 and therefore cannot instruct the sprite batch to render as non-premultiplied. (as you've guessed I am loading my texture map at runtime and rendering to a Texture2D then rather than using the pipeline, essentially to keep the asset size low).

I agree that sampling in the Vertex Shader can be done with a nearest neighbour function and I do this with my heightmap, and its where I currently do my texture weight sampling, to pass onto the Pixel Shader - and I do use blends of varying weights, but rarely more than two textures per voxel. I understand your use of Index and Weight encoded in the texture and I'll experiment with that idea.

Have you used Hardware Instancing to combine fixed and variable geometry into a single shader call ? It seems to me that instead of me mucking about with multiple samples in the Vertex Shader I might just load the weigthtings up as a second VertexBuffer (with the geometry being constant between tiles).

Thanks again for your interesting reply.
Hmm, is your main concern about the texture bandwidth? If textures worry you more than instructions, you could pass through multiple blend textures with a lower resolution, then interpolate them to full size blend maps on the GPU. Another possibility comes from the unlikeliness of blending 16 textures in a single triangle. You could simply pass in only the four textures required, or 16 bits to indicate which 4 out of the 16 textures you are blending in this pass. It would however make creating blendmaps a pain.

I definitely suggest mixing it up to make your terrain less repetitive. A few possibilities:

  • For simple self-similar textures like rock you can blend the same texture multiple times at multiple scales.
  • Add details like rocks etc, which you can do manually, based on a position hash, etc.
  • Create two versions of the same texture with minor differences. Put mask values in the alpha channel to allow your shader to mask in or out a whole region as desired, perhaps based on a position hash.
  • Create two similar textures and in your editor blend them together with Perlin/Simplex noise.

I'm currently working on terrain texture blending too, although currently with a focus on quality rather than repetitiveness. Feel free to drop by:
http://www.gamedev.net/topic/628591-shader-limitations-and-best-practices/
I use a technique described in "Large-Scale Terrain Rendering for Outdoor Games" in GPUPro2. It uses a "single channel" tile index map that is sampled in the pixel shader (using linear filtering), and then determines the two different textures to blend between based on that.

http://mtnphil.wordp...terrain-engine/

The big downside is that it means that a particular tile texture always needs to appear next to another - i.e. there is a distinct ordering. In practice this turns out not to be a huge deal, but it might be for your scenario.

terraintexture.jpg
Thanks for all the suggestions.

My "take home" message from this is;
1) The texture map can be a different size from my Heightmap texture, having more or less voxels, and use LERP in the VertexShader or PixelShader to get a set of blend weights.
2) I can slice+dice the actual colors in the texture map to mean anything I want - there is no need to consider that each channel contains only one texture's worth of blend information - I can use all 24 bits if I want in any combination, bearing in mind the complexities in LERPing the result.
3) Four textures per tile as base textures seems to be a reasonable limit, but for extra "realism" I need to look beyond bump mapped textures and put in some more geometry - rocks, etc. based on a frustum clipping or other tree to limit their number.

Thanks all. I'll look into the recommended links - I'm not building a game, just love experimenting with real time landscaping techniques as a hobby.

Phil

This topic is closed to new replies.

Advertisement