Jump to content
  • Advertisement
Sign in to follow this  
ddengster

DX11 Triangle strip texturing UV distortion

This topic is 679 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Reference to a similar problem

http://stackoverflow.com/questions/19211420/mapping-uv-to-a-bulged-triangle-strip

 

So I was working on ribbon effects and I needed functionality to manipulate extrusion values of ribbon points. I then ran into a problem the similar to the one in the above link.

Instead of the texturing downsizing to the smaller edge, I get a zigzag texture pattern, which is not what I want. Thinking that it's an interpolation problem, I tried hlsl keywords like 'noperspective' on my texture coordinates but to no avail.

 

Does anyone know a way to get the correct interpolation? Do I need to switch to quad strips(assuming if dx11 provides it) or something?

Share this post


Link to post
Share on other sites
Advertisement

This happens a lot, the mistake you made was thinking that the image would change like it does in 2D software.

Edy6SYJ.png

 

Error is all the mistakes made, the thing to remember is that the UV works in triangles even if you use quads. So scaling one side is the same as moving a single vertex of that side around.

Result shows how a fixed outcome will look, poly-count shows how many polygons it takes to get the result and UV shows how it is UV mapped.

 

The first row is made by dividing the plane into as many polygons as there is pixels. This example is a 512*128 texture so 65536 polygons. Because every pixel is mapped to it's own polygon you can now change it as if you where using a 2D pixel software instead of vectors.

 

The second row is smart mapping. Almost the same as above, I used a self made code here that makes a grid over the texture and allows me to scale it, then uses the Blender Decimate to lower the poly-count.

The key here as you will notice on the UV is that the larger side has more polygons on the UV map, also if you zoom in on the mesh you will notice the same error, it's just smaller not gone.

 

Last is how it's done in most games. The polygon does not change instead the image drawn on the polygon is altered. Perspective correctness does the same thing, no changes is made to mesh only to the texture.

The math for doing this in real-time is difficult it would be easier just to draw the laser with the ends, use animations or to use a shader turns a single color mesh into a laser.

 

 

Altering a mesh after it was uv mapped will always cause artifacts, there is no easy way around this.

Edited by Scouting Ninja

Share this post


Link to post
Share on other sites

Thanks, I figured out a solution similar to the last method where I remap UV coordinates based on extrusion values, instead of changing vertex positions based on that (hence the quads are all same size).

 

The math involves remapping the original 'y' texture coordinate  in range [0, 1] to [x, y], and discarding pixels outside the range. x, y is determined based on percentage of the extrusion value you provide. For example, having 50% downsize of the extrusion values places 0 and 1 at the 1st quarter and 3rd quarter points of the line respectively, and x = -0.5, y = 1.5, effectively doubling the range.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!