• Advertisement
Sign in to follow this  

Generating screen-aligned texture coordinates

This topic is 2520 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm a bit puzzled by this: I have a ground plane polygon that I draw as such:

const float Extent = 128;
glVertexAttrib4f(0, -Extent, 0, Extent, 1);
glVertexAttrib4f(0, Extent, 0, Extent, 1);
glVertexAttrib4f(0, Extent, 0, -Extent, 1);
glVertexAttrib4f(0, -Extent, 0, -Extent, 1);

With the following vertex shader aiming to generate screen aligned texture coordinates in the range [0, 1]:

float2 pack(float2 In)
return (In + 1.0) / 2.0;

float4 main(in float4 Position : ATTR0,
uniform float4x4 WorldMatrix,
uniform float4x4 WorldViewProjectionMatrix,
out float2 oScreenSpaceTexCoord : TEXCOORD3) : HPOS
float4 ProjectedCoord = mul(WorldViewProjectionMatrix, Position);
oScreenSpaceTexCoord = pack(ProjectedCoord.xy / ProjectedCoord.ww);
return ProjectedCoord;

and using the following visualization fragment shader:

float4 main(in float2 ScreenSpaceTexCoord : TEXCOORD3) : COLOR
return float4(ScreenSpaceTexCoord, 0, 1);

The results tend to be in the range [0, 1] but they're messed up and they change depending on the camera - ie pixel at coords (x,y) can have a different texcoord based on the camera position, which is wrong - a pixel's texcoord should only depend on its Normalized Device Coordinates and by connection, screen coordinates. Any ideas?

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement