Archived

This topic is now archived and is closed to further replies.

MigPosada

Interpolating Vectors (from Vertex to Pixel Shaders)

Recommended Posts

Still trying to figure out how to implement good quality "per-pixel" lighting even with low polycount. I'm sure my problems come from the interpolation of the vectors across the triangle (for example, the light direction). It's not just denormalization (I'm renormalizing in the pixel shader). In order to fix those problems, I wanna ask some things: 1. Is there a difference between passing vectors as Texture Coordinates or as Color Values? 2. Which method is the best for which case? 3. Any limitations to the vectors I can interpolate across? What I already know: - If passing by the vectors as color values, they are normalized and scaled to 0.0 - 1.0 range. - Gouraud Shading must be used. - To be careful with the wrapping of texture coordinates. - Vectors should be renormalized in the pixel shader (normalization cube map in play). Any thoughts about this topic are highly appreciated. Computer Programming is Magic! Hold the Power! [edited by - MigPosada on March 18, 2004 4:20:32 PM]

Share this post


Link to post
Share on other sites
Color values are clamped values, texture values are arbitrary fpu

You want to use phong or blinn, not gouraud (since it really is what you want to avoid)

renormalize vectors with a texture cube of adequate resolution. 16x16 (or 32x32) is good. Higher, and you will get quantization artifacts.

The Cg toolkit has an example.
If you use asm, just compile the code with cgc.exe to get the asm output and see.

Share this post


Link to post
Share on other sites
But for pre-2.0 shaders, texture coordinates are clamped too, right?

I got very good results with the vertex'' world position as output of the vertex shader instead of the light direction (computed now in the pixel shader side). But this only works right with ps 2.0.

And I was using a 256x256 normalization cube map, hehe.

Last question, for the cube map, nearest point or linear filtering?

Share this post


Link to post
Share on other sites
quote:


typedef enum _D3DSHADEMODE {
D3DSHADE_FLAT = 1,
D3DSHADE_GOURAUD = 2,
D3DSHADE_PHONG = 3,
D3DSHADE_FORCE_DWORD = 0x7fffffff
} D3DSHADEMODE;

Constants

.......

D3DSHADE_PHONG
Not supported.







And the 256x256 cube map is nicer (with any filtering type)

Share this post


Link to post
Share on other sites
quote:
Original post by MigPosada
But for pre-2.0 shaders, texture coordinates are clamped too, right?


no, not if you sample directly from them. yes, if you use them as colours or other vectors. then, you will use passtrough, wich clamps them per pixel. still much bether than the per vertex clamping of the colour interpolators.




If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites
Mission Accomplished!

What was I doing:

- VS: Normalized light vector as color output (0.5 * LightDir + 0.5).
- PS: Get light vector (2.0 * Color - 1.0) and renormalize it with the cube map.

What I do now:

- VS: Non-normalized light vector as texcoord output. (If I normalize it, the artifacts come back).
- PS: Get light vector directly and normalize it with the cube map.

In this image the color output of the PS is the light vector:

http://www.sicosys.org/~migposada/vectornightmare.jpg


--------------------------------------------------------------

Computer Programming is Magic! Hold the Power!

Share this post


Link to post
Share on other sites