• Advertisement
Sign in to follow this  

Simle HLSL cubemap reflection

This topic is 3228 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I recently started to play around with HLSL shaders. I'm currently trying to make real simple water, with bump mapping and cubempa reflection. Now my problem is with that actual cubemap reflection. The code is pretty simple, and for once I undertsand it all. However, it seems to procude strange results. I find it quite annoying because nobody seems to have problems and I do, and I just can't point the source. So I wondered if any of you pros could help me ;). I've put two screenshots to show the results. I apply the shader on a simple square made from 4 vertices with their normal upward (0, 1, 0). This screen shot shows how the cubemap projection on the plane just mess up when the camera is positionned according to particular angles. http://img21.imageshack.us/img21/6876/screen1tso.th.jpg The second one shows the main problem, it seems that something's wrong with the coords interpolation between the opposite corners. There is clearly (easier to see when its moving) a seem between the two triangles. http://img21.imageshack.us/img21/238/screen2uhk.th.jpg I've put the HLSL code here, you'll see that it's quite simple.
float4x4 WorldViewProj : WORLDVIEWPROJ;
float4x4 World : WORLD;
float ElapsedTime;
float3 Eye;

texture ReflectionTexture;

samplerCUBE ReflectionSampler : TEXUNIT3 = sampler_state
{
	Texture = (ReflectionTexture);
	MIPFILTER = LINEAR;
	MINFILTER = LINEAR;
	MAGFILTER = LINEAR;
};


struct VS_IN
{
	float4 Position : POSITION;
	float3 Normal : NORMAL;
	float2 Texture : TEXCOORD0;
};

struct VS_OUT
{
	float4 Position : POSITION;
	float3 Texture0 : TEXCOORD0;
};

struct PS_OUT
{
	float4 Color : COLOR0;
};

VS_OUT vs_func(in VS_IN In)
{
	VS_OUT Out;

	Out.Position = mul(In.Position, WorldViewProj);

	float3 WorldPos = mul(In.Position, World).xyz;

	float3 WEye = mul(World, Eye);

	float3 V = normalize(WorldPos - Eye);

	float3 Normal = normalize(mul(World, In.Normal));
	
	Out.Texture0 = reflect(V, Normal);

	return Out;
}

PS_OUT ps_func(in VS_OUT In)
{
	PS_OUT Out;
	
	float4 color = texCUBE(ReflectionSampler, In.Texture0);

	Out.Color = color;
	
	return Out;
}

technique Technique0
{
	pass p0
	{
		FogEnable = FALSE;
		Lighting = FALSE;

		Sampler[0]= (ReflectionSampler);
		
		VertexShader = compile vs_1_1 vs_func();
		PixelShader  = compile ps_2_0 ps_func();
	}
}

Basically, I compute the view direction by substracting the eye position (camera position) from the vertex position, both in world space. I use this view direction along with the normal (which i'll recall is always (0,1,0)) to get a reflection vector which is actually used as my 3d coords for the cubemap. So, any idead why that wouldn't work ? P.S. : The cubemap has nothing to do with the skybox, it's normal :P [Edited by - GLForce on April 17, 2009 4:54:05 PM]

Share this post


Link to post
Share on other sites
Advertisement
You're doing mul(World, Normal) which is actually transforming your normal by the transpose of your world matrix. Try swapping those two.

Also, if you want to post code here you can use the "code" and "source" tags. See the FAQ.

Share this post


Link to post
Share on other sites
Well I forgot to say but those kind of operations, mul, I've tried every possible way of doing them there is. This way, the other way you proposed, with an implicit float4 like this, with the explicit way (float4(In.Normal, 1), World), etc.. etc.. I'll try anyway, who knows. Thanks for the reply. Oh and thanks for the code tag, I should have check the FAQ more.

EDIT : Tried it. Same result. I really wonder because any of these changes just doesn't make any difference, that's odd.

Share this post


Link to post
Share on other sites
Hi,

There are some mistakes I saw:

1 - Why're you multiplying your eye pos. vector with world matrix? Is that vector in local space? I don't think so, you shouldn't do it.

2 - Why're you multiplying your normal vector with world matrix like that:
float3 Normal = normalize (mul (World, In.Normal));

HLSL matrix structures are not column-major (you're not using Cg, CgFX or GLSL), they are row-major. So, first parameter must be vector, and the second one must be matrix. Also, your normal vector isn't float4, it's float3. One more thing to say: it would be better if you use inverse transpose of world matrix instead of world. So, that line should be changed like this:

float3 Normal = normalize (mul (float4(In.Normal, 0), WorldInvTranspose));

Rest of your code seems correct.
Hope this helps.

Regards,
Rohat.

Share this post


Link to post
Share on other sites
Quote:
Original post by GLForce
Well I forgot to say but those kind of operations, mul, I've tried every possible way of doing them there is. This way, the other way you proposed, with an implicit float4 like this, with the explicit way (float4(In.Normal, 1), World), etc.. etc.. I'll try anyway, who knows. Thanks for the reply. Oh and thanks for the code tag, I should have check the FAQ more.

EDIT : Tried it. Same result. I really wonder because any of these changes just doesn't make any difference, that's odd.
That is most likely because your world matrix is an identity matrix, i.e. the object is at the origin without any rotation applied. Then the transpose of the matrix is the same thing, making it look like nothing changed. Get in the habit of always putting the point being transformed first, and the transform second - unless you explicitly want to multiply by the transpose!

As others have pointed out, the view position is in world space already so you should not transform it. If your world matrix is identity (which I think it is) then making this chance won't appear to change anything either, but you still need to change it for when you have a non-identity world matrix.

I can't really see problem #2 in the image - its showing up really small for some reason. Try changing your mul statements and the view calculation, then repost your screens and code and we'll take it from there.

Share this post


Link to post
Share on other sites
Here's the updated code. Not much has changed. For the eye position, I already did the changes, I did wondered why it had to be mutliplied by the world matrix. I probably understood the sample I followed wrong. As for the results, still no change. I'm posting back the second picture (it was actually a thumbnail and not the picture itself), even if it still doesn't show well what I want to show.


float4x4 WorldViewProj : WORLDVIEWPROJ;
float4x4 World : WORLD;
float4x4 WorldIT : WOLRDIT;
float3 Eye;

texture ReflectionTexture;

samplerCUBE ReflectionSampler : TEXUNIT3 = sampler_state
{
Texture = (ReflectionTexture);
MIPFILTER = LINEAR;
MINFILTER = LINEAR;
MAGFILTER = LINEAR;
};


struct VS_IN
{
float4 Position : POSITION;
float3 Normal : NORMAL;
float2 Texture : TEXCOORD0;
};

struct VS_OUT
{
float4 Position : POSITION;
float3 Texture0 : TEXCOORD0;
};

struct PS_OUT
{
float4 Color : COLOR0;
};

VS_OUT vs_func(in VS_IN In)
{
VS_OUT Out;

Out.Position = mul(In.Position, WorldViewProj);

float3 WorldPos = mul(In.Position, World).xyz;

float3 V = normalize(WorldPos - Eye);

float3 Normal = normalize(mul(float4(In.Normal,0),WorldIT));

Out.Texture0 = reflect(V, Normal);

return Out;
}

PS_OUT ps_func(in VS_OUT In)
{
PS_OUT Out;

float4 color = texCUBE(ReflectionSampler, In.Texture0);

Out.Color = color;

return Out;
}

technique Technique0
{
pass p0
{
FogEnable = FALSE;
Lighting = FALSE;

Sampler[0]= (ReflectionSampler);

VertexShader = compile vs_1_1 vs_func();
PixelShader = compile ps_2_0 ps_func();
}
}



Pic 2
http://img21.imageshack.us/img21/238/screen2uhk.jpg

I seriously think, after seeing it in movment, that the problem comes from the geometry. However, the ting is quite simple : as I said, it's a simple quad going from -10 to 10 on both x and z axis, positionned at y=0, and with an upward normal (0, 1, 0). But from the look of it, and I really have a hard time explaining it because english isin't my first language and the idea itself is quite complex, the interpolation seems wrong. I know it's hard to see in the screenshot, but try to visualize the polygon separation line, going from the lower left corner to the upper right, and that from that line to the corners, the cubemap projection takes differant angles (again, hard to explain). I'll try with a little MS paint schema.

http://img407.imageshack.us/img407/4994/shemam.jpg

Sorry did my best.

Oh and thanks again for all your time and help.

Share this post


Link to post
Share on other sites
I thought about this one too. The thing is backface culling is activated so if any of the polygons were not correctly winded, it would not be rendered. However, I'll follow your idea and simply try to wind it in different ways, but always in the right way. Logically, it wouldn't make any difference, but you know sometimes... And I finally tested my shader in an existing engine (Virtools), and, as I thought, the shader is all right. It worked on a plane, a cube, even a teapot. So my conclusion is that the geometry is really the problem. That's why I'll try to play with winding. Thanks !

Share this post


Link to post
Share on other sites
Are you certain that the cube map is properly loaded as a cube map, and not as a 2D texture? Check which D3DX function you are using to load the map and ensure it is the 'Cube' version. I have made this mistake in the past, and D3D9 will happily continue on with the 2D interface bound as a cube texture!

You might also try to rotate your quad along the z axis to see if any of the other faces of the cube map show up - this could also indicate that the cube map interface is somehow incorrect.

Share this post


Link to post
Share on other sites
Ahhh very good point. And I didin't even think about it. I've implemented a resources manger in my engine responsible for loading stuff like meshes, textures, etc.. And guess what, I just didin't make anything about cube textures. So result, I was using the same functions to load 2d textures and cube textures. I can't wait to try this as soon as I can (right now i'm supposed to study my final exams, what a pain in the...). Thanks alot, again, you give me confidence in the human kind :P

By the way, I've looked quickly at your articles and the online book, really interesting. I'll make sure to take a deeper look at it (fav'd).

Share this post


Link to post
Share on other sites
Yeah, I'll look for that too. The cubemap stuff solved one of my problems (one I didin't clearly mention), but the one you just pointed out is still present. Thanks for the tip.

EDIT : Well i've done some seraching and it seems that perspective correction is not a subject that is much talked about. Could you help me ? Send me a few pointers maybe ? (no bad C++ joke please :P)

Share this post


Link to post
Share on other sites
I,ve managed to record a little video showing what's wrong. At first, I'm at position (0,0,0) and everything seems fine. But look as I go further from the plane, the deformation appears. And then, but I think it's not visible in the video, as I'm really far, it seems right again. See for yourself.

Share this post


Link to post
Share on other sites
Quote:
Original post by GLForce
I,ve managed to record a little video showing what's wrong. At first, I'm at position (0,0,0) and everything seems fine. But look as I go further from the plane, the deformation appears. And then, but I think it's not visible in the video, as I'm really far, it seems right again. See for yourself.

Such a tough nut to crack :)

I think at this point you should take a frame snapshot with PIX and view what your geometry looks like before and after the vertex shader. This will show you if anything looks out of the ordinary. Since you already tested the shader elsewhere and it worked, the only possible sources of error are the geometry, texture or perhaps a sampler error. I'm looking forward to seeing this thing work!

P.S. Thanks for the diggs on the book and articles - if you have any comments or questions feel free to PM me and we can talk about it!

Share this post


Link to post
Share on other sites
Well, I'll try this as soon as I can. The thing is, this soon may come late. I'm still on VC++ 6, which means I've never been able to use a SDK over Summer 2003 (ouch), which means no PIX. I've got to kick my butt and finally install VC++ 2008 and a decent version of the directx9 sdk. Thanks again.

Share this post


Link to post
Share on other sites
Hi,
I said "it looks like" there is no perspective correction. I thought that Texture0 was interpolated without perspective correction. Taking a closer look at your picture made me realize that the perspective correction is there. As pointed out by Jason Z, the problem seems to be in your main application. I suspect there is someting wrong with the normal vectors. As you know, you can check it quickly in your shader by setting the color equal to the normal vector.

Share this post


Link to post
Share on other sites
Another thing I tested out before. In fact, I used color alot to test many different vectors. The normal vectors are right (the shape appears green, which means (0,1,0)). So normals are not the problem. Right now I managed to switch my project to VC++ 2008 and I installed the November 2008 SDK, so PIX is mine. I'll try it to see what it gives me.

Share this post


Link to post
Share on other sites
I'm asking alot from you but does anyone think he could help me get a conclusion from this PIX run file ? I mean, I guess everything I see is right, but how would I know. I'm particlary confused by 4 floats coordinates (what is the w for ?). So I've uploaded the trace file, and put two screenshots in case you can't get the file itself.

file :
http://www.2shared.com/file/5422200/310c1d13/cubicreflection.html

preVS:
http://img106.imageshack.us/img106/4861/prevs.jpg

postVS:
http://img106.imageshack.us/img106/7189/postvs.jpg

To make things clear, only Position and Texture0 count in the postVS output. The other texture coordinates are used for other applications (bump mapping and stuff).

Thanks alot !

Share this post


Link to post
Share on other sites
Ok, I've got some results now. I tested a little more in the engine I talked about earlier (Virtools) to find that the only reason why it worked right was because the primitives I was using had a good number of polygons. When I tested with a simple two-polygons plane like mine, same problem. I foudn my solution, however, I'm not very satisfied. Is it normal that with two polygons the interpolation is too bad to make a good reflection ? Is there a way to make it good enough ? Becuase I was planning to use this on a water plane, and so I was happy that I could use a minimum number of polygons. Now I've got to make it go over 2500 for a 100 by 100 plane, which is kind of too much. And at this count, there is still a little deformation but I just ignored it because it's water (which will be perturbated anyway). Any thought ? Thanks.

Share this post


Link to post
Share on other sites
If interpolation is the problem, then you have two options:

- Move some or more of the calculation from the vertex shader to the pixel shader.

- Subdivide the mesh into smaller polys.

Subdivision is usually the best performing option, as there should still be far less vertices than pixels. However subdivision doesn't work so well if you can get so close to the object that you'd need thousands of polys for it to look correct.

Share this post


Link to post
Share on other sites
I'll go with the multiple vertices. I just figured it could help me add some effect to the water (waves) and well even if there is deformation at close view, it's explained by the fact that it is water, so... That's better than nothing I guess. Thanks for all your help everybody, you really are great people. See ya !

Share this post


Link to post
Share on other sites
Don't give up yet! That's not an interpolation error - after looking at your input geometry, all of it is coplanar in the xz plane. There is no reason that the image should distort the way that you are seeing there. Since it is all coplanar, it should look like a mirror image regardless of where the calculations are performed (i.e. vertex shader or pixel shader). I would suspect that the normal vector transformation that you are doing is somehow incorrect. Are you applying a non-uniform scaling in your world matrix? I usually cast the world matrix to 3x3 then apply the transform if working with vectors - but your way seems like it would be the same effect. Does anyone else use his way?

I haven't had a chance to download the PIX file, but perhaps I can get it tonight after work and take a detailed look. Also, your vertex output structure specifies only the reflected vector, but your screenshot of the output shows two additional texture coordinate sets. Any ideas about why that is?

Share this post


Link to post
Share on other sites
Yeah that's what I said in the post where i put those images. The other vectors (using the TEXTUREX semantic) are for other uses not related with the reflection. Basically they are light vector and an attenuation factor. I'll try to work with the normal transformation, we'll see. At least now we're back to a shader problem. Really this problem seems like a ping-pong game to me. :P

EDIT: I've just tried something. Instead of using the tansformed normal, I driectly used the one in input. As my transmormation matrix, in this particular case, is the identity matrix, it shouldn't make a difference. It gave me the same result, which means the transformation is not the source of the problem, or, on the opposite, that the transformation is not applied (even if it's supposed to be an identity matrix, who knows). Well, on that, I go back to my studies. Have a nice day !

Share this post


Link to post
Share on other sites
Well, I've made some more tests (I know, I should be studying) and here my conclusions : I've transfered the transformed vertex position and the trasnformed normal to the pixel shader through float3 Texcoords. I do the reflect thing directly in the PS, which makes it all work correctly. Of course, my first answer to this would be : interpolation problem ! But wait a minute, I pass both position and normal which should be inetrpolated the exact same way as the precalculed cube coords (like in the original shader). So what would it be ? The reflected coords need more precision than the position and the normal and this precision is lost in the transfer ? I'm really confused. Of course, this new solution works, but I don't think I like the "do 90% of the calculation in the pixel shader" thing. Any thought ? Thanks.

Share this post


Link to post
Share on other sites
Something goofy is going on - from your PIX screenshots you only have four input vertices, but the post transform positions and normal vectors have five unique output vertices! Since the VS is supposed to perform the same on every vertex, I don't understand how you could generate more outputs than inputs... Have you checked the D3D diagnostic output (enabled through the D3D ctrl panel)?

Something isn't right here... I can't downlaod the PIX file either to look closer - do you have a different hosting method?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement