• Advertisement
Sign in to follow this  

Calculating Normals From Displacment

This topic is 2195 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Would it be feasible to cut down on texture memory by using only displacement maps and then calculating the normals in your shader?

Share this post


Link to post
Share on other sites
Advertisement
You cut down on texture memory, but increase the number of samples needed (three heights are required at a minimum) and increase the amount of calculations required just to get your normal vector. You would probably be better off converting to a spherical coordinate system for your normal vectors and then just converting to a cartesian coordinate system after loading the two parameters.

Depending on your requirements, you might even be able to pack both variables into a single component for a minimum amount of memory and bandwidth required...

Share this post


Link to post
Share on other sites
Does the spherical coordinates for normal mapping work? Do you have to do point sampling?

Not saying it doesn't work, cause I haven't tried it, but wouldn't bi linear interpolation cause some odd artifacts, since it would take the long route around the sphere if say you had coordinate -1 next to coordinate 1.

Share this post


Link to post
Share on other sites
[quote name='Jason Z' timestamp='1326528818' post='4902596']
You would probably be better off converting to a spherical coordinate system for your normal vectors and then just converting to a cartesian coordinate system after loading the two parameters.
[/quote]

I'm not certain, but I think if you stored spherical coordinates in your texture, filtering would produce incorrect results. I could already store just X and Y for normals if I wanted to cut it down to two channels. I was just curious if it could be cut down to just a single channel while still giving good results.

Share this post


Link to post
Share on other sites
[quote name='Triangles-PCT' timestamp='1326541785' post='4902624']
Does the spherical coordinates for normal mapping work? Do you have to do point sampling?

Not saying it doesn't work, cause I haven't tried it, but wouldn't bi linear interpolation cause some odd artifacts, since it would take the long route around the sphere if say you had coordinate -1 next to coordinate 1.
[/quote]

That is actually a good point - the interpolation wouldn't be correct in some situations (as you mentioned, since the angle only increases in one direction). However, how often do you have texture data with wildly swinging texels next to one another? In general, I think it would still work as an approximation, even if it wasn't an exact one to one mapping...

Share this post


Link to post
Share on other sites
It's possible, but as Jason mentioned you would need more texture samples in the shader to compute a normal from a height map. You would probably also end up with lower-quality normals, as normal maps are often generated with wide filtering kernel.

Share this post


Link to post
Share on other sites
I think that storing normal maps in spherical coordinates could lead to errors, and I think storing just X and Y leads to poorer quality.

Has anyone used a spheremap transform method for storing normal maps into two channel textures?

ex: [url="http://aras-p.info/texts/CompactNormalStorage.html#method04spheremap"]http://aras-p.info/t...thod04spheremap[/url]

To me, it looks like the linear interpolation of texture filtering wouldn't cause errors. In addition, it seems to be more accurate and the instructions for the transformation are cheaper than spherical coordinates.

Share this post


Link to post
Share on other sites
Storing just XY is very common when using compressed texture formats. I haven't tried spheremap transform myself for storing normal maps, but it's possible that it might result in better quality. However spheremap transform is not linear, so linear interpolation definitely will not produce correct results. But as long as you generate the mip levels individually before encoding, then the error from interpolation might be less than the error resulting from storing XY and reconstructing Z. You'd have to do the math or run some experiments to know for sure.

Share this post


Link to post
Share on other sites
I gave is a shot. There is no noticeable error for areas of smooth normals, but a noticeable error in places with harsh changes in normal direction.

[img]http://img838.imageshack.us/img838/4346/normals00xyz.jpg[/img]

I used the reference image from [url="http://aras-p.info/texts/CompactNormalStorage.html"]http://aras-p.info/t...malStorage.html[/url]

Then again, I'm getting a lot of errors in the Z component when storing just X and Y. I'm not really fond of either technique. [img]http://public.gamedev.net//public/style_emoticons/default/sleep.png[/img]

[b]Edit:[/b] Actually, experimenting with it some more, it might not be so bad. This time I used some real tangent normal maps, and I used more conservative interpolation (1.5X) and the results are almost identical to storing all three components. My tests seem to show that spheremap transform results in slightly less error than the X&Y approach, despite being non-linear.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement