View more

View more

View more

### Image of the Day Submit

IOTD | Top Screenshots

### The latest, straight to your Inbox.

Subscribe to GameDev.net Direct to receive the latest updates and exclusive content.

# Calculating Normals From Displacment

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

8 replies to this topic

### #1Chris_F  Members

Posted 13 January 2012 - 08:02 PM

Would it be feasible to cut down on texture memory by using only displacement maps and then calculating the normals in your shader?

### #2Jason Z  Members

Posted 14 January 2012 - 02:13 AM

You cut down on texture memory, but increase the number of samples needed (three heights are required at a minimum) and increase the amount of calculations required just to get your normal vector. You would probably be better off converting to a spherical coordinate system for your normal vectors and then just converting to a cartesian coordinate system after loading the two parameters.

Depending on your requirements, you might even be able to pack both variables into a single component for a minimum amount of memory and bandwidth required...

Jason Zink :: DirectX MVP

Direct3D 11 engine on CodePlex: Hieroglyph 3

Games: Lunar Rift

### #3Triangles-PCT  Members

Posted 14 January 2012 - 05:49 AM

Does the spherical coordinates for normal mapping work? Do you have to do point sampling?

Not saying it doesn't work, cause I haven't tried it, but wouldn't bi linear interpolation cause some odd artifacts, since it would take the long route around the sphere if say you had coordinate -1 next to coordinate 1.

### #4Chris_F  Members

Posted 14 January 2012 - 10:15 AM

You would probably be better off converting to a spherical coordinate system for your normal vectors and then just converting to a cartesian coordinate system after loading the two parameters.

I'm not certain, but I think if you stored spherical coordinates in your texture, filtering would produce incorrect results. I could already store just X and Y for normals if I wanted to cut it down to two channels. I was just curious if it could be cut down to just a single channel while still giving good results.

### #5Jason Z  Members

Posted 14 January 2012 - 10:44 AM

Does the spherical coordinates for normal mapping work? Do you have to do point sampling?

Not saying it doesn't work, cause I haven't tried it, but wouldn't bi linear interpolation cause some odd artifacts, since it would take the long route around the sphere if say you had coordinate -1 next to coordinate 1.

That is actually a good point - the interpolation wouldn't be correct in some situations (as you mentioned, since the angle only increases in one direction). However, how often do you have texture data with wildly swinging texels next to one another? In general, I think it would still work as an approximation, even if it wasn't an exact one to one mapping...

Jason Zink :: DirectX MVP

Direct3D 11 engine on CodePlex: Hieroglyph 3

Games: Lunar Rift

### #6MJP  Moderators

Posted 14 January 2012 - 12:52 PM

It's possible, but as Jason mentioned you would need more texture samples in the shader to compute a normal from a height map. You would probably also end up with lower-quality normals, as normal maps are often generated with wide filtering kernel.

### #7Chris_F  Members

Posted 14 January 2012 - 07:38 PM

I think that storing normal maps in spherical coordinates could lead to errors, and I think storing just X and Y leads to poorer quality.

Has anyone used a spheremap transform method for storing normal maps into two channel textures?

ex: http://aras-p.info/t...thod04spheremap

To me, it looks like the linear interpolation of texture filtering wouldn't cause errors. In addition, it seems to be more accurate and the instructions for the transformation are cheaper than spherical coordinates.

### #8MJP  Moderators

Posted 15 January 2012 - 12:55 AM

Storing just XY is very common when using compressed texture formats. I haven't tried spheremap transform myself for storing normal maps, but it's possible that it might result in better quality. However spheremap transform is not linear, so linear interpolation definitely will not produce correct results. But as long as you generate the mip levels individually before encoding, then the error from interpolation might be less than the error resulting from storing XY and reconstructing Z. You'd have to do the math or run some experiments to know for sure.

### #9Chris_F  Members

Posted 15 January 2012 - 02:11 PM

I gave is a shot. There is no noticeable error for areas of smooth normals, but a noticeable error in places with harsh changes in normal direction.

I used the reference image from http://aras-p.info/t...malStorage.html

Then again, I'm getting a lot of errors in the Z component when storing just X and Y. I'm not really fond of either technique.

Edit: Actually, experimenting with it some more, it might not be so bad. This time I used some real tangent normal maps, and I used more conservative interpolation (1.5X) and the results are almost identical to storing all three components. My tests seem to show that spheremap transform results in slightly less error than the X&Y approach, despite being non-linear.

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.