Calculate normal knowing only point coordinates ...

Started by
2 comments, last by haegarr 12 years, 8 months ago
Hi, I don't know if there is a possible solution to this: I have a routine that gathers point coordinates using the depth buffer (using opengl glreadpixels function).
So, when the user click an object in the 3D window, the point coordinates are known. How would I calculate nornals for that point, if possible ?
Advertisement
There are 2 principle ways:

1. During rendering not only color and depth but also the normal is computed and stored in its own buffer. Obviously, the normal belonging to a pixel can be read from such a normal buffer. Inaccuracies will occur at object margins where only sub-pixels are covered, of course.

2. The co-ordinates of the pointer position are "unprojected", i.e. the screen co-ordinates are mapped back into view space, and further transformed into world space. You can use either the 2D co-ordinates to compute a ray in the world space and search for the frontmost hit on an object, or else also use the depth to compute a point in world space and search for the object with its surface closest to that point. However, you can compute the normal with the knowledge of the point on the surface.
Thanks haegarr. About solution 1, how do I access the normal buffer ? I couldn't find any info about it. Do I need to render geometry as VBO's in order to be able to access the normal buffer (currently I use display lists).

About solution 1, how do I access the normal buffer ? I couldn't find any info about it. Do I need to render geometry as VBO's in order to be able to access the normal buffer (currently I use display lists).

There is no normal buffer at default. You have to use an appropriate fragment shader script and store the fragment's normal either in a frame buffer image or else in a texture (RTT).

This topic is closed to new replies.

Advertisement