Lambert Azimuthal Equal-Area Projection

This topic is 3009 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

Hi,

So, I'm writing a radiosity solver, and it works somewhat. Unfortunately, OpenGL does not allow 180° field-of-views, and certainly not with an area preserving projection.

I'm trying to write a vertex shader (in GLSL) that will transform arbitrary vertices into a lambertian map. The map should be centered along a patch's normal.

I found the Wikipedia article, which looked good, but it didn't work in practice (maybe I did it wrong; in any case, it doesn't specify how to center the map on arbitrary points (normals).

Imagine a unit sphere (globe). I want a equal-area projection of it. I have an arbitrary point on the sphere that I want to center the map on. Given other points on the sphere, what are their 2D locations on the map?

As an aside, I really don't care about the orientation of the map around its center; the pixels' colors are simply being counted in a circle around the center. The map can be of any orientation so long as it centers on the arbitrary point.

Can someone give me a decent transform that maps points (x,y,z) on a sphere to a 2D map (X,Y), centered on the 3D point (xc,yc,zc)?

Thanks,
G

Share on other sites
I know this doesn't exactly answer your question, but you don't really need to have an equal area projection; you just need to weight your pixels appropriately -- by the determinant of the Jacobian of your map at each point...

Share on other sites
Part of the reason for a transformation is to make the areas equal, but perhaps the more important reason is that OpenGL does not provide capability for a 180° frustum. It must be emulated in a vertex shader.

Share on other sites
Won't you need some kind of orientation, seeing your're rendering a hemisphere, not a sphere? i.e. different patches will be pointing in different directions, right?

As for cenering the view on an arbitrary point -- the traditional perpsective projection done in a vertex shader doesn't suppor this either -- that's what the view-matrix is for.
Before projecting your points (with your new projection, or with a traditional perpsective projection), you've got to transform them to be relative to the camera's position/orientation.
Usually you do this by multiplying world-space positions with the view-matrix, or model-space positions with the world-view-matrix.

However, if you really don't need orientation, all you need to do is subtract the camera position from the vertex position.

Then, once your data is in view-space (relative to the camera), you can project it to 2D.
Quote:
 Original post by GeometrianIt must be emulated in a vertex shader.
All projections are 'emulated in a vertex shader', unless you're using the fixed-function pipeline, which is just emulated in shaders behind the scenes.

Share on other sites
Quote:
 Won't you need some kind of orientation, seeing your're rendering a hemisphere, not a sphere? i.e. different patches will be pointing in different directions, right?
Right; that's this arbitrary point to be centered on. It's the normal of the patch. The "orientation" of the map around that point is what doesn't matter. For example, if the map were rendered rotated 147° around its center, the resultant texture would still be just as useful.
Quote:
 Then, once your data is in view-space (relative to the camera), you can project it to 2D.
That's really the part I need help with. I've got a vector from a vertex to the camera position. These vectors, when normalized, can be treated as points on a sphere. The vertices then need to be projected onto the 2D map somehow, with the normal vector (another point on the sphere) being the center of the map.
Quote:
Quote:
 It must be emulated in a vertex shader.
All projections are 'emulated in a vertex shader', unless you're using the fixed-function pipeline, which is just emulated in shaders behind the scenes.
Ah yes, pardon; I meant "hand-tailored" vertex shader.
-G

Share on other sites
Quote:
 Original post by GeometrianRight; that's this arbitrary point to be centered on. It's the normal of the patch. The "orientation" of the map around that point is what doesn't matter. For example, if the map were rendered rotated 147° around its center, the resultant texture would still be just as useful.
The camera is centered at a point, and is oriented to face a direction. A 'normal' is a direction, not a point, so it only gives you the orientation (with an undefined/inconsequential "roll" rotation as you mention). Assuming your "patch normals" aren't all (0,0,-1), then you're going to need to rotate the world into view-space, just like you do with a normal projection, and you're going to need to translate the world to view-space using the patch position (not normal).

Quote:
 Original post by GeometrianThe vertices then need to be projected onto the 2D map somehow, with the normal vector (another point on the sphere) being the center of the map.
Ahh, I misunderstood this. You're visualising the point on the unit-sphere that corresponds to the patches normal, and trying to get that to end up in the center when you project to 2D, right? So when you're talking about centering on a point, you're thinking in the 2D space?

You achieve this by pre-rotating all the the points into view-space (pre projection - still in 3D). In view-space, the camera is always pointing in some defined direction (e.g. [0,0,-1]), so you don't have to worry about "where the center is" when go you to 2D.

The way projection usually works in a vertex shader is that the camera is assumed to be at [0,0,0], and is facing [0,0,1] or [0,0,-1] (DirectX or OpenGL). The world-view matrix moves all of your input data to be relative to the camera, which makes these assumptions true. Then, once you've got all the data in place, relative to this camera at the origin, then you perform your projection.

Share on other sites
Well, looks like we're back to code again:
vec3 axis_vector = normalize(   cross( patch_normal, vec3(0.0,0.0,-1.0) )   );float angle = acos(   dot( patch_normal, vec3(0.0,0.0,-1.0) )   );mat3 rot_matrix = rotation_matrix_arbitrary(axis_vector,angle);//rotation_matrix_arbitrary constructs a rotation matrix around a//given axis for a given angle.  From experience, I know it works.////        mat3 rotation_matrix_arbitrary(vec3 vec,float angle) {//            float c=cos(angle);float s=sin(angle);//            float C = 1.0 - c;//            float  xs = vec.x* s; float  ys = vec.y* s; float  zs = vec.z* s;//            float  xC = vec.x* C; float  yC = vec.y* C; float  zC = vec.z* C;//            float xyC = vec.x*yC; float yzC = vec.y*zC; float zxC = vec.z*xC;//            return mat3(vec3(vec.x*xC+c,    xyC-zs,    zxC+ys),//                        vec3(    xyC+zs,vec.y*yC+c,    yzC-xs),//                        vec3(    zxC-ys,    yzC+xs,vec.z*zC+c));//        }vertex.xyz = rot_matrix*vertex.xyz;//vec4 vertex = gl_Vertex//This ought to rotate the vertex to be in the proper//space; as if the normal had been facing (0.0,0.0,-1.0)vec3 vertex_vector = vertex.xyz - patch_center;vertex_vector = normalize(vertex_vector);//Vector from vertex to the camera position//http://en.wikipedia.org/wiki/Lambert_azimuthal_equal-area_projectionfloat z_term = pow( 2.0/(1.0-vertex_vector.z), 0.5 );vec2 coord = z_term*vertex_vector.xy; // in range (-2,2)//coord should now be the same as ("X","Y") in the Wikipedia articlecoord = coord*0.5; //in range (-1,1), for the sake of gl_Positionvertex.xy = coord;vertex.z = 0.0;gl_Position = vertex;
For a Cornell box, observed from the top:

A standard 90° view looks like:

With the custom vertex shader above:
[EDIT: a black screen]
[EDIT-EDIT:]
[EDIT-EDIT-EDIT:]

Doesn't look quite right . . . but what am I doing wrong?

Thanks,
-G

[Edited by - Geometrian on July 22, 2010 2:07:06 AM]

Share on other sites
Nothing! Your scene just isn't highly tessellated enough. Same issue with dual paraboloid setups-- if you'll notice, all the vertices are in the right place, but the edges are screwing you over.

Share on other sites
I would think that, except that the first view is from the top, while the second is from the bottom--you can actually see the little white patches, where the first view was taken from!

Share on other sites
Ok, so the major problem was that "vertex" was being rotated; I need to rotate "vertex_vector". Here's the part of the shader so far:
vec3 axis_vector = normalize(   cross( patch_normal, vec3(0.0,0.0,-1.0) )   );float angle = acos(   dot( patch_normal, vec3(0.0,0.0,-1.0) )   );mat3 rot_matrix = rotation_matrix_arbitrary(axis_vector,-angle);vertex_vector = vertex.xyz - patch_center;vertex_vector = rot_matrix*vertex_vector;depth = length(vertex_vector)/1000.0;vertex_vector = normalize(vertex_vector);//http://en.wikipedia.org/wiki/Lambert_azimuthal_equal-area_projectionfloat z_term = pow( 2.0/(1.0-vertex_vector.z), 0.5 );// in range (-2,2)//however, we discard vertex_vector.z>-0.01, to get a hemisphere so in range [-sqrt(2),sqrt(2)]vec2 coord = z_term*vertex_vector.xy;coord = coord/pow(2.0,0.5); //in range (-1,1)vertex.xy = coord;vertex.z = 0.0;vertex.w = 1.0;gl_Position = vertex;
The result:

Seems to be working excellently. Thanks for the help, everyone!
-G

1. 1
2. 2
3. 3
Rutin
22
4. 4
5. 5

• 13
• 19
• 14
• 9
• 9
• Forum Statistics

• Total Topics
632936
• Total Posts
3009310
• Who's Online (See full list)

There are no registered users currently online

×

Important Information

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!