spherical billboards

Started by
4 comments, last by willvarfar 12 years, 7 months ago
I want to draw *lots* of balls, so I've billboarded them, with the rotating to face camera done in a vertex shader:

void main() {
vec4 wpos = vec4(gl_Vertex.xyz, 1.0);
vec4 epos = gl_ModelViewMatrix * wpos;
epos.xy += gl_MultiTexCoord0.xy;
gl_Position = gl_ProjectionMatrix * epos;
float radius = abs(gl_MultiTexCoord0.x);
vec2 tex = -gl_MultiTexCoord0.xy / radius;
tex *= 0.5;
tex += 0.5;
gl_TexCoord[0].xy = tex;
}


for each ball, I have the 'center' in the vertex attribute (that's duplicated; each vertex gets a copy) and the radius and corner in the texture coordinate:

for(boards_t::const_iterator i=boards.begin(); i!=boards.end(); ++i) {
vbo.push_back(tex_vec_t(vec2_t(-i->radius, i->radius),i->center));
vbo.push_back(tex_vec_t(vec2_t(-i->radius,-i->radius),i->center));
vbo.push_back(tex_vec_t(vec2_t( i->radius,-i->radius),i->center));
vbo.push_back(tex_vec_t(vec2_t( i->radius, i->radius),i->center));
}


and the vbo is then an interleaved array with GL_T2F_V3F.

Is there a more speed-efficient or space-efficient-yet-not-massively-slower approach?

My vertex shader knows the center and radius of the sphere so presumably I can pass that to the shader as varying since the interpolation of it won't alter the value as all four vertices in each ball's quad have the same values.

Can I somehow use this to produce proper spherical depth and normals in the fragment shader? Should I have separate normal and depth textures, or is it cheaper to compute in the shader? And what would the math be?
Advertisement
If you render them as point sprites you can avoid dealing with quads at all - you can just upload your per-particle centres/radii to the VBO, render as point sprites and have OpenGL generate the quads (and texture coords) for you. As for normals/depths, I'd use textures for simplicity/flexibillity but in theory you could generate these in the fragment shader.

Here's a video of mine, rendering lots of red balls using point sprites with normal maps:


If you render them as point sprites you can avoid dealing with quads at all - you can just upload your per-particle centres/radii to the VBO, render as point sprites and have OpenGL generate the quads (and texture coords) for you. As for normals/depths, I'd use textures for simplicity/flexibillity but in theory you could generate these in the fragment shader.

Here's a video of mine, rendering lots of red balls using point sprites with normal maps:

http://www.youtube.c...h?v=sL3BMLZ7Ap8


Thank you, that's very encouraging and a great looking demo!

The GL_POINT_SPRITE hint will be well worth investigating when I've got the depth and normals right. I'll hold fire on moving to GL_POINT_SPRITE(S) because my quads are at this point working.

So now I'm focused on my normals map. I have created a texture where the normals are in the RGB and the A holds a coverage; if the (interpolated) A is 0 then the fragment is discarded; if it is 1, then it should be shaded solid and the z set. If its in-between, its on the edge, and the z should be the same as the billboard and the pixel should be blended by A. The code to generate it is:


const int R = 31, SZ = power_of_two(R*2);
std::vector<vec4_t> p;
for(int y=0; y<SZ; y++) {
for(int x=0; x<SZ; x++) {
const float rx = (float)(x-R)/R, ry = (float)(y-R)/R, d = rx*rx+ry*ry;
if(d >= 1) { // outside sphere
p.push_back(vec4_t(1,0,0,0));
} else {
vec3_t normal(rx,sqrt(1.-d),ry);
p.push_back(vec4_t(normal,1));
}
}
}


My focus is trying to get the normals and right z set; what I have written is, first off, not working:

[[vertex shader]]

varying vec3 light_dir;
varying float radius;
varying float z;
void main() {
vec4 wpos = vec4(gl_Vertex.xyz, 1.0);
vec4 epos = gl_ModelViewMatrix * wpos;
z = epos.z;
light_dir = gl_LightSource[0].position.xyz;
epos.xy += gl_MultiTexCoord0.xy;
gl_Position = gl_ProjectionMatrix * epos;
radius = abs(gl_MultiTexCoord0.x);
vec2 tex = gl_MultiTexCoord0.xy / radius;
tex *= 0.5;
tex += 0.5;
gl_TexCoord[0].xy = tex;
};

[[fragment shader]]

varying vec3 light_dir;
varying float radius;
varying float z;
uniform sampler2D tex;
uniform float z_near;
uniform float z_far;
void main() {
vec4 texel = texture2D(tex,gl_TexCoord[0].xy);
if(0.0 == texel.a) discard;
vec3 colour = vec3(1.0,1.0,0.5);
float intensity = max(dot(-normalize(light_dir),texel.rgb),0.0);
gl_FragColor = vec4(colour * intensity,texel.a);
float depth = z - texel.z * radius;
gl_FragDepth = (z_near * z_far) / (z_far - depth * (z_far - z_near));
};


I'm not convinced my normals are coloured right.

The depth is wrong; they are generally drawn in roughly the right place, but there is popping between intersecting spheres when the camera moves.

I've found several competing definitions of gl_FragDepth.
It seems strange to me that you have to do anything to the texture coordinates in the vertex shader, other than just passing them straight through.

I'd recommend outputting the normal map as the colour from the fragment shader until that looks correct, before trying to do anything with lighting / z / alpha.

It seems strange to me that you have to do anything to the texture coordinates in the vertex shader, other than just passing them straight through.


I am encoding the radius in the texture coords rather than computing the vertex positions - as rotated to face camera - in the calling code. I think this is fairly standard approach for billboards? (At least, I adapted it from the billboard code I absorbed on the 'net.)


I'd recommend outputting the normal map as the colour from the fragment shader until that looks correct, before trying to do anything with lighting / z / alpha.
[/quote]

yes if I paint the normal map as colour without lighting it looks great.

So I have a normal map that has convincing (but I am easily convinced) values in it, and a single light.
My issue with the normals was that when I uploaded my normal map I made two elementary mistakes:

  1. I had swapped the y and z
  2. I had not moved from -1:1 to 0:1 - I just had to +1, /2 to get them into the right range

My normal mapping is now working.

This leaves me struggling to set the z-depth correctly. I do not yet think I have a problem with z-fighting - in that I'm not yet worried that my near and far are too far apart.

But I do have them at the wrong z.

My current math seems wrong; I have popping - its as though they are on the wrong plane, or all on the same plane or something; I think they might be very near z_near since they draw in front of my terrain even when they should be behind it.


float depth = z - (texel.z - 0.5) * radius;
gl_FragDepth = (z_near * z_far) / (z_far - depth * (z_far - z_near));


(where z is the incoming vertex shader's gl_Position.z, and not gl_FragCoord.z)

What is the proper way to set the true frag z?

This topic is closed to new replies.

Advertisement