Geometry shader, point sprites to spheres

Started by
14 comments, last by m_power_hax 13 years, 3 months ago
I'm working on transforming a set of point sprites to sphere using a geometry shader (i found that it could be a good idea).
RIght now, the vertex and fragment shader are working, they make the point sprites look like sphere but in 2D (the texture of the sphere is always facing you). When i'm trying to use the geometry shader below, nothing is rendered. HELP!!! lol!
Here the code :
// vertex shader		 uniform float pointRadius;  // point size in world spaceuniform float pointScale;   // scale to calculate size in pixelsuniform vec4 eyePos;void main(){	vec4 wpos = vec4(gl_Vertex.xyz, 1.0);    gl_Position = gl_ModelViewProjectionMatrix * wpos;    // calculate window-space point size    vec4 eyeSpacePos = gl_ModelViewMatrix * wpos;    float dist = length(eyeSpacePos.xyz);    gl_PointSize = pointRadius * (pointScale / dist);    gl_TexCoord[0] = gl_MultiTexCoord0; // sprite texcoord    gl_TexCoord[1] = eyeSpacePos;    gl_FrontColor = gl_Color;}

// pixel shader for rendering points as shaded spheres uniform float pointRadius;uniform vec3 lightDir = vec3(0.577, 0.577, 0.577);void main(){    // calculate eye-space sphere normal from texture coordinates    vec3 N;    N.xy = gl_TexCoord[0].xy*vec2(2.0, -2.0) + vec2(-1.0, 1.0);    float r2 = dot(N.xy, N.xy);    if (r2 > 1.0) discard;   // kill pixels outside circle    N.z = sqrt(1.0-r2);    // calculate depth    vec4 eyeSpacePos = vec4(gl_TexCoord[1].xyz + N*pointRadius, 1.0);   // position of this pixel on sphere in eye space    vec4 clipSpacePos = gl_ProjectionMatrix * eyeSpacePos;    gl_FragDepth = (clipSpacePos.z / clipSpacePos.w)*0.5+0.5;    float diffuse = max(0.0, dot(N, lightDir));    gl_FragColor = diffuse*gl_Color;}

// geometry shader//#version 120 //#extension GL_EXT_geometry_shader4 : enableconst float radius = 0.5;varying out vec2 coord; void main() { 	for (int i = 0 ; i < gl_VerticesIn ; ++i ) 	{		gl_FrontColor = gl_FrontColorIn;		gl_Position = gl_PositionIn;		EmitVertex ( );	}	gl_FrontColor = gl_FrontColorIn[0];	coord = vec2( -1,-1 ); 	gl_Position = (gl_PositionIn[0] + gl_ProjectionMatrix * vec4(-radius,-radius,0,0) ); 	EmitVertex(); 	coord = vec2( -1,1 ); 	gl_Position = (gl_PositionIn[0] + gl_ProjectionMatrix * vec4(-radius,radius, 0,0) ); 	EmitVertex(); 	coord = vec2( 1,-1 ); 	gl_Position = (gl_PositionIn[0] + gl_ProjectionMatrix * vec4( radius,-radius, 0,0) ); 	EmitVertex(); 	coord = vec2( 1,1 ); 	gl_Position = (gl_PositionIn[0] + gl_ProjectionMatrix * vec4( radius,radius, 0,0) ); 	EmitVertex();  	EndPrimitive();}
Advertisement
This stuff is way beyond me, but my first thought is that do you really need a geometry shader for this? Wouldn't a pixel shader with altering the depth values and with some nice spherical texture coordinate lookup thingy suffice?
Quote:Original post by szecs
This stuff is way beyond me, but my first thought is that do you really need a geometry shader for this? Wouldn't a pixel shader with altering the depth values and with some nice spherical texture coordinate lookup thingy suffice?


Well, when i started, i wanted to use the pixel shader to do this, but i was unable to get a good result. I also didnt find anyone that could help me do this. If you have an idea, or example of code using this technique with pixel shader, let me know.
instead of calculating the normal and depth, what about using a lookup texture

render a texture of a sphere in rgb the normal and in the alpha channel the depth
Quote:Original post by Danny02
instead of calculating the normal and depth, what about using a lookup texture

render a texture of a sphere in rgb the normal and in the alpha channel the depth


But it won't help the fact that i'm unable to rotate around the sphere. Since i will be using light, i want to be able to rotate around the sphere and see the light changing.

when u calculate the hole lighting in view space it shouldn't be a problem cause the normals of the sphere don't change in view space

multiply the lightdir with the viewmatrix and u will get ur desired result
Quote:Original post by Danny02
when u calculate the hole lighting in view space it shouldn't be a problem cause the normals of the sphere don't change in view space

multiply the lightdir with the viewmatrix and u will get ur desired result


You mean i should change the line
uniform vec3 lightDir = vec3(0.577, 0.577, 0.577);

to

varying vec3 lightDir = vec3(0.577, 0.577, 0.577);
and that i should use

float diffuse = max(0.0, dot(N, gl_ModelViewMatrix * lightDir));

or something like that? Tryed it and it didnt work. Here a picture of my actual project, with the sphere always facing you.

image
directional light source:

multiply the light direction with the view(camera) matrix only, not the modelview, in the vertex shader.
Use the resulting vector in the dot product with the normal vector in the fragment shader

point light source:
you have the lightposition already in worldspace so u also have only to multiply it with the view matrix. Then u have to calculate the the lightdirection for every pixel either in the vertex or the fragment shader(don't know atm if the vertex shader approch has the same result as if it is calculated in the fragment shader)


I think your problem is that the old deprecated openGL functionality don't offer u direct access to only the view matrix. So if you want to do this technique u have to create and upload the matrix yourself as a uniform.
A few comments / suggestions:

You're outputting a varying "coord" which isn't picked up at all by the fragment shader.

The calculation of the last positions in the geometry shader is bizarre - what is this meant to do? Also, I'm not sure how many vertices your primitive is meant to have, have you missed an EndPrimitive() call after the loop?

For starters I suggest getting a simple pass-through geometry shader to work before you try to do anything with it.

I've only played about with #version 400 geometry shaders, but there's a max_vertices layout specifier that needs setting, I'm not sure if there's an equivalent in pre-400?

You might need to pass through texture coordinates as well as front color and position?
Quote:Original post by Danny02
directional light source:

multiply the light direction with the view(camera) matrix only, not the modelview, in the vertex shader.
Use the resulting vector in the dot product with the normal vector in the fragment shader

point light source:
you have the lightposition already in worldspace so u also have only to multiply it with the view matrix. Then u have to calculate the the lightdirection for every pixel either in the vertex or the fragment shader(don't know atm if the vertex shader approch has the same result as if it is calculated in the fragment shader)


I think your problem is that the old deprecated openGL functionality don't offer u direct access to only the view matrix. So if you want to do this technique u have to create and upload the matrix yourself as a uniform.


Directional light source :

Should it be something like that?
varying vec3 lightDir;void main(){	vec4 wpos = vec4(gl_Vertex.xyz, 1.0);    gl_Position = gl_ModelViewProjectionMatrix * wpos;	//lightDir = gl_Position * vec3(0.577, 0.577, 0.577);    // calculate window-space point size    vec4 eyeSpacePos = gl_ModelViewMatrix * wpos;	lightDir = eyeSpacePos.xyz * vec3(0.1, 0.1, 0.1);

This topic is closed to new replies.

Advertisement