Sign in to follow this  
evillgames

GLSL Geometry Shader does not show anything

Recommended Posts

evillgames    154


GPUGrassVertex growPoint[1];
  Float3Set(growPoint[0].origin,0,0,0);
  PackNormalizedVector(growPoint[0].growDir,CVector3f(0,0,1));
  growPoint[0].grassTileID[0]=0;

  
  grassGeometryBasedAmbient.Enable();
  grassGeometryBasedAmbient.SetGeometryShaderParameter(GL_GEOMETRY_VERTICES_OUT_ARB, 4);
  grassGeometryBasedAmbient.SetGeometryShaderParameter(GL_GEOMETRY_OUTPUT_TYPE_ARB, GL_TRIANGLE_STRIP);
  grassGeometryBasedAmbient.SetGeometryShaderParameter(GL_GEOMETRY_INPUT_TYPE_ARB, GL_POINTS);
  
  
  glDisable(GL_CULL_FACE);
  glDisable(GL_TEXTURE_2D);
  glDisable(GL_BLEND);

  glPointSize(12);
  glColor3f(1,1,1);

  glEnableClientState(GL_VERTEX_ARRAY);
  glDisableClientState(GL_COLOR_ARRAY);
  glDisableClientState(GL_NORMAL_ARRAY);
  glDisableClientState(GL_SECONDARY_COLOR_ARRAY_EXT);
  glDisableClientState(GL_TEXTURE_COORD_ARRAY);

  size_t vbsize = sizeof(GPUGrassVertex);
  glVertexPointer(3, GL_FLOAT, vbsize, &growPoint[0].origin);

  int numIndexes = 1;
  ushort ids[1]={0};
  glDrawElements( GL_POINTS, numIndexes,GL_UNSIGNED_SHORT,ids);

  glEnableClientState(GL_TEXTURE_COORD_ARRAY);  // dont allow to cache this!
  glEnableClientState(GL_VERTEX_ARRAY);
  glEnableClientState(GL_COLOR_ARRAY);
  glDisableClientState(GL_NORMAL_ARRAY);
  glDisableClientState(GL_SECONDARY_COLOR_ARRAY_EXT);
  
  glEnable(GL_TEXTURE_2D);
  glEnable(GL_CULL_FACE);

  grassGeometryBasedAmbient.Disable();




[vertex]

void main()
{
   	gl_Position.xyz = gl_Vertex.xyz;
	gl_FrontColor = gl_Color;
}

[geometry]

void main()
{
	float size_scale = 1;
	float GRASS_HEIGHT = 50 * size_scale;
	float GRASS_WIDTH = 30 * size_scale;
	vec3 origin = vec3(-17.000000, 15.000000, -136.000000); //gl_PositionIn[0].xyz;

	// construct billboard
	vec3 verts[4];
	verts[0]=vec3(-GRASS_WIDTH,0,0);
	verts[1]=vec3(GRASS_WIDTH,0,0);
	verts[2]=vec3(GRASS_WIDTH,0,GRASS_HEIGHT);
	verts[3]=vec3(-GRASS_WIDTH,0,GRASS_HEIGHT);
	//
	
	
	gl_Position = gl_ModelViewProjectionMatrix * vec4(origin + verts[0],1);
	//gl_FrontColor = 1; // gl_FrontColorIn[0];	
	EmitVertex();

	gl_Position = gl_ModelViewProjectionMatrix * vec4(origin + verts[1],1);
	//gl_FrontColor = 1; //gl_FrontColorIn[0];	
	EmitVertex();

	gl_Position = gl_ModelViewProjectionMatrix * vec4(origin + verts[2],1);
	//gl_FrontColor = 1; // gl_FrontColorIn[0];
	EmitVertex();

	gl_Position = gl_ModelViewProjectionMatrix * vec4(origin + verts[3],1);
	//gl_FrontColor = 1; //gl_FrontColorIn[0];
	EmitVertex();
	
	EndPrimitive();
}

hey guys - it just does not work... but i have no errors....
 
here is debug code - some lines are commented out

 

i expect to see quad ,at least....

GS is supported

Share this post


Link to post
Share on other sites
mhagain    13430

It's difficult to say without seeing the code behind your various classes but I have a hunch.

 

Your use of "_ARB" and "_EXT" suffixes throughout your code suggests that you may also be using the ancient GL_ARB_shader_objects extension.  It's not a great idea to mix this extension with more modern functionality, as the extension was never actually taken up into core OpenGL (GL2.0 shaders use a slightly different API that was never even available as an extension), and as such more modern functionality is not even guaranteed to work with it.

 

If that's the case, then you should rip out all use of this extension and just use the core GL 2.0 entry points and GLenums instead.

Share this post


Link to post
Share on other sites
mhagain    13430

The critical question is: are you using GL_ARB_shader_objects?

 

You need to answer this because sometimes working/sometimes not, weird stuff happening, etc is exactly the kind of symptom I would expect if you were (I observed it myself once when I accidentally did the same).

Share this post


Link to post
Share on other sites
evillgames    154

yes - i do use GL_ARB_shader_objects

)

 

so:

bool CGlslShader::Enable()
{
  if(!m_loaded) return 0; //<! not supported
	
	glUseProgramObjectARB(programObject);
  return 1;
}

void CGlslShader::Disable()
{
  if(!m_loaded) return; //<! not supported

	glUseProgramObjectARB(0);
}

glCreateShaderObjectARB
glCompileShaderARB
glAttachObjectARB
glLinkProgramARB

e.t.c
 

Share this post


Link to post
Share on other sites
mhagain    13430

So that's your problem then.

 

Because GL_ARB_shader_objects is old and because it was never taken up into core OpenGL, it's not even guaranteed to know what a geometry shader is.

 

I'm going to guess here that you're also using GL_ARB_geometry_shader4, so it's important to note the specification of that extension: http://www.opengl.org/registry/specs/ARB/geometry_shader4.txt

 

This extension is written against the OpenGL 2.0 specification

 

As I said before, GL_ARB_shader_objects is not the same GLSL API that exists in GL2.0, GL_ARB_shader_objects was never promoted to core GL and the GL2.0 shader API had never previously existed as an extension, so while it may work sometimes, you shouldn't expect it to work consistently or robustly all of the time.

 

So if you want to use a geometry shader, you should dump GL_ARB_shader_objects and use the core GL2.0 API instead.

Share this post


Link to post
Share on other sites
BitMaster    8651
Extensions don't happen for something that is already in Core. Extensions happen for something that does not exist (yet). In many cases useful extensions get promoted to Core with little or no changes. Sometimes they don't for a variety of reasons. In all cases it is a bad idea to just mix extensions and Core, for example by sending Core constants to ARB functions.

Share this post


Link to post
Share on other sites
mhagain    13430

Ah, I see.

 

That all functionality provided by GL versions above 1.1 is provided by extensions is actually a misconception; that's not the way things work and maybe some of the terminology used by GL documentation and tutorials is to blame for fostering it.

 

A new GL version is not just 1.1 plus a bunch of extensions.  Sometimes extensions are promoted, which means that the "ARB", "EXT", etc suffixes get removed.  Sometimes they're unchanged when promoted, sometimes they're radically changed.  Sometimes a new GL version gets new functionality that was never available as an extension, sometimes an extension is never promoted.

 

The potentially confusing terminology I mentioned above is that the GL docs tell you to use the extension loading mechanism for accessing new functionality.  That's not the same thing as using extensions - you use the loading mechanism, not extensions themselves.

 

So in the case of the GL2.0 shader API, it doesn't mean that you need to use GL_ARB_shader_objects.  What it means is that you check if your GL_VERSION is 2.0 or higher, then (e.g. on Windows) do a bunch of wglGetProcAddress calls for glCreateProgram, glCreateShader, glAttachShader, and friends.  Then you check if they're all non-NULL, and if so you can use the core GL2.0 shader API.  Note the no -ARB suffixes, note that it's glCreateProgram, not glCreateProgramObjectARB; it's a different API.

 

So why was this allowed to happen with shaders?  One probable reason is simply this: somebody cocked-up.  The ARB made plenty of cock-ups in the GL2.0/2.1/3.0 timeframe, so we shouldn't be surprised to find another one here.

Share this post


Link to post
Share on other sites
evillgames    154

in other words, when i will switch myself to gl3

i will:

- create context with 3.0

- will check if extension is supported

- then will do getprocaddress (glCreateProgram) - NOT glCreateProgramARB

- will use that just like extension

am i right ?

Share this post


Link to post
Share on other sites
BitMaster    8651
Unless you have very specific constraints, I would not call wglGetProcAdress myself. Use something like GLEW. Getting all the extension information you need for a non-trivial project is a lot of work and there is quite a bit of potential for annoying screwups when doing something by hand.

Share this post


Link to post
Share on other sites
evillgames    154

in other words

i can use extension - getprocaddress "arb" - which can be fucked up, as i see
or i can say - create context 4.0 - and get procaddres with NO ARB ; or if i use glew - #define GL_VERSION_2_0; .... #define GL_VERSION_3_0; .... #define GL_VERSION_4_0 - lot of versions which opengl4 consists from

glew is cool but mine project can not be as GPL(AFAIK glew goes under GPL) - well - copy/paste then )))


and how about mobiles and glES ?

Share this post


Link to post
Share on other sites
BitMaster    8651
GLEW is available under the modified BSD license. You can link it completely statically and even that does not force your own code into a specific license.

OpenGL ES 2 is not identical to OpenGL 2 but in most cases a program written for OpenGL 2 core can be ported to OpenGL ES 2 with a bit of effort even if the original program was not written with GL ES in mind.

A lot of games simply require at least OpenGL 2/2.1 core functionality and then add additional features depending on what further core functions (for higher versions) and extensions are available.
There is always a risk that features are broken in specific drivers. It happens every now and again, even to AAA games. The risk is much less while you are using common functionality. Extremely new extension or completely obsolete ones (like the shader ARB) run a much higher risk.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this