Point sprites on WIndows/ATI

Started by
17 comments, last by RoadToRuin 15 years, 1 month ago
I havea program using point sprites for particles. These render fine on my mac, but having moved to Windows (Vista x64, and ATI Radeon 4870 w/ Catalyst 8.5), the texture coordinates are no longer generated for the point sprites - i.e. I get a single texture coordinates across the entire point sprite. Here is the relevant code which draws the point sprites, the only complexity being that I use the second set of texture coordinates to send the particle life to the shader:
		glEnable(GL_POINT_SPRITE)
		glEnable(GL_BLEND)
		glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA)
		glDepthMask(GL_FALSE)
		
		glTexEnvi(GL_POINT_SPRITE_ARB, GL_COORD_REPLACE_ARB, GL_TRUE)
		
		glBindBuffer(GL_ARRAY_BUFFER, self.vbo)
		
		glEnableClientState(GL_VERTEX_ARRAY)
		glVertexPointer(3, GL_FLOAT, sizeof(Particle), 0)
		
		glClientActiveTexture(GL_TEXTURE1)
		glEnableClientState(GL_TEXTURE_COORD_ARRAY)
		glTexCoordPointer(1, GL_FLOAT, sizeof(Particle), sizeof(c_float)*6)
		
		glDrawArrays(GL_POINTS, 0, self.count)

		glDisableClientState(GL_TEXTURE_COORD_ARRAY)
		glClientActiveTexture(GL_TEXTURE0)
		
		glDisableClientState(GL_VERTEX_ARRAY)
		
		glBindBuffer(GL_ARRAY_BUFFER, 0)
				
		glDepthMask(GL_TRUE)
		glDisable(GL_BLEND)
		glDisable(GL_POINT_SPRITE)


As I said, it runs perfectly on the Mac, so hopefully there is a tiny issue I have missed, which appears on the Radeon. Thanks in advance for any help or ideas. [Edited by - swiftcoder on February 23, 2009 12:47:56 PM]

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Advertisement
We're currently stuck with the same problem at the studio where I work. Our game runs perfectly on all Nvidia hardward but sprites cause issues on ATI hardware. We were also having depth issues with point sprites on ATI hardware, but we fixed that by changing the polygon mode to "GL_POINT", for sprite rendering, as apposed to drawing with "glDrawElements(GL_POINTS,....." and that cured the depth issue. But like I say we still can't get Sprite Texture Coords on ATI.

The strangest thing which has us all completely stumped is that we have an old build of the game in which ATI point sprite errors are not present.

I don't think any of the above will be really all the helpful for you, but we are just as interested in finding a fix as you are.

Just to note, we are using CG as our Shading Language and our primary ATI testing cards are HD 3450s, but we have observed the errors on the HD 3800 series as well.

Good luck mate, I'll post up if we find a solution here.
Few months ago I had the same trouble, no way to make automatic uv generation for pointsprites work on ATI.

I dropped pointsprites anyway cause I needed big sprites and the current "clip on the center on the particle" behavior (nv and ati, it's a specification issue) was unacceptable for my application.
It would be nice to knock it off as impossible. But like I said we have an old build with Sprites working on ATI. External tutorials such as those supplied with the OpenGL Superbilbe (Blue Book) have working sprite programs on ATI as well.
It is possible, but there must be a conflicting variable in the render state, memory management systems or in the actual compile options.

We attempted to use bill boards to over come the ATI issue here, but due to the mass amount of sprites/billboards we need to use here, the extra matrix calculations caused an exponential decrease in performance.

So bill boards are not an option here and sprites can work on ATI.

You can find the Blue Book source coude at "http://www.opengl.org/sdk/docs/books/SuperBible/" and the "SB-AllSource.zip" is the one to go for, for Win32. Compile the "Chapter 9: PointSprites" program and you will be able to see working sprites on ATI. (p.s you may have to tell your linker to ignore the outdated "LIBMCT.lib & LIBC.lib" libraries).

Hope this can help in some way.
Well, this pretty much sucks - it is the only instance where my Mac with X3100 and Apple-nerfed OpenGL drivers works better than the Windows/ATI setup.

In the short, I am pushing few enough particles that I can probably implement rendering them as billboarded-quads, but it isn't an ideal solution for all cases.

Does anyone know if ATI is aware of this issue? I don't see any mention of it on the ATI developer forums, or even on the web (via google).

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

This code works equally well on ATI and nVidia Windows OS's. Mac also runs this fine.

With moDevice'And now render them... first set up our device for renders...                    .Transform.World = Matrix.Identity                    .SetTextureStageState(0, TextureStageStates.AlphaArgument1, TextureArgument.TextureColor)                    .SetTextureStageState(0, TextureStageStates.AlphaArgument2, TextureArgument.Diffuse)                    .SetTextureStageState(0, TextureStageStates.AlphaOperation, TextureOperation.Modulate)                    'Ok, if our device was created with mixed...                    If moDevice.CreationParameters.Behavior.MixedVertexProcessing = True Then                        'Set us up for software vertex processing as point sprites always work in software                        moDevice.SoftwareVertexProcessing = True                    End If                    .RenderState.PointSpriteEnable = True                    If .DeviceCaps.VertexFormatCaps.SupportsPointSize = True Then                        .RenderState.PointScaleEnable = True                        If .DeviceCaps.MaxPointSize > mfParticleSize Then                            .RenderState.PointSize = mfParticleSize                        Else : .RenderState.PointSize = .DeviceCaps.MaxPointSize                        End If                        .RenderState.PointScaleA = 0                        .RenderState.PointScaleB = 0                        .RenderState.PointScaleC = 1                    End If		    .RenderState.SourceBlend = Blend.SourceAlpha		    .RenderState.DestinationBlend = Blend.One		    .RenderState.AlphaBlendEnable = True		    .RenderState.ZBufferWriteEnable = False                    .RenderState.Lighting = False                    .VertexFormat = CustomVertex.PositionColoredTextured.Format                    '.VertexFormat = moPoints(0).Format                    .SetTexture(0, moTex)                    .DrawUserPrimitives(PrimitiveType.PointList, oEmitter.lPCnt, oEmitter.moPoints)                    'Then, reset our device...                    .RenderState.ZBufferWriteEnable = True                    .RenderState.Lighting = True                    .RenderState.PointSpriteEnable = False                    'Ok, if our device was created with mixed...                    If moDevice.CreationParameters.Behavior.MixedVertexProcessing = True Then                        'Clear our SoftwareVertexProcessing for performance                        moDevice.SoftwareVertexProcessing = False                    End If                    .RenderState.SourceBlend = Blend.SourceAlpha                    .RenderState.DestinationBlend = Blend.InvSourceAlpha                     .RenderState.AlphaBlendEnable = True                    .SetTextureStageState(0, TextureStageStates.AlphaArgument1, TextureArgument.TextureColor)                    .SetTextureStageState(0, TextureStageStates.AlphaArgument2, TextureArgument.Current)                    .SetTextureStageState(0, TextureStageStates.AlphaOperation, TextureOperation.SelectArg1)End With
Enoch DagorLead DeveloperDark Sky EntertainmentBeyond Protocol
Quote:Original post by swiftcoder
Does anyone know if ATI is aware of this issue? I don't see any mention of it on the ATI developer forums, or even on the web (via google).


People have been upset with ATI's implementation of OpenGL point sprites for years. Googling ATI and GL Point Sprite, is as hilarious as it is annoying to look back over the years and see just how many people have been upset for so long.

The thing which I keep coming back to (and which has me eternally floored) is Point Sprites can work on ATI hardware and can be observed doing so, so why aren't they working in our cases?

I'm just wondering swiftcoder, but are you using the GLSL or CG? And which variables do you have enabled in your renderstate?

The problem we have in the studio here arose when we rewrote the render systems to incorporate the ability to reset back to fixed funtionally, so that older hardware could play our games. But in our attempted to fix it we have even gone as far as culling everything from the engine until it was only capable of rendering a single point sprite and the issue still persisted.

Below is our render state initialisation (with some pseudo code for copywrite reasons, sounds lame I know but I don't own it)
// Initialise the everything (if possible)int RenderEngineGL_CG::Render_InitialiseAll(){	// Store the device context	hDc = GetDC(hWnd);	if(hDc == NULL)		{return CONTEXT_ERROR;}	// Set a pixel format	SetPixelFormat(hDc, iPixelFormat, NULL); 	// Get the OpenGl rendering Context	hRc = wglCreateContext(hDc);	if(hRc == NULL)		{return RENDER_CONTEXT_ERROR;}	// Check/Set the current context		if(wglMakeCurrent(hDc, hRc) == FALSE)	{return CURRENT_CONTEXT_ERROR;}	// Initialise GLEE	GLeeInit();	// Get the graphics vendor 	iGLVendor = CurrentVendor;	// Get the GL information	fGLVersion = OpenGlVersion;	// Check multisampling	if(UseMultisampling == TRUE)	{		// Enable multisampling		glEnable(GL_MULTISAMPLE_EXT);		//Check version is above 1.3		if(fGLVersion >= 1.3)			{			// Use the alpha value			glEnable(GL_SAMPLE_COVERAGE_ARB);			// Set the sample coverage			glSampleCoverage(1.0, FALSE);		}	}	// Test the OpenGl version, if it is less than 1.2 error and quite the game as this comp is not good enough to run it	if(fGLVersion < 1.2)	{return GL_VERSION_ERROR;}	// Test for GL 1.3 or better	if(fGLVersion < 1.3)		{		// Disable Cloud sprites, Texture filtering and MipMaps	}	// Test for GL 1.4 or better	if(fGLVersion >= 1.4)	{		SetUpThePointAttributes();	}	// Test the OpenGl version, if it is less than 1.5 hardset the fixed functionality pipeline	if(fGLVersion < 1.5)	{		// Set Fixed functionality		bFixedFunctionality = TRUE;		// Turn off all framebuffers		UseFrameBuffers = FALSE;		UseBrightPass	= FALSE;			UseBlurPass	= FALSE;	}	// Check for GL version 1.5	if(fGLVersion >= 1.5)	{			// Enable point sprites in vertex programs		glEnable(GL_VERTEX_PROGRAM_POINT_SIZE_ARB);	}	// Check if the OpenGL versions is less than 2.0 	if((fGLVersion < 2.0) )	{		// Hardset points to sprites		bPointsOverSprites = TRUE;	}		// Check the points over sprites value	if((bPointsOverSprites == FALSE) && (bFixedFunctionality == FALSE))	{		// Enable Point Sprites		glEnable(GL_POINT_SPRITE_ARB);	}	else	{		// Set up smooth points		glEnable(GL_POINT_SMOOTH);		// Set the hint		glHint(GL_POINT_SMOOTH_HINT, GL_FASTEST);	}		// Set default point size	glPointSize(63.0f);	// Set the GL clear colours	glClearColor(0.75f, 0.75f, 0.75f, 0.75f);	glClearDepth(1.0f);		// Clear the colour and depth buffers	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);	// Set the PolygonMode	glPolygonMode(GL_FRONT, GL_FILL);	// Disbale polygon smoothing        glDisable(GL_POLYGON_SMOOTH);	// Set up back face culling	glEnable(GL_CULL_FACE);	glCullFace(GL_BACK);	// Set up the alpha testing	glEnable(GL_ALPHA_TEST);	glAlphaFunc(GL_GREATER, 0.05);	// Set up the Blending function	glEnable(GL_BLEND);	glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);	// Set the shademodel to smooth, the winding to counter clockwise	glShadeModel(GL_FLAT);	glFrontFace(GL_CCW);	// Set up depth testing	glEnable(GL_DEPTH_TEST);		glDepthMask(GL_TRUE);	glDepthFunc(GL_LESS);		// Enable materials	glEnable(GL_COLOR_MATERIAL);	glColorMaterial(GL_FRONT, GL_AMBIENT_AND_DIFFUSE);	// The Render engine has initialised	bInitialised = TRUE;	// Check the fixed function 	if(bFixedFunctionality == TRUE)	{		SetUpFixedFunctionalityPipeline();		return NO_ERRORS;	}	else	{		// Turn on Framebuffers		UseFrameBuffers = TRUE;		InitialiseTheShaders();		InitialisetheFramebuffers();		return NO_ERRORS;	}	// If we are here then something went wrong!	return UNDEFINED_ERROR;}


Similarly to you swiftcoder, we enable and disable point sprites per sprite object, however we do not turn off the depth mask.
And just to note, our render state now is virtually identical to how it was when sprites worked for us on ATI hardware, the only real additions are the fixed functionality checks.

We have also tried using to ATI's own extension headers and libraries but all have failed to make any change.
Quote:Original post by RoadToRuin
I'm just wondering swiftcoder, but are you using the GLSL or CG? And which variables do you have enabled in your renderstate?
GLSL, and I don't have any support for fixed function (besides using the builtin gl_Vertex attribute). One item that might be a possible difference is that I don't use the GL matrix stack - matrices are managed manually by the renderer.

As for render state, all the fixed function stuff (lighting, fog, etc.) is disabled, so all that is enabled is blending and the depth test. I checked with the depth mask on and off, with no improvement.
Quote:The problem we have in the studio here arose when we rewrote the render systems to incorporate the ability to reset back to fixed funtionally, so that older hardware could play our games. But in our attempted to fix it we have even gone as far as culling everything from the engine until it was only capable of rendering a single point sprite and the issue still persisted.
The points them selves render fine, and at the correct size, so all that isn't working is the texture coord interpolation.

The most confusing item is that this call has absolutely no effect on either machine: glTexEnvi(GL_POINT_SPRITE_ARB, GL_COORD_REPLACE_ARB, GL_TRUE). You can set it to GL_TRUE or GL_FALSE, and the PC will always behave as if it is false, while the Mac (OS X 10.5, Intel integrated X3100) will always behave as if it is true. I don't know if this function has been deprecated along the way, or if both drivers just happen to have different bugs regarding it?

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Quote:Original post by swiftcoder
The most confusing item is that this call has absolutely no effect on either machine: glTexEnvi(GL_POINT_SPRITE_ARB, GL_COORD_REPLACE_ARB, GL_TRUE). You can set it to GL_TRUE or GL_FALSE, and the PC will always behave as if it is false, while the Mac (OS X 10.5, Intel integrated X3100) will always behave as if it is true. I don't know if this function has been deprecated along the way, or if both drivers just happen to have different bugs regarding it?


It is definately not deprecated as it is vital to setting sprite texture coordinates and can be observed working correctly. On the Nvidia hardware here (mostly 8800 GTXs and 9500 GTs) setting glTexEnvi(GL_POINT_SPRITE_ARB, GL_COORD_REPLACE_ARB, .....) to GL_TRUE has the effect of generating texture coordinates for the sprites, whilest GL_FALSE has the sprites using the original texture coordinates; Exactly the behaviour that should be expected.

It is odd, that your Mac will force GL_TRUE on coordinate replacement, especially as written in the Sprite Spec the default value is GL_FALSE, but then again it is the behaviour that is usually desired.

With coordinate replacement enabled, coordinates fed from GL to the pipeline via glTexCoordPointer and similar functions are ignored, however the original coords are present in the vertex shader, but are only overwritten/hijacked at the fragment stage. Removing the glTexCoordPointer while having sprite coordinate replacement enabled, has zero effect on srpites on Nvidia as the srpites are generating their own coords, so they render correctly, however on ATI hardware sprites render black as they no longer have texCoords to map the texture.

Is it possible we're both missing something in the shader initialisation? For us our shader Init was completely rewrittenduring our render system rewrite, while the shader code remained largely untouched. However I have read throught the GL_ARB_point_sprite spec many times but I cannot spot any mistakes in our code, and the code you posted looks correct

The spec can be found here: http://icps.u-strasbg.fr/~marchesin/perso/extensions/ARB/point_sprite.html
The offical .txt can be found on the OpenGL website, but that appears to be down right now, never the less here's the address: www.opengl.org/registry/specs/ARB/point_sprite.txt
SwiftCoder I have noticed an issue with your code
		glClientActiveTexture(GL_TEXTURE1)		glEnableClientState(GL_TEXTURE_COORD_ARRAY)		glTexCoordPointer(1, GL_FLOAT, sizeof(Particle), sizeof(c_float)*6)				glDrawArrays(GL_POINTS, 0, self.count)		glDisableClientState(GL_TEXTURE_COORD_ARRAY)		glClientActiveTexture(GL_TEXTURE0)		


Your enable Texture Unit 1 and the send your texture coordinates into it, you also disable through this

Your code should probably look more like this
		glClientActiveTexture(GL_TEXTURE0_ARB)		glEnableClientState(GL_TEXTURE_COORD_ARRAY)		glTexCoordPointer(1, GL_FLOAT, sizeof(Particle), sizeof(c_float)*6)				glDrawArrays(GL_POINTS, 0, self.count)		glClientActiveTexture(GL_TEXTURE0)ARB)		glDisableClientState(GL_TEXTURE_COORD_ARRAY)		


Texture Coordinates should also be sent down Unit0. When GL_COORD_REPLACE_ARB is enabled it will always attempt to send texture coordinates down Unit0.

Hope this helps in some way

This topic is closed to new replies.

Advertisement