Archived

This topic is now archived and is closed to further replies.

WTF? Can't vid cards blend properly?

This topic is 5223 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

Maybe it's by fault I don't know. What I do know is that when I render this
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

glDisable(GL_DEPTH_TEST);
ViewOrthogonal();

glDisable(GL_BLEND);

glColor4f(0, 1, 1, 0);
glVertex2f( 1,  1);
glVertex2f( 1, -1);
glVertex2f(-1, -1);
glVertex2f(-1,  1);
glEnd();

glEnable(GL_BLEND);

glBlendFunc(GL_ONE_MINUS_DST_ALPHA, GL_ONE);

glColor4f(1, 1, 1, 1);
glVertex2f( 1,  1);
glVertex2f( 1, -1);
glVertex2f(-1, -1);
glVertex2f(-1,  1);
glEnd();

ViewPerspective();  
All I get blue-green, and not white like expected. It seems that despite setting an alpha of 0 on the first quad, the vid card always reads an alpha of 1 as the DST alpha. Obviously, there is something wrong here, what is it? [edited by - DudeMiester on June 4, 2004 8:29:57 PM]

Share on other sites
glBlendFunc(GL_ONE_MINUS_DST_ALPHA, GL_ONE);

I believe this means that the final color will be of alpha one i.e full alpha value...

and another thing;
why would you want a fully transparent quad? in order for blending to work, it must have some color to add on to...

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);glDisable(GL_DEPTH_TEST);ViewOrthogonal();glDisable(GL_BLEND);glBegin(GL_QUADS);	glColor4f(0, 1, 1, 1);	//note: alpha changeglVertex2f( 1,  1);	glVertex2f( 1, -1);	glVertex2f(-1, -1);	glVertex2f(-1,  1);glEnd();glEnable(GL_BLEND);glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) //func changeglBegin(GL_QUADS);	glColor4f(1, 1, 1, TheLevelOfTransparencyNeeded); //alpha changeglVertex2f( 1,  1);	glVertex2f( 1, -1);	glVertex2f(-1, -1);	glVertex2f(-1,  1);glEnd();ViewPerspective();

Hope it helps! =)

[edited by - Rasmadrak on June 4, 2004 8:39:39 PM]

Share on other sites
This is an experiment, and I happened upon an apparent bug in the alpha blending system.

Also, from what I can tell from the OpenGL 1.5 spec, the line:

glBlendFunc(GL_ONE_MINUS_DST_ALPHA, GL_ONE);

means that OpenGL will create the final framebuffer color via the following formula:

(1-DST_ALPHA)*SRC_COLOR+DST_COLOR

Where the DST values are those already in the framebuffer at the current pixel, and SRC values are those about to be written/blended into the framebuffer.

Oh and I also set glBlendEquation(GL_FUNC_ADD) just to be sure.

[edited by - DudeMiester on June 4, 2004 8:52:43 PM]

Share on other sites
Errm, I think you'll find you are wrong and blending works perfectly (which, lets be honest, shouldnt be a shock to anyone).

I've just knocked up my own framework, copied that snip of rendering code (without the ViewOrthogonal() call), ran it and lo! I get a white square...

int _tmain(int argc, _TCHAR* argv[]){	boost::shared_ptr<OpenGLWFW::WindowManager> WinMgr(new OpenGLWFW::WindowManager());		try	{		WinMgr->Init();		if(!WinMgr->FindCompatilbeOGLMode(32,24,true,8))			return -1;		if(!WinMgr->FindCompatibleDisplayMode(1024,768,32,75)) // 1024*768*32*75Hz			return -1;		int winone = WinMgr->CreateWin(false);		WinMgr->Show();	}	catch (...) 	{		return -1;	}		// Setup OGL	glClearColor(0.0, 0.0, 0.0, 0.0);	glViewport(0, 0, 1024, 768);	glMatrixMode(GL_PROJECTION);	glLoadIdentity();	glOrtho(0,5,0,5,1,-1);		while (WinMgr->Dispatch())	{		glMatrixMode(GL_MODELVIEW);		glLoadIdentity();                glTranslatef(2,2,0);		glClear(GL_COLOR_BUFFER_BIT);		glDisable(GL_DEPTH_TEST);		glDisable(GL_BLEND);		glBegin(GL_QUADS);				glColor4f(0, 1, 1, 0);				glVertex2f( 1,  1);				glVertex2f( 1, -1);				glVertex2f(-1, -1);				glVertex2f(-1,  1);		glEnd();				glEnable(GL_BLEND);		glBlendFunc(GL_ONE_MINUS_DST_ALPHA, GL_ONE);                glTranslatef(0.5f,0.5f,0);		glBegin(GL_QUADS);			glColor4f(1, 1, 1, 1);			glVertex2f( 1,  1);			glVertex2f( 1, -1);			glVertex2f(-1, -1);			glVertex2f(-1,  1);		glEnd();		WinMgr->SwapBuffers();	}	return 0;}

(note the WndMgr/OpenGLWFW stuff is my windowing system, so basicaly ignore that, the opengl code is the bit you are intrested in).

So, the problem is elsewhere and not with OGL blending at all

For completeness this was run on a WinXP machine, ATI9800xt, Cat4.4 drivers.

edit: cleaned up some code and changed it so the two squared overlapped a bit

[Phantom Web | OpenGL Window Framework ]
"i wonder why i do that... type words which are nuffin like the word i wanted every now and badger"

[edited by - _the_phantom_ on June 4, 2004 10:52:59 PM]

Share on other sites
lol, I think I just found out why. In the PFD I didn''t set the number of alpha bits. Once I put it to 8, it worked fine. On a similar note, I''ve heard that the Geforce 4 Ti doesn''t have a hardware accumulation buffer, is this true? It obviously is very slow, I''ve tried using it, wheather I set the number of bits in the PFD or not.

Share on other sites
yes, thats true, afaik only the GFFX and R300 series (and anything after them) have a hardware accum buffer

Share on other sites
That sucks, kinda weird considering it has been part of the OGL spec for ages.

Share on other sites
yeah, but the spec doesnt say that anything has to be in hardware, you''ve been able to use a software accum buffer forever, its just until recently that they have desided to add hardware accum-buffers to consumer hardware

1. 1
2. 2
Rutin
21
3. 3
4. 4
5. 5
frob
12

• 17
• 9
• 31
• 16
• 9
• Forum Statistics

• Total Topics
632614
• Total Posts
3007440

×