Jump to content

  • Log In with Google      Sign In   
  • Create Account

Shader blending ( Get previous pixel )


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
16 replies to this topic

#1 Gyiove   Members   -  Reputation: 141

Like
0Likes
Like

Posted 09 March 2014 - 08:39 AM

Hello.

 

I want to control the way opengl blend transparent surfaces.

 

Vertex shader

#version 120
varying vec2 texcoord;
 
void main()
{
    gl_Position = gl_ModelViewProjectionMatrix*gl_Vertex;
    texcoord = gl_MultiTexCoord0.xy;
}

Fragment shader

#version 120
uniform sampler2D img;
varying vec2 texcoord;

void main() 
{
	vec4 tex = texture2D ( img, texcoord );
	gl_FragColor = tex;
}

Those are my basic texture rendering codes.
What i need is to get somehow the pixel what is behind that thing what is correctly in shader.

It would be so good to have control over it. 
Also readed the basics of shaders, did not get any ideas how to do it.

Reading on internet, they said that opengl offers some blend functions. Already tried them.
Its nothing wrong with them but the way they act doesnt pleasure me. I really need to controll how pixels are blended.
Is there someone wise who have solution to my problem?
Thanks!



Sponsor:

#2 Juliean   GDNet+   -  Reputation: 2605

Like
4Likes
Like

Posted 09 March 2014 - 08:52 AM

http://wiki.delphigl.com/index.php/glBlendFunc

 

You need to set "GL_ONE" for dfactor, and depending on how you want to apply the new pixel it will probably also bee "GL_ONE" for sfactor (the first param).



#3 Gyiove   Members   -  Reputation: 141

Like
0Likes
Like

Posted 09 March 2014 - 09:02 AM

That is a really good source yet not exactly what i wanted.
Il give you code example:

#version 120
uniform sampler2D img;
varying vec2 texcoord;

void main() 
{
	vec4 tex = texture2D ( img, texcoord );
        vec4 behindpixel = ?
	gl_FragColor = (behindpixel * 0.5 ) + (tex * 0.5 );
}

In theory i can blend pixels without using glBlendFunc.
Getting the that pixel what is behind will give me so much control. I can make so many new nice blending types.
 

If i can somehow know where the pixel will be in screen ( the pixel origin in screen ( x, y coords ) )
Then i can store them and use them. Also the order will not matter ( if the back pixel is rendered first or the front one )
I can figure it out somehow.


Edited by Gyiove, 09 March 2014 - 09:37 AM.


#4 haegarr   Crossbones+   -  Reputation: 4309

Like
0Likes
Like

Posted 09 March 2014 - 09:29 AM

AFAIK fragment shaders cannot read framebuffers directly. However, you can use an FBO with a texture attached as color target in a first pass, and bind those texture for sampling in a second pass. That should give you the access you need.

 

Just for curiosity, what are those kinds of blending you are not able to create with the standard blend-functions?


Edited by haegarr, 09 March 2014 - 09:30 AM.


#5 theagentd   Members   -  Reputation: 587

Like
0Likes
Like

Posted 09 March 2014 - 09:29 AM

Nope, that can't be done. You can't read and write to the same texture due to how desktop GPUs work. It can however be done on tile based mobile GPUs.



#6 Gyiove   Members   -  Reputation: 141

Like
0Likes
Like

Posted 09 March 2014 - 09:36 AM

AFAIK fragment shaders cannot read framebuffers directly. However, you can use an FBO with a texture attached as color target in a first pass, and bind those texture for sampling in a second pass. That should give you the access you need.

I'm familiar with that way and im afraid that it will be ineffective.
Not sure how fast the function is but still believe that there are some other ways, there must be something.

 

 

Just for curiosity, what are those kinds of blending you are not able to create with the standard blend-functions?

 

For example one is that if you are looking out of the window, pixels will be different color but the window texture will be white for example.



#7 Juliean   GDNet+   -  Reputation: 2605

Like
3Likes
Like

Posted 09 March 2014 - 09:39 AM

You can't read from the same target that you currently render to. In order to make what you want to work, you would need to use two rendertargets, and ping-pong between them. First write to #1, and in the next pass you set #1 as uniform sampler and write to #2, and so on.

 

However, are you sure you really need this? What you wrote here:

gl_FragColor = (behindpixel * 0.5 ) + (tex * 0.5 );

can be realised using glBlendFunc(GL_SRC_ALPHA, GL_ONE_MIN_SRC_ALPHA) and outputting

void main() 
{
	vec3 tex = texture2D ( img, texcoord ).rgb;
	gl_FragColor = vec4(tex, 0.5f);
}

I belive that for effects that only need the underlying pixel of the current operation and only want to modify that value in a way you posted, then glBlendFunc is all you need. Really look closely at the page I posted you and look at the available options, there is almost everything you could need. Otherwise, you can take my first suggestion.

 

EDIT:

 

 

I'm familiar with that way and im afraid that it will be ineffective.
Not sure how fast the function is but still believe that there are some other ways, there must be something.

 

Its as fast as switching input textures can be, FBOs are standard and used so widley for advanced graphical effects that the overhead is neglectible. Handling the swapping can be a little awkward to handle, thats all about it.

 

 

For example one is that if you are looking out of the window, pixels will be different color but the window texture will be white for example.

This looks like a standard-way for alpha blending, I even more belive now that you will be fine if you just use glBlendFunc correctly.


Edited by Juliean, 09 March 2014 - 09:44 AM.


#8 Gyiove   Members   -  Reputation: 141

Like
0Likes
Like

Posted 09 March 2014 - 10:09 AM

Untitled-1.png

That thing what make the wall yellow. 
The color depens on that texture what is right now in the fragment shader.
As far, you guys have been really helpful, im glad you want to help me. Thank you so much!
Anyway, i'm really beginner, yet dont know the ways and functions.
I've tried all options of glBlendfunc yet did not succeed in creating the effect you see up, right in the corner.



#9 Juliean   GDNet+   -  Reputation: 2605

Like
0Likes
Like

Posted 09 March 2014 - 10:19 AM


I've tried all options of glBlendfunc yet did not succeed in creating the effect you see up, right in the corner.

 

Did you call glEnable(GL_BLEND) anywhere? This is definately doable with standard blending, people would have gotten insane if they required to pingpong rendertargets in order for simple glass effects.



#10 Gyiove   Members   -  Reputation: 141

Like
0Likes
Like

Posted 09 March 2014 - 11:41 AM

The one ohter problem is:

2014-03-09_193421.png

 

// vertex shader

#version 120
varying vec2 texcoord;
 
void main()
{
    gl_Position = gl_ModelViewProjectionMatrix*gl_Vertex;
    texcoord = gl_MultiTexCoord0.xy;
}
#version 120
uniform sampler2D img;
varying vec2 texcoord;

void main() 
{
	vec4 tex = texture2D ( img, texcoord );
	if( tex.r < 0.3 && tex.g < 0.3 && tex.b > 0.6 ) tex *= 0.0;
	else tex.a = 1.0;
	gl_FragColor = tex;
}

		glUniform1i(glGetUniformLocation(program,"img"),0);
		glBindTexture(GL_TEXTURE_2D, triangles[a].texture);
		glActiveTexture(GL_TEXTURE0);
		if(triangles[a].texture == 1) // The first texture is not supposed to be transparent
		{
			glDisable(GL_BLEND);
		}
		else
		{
			glEnable (GL_BLEND);
			glBlendFunc(GL_ONE, GL_ONE);
		}

	
		glBegin(GL_TRIANGLES);
		for( c = 0; c < 3; c++ )
		{
			glTexCoord2d(triangles[a].uvmap[c][0], -triangles[a].uvmap[c][1]);
			glNormal3d(triangles[a].normals[c][0], triangles[a].normals[c][1], triangles[a].normals[c][2]);
			glVertex3d(-triangles[a].vertex[c][1], triangles[a].vertex[c][2],triangles[a].vertex[c][0]);
		}
		glEnd();

The one reason why i really wanted to get that pixel what is behind.
Tried all those blending flags what are in the page you gave.



#11 phil_t   Crossbones+   -  Reputation: 3913

Like
2Likes
Like

Posted 09 March 2014 - 12:02 PM


gl_FragColor = (behindpixel * 0.5 ) + (tex * 0.5 );

 

What you're trying to do is pretty much standard alpha blending. GL_SRC_ALPHA and GL_ONE_MINUS_SRC_ALPHA, then output 0.5 for your alpha component.

 

I'm not sure exactly what you're trying to explain with your picture above, but it kind of seems like you want the areas around the tree leaves to be completely transparent? The same alpha blending flags will work for that too - except that you'll still be writing to the depth buffer for those completely transparent pixels, so they'll block stuff from being drawn behind them.

 

In that case, what you're looking for is alpha testing, not alpha blending.



#12 Juliean   GDNet+   -  Reputation: 2605

Like
1Likes
Like

Posted 09 March 2014 - 02:01 PM


The one reason why i really wanted to get that pixel what is behind.
Tried all those blending flags what are in the page you gave.
 

 

You have to disable the depth buffer for this:


GL_CHECK(glDisable(GL_DEPTH_TEST));
GL_CHECK(glDepthMask(false));

and afterwards reenable


GL_CHECK(glEnable(GL_DEPTH_TEST));
GL_CHECK(glDepthMask(true));

Never use the zbuffer with alpha-blended stuff, it just doesn't work.

 

EDIT:

 

You can actually leave on GL_DEPTH_TEST and just set glDepthMask to false, otherwise your alphablended stuff will probably overlap your opaque geometry. Also, if you have a texture that is eigther fully opague and on some pixels fully transparent, you can just leave z-buffering like normal and just use alpha-testing by adding this line to your shader after the texture read:

if(tex.a <= 0.0001f)
discard;

Edited by Juliean, 10 March 2014 - 01:41 PM.


#13 Gyiove   Members   -  Reputation: 141

Like
0Likes
Like

Posted 13 March 2014 - 08:52 PM

Thank you all, it is working!
I still have one more question. Does the directx allow me to do that what i asked here?

( that getting pixel behind the thing ... )

Never worked with directx but after reading some source code and information from internet i did not found anything about saying yes or no.
More like no because the thing what im looking for doesnt exists maybe or i just dont know how do search because i dont really know how its called what im looking for.
 


Edited by Gyiove, 13 March 2014 - 09:04 PM.


#14 phil_t   Crossbones+   -  Reputation: 3913

Like
0Likes
Like

Posted 13 March 2014 - 11:22 PM

It's still unclear exactly what feature you're talking about, but all the things talked about on this thread (alpha blending, alpha test, render targets) are supported in DirectX, so I'm sure you'll be able to do pretty much the same thing.



#15 Gyiove   Members   -  Reputation: 141

Like
0Likes
Like

Posted 23 March 2014 - 12:02 AM

What if i render all the opaque triangles, create texture a from rendered screen,

 

then render transparent triangle, create texture b from screen where that one triangle were rendered and then

render quad with the texture a blended with b.

( so i have then control over the pixels what are behind the transparent object, i can control everything )
I will continue this thing with the ohter transparent triangles, one by one.
Need to find a way how do i know that the transparent triangle is or should be visible but thats really small problem right now and im having some ideas yet they are useless right now.
 

How bad is this idea?
Should i do it or find the other way?

 

 

 



#16 C0lumbo   Crossbones+   -  Reputation: 2263

Like
0Likes
Like

Posted 23 March 2014 - 12:35 AM

Thank you all, it is working!
I still have one more question. Does the directx allow me to do that what i asked here?

( that getting pixel behind the thing ... )

Never worked with directx but after reading some source code and information from internet i did not found anything about saying yes or no.
More like no because the thing what im looking for doesnt exists maybe or i just dont know how do search because i dont really know how its called what im looking for.
 

 

If I understand correctly, you're asking if there's any way to do a fully programmable blend. Sadly the answer is no. There's a very full explanation here: http://fgiesen.wordpress.com/2011/07/12/a-trip-through-the-graphics-pipeline-2011-part-9/ (halfway down, "Aside: Why no programmable blend"), but basically, it's a restriction of the GPU design rather than a restriction imposed by OpenGL/D3D. In fact, the entire http://fgiesen.wordpress.com/2011/07/09/a-trip-through-the-graphics-pipeline-2011-index/ series is rather excellent, and recommended reading to all.

 

Actually, I believe fully programmable blends are possible on certain PowerVR chips, for instance the one used by the Vita, but that uses a very different architecture to your desktop GPU.

 

Ultimately, the standard fixed function blend options, while limited, are really very powerful, and you should make every effort to fully understand what can be achieved by them before thinking about ping-ponging between buffers as suggested in your last post.



#17 haegarr   Crossbones+   -  Reputation: 4309

Like
0Likes
Like

Posted 24 March 2014 - 06:00 AM


AFAIK fragment shaders cannot read framebuffers directly

Oh well, today I stumbled over an extension for OpenGLES, originating from Apple but nowadays being an EXT extension that allows the fragment shader to read the active framebuffer at the current fragment location:

    EXT_shader_framebuffer_fetch

 

Perhaps not being an option for the OP (and obviously not needed anyhow), it shows that actually a programmed blending is possible on some platforms.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS