• Advertisement

OpenGL OpenGL Z depth test and transparency doesn't work together

Recommended Posts

Hello. I'm trying to make an android game and I have come across a problem. I want to draw different map layers at different Z depths so that some of the tiles are drawn above the player while others are drawn under him. But there's an issue where the pixels with alpha drawn above the player. This is the code i'm using:

int setup(){
        GLES20.glEnable(GLES20.GL_DEPTH_TEST);
        GLES20.glEnable(GL10.GL_ALPHA_TEST);
		GLES20.glEnable(GLES20.GL_TEXTURE_2D);	
}

int render(){
	    GLES20.glClearColor(0, 0, 0, 0);
        GLES20.glClear(GLES20.GL_ALPHA_BITS);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
        GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT);
        GLES20.glBlendFunc(GLES20.GL_ONE, GL10.GL_ONE_MINUS_SRC_ALPHA);

		// do the binding of textures and drawing vertices
}

My vertex shader:

uniform mat4 MVPMatrix; // model-view-projection matrix
uniform mat4 projectionMatrix;

attribute vec4 position;
attribute vec2 textureCoords;
attribute vec4 color;
attribute vec3 normal;

varying vec4 outColor;
varying vec2 outTexCoords;
varying vec3 outNormal;

void main()
{
    outNormal = normal;
    outTexCoords = textureCoords;
	outColor = color;
	gl_Position = MVPMatrix * position;
}

My fragment shader:

precision highp float;

uniform sampler2D texture;

varying vec4 outColor;
varying vec2 outTexCoords;
varying vec3 outNormal;

void main()
{
    vec4 color = texture2D(texture, outTexCoords) * outColor;
    gl_FragColor = vec4(color.r,color.g,color.b,color.a);//color.a);
}

I have attached a picture of how it looks. You can see the black squares near the tree. These squares should be transparent as they are in the png image:

2018-01-22-15-58-11.png

Its strange that in this picture instead of alpha or just black color it displays the grass texture beneath the player and the tree:

2018-01-22-15-58-11.png

Any ideas on how to fix this?

 

Thanks in advance :)

 

 

Edited by EddieK

Share this post


Link to post
Share on other sites
Advertisement

This is a classic problem with transparency in computer graphics. For transparency, you can not rely on the depth buffer as transparent parts of your geometry will also write the depth. Instead you must sort the geometry from back to front and draw them without a depth buffer. But in your case as it seems like a pixel art style game, you can make use of the depth buffer and instead of transparency, you would use alpha testing. Alpha testing means discarding transparent pixels, so the pixel shader modifies geometry which it outputs. This works well for sorting with a depth buffer, but results in any pixel being either completely transparent or completely opaque. For your art style it might work out perfectly. 

To enable alpha testing, you could write something like this in the fragment shader:

vec4 color = texture2D(texture, outTexCoords) * outColor;
if(color.a < 0.5)
{
  discard;
}
gl_FragColor = vec4(color.r,color.g,color.b,1);

That code will cut out any pixel which has less than 0.5 alpha.

Share this post


Link to post
Share on other sites
9 minutes ago, turanszkij said:

This is a classic problem with transparency in computer graphics. For transparency, you can not rely on the depth buffer as transparent parts of your geometry will also write the depth. Instead you must sort the geometry from back to front and draw them without a depth buffer. But in your case as it seems like a pixel art style game, you can make use of the depth buffer and instead of transparency, you would use alpha testing. Alpha testing means discarding transparent pixels, so the pixel shader modifies geometry which it outputs. This works well for sorting with a depth buffer, but results in any pixel being either completely transparent or completely opaque. For your art style it might work out perfectly. 

To enable alpha testing, you could write something like this in the fragment shader:


vec4 color = texture2D(texture, outTexCoords) * outColor;
if(color.a < 0.5)
{
  discard;
}
gl_FragColor = vec4(color.r,color.g,color.b,1);

That code will cut out any pixel which has less than 0.5 alpha.

Big thanks, that did the trick :)

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By too_many_stars
      Hello Everyone,
      I have been going over a number of books and examples that deal with GLSL. It's common after viewing the source code to have something like this...
      class Model{ public: Model(); void render(); private: GLSL glsl_program; }; ////// .cpp Model::Model(){ glsl_program.compileAndLinkShaders() } void Model::render(){ glsl_program.use() //render something glsl_program.unUse(); } Is this how a shader program should be used in real time applications? For example, if I have a particle class, for every particle that's created, do I want to compiling and linking a vertex, frag shader? It seems to a noob such as myself this might not be the best approach to real time applications.
      If I am correct, what is the best work around?
      Thanks so much for all the help,
       
      Mike
       
    • By Jamal Williams
      Hello, my name is Jamal and I've been interested in programming video games for a long time. I've attempted to make games in the past however, they never really got far or at least close enough to them being released. This is mainly because I was biting off more than I could chew. I know that before I ask this question, many people are going to discourage me from making an MMO. Please save those comments because I know exactly how hard it is to make an MMO. Which is why I am starting off small and then working towards making my game greater.
       
      Anyhow, I plan on releasing a 2D Java game that is going to be hosted on my websites server. I wanted to make the game multiplayer(obviously) but I also wanted to make a system in which the player would log-in and then be able to access their characters in their account and play the story. I'm recently learning about sockets however, I was wondering if anyone had a method or even the slightest idea of how to save a player's information so that it could be loaded back into the game?
      I already have an idea where whenever the player logs in, you create a new initiate of a Player Object and then pass certain variables through. For example, the player logs in from one GameState and then goes into another GameState in which the Player() is called. When the Player() is called, it passes these values and these values tell the game where the player needs to be, etc.
      public class Player { String name; String race; int attack; int defense; public Player(String name, String race, int attack, int defense) { this.name = name; this.race = race; this.attack = attack; this.defense = defense; //etc } } I could always store these values into a SQL database so that way whenever the player logs in, I can just pull these values from there however, I was wondering if there was a better method for saving the player's data. Any thoughts?
    • By getoutofmycar
      I'm having some difficulty understanding how data would flow or get inserted into a multi-threaded opengl renderer where there is a thread pool and a render thread and an update thread (possibly main). My understanding is that the threadpool will continually execute jobs, assemble these and when done send them off to be rendered where I can further sort these and achieve some cheap form of statelessness. I don't want anything overly complicated or too fine grained,  fibers,  job stealing etc. My end goal is to simply have my renderer isolated in its own thread and only concerned with drawing and swapping buffers. 
      My questions are:
      1. At what point in this pipeline are resources created?
      Say I have a
      class CCommandList { void SetVertexBuffer(...); void SetIndexBuffer(...); void SetVertexShader(...); void SetPixelShader(...); } borrowed from an existing post here. I would need to generate a VAO at some point and call glGenBuffers etc especially if I start with an empty scene. If my context lives on another thread, how do I call these commands if the command list is only supposed to be a collection of state and what command to use. I don't think that the render thread should do this and somehow add a task to the queue or am I wrong?
      Or could I do some variation where I do the loading in a thread with shared context and from there generate a command that has the handle to the resources needed.
       
      2. How do I know all my jobs are done.
      I'm working with C++, is this as simple as knowing how many objects there are in the scene, for every task that gets added increment a counter and when it matches aforementioned count I signal the renderer that the command list is ready? I was thinking a condition_variable or something would suffice to alert the renderthread that work is ready.
       
      3. Does all work come from a singular queue that the thread pool constantly cycles over?
      With the notion of jobs, we are basically sending the same work repeatedly right? Do all jobs need to be added to a single persistent queue to be submitted over and over again?
       
      4. Are resources destroyed with commands?
      Likewise with initializing and assuming #3 is correct, removing an item from the scene would mean removing it from the job queue, no? Would I need to send a onetime command to the renderer to cleanup?
    • By Finalspace
      I am starting to get into linux X11/GLX programming, but from every C example i found - there is this XVisualInfo thing parameter passed to XCreateWindow always.
      Can i control this parameter later on - when the window is already created? What i want it to change my own non GLX window to be a GLX window - without recreating. Is that possible?
       
      On win32 this works just fine to create a rendering context later on, i simply find and setup the pixel format from a pixel format descriptor and create the context and are ready to go.
       
      I am asking, because if that doesent work - i need to change a few things to support both worlds (Create a context from a existing window, create a context for a new window).
    • By Yacec
      Hello,
      I recently started going into GameDev for real and I made simple game for iOS platform. It is written in Swift and uses SpriteKit framework. Please give me some feedback about my game because i don't know if I have any potential for making games. I don't want make mobile games but it is some sort of warm up.
      Link for game in App Store: https://itunes.apple.com/us/app/colorize-quick/id1167120774?mt=8&ign-mpt=uo%3D4
      Thanks in advance!
       
       

       

  • Advertisement