• Advertisement

OpenGL ES Problems with directional lighting shader

Recommended Posts

Hi Guys,

I have been struggling for a number of hours trying to make a directional (per fragment) lighting shader. 

I have been following this tutorial in the 'per fragment' section of the page - http://www.learnopengles.com/tag/per-vertex-lighting/ along with tutorials from other sites.

This is what I have at this point.

// Vertex shader

varying vec3 v_Normal;
varying vec4 v_Colour;
varying vec3 v_LightPos;

uniform vec3 u_LightPos;
uniform mat4 worldMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;

void main()
{
    vec4 object_space_pos = vec4(in_Position, 1.0);
    gl_Position = worldMatrix * vec4(in_Position, 1.0);
    gl_Position = viewMatrix * gl_Position;     // WV
    gl_Position = projectionMatrix * gl_Position;

    mat4 WV = worldMatrix * viewMatrix;
    
    v_Position = vec3(WV * object_space_pos);
    v_Normal = vec3(WV * vec4(in_Normal, 0.0));
    v_Colour = in_Colour;
    v_LightPos = u_LightPos;
}

And 

// Fragment

varying vec3 v_Position;
varying vec3 v_Normal;
varying vec4 v_Colour;
varying vec3 v_LightPos;

void main()
{
    float dist = length(v_LightPos - v_Position);
    vec3 lightVector = normalize(v_LightPos - v_Position);
    float diffuse_light = max(dot(v_Normal, lightVector), 0.1);
    diffuse_light = diffuse_light * (1.0 / (1.0 + (0.25 * dist * dist)));
    
    gl_FragColor = v_Colour * diffuse_light;
}

If I change the last line of the fragment shader to 'gl_FragColor = v_Colour;' the model (a white sphere) will render to the screen in solid white, as expected.

But if I leave the shader as is above, the object is invisible.

I am suspecting that it is something to do with this line in the vertex shader, but am at a loss as to what is wrong.

v_Position = vec3(WV * object_space_pos);

If I comment the above line out, I get some sort of shading going on which looks like it is trying to light the subject (with the normals calculating etc.)

Any help would be hugely appreciated.

Thanks in advance :)

Share this post


Link to post
Share on other sites
Advertisement

Hi, lonewolff!

My initial guess would be that your v_Position and v_LightPos are in different coordinate spaces, making the distance and light direction computation meaningless. v_Position is in view space:

v_Position = vec3(WV * object_space_pos);

Can you make sure that you also transform v_LightPos into view space somewhere in your app?

Does it help if you comment out attenuation?

diffuse_light = diffuse_light * (1.0 / (1.0 + (0.25 * dist * dist)));

Also, probably just a formality, but directional lighting doesn't use light position or attenuation, it's an approximation for the case when the light source is very far relative to the scale of the scene, e.g. the sun illuminating a building. In such a case we assume that the light rays all travel in the same direction and the light's intensity falloff with distance is negligible. Point lighting would be a better name in this case.

Share this post


Link to post
Share on other sites

So basically tutorial shows how you manage per pixel lighting which is say really easy to implement.

You pass verts to shader and test the distance between light and pixel fragment, whenever fragment is in radius you color it as diffuse light color

 

Let me show the code first

 

Vs

attribute vec3 Vpos;

uniform vec4 MVP1;
uniform vec4 MVP2;
uniform vec4 MVP3;
uniform vec4 MVP4;

uniform vec4 WM1;
uniform vec4 WM2;
uniform vec4 WM3;
uniform vec4 WM4;

vec4 vertexClip;

varying highp vec3 vertex_pos;

float dp43(vec4 matrow, vec3 p)
{
return ( (matrow.x*p.x) + (matrow.y*p.y) + (matrow.z*p.z) + matrow.w );
}


 
void main()
{
vertexClip.x = dp43(MVP1, Vpos);
vertexClip.y = dp43(MVP2, Vpos);
vertexClip.z = dp43(MVP3, Vpos);
vertexClip.w = dp43(MVP4, Vpos);


vertex_pos.x = dp43(WM1, Vpos);
vertex_pos.y = dp43(WM2, Vpos);
vertex_pos.z = dp43(WM3, Vpos);


gl_Position = vertexClip;

}


Fs

 

varying highp vec3 vertex_pos;
uniform highp vec3 LPOS;
uniform highp vec3 LDIFF;

uniform highp float LRadius;

highp float n3ddistance(highp vec3 first_point, highp vec3 second_point)
{
highp float x = first_point.x-second_point.x;
highp float y = first_point.y-second_point.y;
highp float z = first_point.z-second_point.z;
highp float val = x*x + y*y + z*z;
return sqrt(val);
}

void main()
{
	highp float dst = n3ddistance(LPOS, vertex_pos);
	highp float intensity = clamp(1.0 - dst / LRadius, 0.0, 1.0);
	highp vec4 color = vec4(LDIFF.x, LDIFF.y, LDIFF.z, 1.0)*intensity;
	gl_FragColor = color;
}

 

 

Now short explenation you pass vertex world coordinate to fragment shader then you can test that against light position that means: your object could be centered at 0,0,0 pos then you could apply that

Matrix44<float> wrld;
	wrld.TranslateP(ship[i]->pos);// make translation matrix 
wrld = ship[i]->ROTATION_MAT * wrld;
		 MVP = (wrld * ACTUAL_VIEW) * ACTUAL_PROJECTION;

Then we are ready to make a directional light always when you pass normal is either affected by object rotation matrix or you have it already fixed in buffer

Then dot(normal, lightdir) whenever is0 or positive gives you ambient lightning color else we can get diffuse lighting

The idea is to find which fragment is in the cone, to do that you could simply define light radius and radius of a base of this cone bR

 

Since we have base radius and light radius and light direction we could simply check whenever a fragment is in the spotlight, to do that you need closestpointonline function

First define two line ends first is your lightpos second one is lightpos+lightdir*radius

Now you test fragment position against this line and get the distance between fragment and closest point on line,

Now having this we need to test whenever lets say fragment is in triangle

Given closest point on line we can find distance from light pos to cone line center let it be cvX now divide it by light radius we will get 'percentage of the distance' now multiply that percentage by base radius IF FRAGMENT DISTANCE TO CONE LINE CENTER IS LESS THAN THIS YOU CAN COLOR FRAGMENT WITH DIFFUSE COLOR

 

end of story

Share this post


Link to post
Share on other sites

Thanks for the replies guys. :)

@dietrich- Right you are! I have moved the light into view space and taken away attenuation for the time being to turn it in to a directional light only. 

Things are looking better, but still some anomalies.

NqeIbs1.png

The only thing I can see is that the light position is off on the z-axis.

Sphere is at 0,0,0
light is at -5, 2, 5
camera is at 0, 0, -5

Going by this, the light should be behind the object, not in front.

Does this give a hint at any calculation I may have missed?

Here is the shader in its current form.

attribute vec3 in_Position;
attribute vec3 in_Normal;
attribute vec4 in_Colour;

varying vec3 v_Position;
varying vec3 v_Normal;
varying vec4 v_Colour;
varying vec3 v_LightPos;

uniform vec3 u_LightPos;
uniform mat4 worldMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;

void main()
{
    vec4 object_space_pos = vec4(in_Position, 1.0);
    gl_Position = worldMatrix * vec4(in_Position, 1.0);
    gl_Position = viewMatrix * gl_Position;
    gl_Position = projectionMatrix * gl_Position;

    mat4 WV = worldMatrix * viewMatrix;
    
    v_Position = vec3(WV * object_space_pos);
    v_Normal = vec3(WV * vec4(in_Normal, 0.0));
    v_Colour = in_Colour;
    
    vec4 object_space_light = vec4(u_LightPos, 1.0);
    v_LightPos = vec3(WV * object_space_light);
}

and

varying vec3 v_Position;
varying vec3 v_Normal;
varying vec4 v_Colour;
varying vec3 v_LightPos;

void main()
{
    float dist = length(v_LightPos - v_Position);
    vec3 lightVector = normalize(v_LightPos - v_Position);
    float diffuse_light = max(dot(v_Normal, lightVector), 0.1);
//  diffuse_light = diffuse_light * (1.0 / (1.0 + (0.0000001 * dist * dist)));
    
//    gl_FragColor = v_Colour * diffuse_light;
    gl_FragColor = vec4(v_Colour.rgb * diffuse_light, v_Colour.a);
}

Thanks again, this is hugely appreciated :)

Share this post


Link to post
Share on other sites

Then I'd suggest to simplify things a bit and move the lighting computations into world space, and see if it changes anything.

v_Position will then become

v_Position = vec3(worldMatrix * object_space_pos);

and light position will remain unchanged,

v_LightPos = u_LightPos;

 

Share this post


Link to post
Share on other sites

@dietrich

Arggh! Turns out I was working with a screwy model. I have enabled backface culling and all is now great.

A huge thanks to you though, I have learned a lot today! :D

Share this post


Link to post
Share on other sites

Ah, well, that happens too, glad to hear it's fixed:)

One more thing: you're currently using v_Normal as is, but you really want to normalize it again in the fragment shader before doing any computations with it (the tutorial seems to be missing this step). Each fragment receives an interpolated normal, and a linear interpolation of unit vectors is not necessarily a unit vector itself, here's a nice illustration (image via google):

Figure1.png

Share this post


Link to post
Share on other sites

Cool, thanks for that.

How would you go about re-normalising as above? From the diagram, how would you work out where the height of the red area should be?

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By EddieK
      Hi I am having this problem where I am drawing 4000 squares on screen, using VBO's and IBO's but the framerate on my Huawei P9 is only 24 FPS. Considering it has 8-core CPU and a pretty powerful GPU, I don't think it is not capable of drawing 4000 textured squares at 60FPS.
      I checked the DMMS and found out that most of the time spent was by the put() method of the FloatBuffer, but the strange thing is that if I'm drawing these squares outside of the view frustum, the FPS increases. And I'm not using frustum culling. 
      If you have any ideas what could be causing this, please share them with me. Thank you in advance.
    • By EddieK
      Hi, so I am trying to implement packed VBO's with indexing on OpenGL but I have run across problems. It worked fine when I had separate buffers for vertex positions, colors and texture coordinates. But when I tried to put everything into a single packed buffer, it completely glitched out. Here's the code which I am using:
      this.vertexData.position(0); this.indexData.position(0); int stride = (3 + 4 + 2) * 4; GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, buffers[0]); GLES20.glBufferData(GLES20.GL_ARRAY_BUFFER, vertexData.capacity()*4, vertexData, GLES20.GL_STATIC_DRAW); ShaderAttributes attributes = graphicsSystem.getShader().getAttributes(); GLES20.glEnableVertexAttribArray(positionAttrID); GLES20.glVertexAttribPointer(positionAttrID, dimensions, GLES20.GL_FLOAT, false, stride, 0); GLES20.glEnableVertexAttribArray(colorAttrID); GLES20.glVertexAttribPointer(colorAttrID, 4, GLES20.GL_FLOAT, false, stride, dimensions * 4); GLES20.glEnableVertexAttribArray(texCoordAttrID); GLES20.glVertexAttribPointer(texCoordAttrID, 2, GLES20.GL_FLOAT, false, stride, (dimensions + 4) * 4); GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, buffers[3]); GLES20.glBufferData(GLES20.GL_ELEMENT_ARRAY_BUFFER, indexData.capacity()*2, indexData, GLES20.GL_STATIC_DRAW); GLES20.glDrawElements(mode, count, GLES20.GL_UNSIGNED_SHORT, 0); The data in vertex buffer is ordered like this:
      Vertex X, vertex Y, vertex Z, Color r, color g, color b, color a, Tex coord x, tex coord z and so on... (And I am pretty certain that the buffer I'm using is in this order)
      This is the version of the code which worked fine:
      this.vertexData.position(0); this.vertexColorData.position(0); this.vertexTexCoordData.position(0); this.indexData.position(0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, buffers[0]); GLES20.glBufferData(GLES20.GL_ARRAY_BUFFER, vertexPositionData.capacity()*4, vertexPositionData, GLES20.GL_STATIC_DRAW); ShaderAttributes attributes = graphicsSystem.getShader().getAttributes(); GLES20.glEnableVertexAttribArray(positionAttrID); GLES20.glVertexAttribPointer(positionAttrID, 4, GLES20.GL_FLOAT, false, 0, 0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, buffers[1]); GLES20.glBufferData(GLES20.GL_ARRAY_BUFFER, vertexColorData.capacity()*4, vertexColorData, GLES20.GL_STATIC_DRAW); GLES20.glEnableVertexAttribArray(colorAttrID); GLES20.glVertexAttribPointer(colorAttrID, 4, GLES20.GL_FLOAT, false, 0, 0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, buffers[2]); GLES20.glBufferData(GLES20.GL_ARRAY_BUFFER, vertexTexCoordData.capacity()*4, vertexTexCoordData, GLES20.GL_STATIC_DRAW); GLES20.glEnableVertexAttribArray(textCoordAttrID); GLES20.glVertexAttribPointer(textCoordAttrID, 4, GLES20.GL_FLOAT, false, 0, 0); */ GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, buffers[3]); GLES20.glBufferData(GLES20.GL_ELEMENT_ARRAY_BUFFER, indexData.capacity()*2, indexData, GLES20.GL_STATIC_DRAW); GLES20.glDrawElements(mode, count, GLES20.GL_UNSIGNED_SHORT, 0); This is the output of the non working code:

      From this picture I can see that some of the vertex positions are good, but for some reason every renderable object from the game has a at least one vertex position of value 0
      Thank in advance,
      Ed
    • By AhmedSaleh
      I'm trying to write a leather material shader. I have a normal map, bump map (grayscaled), specular map, diffuse map, cube maps.
      I have done the following

       
      #version 100 precision highp int; precision highp float; uniform sampler2D diffuseColorMap; uniform sampler2D ambientOcclusionMap; uniform sampler2D normalMap; uniform sampler2D specularMap; uniform sampler2D bumpMap; uniform samplerCube envMap; varying vec2 texCoord[2]; varying vec3 viewWorld; uniform float reflectionFactor; uniform float diffuseFactor; uniform float opacity; varying vec3 eyeVector; varying mat3 world2Tangent; varying vec3 lightVec; varying vec3 halfVec; varying vec3 eyeVec; void main() {   vec3 normalTangent = 2.0 * texture2D (normalMap, texCoord[0]).rgb - 1.0;        vec4 x_forw = texture2D( bumpMap, texCoord[0]+vec2(1.0/2048.0, 0.0));     vec4 x_back = texture2D( bumpMap, texCoord[0]-vec2(1.0/2048.0, 0.0));     vec4 y_forw = texture2D( bumpMap, texCoord[0]+vec2(0.0, 1.0/2048.0));     vec4 y_back = texture2D( bumpMap, texCoord[0]-vec2(0.0, 1.0/2048.0));     vec3 tangX = vec3(1.0, 0.0, 3.0*(x_forw.x-x_back.x));     vec3 tangY = vec3(0.0, 1.0, 3.0*(y_forw.x-y_back.x));     vec3 heightNormal = normalize(cross(tangX, tangY));     heightNormal = heightNormal*0.5 + 0.5;            float bumpAngle = max(0.0, dot(vec3(0.0,0.0,1.0),heightNormal ));       vec3 normalWorld = normalize(world2Tangent *heightNormal);      vec3 refDir = viewWorld - 2.0 * dot(viewWorld,normalWorld) * normalWorld;     // compute diffuse lighting                 vec4 diffuseMaterial = texture2D (diffuseColorMap, texCoord[0]);         vec4 diffuseLight  =  vec4(1.0,1.0,1.0,1.0);                  // In doom3, specular value comes from a texture           vec4 specularMaterial =  texture2D (specularMap, texCoord[0])  ;         vec4 specularLight = vec4(1.0,1.0,1.0,1.0);         float shininess = pow (max (dot (halfVec,heightNormal), 0.0), 2.0)  ;         vec4 reflection = textureCube(envMap, refDir);         //gl_FragColor=diffuseMaterial * diffuseLight * lamberFactor ;         //gl_FragColor+=specularMaterial * specularLight * shininess ;         //gl_FragColor+= reflection*0.3;     gl_FragColor = diffuseMaterial*bumpAngle ;  }  
      My question is how would I use the bump map (Grayscale) to the result of the reflection or what's wrong in my shader ?
       
       
       

    • By radek spam
      Hi,
      I would like to create a province map, something like in attached example of Age Of Conquest. I would like to use Libgdx.
      After some research i learnt that it can be done by using two images, one with graphics and second invisible with distinct colors to handle clicks.
      I have some doubts about this method:
      how to deal with memory, i have created sample map with size of 960x540 and it weighs 600kb, i would need 10 times bigger map. I could cut it in some smaller pieces and render them but im afraid that it can cause lags when scrolling the map how to deal with highlighting the provinces. I managed to implement simple highlight limited to one province creating filter in OpenGl fragment shader. But what if i want to highlight multiple provinces (eg. highlight all provinces of some country). I guess It can be done by shader too but it may be much complicated i would like to also implement Fog of War over the undiscovered provinces. How one could do that? I would really appreciate your guidance. Perhaps to create the above map i need some other method?

    • By lawnjelly
      I'm interested in rendering a grayscale output from a shader, to save into a texture for later use. I only want an 1 channel 8 bit texture rather than RGBA, to save memory etc.
      I can think of a number of possible ways of doing this in OpenGL off the top of my head, just wondering what you guys think is the best / easiest / most compatible way, before I dive into coding? This has to work on old android OpenGL ES2 phones / tablets etc, so nothing too funky.
      Is there some way of rendering to a normal RGBA frame buffer, then using glCopyTexSubImage2D or similar to copy + translate the RGBA to a grayscale texture? This would seem the most obvious, and the docs kind of suggest it might work. Creating an 8 bit framebuffer. If this is possible / a good option? Rendering out RGBA, using glReadPixels, translating on the CPU to grayscale then reuploading as a fresh texture. Slow and horrible but this is a preprocess, and would be a good option is this is more guaranteed to work than other methods.
  • Advertisement