Advertisement Jump to content
  • Advertisement

babaliaris

Member
  • Content Count

    101
  • Joined

  • Last visited

Community Reputation

126 Neutral

About babaliaris

  • Rank
    Member

Personal Information

Social

  • Github
    babaliaris
  • Steam
    nbampaliaris

Recent Profile Visitors

1619 profile views
  1. Hello! I'm currently learning how the depth testing works in OpenGL from these tutorials and the tutorial says that By default the depth function GL_LESS is used that discards all the fragments that have a depth value higher than or equal to the current depth buffer's value. If i guess that the depth value it the z coordinate that I pass through the vertex data, then the above statement should not be true. Fragments with small Z values should be discarded because the depth is towards the -Z axis not fragments with higher z value. Does the depth values are created somehow else by using the z coordinate of the fragment? So the depth value is a number from 0...N so lets say a fragment has a depth value of 5 and the one that is behind it has 10, the 5 will pass the test?
  2. babaliaris

    Lighting: Inside faces are getting lighted too?

    I have one more question. My lighting right now works great but it does not have any logic if an object is in front of another and is blocking the light source, then the object behind it should not get any light. I understand why this is happening. For example in my diffuse calculations you can see that I'm using the direction from the fragment towards the light source and the normal of the fragment in order to get the angle between them and create the factor that is going to reduce or increase the light of the fragment based on that angle. But this considers only the current fragment and the light source, not the other objects fragments, so if an object is behind another and its facing the lighting source this face is going to be lighted any way no matter how many objects are in front of it blocking the light. Is this an advanced topic in lighting? Should I wait? I'm following these tutorials and I just finished with the Model Loading and heading to the advanced OpenGL tab.
  3. babaliaris

    Lighting: Inside faces are getting lighted too?

    Oh, I understand now. So the logic is to use multiple cubes to build the house right? Not using a big cube and see the inside of it.
  4. babaliaris

    Lighting: Inside faces are getting lighted too?

    And when building a house, don't you want to see inside it? How does this work?
  5. babaliaris

    Lighting: Inside faces are getting lighted too?

    As i said i tried to do this and it works: //Positions //Normals //Texels //Front face (on z axis) -0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, 0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, -0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 1.0f, -0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, //Front face with reversed Normals. Draw it a little farther so the depth test will pass. //Reverse -0.5f, -0.5f, 0.49f, 0.0f, 0.0f, -1.0f, 0.0f, 0.0f, 0.5f, -0.5f, 0.49f, 0.0f, 0.0f, -1.0f, 1.0f, 0.0f, 0.5f, 0.5f, 0.49f, 0.0f, 0.0f, -1.0f, 1.0f, 1.0f, 0.5f, 0.5f, 0.49f, 0.0f, 0.0f, -1.0f, 1.0f, 1.0f, -0.5f, 0.5f, 0.49f, 0.0f, 0.0f, -1.0f, 0.0f, 1.0f, -0.5f, -0.5f, 0.49f, 0.0f, 0.0f, -1.0f, 0.0f, 0.0f, But i thing this method destroys the performance if for each face you need to have duplicate data to change only the normal direction.
  6. Hello again! I have created a basic phong model but I noticed that inside faces are getting lighted too. It seems to me like Normals (their direction) are the same outside-inside and this is why this is happening. Take a look at the following video. Outside it seems alright, but when I look inside the cube, the front face which is getting lighted from the outside, is also getting lighted from the inside. I believe that 99% this is because the inside face is actually using the same normals as the outside one, since in my vertex data normals are being initialised for each face (6 in total) not for 12. Do I have to create 72 vertices? 36 for the outside 6 faces and 36 for the inside with different normals. In the specular calculations don't be suprised by this: vec3 viewDirection = normalize(fragPosition - viewPos); The viewPos is actually the Front of the camera not the position, which is relative to the view coordinate system no the world (I'm doing lighting calculations in the word coordinate system). Instead of transforming the viewPos (Front of the camera) into the word coordinate system, i just moved it with my mind from view to word (like we learned in maths) and came out with the above calculation which gives me the appropriate vector to get the correct angle between the view direction and the reflection of the light. This is my vertex data: //Vertex Data. float vertices[] = { // positions // normals // texture coords -0.5f, -0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 0.0f, 0.0f, 0.5f, -0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 1.0f, 0.0f, 0.5f, 0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 1.0f, 1.0f, 0.5f, 0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 1.0f, 1.0f, -0.5f, 0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 0.0f, 1.0f, -0.5f, -0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 0.0f, 0.0f, -0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, 0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, -0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 1.0f, -0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, -0.5f, 0.5f, 0.5f, -1.0f, 0.0f, 0.0f, 1.0f, 0.0f, -0.5f, 0.5f, -0.5f, -1.0f, 0.0f, 0.0f, 1.0f, 1.0f, -0.5f, -0.5f, -0.5f, -1.0f, 0.0f, 0.0f, 0.0f, 1.0f, -0.5f, -0.5f, -0.5f, -1.0f, 0.0f, 0.0f, 0.0f, 1.0f, -0.5f, -0.5f, 0.5f, -1.0f, 0.0f, 0.0f, 0.0f, 0.0f, -0.5f, 0.5f, 0.5f, -1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.5f, 0.5f, 0.5f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.5f, 0.5f, -0.5f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.5f, -0.5f, -0.5f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.5f, -0.5f, -0.5f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.5f, -0.5f, 0.5f, 1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.5f, 0.5f, 0.5f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, -0.5f, -0.5f, -0.5f, 0.0f, -1.0f, 0.0f, 0.0f, 1.0f, 0.5f, -0.5f, -0.5f, 0.0f, -1.0f, 0.0f, 1.0f, 1.0f, 0.5f, -0.5f, 0.5f, 0.0f, -1.0f, 0.0f, 1.0f, 0.0f, 0.5f, -0.5f, 0.5f, 0.0f, -1.0f, 0.0f, 1.0f, 0.0f, -0.5f, -0.5f, 0.5f, 0.0f, -1.0f, 0.0f, 0.0f, 0.0f, -0.5f, -0.5f, -0.5f, 0.0f, -1.0f, 0.0f, 0.0f, 1.0f, -0.5f, 0.5f, -0.5f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.5f, 0.5f, -0.5f, 0.0f, 1.0f, 0.0f, 1.0f, 1.0f, 0.5f, 0.5f, 0.5f, 0.0f, 1.0f, 0.0f, 1.0f, 0.0f, 0.5f, 0.5f, 0.5f, 0.0f, 1.0f, 0.0f, 1.0f, 0.0f, -0.5f, 0.5f, 0.5f, 0.0f, 1.0f, 0.0f, 0.0f, 0.0f, -0.5f, 0.5f, -0.5f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f }; This is my fragment shader: #version 330 core //Framgent Output. out vec4 aPixelColor; //Normals and TexCoordinates. in vec3 fragNormal; in vec2 fragTexCoord; in vec3 fragPosition; //Light Source. struct LightSource { vec3 position; vec3 ambient; vec3 color; }; //Material. struct Material { sampler2D diffuse; sampler2D specular; int shininess; }; //Uniforms. uniform LightSource light; uniform Material material; uniform vec3 viewPos; //Declare Functions. vec3 GetAmbientColor(); vec3 GetDifffuseColor(); vec3 GetSpecularColor(); //-_-_-_-_-_-_-_-_-_-_-_-_-_-_-Main Function-_-_-_-_-_-_-_-_-_-_-_-_-_-_-// void main() { float alpha_value = texture(material.diffuse, fragTexCoord).w; vec3 ambient_color = GetAmbientColor(); vec3 diffuse_color = GetDifffuseColor(); vec3 specular_color = GetSpecularColor(); vec3 final_color = ambient_color + diffuse_color + specular_color; //Set the final color. aPixelColor = vec4(final_color, alpha_value); } //-_-_-_-_-_-_-_-_-_-_-_-_-_-_-Main Function-_-_-_-_-_-_-_-_-_-_-_-_-_-_-// vec3 GetAmbientColor() { return light.ambient * vec3(texture(material.diffuse, fragTexCoord)); } vec3 GetDifffuseColor() { vec3 light_direction = normalize(light.position - fragPosition); vec3 normal = normalize(fragNormal); float diffuse_factor = max(dot(light_direction, normal), 0); return (light.color * diffuse_factor) * vec3(texture(material.diffuse, fragTexCoord)); } vec3 GetSpecularColor() { vec3 light_direction = normalize(fragPosition - light.position); vec3 normal = normalize(fragNormal); vec3 viewDirection = normalize(fragPosition - viewPos); vec3 refrection = normalize(reflect(light_direction, normal)); float spec_factor = pow( max(dot(refrection, viewDirection), 0) , material.shininess ); return light.color * spec_factor * vec3(texture(material.specular, fragTexCoord)); } This is the vertex shader: #version 330 core layout(location = 0) in vec3 aPos; layout(location = 1) in vec3 aNormal; layout(location = 2) in vec2 aTexel; uniform mat4 model; uniform mat4 view; uniform mat4 proj; out vec3 fragNormal; out vec2 fragTexCoord; out vec3 fragPosition; void main() { gl_Position = proj * view * model * vec4(aPos, 1.0f); fragNormal = mat3(transpose(inverse(model))) * aNormal; fragTexCoord = aTexel; fragPosition = vec3(model * vec4(aPos, 1.0f)); }
  7. Well if you read my post you will see that I check about memory alignment, and for the max texture size. These weren't the problem. Anyway problem solved.
  8. Thank you very much @CrazyCdn!!! You saved me! Now that I know this about power of two, I'll be fine!!! I bought courses on udemy, watched YouTube videos, even read tutorials and other stuff on the internet, but in this forum I learned how to properly do things.
  9. LOL wait. I resized my images so the resolutions will be power of 2 (each direction) and it works!!! So what do you think is the problem in my code?
  10. But still i checked this out, the resolutions of the images when this problem occurs are in upload order: 2048x2048, 512x512, 948x948, 736x736 And still the problem appears...
  11. If I am correct, I think I read somewhere in OpenGL Recommendations that you should use images that the resolution is a perfect quad and the width and height are multiple of 4 (For example 2x2, 4x4, 1024x1024, 2048x2048 etc..) so what you said make sense. But I would still love to know whats causing my current problem just to be a good graphics programmer and be aware of everything.
  12. Guys just give me some hints. If the glTexImage2D() is configured correctly , if the width of the source images is multiple of 4 and my sharers and vertex data are fine, what else could go wrong?? Nobody has the experience to tell me? What should i check first?
  13. Yes you are right (copy-paste mistake). For some reason I cant edit the original post so I'm going to post the code here: Fragment Shader #version 330 core out vec4 Color; in vec2 TexCoord; uniform sampler2D diffuse; void main() { Color = texture(diffuse, TexCoord); }
  14. If you mean about the vertex specification, you can check it at the constructor of the Renderer class . I don't think I have any vertex data wrong in there. Also I would like to mention that A LOT of people who run Nvidia run my code and everything was rendering fine. But on my computer (AMD GPU) does this. Someone in an older post told me that "Some OpneGL Implementations are more forgiven that others and 99% when a glitch like this appears it's your fault, not the driver's." As much as I want to believe him, I cant' find what else I might be doing wrong.
  15. LoL. Usually when I use enums I'm doing it old style (a bunch of #define statements, not the c++ enum type) bet you would scream more seeing that 😂
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!