I CAN DO

Members
  • Content count

    65
  • Joined

  • Last visited

Community Reputation

164 Neutral

About I CAN DO

  • Rank
    Member
  1. OpenGL

    Come on guys.
  2. I have a problem with OpenGL and GLSL to implement Bump Map. Vertex shader: #version 330 core layout (std140) uniform Matrices { mat4 projViewModelMatrix; mat3 normalMatrix; mat4 modelViewMatrix; }; uniform mat4 View; uniform vec3 LightPosition; uniform vec3 CameraPosition; in vec3 position; in vec3 normal; in vec2 texCoord; out vec2 TexCoord; out vec3 Normal; // BUMP MAPPING ///////////////////////////////////// out vec3 lightVec; out vec3 eyeVec; in vec3 vTangent; in vec3 bTangent; void main() { Normal = normalize(normalMatrix * normal); gl_Position = projViewModelMatrix * vec4(position, 1.0); TexCoord = vec2(texCoord); // BUMP MAPPING vec3 n = normalize(normalMatrix * normal); vec3 t = normalize(normalMatrix * vTangent); vec3 b = normalize(normalMatrix * bTangent); vec3 vVertex = vec3(modelViewMatrix * vec4(position, 1.0)); // vec3 tmpVec = LightPosition.xyz - vVertex.xyz; vec3 tmpVec = (View * vec4(LightPosition, 1.0)).xyz - vVertex.xyz; lightVec.x = dot(tmpVec, t); lightVec.y = dot(tmpVec, b); lightVec.z = dot(tmpVec, n); tmpVec = -vVertex; eyeVec.x = dot(tmpVec, t); eyeVec.y = dot(tmpVec, b); eyeVec.z = dot(tmpVec, n); } FragmentShader #version 330 layout (std140) uniform Material { vec4 diffuse; vec4 ambient; vec4 specular; vec4 emissive; float shininess; int texCount; }; uniform sampler2D texUnit; in vec3 Normal; in vec2 TexCoord; out vec4 outputF; //BUMP MAPPING /////////////////////////////////////////// in vec3 lightVec; in vec3 eyeVec; uniform sampler2D normalMap; uniform vec4 diffuseLight; uniform vec4 ambientLight; uniform vec4 specularLight; main() { vec3 N2 = normalize(texture(normalMap, TexCoord).xyz * 2.0 - 1.0); vec3 L2 = normalize(lightVec); vec3 V2 = normalize(eyeVec); vec3 R2 = normalize(-reflect(L2, N2)); float NdotL = max(0.0, dot(N2, L2)); float RdotV = max(0.0, dot(R2, V2)); // Compute final colours. vec4 ambient = ambientLight * ambient; vec4 diffuse = diffuseLight * diffuse * NdotL; vec4 specular2 = specularLight * specular * (pow(RdotV, shininess) * 1); vec4 base = texture2D(texUnit, TexCoord); // Final colour. outputF = base * ambient + base * diffuse + specular2; } There is some error there? I feel that the calculation of the direction of the light is wrong. Could it be?   The line: vec3 tmpVec = (View * vec4(LightPosition, 1.0)).xyz - vVertex.xyz; is OK? the line: tmpVec = -vVertex;   OK? or I need change to: tmpVec = (View * vec4(CameraPosition, 1.0)).xyz - vVertex.xyz;   or tmpVec = (CameraPosition).xyz - vVertex.xyz;   Please help me. Thank you, best regards.  
  3. All is OK in the shader map texture, but the is scaled (too big). The coord texture: mat4 bias = mat4(0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0, 0.5, 0.5, 0.5, 1.0); ShadowCoord = bias * projMatrix * viewModelMatrix * vec4(position,1); Why in another example the textureMap coord is: ShadowCoord = bias * projMatrix * viewModelMatrix * ViewMatrixInverse; And another: ShadowCoord = bias * projMatrix * viewModelMatrix * ModelMatrixRenderObject * vec4(position,1); Please, does anyone can explain the differences? and Why do my Texture Map is scaled? Thank you, best regards.
  4. Thank you Sponji,   That was the problem
  5. Ok. Thank you!     Like this?   glDrawArrays(GL_POINTS, 0, MAX_PARTICLES); 
  6. Im working in Particle System Class from this tutorial Particles - Anton's OpenGL 4 Wiki - Dr Anton Gerdelan   Code:   //Vertex Shader // shader to update a particle system based on a simple kinematics function #version 330 in vec3 v; // initial velocity in float tZero; // start time uniform mat4 projViewModelMatrix; uniform vec3 emitterPos_wor; // emitter position in world coordinates uniform float T; // system time T in seconds out float opacity; void main() { // work out how many seconds into our particle's life-time we are (after its starting time) float t = T - tZero; vec3 p; // gradually make particle more transparent over its life-time opacity = 1 - (t / 3) - 0.2; // particle stays put until it has reached its birth second if (t > 0) { // gravity vec3 a = vec3(0,-10,0); // this is a standard kinematics equation of motion with velocity (from VBO) and acceleration (gravity) p = emitterPos_wor + v * t + 0.5 * a * t * t; } else { p = emitterPos_wor; } gl_Position = projViewModelMatrix * vec4(p, 1); } // Pixel shader // shader to render simple particle system's points #version 330 uniform sampler2D textureMap; // I used a texture for my particles out vec4 fragColour; uniform vec4 Color; in float opacity; void main() { vec4 texcol = texture2D(textureMap, gl_PointCoord); // using point uv coordinates which are pre-defined over the point fragColour = vec4(1-opacity,1-opacity,1-opacity,1-opacity) * texcol * Color; // bright blue! } /////// CPU bool ParticleSystem::init(vec3 Position){ std::vector<vec3> Velocidad; std::vector<float> Life; for ( int i = 0; i < MAX_PARTICLES; i++ ) { Velocidad.push_back(vec3(0,-1,0)); } for ( int i = 0; i < MAX_PARTICLES; i++ ) { Life.push_back(0.001f * (float)(i)); } glGenVertexArrays( 1, &m_VAO ); glBindVertexArray( m_VAO ); glGenBuffers(ARRAY_SIZE_IN_ELEMENTS(m_Buffers), m_Buffers); glBindBuffer(GL_ARRAY_BUFFER, m_Buffers[VEL_VB]); glBufferData(GL_ARRAY_BUFFER, sizeof(Velocidad[0]) * Velocidad.size(), &Velocidad[0], GL_STATIC_DRAW); glEnableVertexAttribArray(0); glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0); glBindBuffer(GL_ARRAY_BUFFER, m_Buffers[LIF_VB]); glBufferData(GL_ARRAY_BUFFER, sizeof(Life[0]) * Life.size(), &Life[0], GL_STATIC_DRAW); glEnableVertexAttribArray(1); glVertexAttribPointer(1, 1, GL_FLOAT, GL_FALSE, 0, 0); glBindVertexArray(0); return true; } /////////////////// FINAL RENDER bool ParticleSystem::render(){ glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D,ModeloOrtho.getTextureFromID(24)); glTexEnvi(GL_POINT_SPRITE, GL_COORD_REPLACE, GL_TRUE); glEnable(GL_POINT_SPRITE); glPointSize(150.0f); shaderParticle.setUniform("projViewModelMatrix",vsml->get(VSMathLib::PROJ_VIEW_MODEL)); shaderParticle.setUniform("emitterPos_wor",ParticleInit); shaderParticle.setUniform("T",ParticleTime); shaderParticle.setUniform("Color",ColorParticle); glUseProgram(shaderParticle.getProgramIndex()); glBindVertexArray(m_VAO); glDrawElements(GL_POINTS, 0, GL_UNSIGNED_INT, 0); glBindVertexArray(0); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D,0); return true; }     The problem is that nothing happens. And if I change this line: glDrawElements (GL_POINTS, MAX_PARTICLES, GL_UNSIGNED_INT, 0); Crash the system. What am I doing wrong? Thank you, best regards.
  7. Ok, here is the solution The line: lib3ds_matrix_copy(MatA, M[j]->matrix); //Here is the problem!!! Because this Matrix is the Model Matrix, but this is not the animation Matrix... The animation Matrix is here: Lib3dsNode* Node = lib3ds_file_node_by_name(f,"Model", LIB3DS_NODE_MESH_INSTANCE); lib3ds_matrix_copy(MatA, Node->matrix); //HERE IS THE SOLUTION. Thanks you.
  8. Ok, now I see understood my question. Thank you very much. I've already been researching this technology and developed small applications DEMO in Windows and Mac, but can not find the way to develop OpenGL applications unless it is FLARtoolkit Flash. If you have knowledge please orientem to see how to develop such applications with ARToolKit and OpenGL Web Page. Again, thank you very much.
  9. Moderator, You disrespected me by saying these words. I have many years while a member of this community and I think you have been unfair thus referring to the question I made. Synthetically asked because I need only the name of the tool for Augmented Reality applications on Web pages in order to investigate and then formulate a budget for a customer who needs it. On this site people are very intelligent and varied knowledge in all kinds of technology is undoubtedly the best place I have responded and clarified issues with the intellectual level of its members, so much so that when I receive an Accept: Part of the respect in I heed to their instructions. I send a greeting and I hope to retract.
  10. If for example 60.0f increase the shadow of the object that is at the heart deforms. and is not as accurate as today, but the error disappears. There will be some mener filter the pixels in the fragment shader to eliminate those who have not been rendered?
  11. What should be modified for increasing the size of the view-volume? 1 - gluLookAt(L[0]->position[0],L[0]->position[1],L[0]->position[2],0,0,1,0,0,1); 2 - gluPerspective(45.0f, SHADOW_WIDTH/SHADOW_HEIGHT, 0.1f, 1000.0f); 3 - const float mBias[] = { 0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.0, 0.5, 0.0, 0.5, 0.5, 0.5, 1.0}; glMatrixMode(GL_TEXTURE); glLoadMatrixf(mBias); glMultMatrixf(g_mProjection); glMultMatrixf(g_mModelView); glMultMatrixf(g_mCameraInverse);
  12. The border line? And how could I do to delete it?
  13. Thanks, I was very helpful. Have a reference page or source code to implement this correction algorithm?
  14. ¡Yes, Krypt0n! Thank you! Now, another question. There would be another way to choose the correct normal vector given three points of the triangle without the need for a dot? The object is very irregular and not all faces have the same direction with respect to the origin of coordinates.
  15. Thanks VanillaSnake21, A need to develop a PC game with two different cameras each with a different monitor. How can I do that?