• Advertisement

too_many_stars

Member
  • Content count

    144
  • Joined

  • Last visited

Community Reputation

336 Neutral

About too_many_stars

  • Rank
    Member

Personal Information

  • Interests
    Art
    Programming
  1. OpenGL Shader class instancing question

    Thanks for the reply. Maybe a singleton registry with std::map< std::string , GLSL > pair? That would give a quick look up table for a particular shader and would make it available in any render method. Thanks Mike
  2. Hello Everyone, I have been going over a number of books and examples that deal with GLSL. It's common after viewing the source code to have something like this... class Model{ public: Model(); void render(); private: GLSL glsl_program; }; ////// .cpp Model::Model(){ glsl_program.compileAndLinkShaders() } void Model::render(){ glsl_program.use() //render something glsl_program.unUse(); } Is this how a shader program should be used in real time applications? For example, if I have a particle class, for every particle that's created, do I want to compiling and linking a vertex, frag shader? It seems to a noob such as myself this might not be the best approach to real time applications. If I am correct, what is the best work around? Thanks so much for all the help, Mike
  3. Help with GL_ELEMENT_ARRAY_BUFFER

    That makes sense, thanks so much!
  4. Help with GL_ELEMENT_ARRAY_BUFFER

    Thanks so much again Swifcoder, that did the trick. I think I completely confused myself going back and forth. One more question if I may. 1.) Is there a performance increase to using indecies? Thanks, Mike
  5. Help with GL_ELEMENT_ARRAY_BUFFER

    Thanks for the response swiftcoder, I already tried that with the following code... GLfloat verts[] = { 1.0f , 1.0f , 1.0f , //[ 0 ] -1.0f , 1.0f , 1.0f , //[ 1 ] -1.0f , -1.0f , 1.0f , //[ 2 ] 1.0f , -1.0f , 1.0f }; //[ 3 ] GLushort indices[] = { 0 , 1 , 2 , 2 , 1 , 3}; GLfloat uv[] = { 1.0f , 1.0f , 0.0f , 1.0f , 0.0f , 0.0f , 1.0f , 0.0f }; And the result is a quad with a slice taken out of it on the left hand side. I should also note that when I use glDrawArrays ( after changing the number of verts and uv's I don't have any issues.)
  6. VS2012 and Additional Inclue Directories

    Thank you very much for the detailed response. I will have a look at both links. Mike
  7. Hello everyone, I have having issues understanding the opengl index buffers. Here is what I have, a simple quad... GLfloat verts[] = { 1.0f , 1.0f , 1.0f , //[ 0 ] -1.0f , 1.0f , 1.0f , //[ 1 ] -1.0f , -1.0f , 1.0f , //[ 2 ] 1.0f , -1.0f , 1.0f }; //[ 3 ] GLushort indices[] = { 0 , 1 , 2 , 2 , 1 , 3}; GLfloat uv[] = { 1.0f , 1.0f , //front face 0.0f , 1.0f , 1.0f , 0.0f , 1.0f , 0.0f , 0.0f , 1.0f , 0.0f , 0.0f }; Bound, filled, and passed into my shaders with a texture. However, when I make the call glBindBuffer( GL_ELEMENT_ARRAY_BUFFER , index_id ); glDrawElements( GL_TRIANGLES , 3 * 2 , GL_UNSIGNED_SHORT , (void*) NULL ); The textures uv co-ordinates are all screwed up. My winding is CCW for both the verts and the indices. Every index has a uv coordinate so I am at a loss. Mike
  8. VS2012 and Additional Inclue Directories

    @desiado - Thanks, I think I understand. So every solution on my PC will be able to see what's in one VC++ Directories -> Include Directories as opposed to project level only. @ChaosEngine - It's an old laptop and I have not bothered updating. However, as soon as I get something newer, that will be the first thing I do. Thanks so much guys!
  9. Hello Everyone, A quick question regarding Visual Studio 2012. Under Configuration Properties, we have VC++ Directories -> Include Directories Under C/C++ we have General->Additional Include Directories What exactly is the difference and when should I be using the C/C++ Additional Include Directories? Thanks, Mike
  10. Shader Attribute Help

    Bah! You are absolutely correct, that was an embarrassing mistake. I am still trying to get used to all the variables and how information flows between the OpenGL program and the shader's themselves. And yes, I am printing out the error logs but nothing showed up the assignment. Thanks you again so much! Mike
  11. Shader Attribute Help

    Thanks for the reply Swiftcoder. I did as you said, but all I get is a black triangle with still only the positional attribute. Please note that in the vertex shader I have the normals coming in via "in vec3 vertex_normal" I am using them in the calculation to figure out "normal_in_camera" on the second last line which is going to the fragment shader. New vertex shader in vec3 vertex_position; //in local space in vec3 vertex_normal; //in local space out vec3 light_dir_camera; out vec3 normal_in_camera; uniform mat4 mvp_matrix; // model * view * projection matrix uniform mat4 model_matrix; //local to world space model matrix uniform mat4 view_matrix; //camera matrix uniform vec3 light_pos_world; //light position in world space void main(){ vec3 world_vert_pos = (model_matrix * vec4( vertex_position , 1 )).xyz; //get the vertex in world position vec3 vert_pos_camera = (view_matrix * vec4( vertex_position , 1 )).xyz; //get the vertex in camera space vec3 to_camera = vec3( 0 , 0 , 0 ) - vert_pos_camera; //vector from vertex in camera space to the eye ( which is located at the origin) vec3 light_pos_camera = (view_matrix * vec4( light_pos_world , 1 )).xyz; //get the light position in camera space vec3 light_dir_camera = light_pos_camera + to_camera; //no sure about this vec3 normal_in_camera = (view_matrix * model_matrix * vec4( vertex_normal , 0 )).xyz; //get the normal in camera space ( note that w = 0 to signify it's a direction not a position!) gl_Position = mvp_matrix * vec4( vertex_position , 1 ); } Fragment shader in vec3 normal_in_camera; in vec3 light_dir_camera; out vec3 frag_color; void main(){ vec3 light_color = vec3( 1 , 1 , 1 ); vec3 material_color = vec3( 0.2 , 0 , 0.2 ); vec3 n = normalize( normal_in_camera ); vec3 l = normalize( light_dir_camera ); float angle = clamp( dot( n , l ) , 0 , 1 ); frag_color = material_color * light_color * angle; } Mike
  12. Shader Attribute Help

    Hello everyone, I am doing some tutorials from this site: http://www.opengl-tutorial.org/beginners-tutorials/tutorial-8-basic-shading/ and am having some issues with my normals and passing them onto the vertex shader to get some diffuse lighting to work. Here's a simple triangle Triangle::Triangle( const glm::vec3& a , const glm::vec3& b, const glm::vec3& c ): angle(), vert_handle( 0 ), vao_handle(0), normal_handle( 0 ), pos(){ GLfloat verts[] = { a.x , a.y , a.z , b.x , b.y , b.z , c.x , c.y , c.z}; GLfloat vert_normals[] = { 0.0f , 0.0f , 1.0f, 0.0f , 0.0f , 1.0f, 0.0f , 0.0f , 1.0f}; glGenBuffers( 1 , &vert_handle ); glBindBuffer( GL_ARRAY_BUFFER , vert_handle ); glBufferData( GL_ARRAY_BUFFER , sizeof( verts ) , verts , GL_STATIC_DRAW ); glGenBuffers( 1, &normal_handle ); glBindBuffer( GL_ARRAY_BUFFER , normal_handle ); glBufferData( GL_ARRAY_BUFFER , sizeof( vert_normals ) , vert_normals , GL_STATIC_DRAW ); glGenVertexArrays( 1 , &vao_handle ); glBindVertexArray( vao_handle ); glBindBuffer( GL_ARRAY_BUFFER , vert_handle ); glVertexAttribPointer( 0 , 3 , GL_FLOAT , GL_FALSE , 0 , (void*) NULL ); glBindBuffer( GL_ARRAY_BUFFER , normal_handle ); glVertexAttribPointer( 1 , 3 , GL_FLOAT , GL_FALSE , 0 , (void*) NULL ); } Creation of my glsl program glsl = GLSLProgram( new GLSL() ); glsl->compileShaders( "shaders/vert_shader.txt", "shaders/frag_shader.txt" ); glsl->addAttribute("vertex_position"); glsl->addAttribute("vertex_normal"); glsl->linkShaders(); glsl->printActiveAttributes(); And the vertex shader #version 400 in vec3 vertex_position; //in local space in vec3 vertex_normal; //in local space out vec3 frag_color; uniform mat4 mvp_matrix; // model * view * projection matrix uniform mat4 model_matrix; //local to world space model matrix uniform mat4 view_matrix; //camera matrix uniform vec3 light_pos_world; //light position in world space void main(){ vec3 world_vert_pos = (model_matrix * vec4( vertex_position , 1 )).xyz; //get the vertex in world position vec3 vert_pos_camera = (view_matrix * vec4( vertex_position , 1 )).xyz; //get the vertex in camera space vec3 to_camera = vec3( 0 , 0 , 0 ) - vert_pos_camera; //vector from vertex in camera space to the eye ( which is located at the origin) vec3 light_pos_camera = (view_matrix * vec4( light_pos_world , 1 )).xyz; //get the light position in camera space vec3 light_dir_camera = light_pos_camera + to_camera; vec3 normal_in_camera = (view_matrix * model_matrix * vec4( vertex_normal , 0 )).xyz; //get the normal in camera space ( note that w = 0 to signify it's a direction not a position!) gl_Position = mvp_matrix * vec4( vertex_position , 1 ); } My issue is that the glsl program is telling me that my triangle normals are not active. However, per my understanding of the documentation, since I am using "vertex_normal" in my vertex shader, that attribute should be active, but it's not. Of course, without the normals, I can't get the light to work. Could someone please tell me what the issue is? Thanks, Mike
  13. plane point distance help

    Thanks for the replies guys. But I must be missing something. Given the correct formula that was quoted: dist = dot( plane_normal , point ) - plane_distance As opposed to the one I used: dist = dot( plane_normal , point ) + plane_distance and the same planes I defined above, we get the following answers. Plane plane1(Vec2(1,0),0) Plane plane2(Vec2(1,0),50) Plane plane3(Vec2(-1,0),350) Plane plane4(Vec2(-1,0),400) and a Point at (200,0) The following would be the distances point to plane1 = dot( Vec2(1,0) , Vec2(200,0)) - 0 = 200 //Correct distance point to plane2 = dot( Vec2(1,0) , Vec2(200,0)) - 50 = 150 //Correct distance point to plane3 = dot( Vec2(-1,0), Vec2(200,0)) - 350 = -200 - 350 = -550//Wrong distance //s/b 150 point to plane4 = dot( Vec2(-1,0), Vec2(200,0)) - 400 = -200 - 400 = -600 //Wrong distance s/b 200 Whether you add or subtract the plane distance from the dot product, you will still get the wrong numbers (as far as I can tell) The only way I can think of getting around this is using negative plane distances in some cases, which is not very intuitive. Thanks, Mike
  14. plane point distance help

    Hello Everyone, I am having an issue with a point plane distances, and was wondering if anyone could help me. To start, a plane is defined by a unit normal, and a distance from the origin Plane(Vec2,float) while a point is simply a point in space Vec2. To get the distance between a point and a plane, usually something like this is done: float distance = dot( plane_normal , point ) + plane_distance_from_origin. So far so good. The issue lies with the following example. Say we have: This also assumes that top left is the origin. Plane plane1(Vec2(1,0),0) Plane plane2(Vec2(1,0),50) Plane plane3(Vec2(-1,0),350) Plane plane4(Vec2(-1,0),400) and a Point at (200,0) The following would be the distances point to plane1 = dot( Vec2(1,0) , Vec2(200,0)) + 0 = 200 //Correct distance point to plane2 = dot( Vec2(1,0) , Vec2(200,0)) + 50 = 250 //WRONG!! incorrect distance. S/B 150 point to plane3 = dot( Vec2(-1,0), Vec2(200,0)) + 350 = -200 + 350 = 150 //Correct distance point to plane4 = dot( Vec2(-1,0), Vec2(200,0)) + 400 = -200 + 400 = 200 //Correct distance My issue is with "point to plane2" distance. I am not sure how to deal with this problem. Could someone please suggest a solution, or give me a source. Thank you very much. Mike
  15. C++ Shader help

    Hello Everyone, I am having an issue getting my shaders to work. Here is what I have. This is the rendering function. glUseProgram( program_id ); GLfloat v[6] = { 0.0f , 0.0f , 1.0f , 1.0f , 1.0f, 0.0f }; GLuint vbo_id = 0; glGenBuffers( 1 , &vbo_id); glBindBuffer( GL_ARRAY_BUFFER , vbo_id); glBufferData( GL_ARRAY_BUFFER , sizeof(v) , v , GL_STATIC_DRAW); glEnableVertexAttribArray( 0 ); glBindBuffer( GL_ARRAY_BUFFER , vbo_id ); glVertexAttribPointer( 0 , 2 , GL_FLOAT, GL_FALSE , 0 , 0 ); glDrawArrays( GL_TRIANGLES , 0 , 3 ); glBindBuffer( GL_ARRAY_BUFFER , 0 ); glDisableVertexAttribArray(0); And the vertex shader #version 330 in vec2 vertex_position; //layout( location = 0 ) in vec2 vertex_position; void main(){ gl_Position.xy = vertex_position * 1.2; gl_Position.z = 0; gl_Position.w = 1; } The fragment shader #version 330 void main(){ gl_FragColor = vec4(1.0,0.0,0.0,1.0); } I did not post the compiling , linking and attribute adding code because it appears to work. For example, if I make a syntax error in one the shaders, it get a warning. Or if there's a linker problem for example. My issue is that I do get a triangle to show up on the screen, but no matter what I do to the vertex or fragment shaders (for example changing the color or size ) it has no effect with what I see on the screen. In essence, there appears to be no communication between the shaders and glew. My frustration is further compounded by the fact that I was able to get this working before on the same laptop. If anyone has any ideas on what could be wrong with my incredibly simple program, please let me know. Thanks, Mike
  • Advertisement