addiem7c5

Members
  • Content count

    6
  • Joined

  • Last visited

Community Reputation

103 Neutral

About addiem7c5

  • Rank
    Newbie
  1. Visual Studio wasn't updating my .dll file, that is the cause of this problem. It is fixed now.
  2. I am having troubles getting texturing working in my OpenTK/OpenGL 3.3 code. I am sure it is something simple that I am missing, but I just can't seem to get a texture on the square I am rendering. (I'm writing a rendering framework for a game). The relevant code pieces are: //setup Texcoord array. GL.BindBuffer(BufferTarget.ArrayBuffer, VertexBufferObject[2]); GL.BufferData<float>(BufferTarget.ArrayBuffer, (IntPtr)(sizeof(float) * texcoord_information.Count), texcoord_information.ToArray(), BufferUsageHint.StaticDraw); GL.EnableVertexAttribArray(2); GL.VertexAttribPointer(2, 2, VertexAttribPointerType.Float, false, 0, 0); GL.BindBuffer(BufferTarget.ArrayBuffer, 0); //Set up our textures for(int i = 0; i < Textures.Count; i++) { Console.WriteLine("Preparing texture."); Uniform_To_Texture[GL.GetUniformLocation(Shader.GetProgram(), "diffuse")] = Textures[i]; } //Render int texture_unit = 0; foreach (KeyValuePair<int, ITexture> kvp in Uniform_To_Texture) { GL.Uniform1(kvp.Key, texture_unit); GL.ActiveTexture(TextureUnit.Texture0 + texture_unit); GL.BindTexture(TextureTarget.Texture2D, kvp.Value.GetTextureHandle()); texture_unit++; } if (!GL.IsVertexArray(VertexArrayObject)) throw new OpenGLException("Vertex Array Object not set up correctly, cannot render!"); GL.BindVertexArray(VertexArrayObject); GL.DrawArrays(BeginMode.Triangles, 0, Vertices.Count); GL.BindVertexArray(0);   Then the vertex and fragment shaders are: //Vertex shader #version 330 uniform mat4 viewmatrix, projmatrix, transformmatrix; in vec3 position; in vec3 normal; in vec2 texcoord; varying vec2 texturecoord; void main() { texturecoord = texcoord; gl_Position = projmatrix * viewmatrix * transformmatrix * vec4(position, 1.0) ; } //Fragment Shader #version 330 uniform sampler2D diffuse; varying vec2 texturecoord; out vec4 color; void main(void) { color = texture2D(diffuse, texturecoord); }   The square I am rendering is just coming up white, if that helps any. 
  3. OpenGL Problem With VBOs

    [quote name='karwosts' timestamp='1297362056' post='4772467'] Have you stepped through line by line and verified that all your values are good and valid? Didn't forget to initialize numberofvertices or anything stupid like that? [/quote] Hah. Hah. It would have been that of course. I must've been tired or something last night, I forgot to copy number of vertices and number of indices over. That did the trick. It's a good feeling that I knew how to do VBOs, but didn't know how to pass arguments into a function XD Thanks for your help!
  4. OpenGL Problem With VBOs

    [quote name='JimmyDeemo' timestamp='1297339661' post='4772313'] Ok i might not know much about this but here goes. Doesn't the index data need to be stored in sequentially in memory? Does boost::shared_array do this? Try your vertex and index data in a std::vector<GLfloat> and std::vector<GLuint> respectively and then filter that type through the creation of the VBOs. See if that renders what you expect. From what i am aware the latest spec of std::vector make sure that the data is in sequential order. [/quote] I am fairly sure boost::shared_array keeps contiguous memory, since you new a pointer and give it to shared_array to manage it as a smart pointer. Just for the sake of checking it though, I did change them to vectors and it still doesn't render. Thanks for the idea, though
  5. OpenGL Problem With VBOs

    Oh yeah, I forgot to add that. I was checking glGetError(), and it was returning 0. I'm very baffled. Edit: I also threw in a couple fixed function pipeline calls just to see if anything was rendering. A red triangle showed up, just as it should have. Edit 2: I am actually using the data that I'm putting in the VBO on the FFP, and it's rendering correctly.
  6. Hello everyone, It's been a long while since I last posted here, so long that I forgot my old username/password. Ah well. I am very far behind in OpenGL spec, and am getting off the fixed function pipeline. I've been trying to get Vertex Buffer Objects to render to no avail. Could someone tell me what is wrong with this code? [code] //m_indices is a boost::shared_array<unsigned int> //m_vertices is a boost::shared_array<vertexinformation> where vertex information is: struct vertexinformation { float px, py, pz; float nx, ny, nz; float t1x, t1y; float t2x, t2y; float t3x, t3y; float t4x, t4y; float t5x, t5y; }; //This is in my initialization glGenBuffers(1, &m_vertexbufferobject); glGenBuffers(1, &m_indexbufferobject); glBindBuffer(GL_ARRAY_BUFFER, m_vertexbufferobject); glBufferData(GL_ARRAY_BUFFER, m_numberofvertices * 16 * sizeof(float), m_vertices.get(), GL_STATIC_DRAW); glBindBuffer(GL_ARRAY_BUFFER, 0); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, m_indexbufferobject); glBufferData(GL_ELEMENT_ARRAY_BUFFER, m_numberofindices * sizeof(unsigned int), m_indices.get(), GL_STATIC_DRAW); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0); //Render Code glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glBindBuffer(GL_ARRAY_BUFFER, m_vertexbufferobject); glEnableClientState(GL_VERTEX_ARRAY); glVertexPointer(3, GL_FLOAT, 64, 0); glEnableClientState(GL_NORMAL_ARRAY); glNormalPointer(GL_FLOAT, 64, BUFFER_OFFSET(3*sizeof(float))); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, m_indexbufferobject); glDrawElements(GL_TRIANGLES, m_numberofindices, GL_UNSIGNED_INT, 0); glDisableClientState(GL_NORMAL_ARRAY); glDisableClientState(GL_VERTEX_ARRAY); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0); glBindBuffer(GL_ARRAY_BUFFER, 0); glfwSwapBuffers(); [/code] What it should be drawing is a quad consisting of two triangles. The vertices specified are the four corners, and the indices specified are the two triplets of vertices to create the triangle. Can anyone see anything wrong with any of this? Thanks -Addie