Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 13 Aug 2011
Offline Last Active Nov 13 2014 12:13 AM

#4949218 OpenGL ES 2.0: How to specify normals when using an index?

Posted by on 14 June 2012 - 11:57 AM

Okay, here's what's happening now. I've removed the index buffer altogether, and am now just using glDrawArrays(). The way I accomplished this was by building a list of vertices using the COLLADA file's index myself, then passing the result to glBufferData(). It works if I don't use the normals. Here's what the output looks like if I use the values of the vertices as normals in the shader:
Obviously, the vertices are being read correctly. However, here's what I get when I add the normas:
What gives? I suspect this is a problem with my shader(s). Hopefully this "globe" shape will be a tell-tale sign of something I've done wrong.

Here are the shaders I'm using:

//vertex shader

attribute vec3 v_position;
attribute vec3 v_normal;
varying float lightIntensity;

uniform mat4 model;
//uniform mat4 view;
uniform mat4 proj;

void main(void)
  vec4 newPosition = proj * model * vec4(v_position,1.0);
  gl_Position = newPosition;

  //specify direction of light
  vec3 light_dir = vec3(0.9,0.8,-3.0);

  //if I use this, it looks like a messed up sphere
  vec4 newNormal = proj * model * vec4(v_normal,0.0);
  //if I use this, I can make out the shape of the model,
  //but the lighting over the model is wrong.
  //vec4 newNormal = proj * model * vec4(v_position,0.0);
  lightIntensity = max(0.0, dot(newNormal.xyz, light_dir));

//fragment shader

varying float lightIntensity;
void main(void)
  vec4 yellow = vec4(1.0, 1.0, 0.0, 1.0);
  gl_FragColor = vec4((yellow * lightIntensity * 0.2).rgb, 1.0);

#4949005 OpenGL ES 2.0: How to specify normals when using an index?

Posted by on 13 June 2012 - 07:19 PM

As I said in the first quote that you acknowledged; an index references all enabled attribute arrays. You can only have one index array, and each index in that array will reference all enabled attributes to construct one vertex. Thus, if you have the index array [1, 4, 5], then you have three vertices; first vertex is constructed from the second position and the second normal; the second vertex is constructed from the fifth position and the fifth normal; and the third vertex is constructed from the sixth position and the sixth normal.

Okay. I understand. This explains why the model renders properly when I remove the normal entries from the index, because as I mentioned a COLLADA file contains an interleaved index of both vertices and normals: VNVNVNVNVNVN

So my predicament is I have two arrays, one for vertices and one for normals, but I am allowed only one index. The COLLADA file's index interleaves the indices for both in an integer array. I don't think there is any relation between the two indices.

Should I just not use the index at all and assemble the data manually?

Any ideas?

#4948991 OpenGL ES 2.0: How to specify normals when using an index?

Posted by on 13 June 2012 - 06:16 PM

You are incorrect that the index only reference the vertices. An index references all enabled attributes. Since you are using generic attributes, there is no notation of vertex or normal, or anything for the matter, as far as OpenGL is concerned anyway since the attribute could be anything.

Ah, got it. Thanks for clearing that up.

So, do you have exactly one normal for every position in the corresponding attribute array? If not, the normal array is not correct since you have normals without corresponding position, or vice versa.

I should have an equal number of vertices and normals, per the COLLADA file. I made sure it was exported properly and the model in question works fine on an OpenGL (non-ES) 2.0 implementation. I am just trying to get everything working on ES 2.0 for use on phones.

I can build either one or two arrays. One with vertices and normals interleaved, or two separate arrays that each correspond to a distinct attribute variable. Assuming I have two GL_ARRAY_BUFFERS, each attached to its own attribute variable in the vertex shader, how does OpenGL know which index values correspond to each array/attribute?

FYI, my index is from the COLLADA file, and looks like this (where V is a vertex position, and N is a normal):


#4948979 OpenGL ES 2.0: How to specify normals when using an index?

Posted by on 13 June 2012 - 05:28 PM

I have an OpenGL ES 2.0-based app I'm working on, and am having trouble with lighting. I would like to send both vertices and normals to the vertex shader, but I don't know of a way to do this. I can send the vertices to the shader and it renders the model properly using my index specified with GL_ELEMENT_ARRAY_BUFFER, but the lighting isn't rendered properly without the normals.

Here's the code I'm working with:

glEnableVertexAttribArray(0); //vertices
glEnableVertexAttribArray(1); //normals
glBindAttribLocation(prog, 0, "v_position"); //bind this var to 0
glBindAttribLocation(prog, 1, "v_normal"); //bind this var to 1

//generate buffers
GLuint vbuf_id[2],ibuf_id[2];
glGenBuffers(2, vbuf_id); //vertex buffers
glGenBuffers(2, ibuf_id); //index buffers

//create and bind buffer for vertices
glBindBuffer(GL_ARRAY_BUFFER, vbuf_id[0]); //vertex buffer
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, (void*)0);
glBindBuffer(GL_ARRAY_BUFFER,0); //unbind

//create and bind another buffer for normals
glBindBuffer(GL_ARRAY_BUFFER, vbuf_id[1]); //normal buffer
glBufferData(GL_ARRAY_BUFFER, sizeof(normals), normals, GL_STATIC_DRAW);
glVertexAttribPointer(1, 3, GL_FLOAT, GL_FALSE, 0, (void*)0);
glBindBuffer(GL_ARRAY_BUFFER,0); //unbind

//create IBO (index buffer)
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibuf_id[0]);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(index), index, GL_STATIC_DRAW);

Which works great for vertices, using the supplied index. However, since the index only references the vertices, how do I include the normals? I have a list of normals from a COLLADA file, but I can't seem to get the vertex shader to process them correctly. I also had to remove the normals entries from the index because OpenGL wouldn't work if I included them. Apparently, the GL_ELEMENT_ARRAY_BUFFER wants packed entries for vertices ONLY.

What am I doing wrong? I'm happy to supply any additional information that may help.

#4883403 How to squeeze rectangular bitmap into GL_TEXTURE_2D?

Posted by on 13 November 2011 - 12:09 AM

On nvidia cards (never used amd but those probably work as well, maybe not intel though), you can texImage a rectangle and not have to do anything different between a rectangle and square texture. The texture coords are still the same as well (0 to 1). It will help what you are doing

Well, the documentation says that the texture size for GL_TEXTURE_2D must be power of 2. That's why I passed "width" as both the width and height. Is it possible to pass rectangular dimensions when the texture target is GL_TEXTURE_2D on some systems? I'd been using GL_TEXTURE_RECTANGLE_ARB because I thought I had to with non-power of 2 textures. But then I have the problem that the texture coords aren't normalized.

I see that your destination is rectangle, then your source in square? Why put a square texture into a rectangle?

Other way around. I have an image (1024x768) that I need to put into a square texture. I thought padding the pixel data to make it square would work, but as you pointed out it didn't work.