x

Ads are being blocked

We get it. We use ad blockers too. But GameDev.net displays them so we can continue to be a great platform for you.

Please whitelist GameDev.net and our advertisers.

Also consider a GDNet+ Pro subscription to remove all ads from GameDev.net.

OpenGL API Specifications

Documents and Learning ResourcesSupport Libraries and FrameworksFunction Libraries and GeneratorsThis list contains some basic resources for getting started with, and developing for, OpenGL 3.0 and onwards. Contact your forum moderator if you want to contribute to the list or have any comments.-
##### Composer for Video Games

MichaelElmquist -
##### Orchestral Composer of the Gothic looking to score games

Joshua Thibeault -
##### Music Composer - Let's create something amazing

Brian Pharai -
##### Elastic Music - Music for your game

Elastic Music

Subscribe to GameDev.net's newsletters to receive the latest updates and exclusive content.

Sign up now

Started by arithma, Jun 26 2012 04:09 AM

4: Adsense

**Guest**, the last post of this topic is over **60 days old** and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

- You cannot reply to this topic

7 replies to this topic

Posted 26 June 2012 - 04:09 AM

I'm having trouble sending both normals and a u,v pair to my shaders. If I remove the normal, things work as expected.

It appears the v_normal is receiving the values that are intended for v_coord. I still have no idea though.

I've spent a lot of time trying to figure out, but I really am not able to pick a sane set of actions to debug/experiment anymore.

This is my vertex:

[source lang="cpp"]struct Vertex{ Vertex(vec3 const & v) : pos(v) {} vec3 pos; vec3 normal; real u, v;};[/source]

[source lang="cpp"] const int VERTEX_POS_INDX = 0; const int VERTEX_NORMAL_INDX = 1; const int VERTEX_TEXCOORD_INDX = 2; const int VERTEX_POS_SIZE = 3; const int VERTEX_NORMAL_SIZE = 3; const int VERTEX_TEXCOORD_SIZE = 2; GLuint vbo, ibo; glGenBuffers(1, &vbo); glBindBuffer(GL_ARRAY_BUFFER, vbo); glBufferData(GL_ARRAY_BUFFER, sphere->vertices.size()*sizeof(Vertex), &sphere->vertices[0], GL_STATIC_DRAW); glGenBuffers(1, &ibo); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibo); glBufferData(GL_ELEMENT_ARRAY_BUFFER, sphere->indices.size()*sizeof(unsigned short), &sphere->indices[0], GL_STATIC_DRAW); glEnableVertexAttribArray ( VERTEX_POS_INDX ); glEnableVertexAttribArray ( VERTEX_NORMAL_INDX ); glEnableVertexAttribArray ( VERTEX_TEXCOORD_INDX ); int offset = 0; glVertexAttribPointer ( VERTEX_POS_INDX, VERTEX_POS_SIZE, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)offset ); offset += VERTEX_POS_SIZE * sizeof(real); glVertexAttribPointer ( VERTEX_NORMAL_INDX, VERTEX_NORMAL_SIZE, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)offset ); offset += VERTEX_NORMAL_SIZE * sizeof(real); glVertexAttribPointer ( VERTEX_TEXCOORD_INDX, VERTEX_TEXCOORD_INDX, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)offset ); glBindAttribLocation ( programObject, VERTEX_POS_INDX, "a_position" ); glBindAttribLocation ( programObject, VERTEX_NORMAL_INDX, "a_normal" ); glBindAttribLocation ( programObject, VERTEX_TEXCOORD_INDX, "a_coord" );[/source]

[source lang="cpp"]precision highp float;uniform mat4 u_mv;uniform mat4 u_mvp;uniform vec3 u_light;uniform vec3 u_up;attribute vec3 a_position;attribute vec2 a_coord;attribute vec3 a_normal;varying vec2 v_coord;varying vec3 v_normal;void main() { v_coord = a_coord; v_normal = a_normal; gl_Position = u_mvp * vec4(a_position, 1);}[/source]

[source lang="cpp"]precision highp float;uniform vec3 u_up;varying vec3 v_normal;varying vec2 v_coord;precision highp float;uniform vec3 u_up;varying vec3 v_normal;varying vec2 v_coord;void main(){ vec2 coord = v_coord; vec3 normal = v_normal; coord.x = mod(v_coord.x * 5.0, 1.0); coord.y = mod(v_coord.y * 5.0, 1.0); gl_FragColor = vec4 ( mod(coord.x*1.0,1.0), mod(coord.y*1.0,1.0), mod(normal.z*5.0,1.0)*0.0, 1.0 );}[/source]

It appears the v_normal is receiving the values that are intended for v_coord. I still have no idea though.

I've spent a lot of time trying to figure out, but I really am not able to pick a sane set of actions to debug/experiment anymore.

This is my vertex:

[source lang="cpp"]struct Vertex{ Vertex(vec3 const & v) : pos(v) {} vec3 pos; vec3 normal; real u, v;};[/source]

[source lang="cpp"] const int VERTEX_POS_INDX = 0; const int VERTEX_NORMAL_INDX = 1; const int VERTEX_TEXCOORD_INDX = 2; const int VERTEX_POS_SIZE = 3; const int VERTEX_NORMAL_SIZE = 3; const int VERTEX_TEXCOORD_SIZE = 2; GLuint vbo, ibo; glGenBuffers(1, &vbo); glBindBuffer(GL_ARRAY_BUFFER, vbo); glBufferData(GL_ARRAY_BUFFER, sphere->vertices.size()*sizeof(Vertex), &sphere->vertices[0], GL_STATIC_DRAW); glGenBuffers(1, &ibo); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibo); glBufferData(GL_ELEMENT_ARRAY_BUFFER, sphere->indices.size()*sizeof(unsigned short), &sphere->indices[0], GL_STATIC_DRAW); glEnableVertexAttribArray ( VERTEX_POS_INDX ); glEnableVertexAttribArray ( VERTEX_NORMAL_INDX ); glEnableVertexAttribArray ( VERTEX_TEXCOORD_INDX ); int offset = 0; glVertexAttribPointer ( VERTEX_POS_INDX, VERTEX_POS_SIZE, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)offset ); offset += VERTEX_POS_SIZE * sizeof(real); glVertexAttribPointer ( VERTEX_NORMAL_INDX, VERTEX_NORMAL_SIZE, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)offset ); offset += VERTEX_NORMAL_SIZE * sizeof(real); glVertexAttribPointer ( VERTEX_TEXCOORD_INDX, VERTEX_TEXCOORD_INDX, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)offset ); glBindAttribLocation ( programObject, VERTEX_POS_INDX, "a_position" ); glBindAttribLocation ( programObject, VERTEX_NORMAL_INDX, "a_normal" ); glBindAttribLocation ( programObject, VERTEX_TEXCOORD_INDX, "a_coord" );[/source]

[source lang="cpp"]precision highp float;uniform mat4 u_mv;uniform mat4 u_mvp;uniform vec3 u_light;uniform vec3 u_up;attribute vec3 a_position;attribute vec2 a_coord;attribute vec3 a_normal;varying vec2 v_coord;varying vec3 v_normal;void main() { v_coord = a_coord; v_normal = a_normal; gl_Position = u_mvp * vec4(a_position, 1);}[/source]

[source lang="cpp"]precision highp float;uniform vec3 u_up;varying vec3 v_normal;varying vec2 v_coord;precision highp float;uniform vec3 u_up;varying vec3 v_normal;varying vec2 v_coord;void main(){ vec2 coord = v_coord; vec3 normal = v_normal; coord.x = mod(v_coord.x * 5.0, 1.0); coord.y = mod(v_coord.y * 5.0, 1.0); gl_FragColor = vec4 ( mod(coord.x*1.0,1.0), mod(coord.y*1.0,1.0), mod(normal.z*5.0,1.0)*0.0, 1.0 );}[/source]

[ my blog ]

Posted 26 June 2012 - 06:41 AM

A check: You are doing the linking (glLinkProgram) of the shader program after the call of glBindAttribLocation(), don't you?

I usually do the glVertexAttribPointer offset as follows:

struct Vertex *p = 0;

glVertexAttribPointer ( VERTEX_POS_INDX, VERTEX_POS_SIZE, GL_FLOAT, GL_FALSE, sizeof(Vertex), &p->pos);

glVertexAttribPointer ( VERTEX_NORMAL_INDX, VERTEX_NORMAL_SIZE, GL_FLOAT, GL_FALSE, sizeof(Vertex), &p->normal );

glVertexAttribPointer ( VERTEX_TEXCOORD_INDX, VERTEX_TEXCOORD_INDX, GL_FLOAT, GL_FALSE, sizeof(Vertex), &p->u );

I usually do the glVertexAttribPointer offset as follows:

struct Vertex *p = 0;

glVertexAttribPointer ( VERTEX_POS_INDX, VERTEX_POS_SIZE, GL_FLOAT, GL_FALSE, sizeof(Vertex), &p->pos);

glVertexAttribPointer ( VERTEX_NORMAL_INDX, VERTEX_NORMAL_SIZE, GL_FLOAT, GL_FALSE, sizeof(Vertex), &p->normal );

glVertexAttribPointer ( VERTEX_TEXCOORD_INDX, VERTEX_TEXCOORD_INDX, GL_FLOAT, GL_FALSE, sizeof(Vertex), &p->u );

**Edited by larspensjo, 26 June 2012 - 06:53 AM.**

Posted 26 June 2012 - 07:13 AM

Better again to use the GL3.x+ syntax (if available in your implementation) for this in your shader:

layout(location = 0) in vec4 position;

**Edited by mhagain, 26 June 2012 - 07:14 AM.**

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.

Posted 28 June 2012 - 05:12 AM

Nice to see! But that means that the call toGot it solved!

I just had to use glGetAttributeLocation to get the identifiers from the linked shader program.

glBindAttribLocation() wasn't done before the linkage phase, or the identifiers would have been the one you specified.

If you are using the same VAO for more than one program, you can't use

glGetAttributeLocation() as the results may be different.Posted 28 June 2012 - 08:17 AM

@larspensjo: Thanks, got it!

A check: You are doing the linking (glLinkProgram) of the shader program after the call of glBindAttribLocation(), don't you?

I thought you were warning me of a pitfall rather than reminding me to do the right thing.

[ my blog ]

**Guest**, the last post of this topic is over **60 days old** and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

Reply to quoted posts Clear

Copyright © 1999-2017 GameDev.net, LLC

GameDev.net™, the GameDev.net logo, and GDNet™ are trademarks of GameDev.net, LLC.