Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 26 Aug 2011
Offline Last Active Jun 24 2012 12:01 AM

Topics I've Started

Matrix to vector?

22 June 2012 - 08:24 PM

Heya! I've got a tree of objects, each with their own transformation matrices relative to their parent object. Give an object anywhere in the tree, I'd like to find it's absolute position from the origin. This is simple enough: I walk down the tree, applying each transform along the way, 'till I hit the object in question. The resultant matrix gives the absolute transformation matrix.

What's the best way to turn this in to a vector that is the position relative to the origin? I have bounding boxes around each element, and I'd like to use this to find collisions.

I may be making this more complicated than I need to...

Calculating normals for mesh: More odd lighting stuff...

26 August 2011 - 07:19 PM

So, I've been trying to get an OBJ file to look good under lighting to no avail. I got a little closer with some help earlier, but I'm kind of stuck again. I've got an OBJ file that has vertices and face data in it. For each vertex I basically calculate the normal for it's associated faces then average the normals to create one vector for that vertex. Here is the result of that:


And here's the normals "visualized" (lol):


Any suggestions as to why this looks so weird? Is it my lighting? My normal calculations? Another thing that I heard about was duplicated shared vertices and assigning different vectors. Would that do the trick? Any guidance would be greatly appreciated!!

Indexed primitive looks wonky with lighting

26 August 2011 - 10:48 AM

Hey there, not even 100% sure this is the right place for this question, so please be gentle! My first time posting on these forums.

I've got what I'm sure is a simple problem stemming from some deficiency with regard to my understanding of how crap works in OpenGL.

Basically, all I want is 2 "brick" meshes with lighting. In order to support future growth, I want to display them as indexed primitives and I am attempting to compute normals from the faces. Here is the relevant code -- it basically creates to relavent vertices, indexes, and computes the appropriate normals. I don't include the normals here in hopes that the normals were what is messing up my lighting. Note that I also am shading this model GL_FLAT.

void create_brick() {
  Vector vertices[] = { 
    Vector( 1.0f, 0.25f, 1.0f),
    Vector( 1.0f,-0.25f, 1.0f),
    Vector(-1.0f, 0.25f, 1.0f),
    Vector(-1.0f,-0.25f, 1.0f),
    Vector( 1.0f,-0.25f,-1.0f),
    Vector(-1.0f, 0.25f,-1.0f),
    Vector( 1.0f, 0.25f,-1.0f),

  GLubyte indices[] = { 
    0, 2, 3, 1,
    0, 1, 4, 6,
    0, 6, 5, 2,
    2, 5, 7, 3,
    7, 4, 1, 3,
    4, 7, 5, 6

  // This took a suprising amount of energy.
  GLfloat * normals = compute_normals(vertices, 8, indices, 24);
  GLfloat * floatVertices = convert_vector_list(vertices, 8); 

    glVertexPointer(3, GL_FLOAT, 0, floatVertices);
    glDrawElements(GL_QUADS, 24, GL_UNSIGNED_BYTE, indices);
  // Huh? Normals?
  for(int i = 0; i < 24; i += 3) {
    printf("%d: (%f, %f, %f)\n", i, normals[i], normals[i+1], normals[i+2]);

  // Be nice to yourself!

void render_geometry() {

  glTranslatef(-2.0f, -1.0f, -5.0f);

  glTranslatef(2.0f, -1.0f, -5.0f);


Pretty sane, I think. Well, apparently not. Here is what is displayed:


What I expected was 2 rectangular primitives. What I got was something...else. Can anyone offer any guidance as to what I'm doing wrong? I can post my lighting code, if necessary!

Edit: Made code a little more relevant and added a little clarification to setup.