Problems with custom matrices (SOLVED)
SITUATION:
Currently, I'm trying to implement a system where I don't use glMatrixMode(), or the gl transformation functions, at all. Instead, I want to be able to use my own matrix class and functions, then passing in the matrix data directly to my vertex shader to perform the transformation.
PROBLEM:
Originally, when I was just doing the modelview matrix for my early tests, it worked just fine. The camera moved as expected, the object moved as expected. Now, however, that I have added the projection matrix into it, it is causing weird behavior.
As an example, I was trying to draw the Teapot, and making it rotate around the X axis. In my shader, I multiply each vertex position by my custom worldViewProj matrix. The output, is me looking down from the top of the teapot, rather than the side, and the teapot is vertically flattened as it spins around the X axis.
Is there something I have to disable with OpenGL in order for my custom projection matrix to work correctly? If code (C++ and/or shader), and images are necessary to diagnose the problem, please let me know.
Thanks.
[Edited by - n00body on April 3, 2008 5:26:26 PM]
I think code would help, specifically your matrix generation code, and maybe the OpenGL setup and drawing code.
In my libraries I calculate projection and transformation matrices myself and then use the glLoadMatrixf method to allow OpenGL to do the actual transforms itself. Why not do that, as the end effect is the same (i.e. you still get to use the matrix how ever you want in the shader code)?
In my libraries I calculate projection and transformation matrices myself and then use the glLoadMatrixf method to allow OpenGL to do the actual transforms itself. Why not do that, as the end effect is the same (i.e. you still get to use the matrix how ever you want in the shader code)?
I can think of two errors:
- Your ´projection matrix is wrong (use glFrustum or gluPerspective and read that back to compare OpenGL's projection matrix with your own) or
- The order in which the matrices are multiplied is wrong. I also use shaders and provide my own matrices. My modelviewprojection matrix (the one I provide as a shader parameter) is calculated as projection * view * model.
I know this is gonna seem stupid, but I figured out the problem. I knew it couldn't have been the matrix code, or the ordering, since I'd basically copied those straight from another example. So I was perplexed as to what could be wrong. Only to find it out while I was looking around in the glew header.
I went to all the trouble of making sure my matrices were column-major so GL would like them, but then used the wrong function to pass them in. So as a result, it was sending the transpose, rather than the unmodified matrix.
Sorry about the wasted time. :(
I went to all the trouble of making sure my matrices were column-major so GL would like them, but then used the wrong function to pass them in. So as a result, it was sending the transpose, rather than the unmodified matrix.
Sorry about the wasted time. :(
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement