using own matrix in opengl

Started by
21 comments, last by Zakwayda 16 years, 9 months ago
Quote:Original post by juxie
Quote:Original post by jyk
I've heard people say that, but I'm not exactly sure what they mean by it, and in any case I would argue that it's not really accurate (OpenGL and DirectX deal with transforms in basically the same way).

I think what people are referring to here is the difference in notational convention between the two APIs (row-vector vs. column-vector notation) and the implications for multiplication order.

When using the DirectX math library this is directly evident, but in OpenGL everything happens 'under the hood'. It could probably be argued that OpenGL itself doesn't really assume a notational convention; rather, it is simply the case that transforms are applied in the opposite of the order in which the corresponding function calls appear in the code. (Most OpenGL references use column-vector notation, however, so this is how people tend to think of things when working with OpenGL transform functions.)

It's all a bit confusing, but I think the first thing you need to understand is that OpenGL and D3D/DirectX are fundamentally the same in terms of how they deal with transforms. I say 'fundamentally' because there are a number of superficial differences - for example, D3D maintains separate world and model matrices, while OpenGL combines them into one - but the concepts are essentially the same.


I feel bad that I am still pretty much confused.
I tried to read up on quite a number of article, it seems to touch on simpler transformation.
I am not sure when I did the transformation correct or when I did it incorrectly.

Anywhere I can read up on this?

Thanks.


Do you have a math program like octave installed?
I would suggest you to do some transformations manually and to examining the results to get an intuition how they behave.

http://www.8ung.at/basiror/theironcross.html
Advertisement
Quote:Original post by Basiror

Do you have a math program like octave installed?
I would suggest you to do some transformations manually and to examining the results to get an intuition how they behave.


I have just downloaded octave.
I will give it a try and hopefully will understand matrices better.
Thanks everyone.
Quote:Original post by Basiror
Quote:Original post by jyk
Just to make sure I'm not misunderstanding, when you say 'multiply from the right' are you referring to column-vector notation, or row-vector notation?
column
Oh, ok. Well, I'm with you on column-vector notation, and it does seem that most academic references use this convention (as do, of course, most OpenGL references). However, since DirectX (which uses row-vector notation) is so prevalent, I would guess that row-vector and column-vector notation are used about equally overall.

As far as intuitiveness goes, row vector advocates point out that sequences of transforms read naturally from left to right when written using row-vector notation, while column vector advocates sometimes use the counter-argument that column vector notation closely mirrors function composition.

Again, I tend to use column vectors myself, but I'm still not sure if one or the other convention can be said to be more intuitive (or more prevalent overall) than the other.

This topic is closed to new replies.

Advertisement