Archived

This topic is now archived and is closed to further replies.

DanG

"Disabling" OGL Matrices?

Recommended Posts

In my current app, I manually do all the matrix math, including rotations, translations, scaling, etc... and multiply EVERY vertex with the correct matrix, then by my projection matrix. Right now I setup all OpenGL Matriced to Identities except the Camera to Screen matrix (I think thats made by glViewport calls?) and use a default ortho camera at the origon. THe thing is that when I give these minipulated vertices to OpenGL, they are still multiplied by the MODELVIEW and PROJECTION matrices, which are just Identities. Is there a way to force OpenGL to stop multiplying the vertices and just render them as inputted?

Share this post


Link to post
Share on other sites
quote:
Original post by tangentz
This may sound obvious, but, how about if you just don''t call
glLoadIdentity()?



Again, I don''t know enough about OGL, but it seems to me that the vertices are going to be multiplied by OpenGL matrices independent of the LoadIdenitty call. If I don''t make the identity call, then the vertices are being multiplied by an unknown matrix.

Share this post


Link to post
Share on other sites
quote:
Original post by chowe6685
if the matricies are identities, why do you care whether they are used?


I''ve already done the calculations, I don''t OpenGL to waste time multiplying all the verteces by some identity matrices. I don''t really know the speed penalty, but I simply don''t want OpenGL doing this.

Share this post


Link to post
Share on other sites
Multiplying with the default identities is nothing to worry about. No I don''t think it''s possible to disable them. How would it know how to render what''s inputted? It wouldn''t have any information. Just because you have it doesn''t mean it does.

------------
Where''d the engine go? Where''d it go? Aaaaaah!!
MSN: nmaster42@hotmail.com, AIM: LockePick42, ICQ: 74128155

Share this post


Link to post
Share on other sites
Man, just give him an answer!

Quoted from www.opengl.org faq:

quote:

9.040 How do I bypass OpenGL matrix transformations and send 2D coordinates directly for rasterization?

There isn''t a mode switch to disable OpenGL matrix transformations. However, if you set either or both matrices to the identity with a glLoadIdentity() call, typical OpenGL implementations are intelligent enough to know that an identity transformation is a no-op and will act accordingly.

More detailed information on using OpenGL as a rasterization-only API is in the OpenGL Game Developer’s FAQ.


Share this post


Link to post
Share on other sites
Opengl has a few functions that let you multiply and load your own matrix to its. So instead of you multiplying your matricies to your vertices, just let opengl do it for you. That way you will only apply one matrix to the vertices and it will be your own.

Share this post


Link to post
Share on other sites
BTW, if you want to use hardware T&L then you''re not going the right way about it. If you''re at all concerned about performance you should let OpenGL handle the maths. Of course, for all I know your target system is 2GHz Athlon with a TNT2 .

____________________________________________________________
www.elf-stone.com

Share this post


Link to post
Share on other sites