"Disabling" OGL Matrices?

Started by
9 comments, last by DanG 21 years, 8 months ago
In my current app, I manually do all the matrix math, including rotations, translations, scaling, etc... and multiply EVERY vertex with the correct matrix, then by my projection matrix. Right now I setup all OpenGL Matriced to Identities except the Camera to Screen matrix (I think thats made by glViewport calls?) and use a default ortho camera at the origon. THe thing is that when I give these minipulated vertices to OpenGL, they are still multiplied by the MODELVIEW and PROJECTION matrices, which are just Identities. Is there a way to force OpenGL to stop multiplying the vertices and just render them as inputted?
Ambassador: Mr. Bush are you stoned or just really, REALLY dumb?Pres. Bush - I assure you I am not stoned.
Advertisement
This may sound obvious, but, how about if you just don''t call
glLoadIdentity()? I hope you did all the calculations right.


Kami no Itte ga ore ni zettai naru!
神はサイコロを振らない!
quote:Original post by tangentz
This may sound obvious, but, how about if you just don''t call
glLoadIdentity()?


Again, I don''t know enough about OGL, but it seems to me that the vertices are going to be multiplied by OpenGL matrices independent of the LoadIdenitty call. If I don''t make the identity call, then the vertices are being multiplied by an unknown matrix.

Ambassador: Mr. Bush are you stoned or just really, REALLY dumb?Pres. Bush - I assure you I am not stoned.
if the matricies are identities, why do you care whether they are used?
quote:Original post by chowe6685
if the matricies are identities, why do you care whether they are used?


I''ve already done the calculations, I don''t OpenGL to waste time multiplying all the verteces by some identity matrices. I don''t really know the speed penalty, but I simply don''t want OpenGL doing this.

Ambassador: Mr. Bush are you stoned or just really, REALLY dumb?Pres. Bush - I assure you I am not stoned.
Multiplying with the default identities is nothing to worry about. No I don''t think it''s possible to disable them. How would it know how to render what''s inputted? It wouldn''t have any information. Just because you have it doesn''t mean it does.

------------
Where''d the engine go? Where''d it go? Aaaaaah!!
MSN: nmaster42@hotmail.com, AIM: LockePick42, ICQ: 74128155
_______________________________________Pixelante Game Studios - Fowl Language
Man, just give him an answer!

Quoted from www.opengl.org faq:

quote:
9.040 How do I bypass OpenGL matrix transformations and send 2D coordinates directly for rasterization?

There isn''t a mode switch to disable OpenGL matrix transformations. However, if you set either or both matrices to the identity with a glLoadIdentity() call, typical OpenGL implementations are intelligent enough to know that an identity transformation is a no-op and will act accordingly.

More detailed information on using OpenGL as a rasterization-only API is in the OpenGL Game Developer’s FAQ.

Gaiomard Dragon-===(UDIC)===-
THANKS! I guess I''ll just keep it as is.
Ambassador: Mr. Bush are you stoned or just really, REALLY dumb?Pres. Bush - I assure you I am not stoned.
Opengl has a few functions that let you multiply and load your own matrix to its. So instead of you multiplying your matricies to your vertices, just let opengl do it for you. That way you will only apply one matrix to the vertices and it will be your own.
BTW, if you want to use hardware T&L then you''re not going the right way about it. If you''re at all concerned about performance you should let OpenGL handle the maths. Of course, for all I know your target system is 2GHz Athlon with a TNT2 .

____________________________________________________________
www.elf-stone.com

____________________________________________________________www.elf-stone.com | Automated GL Extension Loading: GLee 5.00 for Win32 and Linux

This topic is closed to new replies.

Advertisement