Posted 11 October 2001 - 06:36 PM
Well, Im makin an app in openGL, It turns out that if I specify textures larger than 256x256, then they are resized to a 256x256 either by my voodoo card or by somehing else. Also, I cannot specify a bit depth higher than 16 (though I know my system can show 32/24 bits) ..it doesnt crash or anything, it just doesnt show more than 65536 colors (which actually makes 24bit pics look bad) This is also only on some computers, on other comps it works fine.
I know that directx and other api''s manages to do both those things even on my crappy voodoo 3 card, is there anything one can do with openGL setup/initializing so to make sure that it supports those things with more than only a few graphics cards ? does anyone know ?