Jump to content
  • Advertisement
Sign in to follow this  

GL_COLOR_INDEX for textures.

This topic is 4962 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I just have to ask, Ive tried with no results the whole day. Is it possible to have GL_COLOR_INDEX as the format for a texture. I cant find any tutorial that shows how. at reference sites I see a lot info about it, but also notes that it is undefined or something I dont understand. Please tell me, is it possible! Thanks

Share this post

Link to post
Share on other sites
OK, I understand now that it is converted to RGB if I choose GL_COLOR_INDEX but is it in some other way I can do it, DrawPixels was far to slow.
I once was told that I could do something with a LUMINANCE texture and a texture with the colors and then something with pixelshading, to fake it.
But I dont understand much of that, do someone understand what he meant or a way I can do it, the goal is that I want to be able to change one color and then all my textures should be changed, it takes to long time to reload all textures.

I need advice, please

Share this post

Link to post
Share on other sites
So you want the textures to be in color-index format and then you want these textures to automatically change when the palette changes...
As far as I know, you're stuck with this:
  • If you have a nVidia card: Look for the GL_EXT_PALETTED_TEXTURES and GL_EXT_SHARED_PALETTE extensions. (If you need help using these extensions I can give you some code).

  • If you have a ATI card: Check this post

  • If none of these applies: You'll have to write your own (fragment/texture) shader

Post if I left you any doubts [wink]

Share this post

Link to post
Share on other sites
Thanks a lot for the answer.
I got a nVidia GeForce 2.
Iam new to openGL so I dont exactly understand how to check this things.
But Iam really intersted to know how=)
and how they are used, no response from google=/

Share this post

Link to post
Share on other sites
it takes to long time to reload all textures

are u sure?
how many textures and what resolutions are that in?

Share this post

Link to post
Share on other sites
Well infact that is exaclt what iam doing right now, but it feels wrong to reload all textures every frame, but it runs smoothly, for the momement.
But right know Ive only tried with a about 4-5 textures.
They are about 128x128.
In the end it will be say, 75-125. and now they have to eb converted to RGB so it takes 3 times more then ColorIndex textures would.
But I stick to this for now.
But I hope to make a better solution!

Share this post

Link to post
Share on other sites
So if you want a better solution, I'm pretty sure your card supports paletted textures. It will be cleaner, faster and consuming less memory [wink]

Here's how to do it:
Okay, first of all you'll be using extensions, so you need to download the glext.h header and include it somewhere after "gl.h".
Then you'll need a pointer to these special extensions functions.
You will need just one:


Then you will have to initialize it somewhere in your initialization function and enable the extensions:

glColorTableEXT = (PFNGLCOLORTABLEEXTPROC) wglGetProcAddress("glColorTableEXT");

So now you will load your textures (just once!) like this:

glTexImage2D(GL_TEXTURE_2D, 0, GL_COLOR_INDEX8_EXT, texWidth, texHeight,

(This is assuming your palettes are 256 shades wide)

And every time you want to change your palette you will do something like this:


I think it should run fine on your GeForce2. But if you get some invalid pointer errors, it probably isn't supported. So you should always check if these extensions are available, just in case. To do that, you´ll call glGetString(GL_EXTENSIONS) and look for these strings: "GL_EXT_paletted_textures" and "GL_EXT_shared_texture_palette". Now the right thing to do would be to make your app always check for this strings and then decide whether it can use these extensions or should fall back to software-mode extensions-free routines. But for now, you could just call that and breakpoint it in your app.

Here are a couple of nice places to look for info on extensions:
The OpenGL FAQ - Nice place to start from.
The OpenGL Extensions Registy - Lots of specs and more technical stuff.
And there's also a NeHe Tutorial which deals with extensions.

Hope that helped! Tell me if it worked [smile]

Share this post

Link to post
Share on other sites
Thanks a lot!!! it Works amazing...
...I think, Iam to tired to change the code so it only loads the texture once.
But it works, when I use those parameters and do the ColorTable thing so I think it should work, I'll fix it tomorrow! =)

I just want to ask one more question, It is not that important but Iam just interested.
When I call glTexImage2D it never works, I dont get any errors, But when I draw my texture only the color I selected last is shown.
No texture mapping just solid color.

But if I call gluBuild2DMipmaps it works
I always send dimensions that is a power of 2, so Do you have any idea of why it doesnt work, do Ive to do something more?
When I look up what the function does, it just scales the image and then use glTexImage2D, strange huh?!

Share this post

Link to post
Share on other sites
Hehe good it worked =]
You say you don't get any errors, but did you really try a glGetError() right after the call to glTexImage?
Maybe it's something wrong with the function call. And I didn't understand if you're using mipmaps or not.. If you are, maybe you're not setting the GL_MIN_FILTER or GL_MAX_FILTER to GL_NEAREST_MIPMAP_LINEAR or something like that. Or maybe you just forgot to glEnable(GL_TEXTURE_2D). I don't know, but I'm sure you'll find out. [wink]

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!