• Advertisement
Sign in to follow this  

Optimizing older paletted Images

This topic is 4324 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I said it in an other thread already: We work on a remake of Settlers 2. But we have a "little" problem with the performance respectively the waste of memory of the grafic card. The pictures are 8bit with a palette. Now I have to use 32 Bit Textures. That means that it need the fourfold memory as normally, to say nothing of the other memory which is need to round the images widths and heights to 2^n... So the game needs approximately 60-80 MB graphic memory. We tested it e.g. on a Geforce 2 MX (32 MB afaik (?)) and got only 40 -90 fps (depending no resolution) With a Geforce 4 Ti (64 MB) I got 300-600 instead (probably just enough memory) Are there any ideas to optimize it? I know that they do it in the past different but maybe there are some good ideas do optimize it because the game is 10 years old and it wouldn't be so nice if it only runs fluidly on a Geforce 2 with 640x480. [headshake] I heard of the gl_ext_paletted_texture extension. It could be a good alternative on older graphic cards but for example my Extension Viewer says me that my card ( the Geforce 4) doesn't support 8 Bit Textures (then i have to use 16 Bit indizes or what? oO )

Share this post


Link to post
Share on other sites
Advertisement
If you are targeting Geforce 2, then you can use paletted textures directly with the EXT_paletted_texture extention. However, starting from the Geforce 4, paletted textures are no longer supported in hardware, and the extention does not exist.

You could simply use the extention if it exists, and if it doesn't, convert to RGBA8888 at runtime and hope that all of your textures fit in memory. Not the ideal solution, but it might be sufficient for your case.

You could also look into DXT texture compression which actually gives 6:1 compression on RGB888 images, and 8:1 or 4:1 compression on RGBA8888 images, depending on the format used. I don't know what kind of artwork you are using, but if it is mostly pixel-art, you'll probably find that DXT compression results in too much loss of quality. However, for normal images, DXT compression is quite good, and is actually the standard for today's games.

Good luck!

Share this post


Link to post
Share on other sites
EXT_paletted_texture is not suported everywhere. I think ATI cards don't support it and only early Detonator drivers supported it on Geforce 2 and whatever.
16 bit textures or DXT3 or DXT5 compressed textures are the better solution.

Also, are you using those 60-80MB textures all at once in one scene? Using 16 bit or DXT may improve performance.

Share this post


Link to post
Share on other sites
Quote:

You could also look into DXT texture compression which actually gives 6:1 compression on RGB888 images, and 8:1 or 4:1 compression on RGBA8888 images, depending on the format used. I don't know what kind of artwork you are using, but if it is mostly pixel-art, you'll probably find that DXT compression results in too much loss of quality. However, for normal images, DXT compression is quite good, and is actually the standard for today's games.



It's all pixelart. ;) Here are some examples:

Soldiers animation
Some icons of the GUI
Some buildings
Some trees

Quote:

If you are targeting Geforce 2, then you can use paletted textures directly with the EXT_paletted_texture extention. However, starting from the Geforce 4, paletted textures are no longer supported in hardware, and the extention does not exist.


The Geforce 4 supports the GL_EXT_paletted_texture extension, its in the report, but it there is also this:

Quote:

No paletted texture support
This may cause performance loss in some older applications.


But in the report with the Geforce 2 is this piece of text, too. Does it mean that there is only software support? [looksaround]

Quote:

Also, are you using those 60-80MB textures all at once in one scene?


The most of the textures are used but only sometimes. E.g. the GUI stuff. You need it only if you open a window but then you need it but usually you won't open a window. The lots of little different humans, there are maybe 10 of 32 ( with all animation steps) on the screen. The same with the buildings. You won't see all at once but you see it anytime and all the buildings have seperate animations... We thought of setting the priority of important textures (the terrain textures and the border of the screen) high. But the rest is very different and we can't say what is important and what not.

Quote:

Using 16 bit or DXT may improve performance.


I don't know whether 16 bit is enough (4 bit per a,r,g,b ? ) because the palette data is 24 bit of course, I'm going to test it. BTW: I don't need really the alpha channel but I haven't color keys in OpenGL ;).

Share this post


Link to post
Share on other sites
Quote:
Original post by Carrier
It's all pixelart. ;)

Hmm... some of those wouldn't look too bad with DXT, but others would look pretty crappy. You've got a pretty big mix there. :)

DXT compresses textures by making a 4x4 tile only contain two primary colors -- where two others are blended between those. Basically this means if you use any more than four colors in that 4x4 tile, colors will be lost. This does not work well with pixel art.

I would suggest giving it a try, however. Maybe just converting the GUI controls to DXT would be sufficient for you?

Quote:

The Geforce 4 supports the GL_EXT_paletted_texture extension, its in the report, but it there is also this:
Quote:

No paletted texture support
This may cause performance loss in some older applications.

But in the report with the Geforce 2 is this piece of text, too. Does it mean that there is only software support? [looksaround]

I apologize -- you are correct. The Geforce 4 does support paletted textures, but it is the last in the Geforce line to support them.

I have a trusty Geforce 2 Ultra 64Mb right here and it blazes right through with paletted textures just fine. There is definitely no unpacking going on because I can use all of the memory without any slowdowns.

Share this post


Link to post
Share on other sites
Quote:

I have a trusty Geforce 2 Ultra 64Mb right here and it blazes right through with paletted textures just fine. There is definitely no unpacking going on because I can use all of the memory without any slowdowns.


Yes, I would be happy if it extension if on Geforce 2 avaible. But I then I don't why there is that message in the report. As I said, the extension is supported but then i dont unterstand it. It's quite contradictory. [oh]

Share this post


Link to post
Share on other sites
Quote:
Original post by Carrier
Yes, I would be happy if it extension if on Geforce 2 avaible. But I then I don't why there is that message in the report. As I said, the extension is supported but then i dont unterstand it. It's quite contradictory. [oh]

I would suggest the report is incorrect. From NVidia in regards to the EXT_paletted_texture extention:

Quote:
Selected NVIDIA GPUs: NV1x (GeForce 256, GeForce2, GeForce4 MX, GeForce4 Go, Quadro, Quadro2), NV2x (GeForce3, GeForce4 Ti, Quadro DCC, Quadro4 XGL), and NV3x (GeForce FX 5xxxx, Quadro FX 1000/2000/3000).

NV3 (Riva 128) and NV4 (TNT, TNT2) GPUs and NV4x GPUs do NOT support this functionality (no hardware support).

Future NVIDIA GPU designs will no longer support paletted textures.


That should answer your question for NVidia's cards. Now about ATI, I'm afraid I won't be able to help you out there. :(

Share this post


Link to post
Share on other sites
I'm surprised the extensions is available on the Geforce FX

You can find a list here
delphi3d.net


I thought they had removed it from all their drivers long ago.

Also, when you are rendering, are you using Alpha testing?
It should improve performance on those old GPUs but not on newer ones. Another thing that drags performance down is too many overwrites (fillrate). You can try rendering front to back.

Hopefully, there isn't texture swapping taking place. Depends on how much textures you use in a scene.

[Edited by - V-man on April 20, 2006 8:54:23 AM]

Share this post


Link to post
Share on other sites
No, I don't use Alpha Testing. Yes, I've noticed that fill rate is decisive in such games. I already use VBOs for rendering the terrain but I didn't get much more fps.
I decided to use the extension if possible and if not, 32 bit images. The modern graphic cards, they don't support it, have enough memory in any case. :)

BTW: I have still a question. When I generate a texture with glTexImage2D, the memory which the game used grow up in the Taskmanager. Looks like that the textures generating in the RAM..?

Share this post


Link to post
Share on other sites
Quote:
Original post by Carrier
BTW: I have still a question. When I generate a texture with glTexImage2D, the memory which the game used grow up in the Taskmanager. Looks like that the textures generating in the RAM..?

That's correct. When you call glTexImage2D, OpenGL makes an internal copy of the texture for itself. This is how it automatically handles paging in and out of textures from VRAM.

Once you've uploaded a texture with glTexImage2D, you can free your in-memory texture.

Share this post


Link to post
Share on other sites
Quote:

Once you've uploaded a texture with glTexImage2D, you can free your in-memory texture.


Hmm and how?
I don't mean the memory which I alloc for the pixel data and which I give the function. I free this of course.

Share this post


Link to post
Share on other sites
Quote:
Original post by Carrier
... to say nothing of the other memory which is need to round the images widths and heights to 2^n...


you probably know this but you can put all the images into one big texture and have only that a power of two so to result in very small waste (there are algorithms to pack the images together). You can then use bits of that texture to each sprite (quad).

PS. the graphics look awesome! looks like my kind of game [smile]


Share this post


Link to post
Share on other sites
Quote:
Original post by Carrier
Quote:

Once you've uploaded a texture with glTexImage2D, you can free your in-memory texture.


Hmm and how?
I don't mean the memory which I alloc for the pixel data and which I give the function. I free this of course.

That's what I was referring to. :)

Once you pass the memory to glTexImage2D, you no longer need to keep the buffer around for yourself. If you did, there would be two copies in memory, yours and OpenGL's.

As I said earlier, OpenGL has to keep its own copy in order to handle texture paging. There is nothing you can do about this -- it is how OpenGL works.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement