• Advertisement

Archived

This topic is now archived and is closed to further replies.

Texture Compression

This topic is 5996 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hi, i wanted to ask something. If i''ll do texture compression (s3tc) in real time (load bmp -> comvert to s3tc -> use as texture) will it harm performance a lot?

Share this post


Link to post
Share on other sites
Advertisement
Not really, s3tc was meant to be quite fast to compress (it''s even faster to decompress).

Although, you probably don''t want to do it in software, get the hardware to do it! In opengl you just go like this:

  
int texture;

glGenTextures( 1, &texture );
glBindTexture( GL_TEXTURE_2D, m_texture );
glTexImage2D( GL_TEXTURE_2D, 0, GL_COMPRESSED_RGB_ARB, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, data );


where data, width and height are all set beforhand. I guess it''s similar in DirectX...


dHarding.net - Just click it.

Share this post


Link to post
Share on other sites
why not just convert it during design time? That''ll save the middle step unless you have a specific need for using a bitmap.



How many Microsoft employees does it take to screw in a light bulb?
None, they just declare drakness as a new standard.

Share this post


Link to post
Share on other sites
>> why not just convert it during design time? That''ll save the middle step unless you have a specific need for using a bitmap.

How can you save an image in the same format as the card uses for its texture compression? I''ve never heard of this...

------------
- outRider -

Share this post


Link to post
Share on other sites

  • Advertisement