Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


OpenGL TexImage with GL_RGB vs. GL_RGBA


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
2 replies to this topic

#1 m_shade   Members   -  Reputation: 122

Like
Likes
Like

Posted 12 February 2000 - 03:29 PM

I''m loading a texture image that does not require any alpha data. I notice a considerable difference in image quality in 16-bit mode when I only include pixel data for RGB, and leave out A, as opposed to adding the A to my pixel data and setting it to be 255 (full). I was just wondering if when using glTexImage2D, if when you specify RGB, it will split up the bits 5,6,5 for R,G, and B, and when you specify RGBA it will split up the bits 4,4,4,4 for R,G,B, and A. I at first thought it would always just fill in the Alpha if its not given, but this does not appear to be the case. I am giving the data in 24 or 32 bit quantities per pixel. I also thought this may be card/driver specific, I don''t know, I''ve never seen it specified. I''m using a geForce. The reason this came up is becausing I had been using my own crude TGA loading routine on the project I''d been working on, but decided to grab the LoadTGA function from the Q1 source. After adding it, I noticed the texture quality of my skybox really sucked, so I compared the two functions, and realized the Q1 source used RGBA and set the alpha to 255, while I had been using just RGB. I thought it was odd that this wasn''t accounted for in the Q1 source (even though TGA''s aren''t really used in Q1, some of the skybox functionality is still there).

Sponsor:

#2 DaBit   Members   -  Reputation: 122

Like
Likes
Like

Posted 16 February 2000 - 10:22 PM

You could give glTexImage2D a hint about what format to use. For example, if you want to hint the driver to use 16-bit textures, you could enter GL_RGB5 instead of GL_RGB for the ''InternalFormat'' (the third one) parameter of glTexImage2D. If you want to force 32-bit ARGB, enter GL_RGBA8.

Hope this helps,
DaBit.



#3 m_shade   Members   -  Reputation: 122

Like
Likes
Like

Posted 18 February 2000 - 07:38 AM

Ah, yes that actually helps a lot, that''s exactly the info I was looking for. I had forgotten all about the fact that you can specify the resolution of the color components with the internalFormat parameter.

Thanks





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS