Jump to content
  • Advertisement
babaliaris

OpenGL Image Formats: How to tell GlTexImage2D how to process data for each file format?

Recommended Posts

Hello!

For those who don't know me I have started a quite amount of threads about textures in opengl. I was encountering bugs like the texture was not appearing correctly (even that my code and shaders where fine) or I was getting access violation in memory when I was uploading a texture into the gpu. Mostly I thought that these might be AMD's bugs because when someone was running my code he was getting a nice result. Then someone told me "Some drivers implementations are more forgiven than others, so it might happen that your driver does not forgive that easily. This might be the reason that other can see the output you where expecting". I did not believe him and move on.

Then Mr. @Hodgman gave me the light. He explained me somethings about images and what channels are (I had no clue) and with some research from my perspective I learned how digital images work in theory and what channels are. Then by also reading this article about image formats I also learned some more stuff.

The question now is, if for example I want to upload a PNG to the gpu, am I 100% that I can use 4 channels? Or even that the image is a PNG it might not contain all 4 channels (rgba). So I need somehow to retrieve that information so my code below will be able to tell the driver how to read the data based on the channels.

I'm asking this just to know how to properly write the code below (with capitals are the variables which I want you to tell me how to specify)

stbi_set_flip_vertically_on_load(1);

//Try to load the image.
unsigned char *data = stbi_load(path.c_str(), &m_width, &m_height, &m_channels, HOW_MANY_CHANNELS_TO_USE);

//Image loaded successfully.
if (data)
{

    //Generate the texture and bind it.
	GLCall(glGenTextures(1, &m_id));
	GLCall(glActiveTexture(GL_TEXTURE0 + unit));
	GLCall(glBindTexture(GL_TEXTURE_2D, m_id));

	GLCall(glTexImage2D(GL_TEXTURE_2D, 0, WHAT_FORMAT_FOR_THE_TEXTURE, m_width, m_height, 0, WHAT_FORMAT_FOR_THE_DATA, GL_UNSIGNED_BYTE, data));
}

So back to my question. If I'm loading a PNG, and tell stbi_load to use 4 channels and then into glTexImage2D,  WHAT_FORMAT_FOR_THE_DATA = RGBA will I be sure that the driver will properly read the data without getting an access violation?  

I want to write a code that no matter the image file, it will always be able to read the data correctly and upload them to the GPU.

Like 100% of the tutorials and guides about openGL out there (even one which I purchased from Udemy) where not explaining all these stuff and this is why I was experiencing all these bugs and got stuck for months!

 

Also some documentation you might need to know about stbi_load to help me more:

// Limitations:
//    - no 12-bit-per-channel JPEG
//    - no JPEGs with arithmetic coding
//    - GIF always returns *comp=4
//
// Basic usage (see HDR discussion below for HDR usage):
//    int x,y,n;
//    unsigned char *data = stbi_load(filename, &x, &y, &n, 0);
//    // ... process data if not NULL ...
//    // ... x = width, y = height, n = # 8-bit components per pixel ...
//    // ... replace '0' with '1'..'4' to force that many components per pixel
//    // ... but 'n' will always be the number that it would have been if you said 0
//    stbi_image_free(data)

 

Edited by babaliaris

Share this post


Link to post
Share on other sites
Advertisement

Based on the stb documentation for the stbi_load function, just a little further down from what you quoted was your answer:

// An output image with N components has the following components interleaved
// in this order in each pixel:

// N=#comp components
// 1 grey
// 2 grey, alpha
// 3 red, green, blue
// 4 red, green, blue, alpha

HOW_MANY_CHANNELS_TO_USE should be 0 as per the documentation.

As for WHAT_FORMAT_FOR_THE_DATA, that will depend on what your m_channels returns to you.

One idea:

GLenum format;

switch (m_channels)
{
case 1:	format = GL_RED;
	break;
case 2: format = GL_RG;
	break;
case 3: format = GL_RGB;
	break;
case 4: format = GL_RGBA;
}

Then pass format to glTexImage2D().

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!