Jump to content
  • Advertisement
Sign in to follow this  
Kylotan

Unity BMPs in SDL and byte order

This topic is 5454 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm working on the texture loading routine for the SDL/OpenGL 2D library mentioned in this thread. The problem I seem to be getting is that bitmap files (BMPs) don't seem to follow the rules. Firstly, I load an image using IMG_Load from the sdl_image library. Then, I use SDL_CreateRGBSurface to create a surface using a standard format (as suggested here). Thirdly, I use SDL_BlitSurface to copy the loaded image to the surface in the standard format. Finally, that surface used as data to create an OpenGL texture. Now, I was led to believe that SDL_BlitSurface will convert appropriately between surfaces that have different pixel formats, hence me using it here. However, when I load in a BMP, it gets displayed with the red and blue channels reversed. JPG and PNG files are fine. It looks like the conversion never takes place. I was hoping to not have to sniff out the individual bitmasks for every different filetype/bitdepth/endianness by using existing SDL code to do this. Am I missing something? PS. TGA files seem to have the problem as well.

Share this post


Link to post
Share on other sites
Advertisement
Could you post some code? The texture-loading code I've written does exactly what you describe (with the additional optimization that it checks if the surface is already in a "sane" format, and uploads that directly), and I've had no problems with it.

Share this post


Link to post
Share on other sites
SDL_ConvertSurface requires an existing surface to convert to. I don't have an existing surface, and when I create one I may as well blit to it.

The code I was using is essentially this:
ftp://ptah.lnf.kth.se/pub/misc/convgltex.c. That code is followed by:


SDL_SetAlpha(textureImage, 0, SDL_ALPHA_OPAQUE);
SDL_Surface* myNewSurface = conv_surf_gl(textureImage, true);
// Create texture from those pixels
glTexImage2D(GL_TEXTURE_2D,
0, // base image level
GL_RGBA8, // internal format, don't change
me.width,
me.height,
0, // no border
GL_BGRA_EXT, // external format
GL_UNSIGNED_BYTE,
myNewSurface->pixels);




The linked file was posted on the SDL list several years ago by someone who supposedly knew what he was doing.

Share this post


Link to post
Share on other sites
I think the problem is how the 24bit textures get created. Try this instead:

#else
if(want_alpha){
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
}else{
rmask = 0x0000ff;
gmask = 0x00ff00;
bmask = 0xff0000;
amask = 0x000000;
}
#endif

Note that you've got to adjust your glTexImage2D call as well, for 24bit images. GL_BGR instead of GL_BGRA.

Share this post


Link to post
Share on other sites
Quote:
Original post by Kylotan
SDL_ConvertSurface requires an existing surface to convert to.
No, just a pixel format.
typedef struct{
SDL_Palette *palette;
Uint8 BitsPerPixel;
Uint8 BytesPerPixel;
Uint32 Rmask, Gmask, Bmask, Amask;
Uint8 Rshift, Gshift, Bshift, Ashift;
Uint8 Rloss, Gloss, Bloss, Aloss;
Uint32 colorkey;
Uint8 alpha;
} SDL_PixelFormat;

 SDL_PixelFormat GL_conversion_format = {
NULL,
32,
4,
#if endianness stufff...
... endianness MASKS
... endianness shifts
#endif
0,
0,
0,
0,
0
};

... later:
surface_for_gl = SDL_ConvertSurface(input_surface, &GL_conversion_format, SDL_SWSURFACE);

Share this post


Link to post
Share on other sites
Travis, in this code I am always generating 32bpp images. I pass true to the supplied function.

C-Junkie, you're right; I was foolishly reading the documentation that says "Converts a surface to the same format as another surface." instead of looking at the function prototype like a sensible programmer would. :) However that doesn't explain why SDL_BlitSurface doesn't appear to be doing the job.

Share this post


Link to post
Share on other sites
Hmm. Well, I can sorta see why it might be getting inverted:
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = want_alpha ? 0xff000000 : 0;

For intel, that's RGBA, not BGRA. So you'd use GL_RGBA instead of GL_BGRA_EXT in your glTexImage2D call.

But I don't see why that would be different for PNG/JPEG...

Share this post


Link to post
Share on other sites
From what I gather, BMP and TGA are natively stored in a reversed byte order to JPG and PNG, at least on Win32. This information is stored with the loaded surface and in theory SDL_BlitSurface should convert any of these to the appropriate type when needed. However in this case it seems to be failing to do this. I've never had a problem when blitting to the screen so I assume it's some sort of problem with blitting to this artificial surface.

Share this post


Link to post
Share on other sites
Listen your problem is comming on how SDL Generates his Image Compare to the Way that glTexImage2D Works The Problem is that your Colors in SDL the red and green are swap so what you want to do is Load the Image with SDL LoadBMP or IMG_LoadImage then get a Pointer to the Pixels. Then you need to Extract all the Red and Pixels and Blue Pixels and Swap them Around. Listen i will send you the code when i get home since right now i am in school Listen just in case Email me if you cannot still figuring out the problem my email is jeromefabioc@hotmail.com so i can send you the code or what you can do is check www.gametutorials.com and get the Texture Image Tutorial on SDL Open Gl that also will tell you how to fix the problem. if you still not getting it you can always email me.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!