I'm using ye olde LazyFoo (who graciously has updated for SDL2) as a guide here; I've got two images loading in and can display them. What I can't get to work is the colorkey (a transparent background for an image). I've tried setting a static color (arbitrarily a shade of bright pink, 0xff,0x00,0xf2) saved this out to both a .bmp and a .png (the latter I'm loading with the latest SDL_Image) and neither gives me a transparent color; instead I see lots of that bright pink.
The trouble, as it turns out, is that the pink I see in my game window is NOT the pink I saved via GIMP. I grabbed a pixel (only a single sample) from the Texture and got this:
25/04/2014 00:16:48,004228 DEBUG [Texture.cpp:29] transparency showdown! R GIMP 255 SDL 16
25/04/2014 00:16:48,004228 DEBUG [Texture.cpp:30] transparency showdown! G GIMP 0 SDL 0
25/04/2014 00:16:48,005228 DEBUG [Texture.cpp:31] transparency showdown! B GIMP 252 SDL 214
25/04/2014 00:16:48,005228 DEBUG [Texture.cpp:32] transparency showdown! A GIMP 255 SDL 255
which boggles my brain a bit: the colors are changing! (Numerically, not visually.)
This sample is takien for the PNG version (I haven't done it for the BMP version yet...I've been at work thinking on this all day) where the "GIMP" value is the one I set within the GIMP editor, and "SDL" is the value that SDL is telling me that the pixel actually holds. So, the transparency color I'm setting isn't the color that's drawn, but (as you might guess) to my eyes the image renders to screen with that precise shade of pink.
My first hunch is "compression?" but PNG is supposed to be lossless, no? My second hunch is "pixel format?" but I've stuck very closely to LazyFoo and I'm using the SDL function to get that format.
So...what could it be?
It will be some time before I get home and try saving the PNG without compression (just to rule it out) but maybe this is actually to be expected and I'm "doing it wrong"? Is there a better way/standard practice to select a ColorKey color that I'm not aware of? I want to be able to have arbitrary (per image, loaded with a properties sheet) color keys, so I hope that's possible.
Many thanks in advance for your thoughts. I don't want to do too much of a code dump on you all, but just incase someone's interested:
void VideoSystemDelegate::startup(std::string const windowTitle, const int xStartingPos, const int yStartingPos, const int width, const int height, unsigned int const flags) {
if (SDL_Init(SDL_INIT_VIDEO) < 0)
throw VideoSystemDelegateException("Failed to start SDL Video. " + std::string(SDL_GetError()));
if (!SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, "1"))
LOG_WARN("Couldn't enable linear texture filtering. SDL: " + std::string(SDL_GetError()));
window = SDL_CreateWindow(windowTitle.c_str(), xStartingPos, yStartingPos, width, height, flags);
if (window == nullptr)
throw VideoSystemDelegateException("Failed to create window. SDL: " + std::string(SDL_GetError()));
renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED); // is this where VSYNC can be turned on? (last field)
if (renderer == nullptr)
throw VideoSystemDelegateException("Failed to create renderer. SDL: " + std::string(SDL_GetError()));
SDL_SetRenderDrawColor(renderer, 0xFF, 0xFF, 0xFF, 0xFF); // not sure why LazyFoo does this.
int imgFlags = IMG_INIT_PNG;
if (!(IMG_Init(imgFlags) & imgFlags))
throw VideoSystemDelegateException("Failed to init PNG loading. IMG: " + std::string(IMG_GetError()));
initialized = true;
}
void VideoSystemDelegate::renderTexture(Texture t, int x, int y) {
if (!initialized) {
LOG_ERROR("Attempted to render texture but drawing is NOT online.");
return;
}
SDL_Rect renderQuad = { x, y, t.width(), t.height() };
SDL_RenderCopy(renderer, t.texture(), nullptr, &renderQuad);
}
Here's the texture method involved:
void Texture::loadFromFile(const std::string& path) {
LOG_DEBUG("Loading texture " + path);
deallocateTexture();
SDL_Texture* finalTexture = nullptr;
SDL_Surface* loadedSurface = IMG_Load(path.c_str());
if (loadedSurface == nullptr) {
LOG_ERROR("Unable to load image from " + path + " SDL: " + SDL_GetError());
return; // TODO throw?
}
if (SDL_SetColorKey(loadedSurface, SDL_TRUE,
SDL_MapRGB(loadedSurface->format,
TransparencyKeyRed,
TransparencyKeyGreen,
TransparencyKeyBlue) != 0))
LOG_ERROR("Unable to set transparency key. SDL: " + std::string(SDL_GetError()) );
finalTexture = SDL_CreateTextureFromSurface(VideoSystemDelegate::singleton().getRenderer(), loadedSurface);
if (finalTexture == nullptr) {
LOG_ERROR("Failed to create texture from image loaded from " + path + " SDL: " + SDL_GetError());
SDL_FreeSurface(loadedSurface);
return;
}
widthOfTexture = loadedSurface->w;
heightOfTexture = loadedSurface->h;
theTextureItself = finalTexture;
SDL_FreeSurface(loadedSurface);
}
And the method in my Game class (with a comment referring to a separate issue I'm experiencing. )
void Game::doAllTheGraphicsStuff() {
//VideoSystemDelegate::singleton().clearScreen(); // If I do this, I see only white.
VideoSystemDelegate::singleton().renderTexture(testBackground, 0, 0);
VideoSystemDelegate::singleton().renderTexture(testImage, 0, 0);
VideoSystemDelegate::singleton().updateScreen();
}
The clearScreen() and updateScreen() calls are simply wrappers for SDL_RenderClear() and SDL_RenderPresent respectively. (The VideoSystemDelegate and Texture classes are *friends* simply because the Texture class requires the renderer and I wanted to restrict the passing around of SDL stuff...in my efforts to decouple.)
ADDENDUM: I saved the file out at compression level zero, and the color values come out identically to the above pattern. It's so odd, no? The red value is max (255) but there it's 16...which isn't a difference between max and any of the other numbers. So, one candidate down.
There aren't many options for saving a PNG with GIMP...and both GIMP and SDL_Image use libpng, so I would think one could ingest whatever the other spits out.
If you've had success with LazyFoo's tutorials and your own custom images, what did you save them out as? What did you use as your ColorKey?