SDL Color Keying trouble (my first time)

Started by
1 comment, last by serratemplar 9 years, 11 months ago

I'm using ye olde LazyFoo (who graciously has updated for SDL2) as a guide here; I've got two images loading in and can display them. What I can't get to work is the colorkey (a transparent background for an image). I've tried setting a static color (arbitrarily a shade of bright pink, 0xff,0x00,0xf2) saved this out to both a .bmp and a .png (the latter I'm loading with the latest SDL_Image) and neither gives me a transparent color; instead I see lots of that bright pink.

The trouble, as it turns out, is that the pink I see in my game window is NOT the pink I saved via GIMP. I grabbed a pixel (only a single sample) from the Texture and got this:


25/04/2014 00:16:48,004228 DEBUG [Texture.cpp:29] transparency showdown! R GIMP 255 SDL 16
25/04/2014 00:16:48,004228 DEBUG [Texture.cpp:30] transparency showdown! G GIMP 0 SDL 0
25/04/2014 00:16:48,005228 DEBUG [Texture.cpp:31] transparency showdown! B GIMP 252 SDL 214
25/04/2014 00:16:48,005228 DEBUG [Texture.cpp:32] transparency showdown! A GIMP 255 SDL 255


which boggles my brain a bit: the colors are changing! (Numerically, not visually.)

This sample is takien for the PNG version (I haven't done it for the BMP version yet...I've been at work thinking on this all day) where the "GIMP" value is the one I set within the GIMP editor, and "SDL" is the value that SDL is telling me that the pixel actually holds. So, the transparency color I'm setting isn't the color that's drawn, but (as you might guess) to my eyes the image renders to screen with that precise shade of pink.

My first hunch is "compression?" but PNG is supposed to be lossless, no? My second hunch is "pixel format?" but I've stuck very closely to LazyFoo and I'm using the SDL function to get that format.

So...what could it be?


It will be some time before I get home and try saving the PNG without compression (just to rule it out) but maybe this is actually to be expected and I'm "doing it wrong"? Is there a better way/standard practice to select a ColorKey color that I'm not aware of? I want to be able to have arbitrary (per image, loaded with a properties sheet) color keys, so I hope that's possible. smile.png

Many thanks in advance for your thoughts. I don't want to do too much of a code dump on you all, but just incase someone's interested:


void VideoSystemDelegate::startup(std::string const windowTitle, const int xStartingPos, const int yStartingPos, const int width, const int height, unsigned int const flags) {
	if (SDL_Init(SDL_INIT_VIDEO) < 0)
		throw VideoSystemDelegateException("Failed to start SDL Video. " + std::string(SDL_GetError()));

	if (!SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, "1"))
		LOG_WARN("Couldn't enable linear texture filtering. SDL: " + std::string(SDL_GetError()));

	window = SDL_CreateWindow(windowTitle.c_str(), xStartingPos, yStartingPos, width, height, flags);
	if (window == nullptr)
		throw VideoSystemDelegateException("Failed to create window. SDL: " + std::string(SDL_GetError()));
		
	renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED); // is this where VSYNC can be turned on? (last field)
	if (renderer == nullptr)
		throw VideoSystemDelegateException("Failed to create renderer. SDL: " + std::string(SDL_GetError()));

	SDL_SetRenderDrawColor(renderer, 0xFF, 0xFF, 0xFF, 0xFF); // not sure why LazyFoo does this.

	int imgFlags = IMG_INIT_PNG;
	if (!(IMG_Init(imgFlags) & imgFlags))
		throw VideoSystemDelegateException("Failed to init PNG loading. IMG: " + std::string(IMG_GetError()));

	initialized = true;
}

void VideoSystemDelegate::renderTexture(Texture t, int x, int y) {
	if (!initialized) {
		LOG_ERROR("Attempted to render texture but drawing is NOT online.");
		return;
	}

	SDL_Rect renderQuad = { x, y, t.width(), t.height() };
	SDL_RenderCopy(renderer, t.texture(), nullptr, &renderQuad);
} 

Here's the texture method involved:


void Texture::loadFromFile(const std::string& path) {
	LOG_DEBUG("Loading texture " + path);
	deallocateTexture();

	SDL_Texture* finalTexture = nullptr;	
	SDL_Surface* loadedSurface = IMG_Load(path.c_str());
	if (loadedSurface == nullptr) {
		LOG_ERROR("Unable to load image from " + path + " SDL: " + SDL_GetError());
		return;	// TODO throw?
	}

	if (SDL_SetColorKey(loadedSurface, SDL_TRUE, 
		SDL_MapRGB(loadedSurface->format,
		TransparencyKeyRed,
		TransparencyKeyGreen,
		TransparencyKeyBlue) != 0))
		LOG_ERROR("Unable to set transparency key. SDL: " + std::string(SDL_GetError()) );

	finalTexture = SDL_CreateTextureFromSurface(VideoSystemDelegate::singleton().getRenderer(), loadedSurface);
	if (finalTexture == nullptr) {
		LOG_ERROR("Failed to create texture from image loaded from " + path + " SDL: " + SDL_GetError());
		SDL_FreeSurface(loadedSurface);
		return;
	}

	widthOfTexture = loadedSurface->w;
	heightOfTexture = loadedSurface->h;
	theTextureItself = finalTexture;

	SDL_FreeSurface(loadedSurface);
}

And the method in my Game class (with a comment referring to a separate issue I'm experiencing. smile.png )


void Game::doAllTheGraphicsStuff() {
	//VideoSystemDelegate::singleton().clearScreen();	// If I do this, I see only white.
	VideoSystemDelegate::singleton().renderTexture(testBackground, 0, 0);
	VideoSystemDelegate::singleton().renderTexture(testImage, 0, 0);
	VideoSystemDelegate::singleton().updateScreen();
}

The clearScreen() and updateScreen() calls are simply wrappers for SDL_RenderClear() and SDL_RenderPresent respectively. (The VideoSystemDelegate and Texture classes are *friends* simply because the Texture class requires the renderer and I wanted to restrict the passing around of SDL stuff...in my efforts to decouple.)

ADDENDUM: I saved the file out at compression level zero, and the color values come out identically to the above pattern. It's so odd, no? The red value is max (255) but there it's 16...which isn't a difference between max and any of the other numbers. So, one candidate down.

There aren't many options for saving a PNG with GIMP...and both GIMP and SDL_Image use libpng, so I would think one could ingest whatever the other spits out.

If you've had success with LazyFoo's tutorials and your own custom images, what did you save them out as? What did you use as your ColorKey?

Advertisement

Okay, I finally got some free time to play with this again, and I made some progress. There are a few issues with my code that I'm finding here.

I made a table and started testing variables. All I had to do was set up LazyFoo's color_keying tutorial with no edits and get it working to test again.


I tried his images and my images in his code...his images work, mine do not.

I tried his images and my images in my code...his images work, mine do not.

I also tried loading and saving his images with GIMP the way I saved mine...and his images still work.

At this point, I realized I'd been (unconsciously) turning two (not one) knobs. I'm not only exchanging our images...but our color keys.

As soon as I tried to pass and store his color keys the way I've been passing and storing mine, his images also failed.

Then I tried to just directly set mine, and my images work in my code.

So...I do have a question still, haha. smile.png

This works:


SDL_SetColorKey(loadedSurface, SDL_TRUE, SDL_MapRGB(loadedSurface->format, 0xFF, 0x00, 0xFC));

and this doesn't:


Uint8 TransparencyKeyRed = 0xFF;	
Uint8 TransparencyKeyGreen = 0x00;
Uint8 TransparencyKeyBlue = 0xFC;

SDL_SetColorKey(loadedSurface, SDL_TRUE,
	SDL_MapRGB(loadedSurface->format,
	TransparencyKeyRed,
	TransparencyKeyGreen,
	TransparencyKeyBlue)

So...it appears that I'm using Uint8 incorrectly? The SDL_MapRGB() function does take Uint8s...and those Uint8s are typedef'd uint8_t which is a typedef'd unsigned char. That all makes sense, but I'm getting this odd behavior here nonetheless.

How can I pass these values around? Ultimate I'd like for them to come from sprite properties files and not be hardcoded.Also, the pixel sample values I took above I also caught in Uint8s...and I tried logging them with std::to_string(). Maybe that's why the values are so weird looking there?

Any ideas would be appreciated. :) If nothing else, I hope somebody learns something from my struggles here.

I suspect my mind is not entirely stable.

This works:


	Uint32 passedInKey = SDL_MapRGB(loadedSurface->format, TransparencyKeyRed, TransparencyKeyGreen, TransparencyKeyBlue);
	
	if (SDL_SetColorKey(loadedSurface, SDL_TRUE, passedInKey) < 0)
		LOG_ERROR("Unable to set transparency key. SDL: " + std::string(SDL_GetError()));
	else
		LOG_DEBUG("It *thinks* it set the transparency correctly.");

Maybe I'm just to tired to see how that's functionally different than the inline evaluation I was previously using. But it works, so...I guess I'll go with it haha.

So strange.

This topic is closed to new replies.

Advertisement