Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 16 Oct 2000
Offline Last Active Aug 24 2015 09:14 PM

Topics I've Started

SDL Color Keying trouble (my first time)

25 April 2014 - 02:55 PM

I'm using ye olde LazyFoo (who graciously has updated for SDL2) as a guide here; I've got two images loading in and can display them. What I can't get to work is the colorkey (a transparent background for an image). I've tried setting a static color (arbitrarily a shade of bright pink, 0xff,0x00,0xf2) saved this out to both a .bmp and a .png (the latter I'm loading with the latest SDL_Image) and neither gives me a transparent color; instead I see lots of that bright pink.
The trouble, as it turns out, is that the pink I see in my game window is NOT the pink I saved via GIMP. I grabbed a pixel (only a single sample) from the Texture and got this:

25/04/2014 00:16:48,004228 DEBUG [Texture.cpp:29] transparency showdown! R GIMP 255 SDL 16
25/04/2014 00:16:48,004228 DEBUG [Texture.cpp:30] transparency showdown! G GIMP 0 SDL 0
25/04/2014 00:16:48,005228 DEBUG [Texture.cpp:31] transparency showdown! B GIMP 252 SDL 214
25/04/2014 00:16:48,005228 DEBUG [Texture.cpp:32] transparency showdown! A GIMP 255 SDL 255

which boggles my brain a bit: the colors are changing! (Numerically, not visually.)
This sample is takien for the PNG version (I haven't done it for the BMP version yet...I've been at work thinking on this all day) where the "GIMP" value is the one I set within the GIMP editor, and "SDL" is the value that SDL is telling me that the pixel actually holds. So, the transparency color I'm setting isn't the color that's drawn, but (as you might guess) to my eyes the image renders to screen with that precise shade of pink.
My first hunch is "compression?" but PNG is supposed to be lossless, no? My second hunch is "pixel format?" but I've stuck very closely to LazyFoo and I'm using the SDL function to get that format.
So...what could it be?

It will be some time before I get home and try saving the PNG without compression (just to rule it out) but maybe this is actually to be expected and I'm "doing it wrong"? Is there a better way/standard practice to select a ColorKey color that I'm not aware of? I want to be able to have arbitrary (per image, loaded with a properties sheet) color keys, so I hope that's possible. smile.png
Many thanks in advance for your thoughts. I don't want to do too much of a code dump on you all, but just incase someone's interested:

void VideoSystemDelegate::startup(std::string const windowTitle, const int xStartingPos, const int yStartingPos, const int width, const int height, unsigned int const flags) {
	if (SDL_Init(SDL_INIT_VIDEO) < 0)
		throw VideoSystemDelegateException("Failed to start SDL Video. " + std::string(SDL_GetError()));

		LOG_WARN("Couldn't enable linear texture filtering. SDL: " + std::string(SDL_GetError()));

	window = SDL_CreateWindow(windowTitle.c_str(), xStartingPos, yStartingPos, width, height, flags);
	if (window == nullptr)
		throw VideoSystemDelegateException("Failed to create window. SDL: " + std::string(SDL_GetError()));
	renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED); // is this where VSYNC can be turned on? (last field)
	if (renderer == nullptr)
		throw VideoSystemDelegateException("Failed to create renderer. SDL: " + std::string(SDL_GetError()));

	SDL_SetRenderDrawColor(renderer, 0xFF, 0xFF, 0xFF, 0xFF); // not sure why LazyFoo does this.

	int imgFlags = IMG_INIT_PNG;
	if (!(IMG_Init(imgFlags) & imgFlags))
		throw VideoSystemDelegateException("Failed to init PNG loading. IMG: " + std::string(IMG_GetError()));

	initialized = true;
void VideoSystemDelegate::renderTexture(Texture t, int x, int y) {
	if (!initialized) {
		LOG_ERROR("Attempted to render texture but drawing is NOT online.");

	SDL_Rect renderQuad = { x, y, t.width(), t.height() };
	SDL_RenderCopy(renderer, t.texture(), nullptr, &renderQuad);

Here's the texture method involved:

void Texture::loadFromFile(const std::string& path) {
	LOG_DEBUG("Loading texture " + path);

	SDL_Texture* finalTexture = nullptr;	
	SDL_Surface* loadedSurface = IMG_Load(path.c_str());
	if (loadedSurface == nullptr) {
		LOG_ERROR("Unable to load image from " + path + " SDL: " + SDL_GetError());
		return;	// TODO throw?

	if (SDL_SetColorKey(loadedSurface, SDL_TRUE, 
		TransparencyKeyBlue) != 0))
		LOG_ERROR("Unable to set transparency key. SDL: " + std::string(SDL_GetError()) );

	finalTexture = SDL_CreateTextureFromSurface(VideoSystemDelegate::singleton().getRenderer(), loadedSurface);
	if (finalTexture == nullptr) {
		LOG_ERROR("Failed to create texture from image loaded from " + path + " SDL: " + SDL_GetError());

	widthOfTexture = loadedSurface->w;
	heightOfTexture = loadedSurface->h;
	theTextureItself = finalTexture;


And the method in my Game class (with a comment referring to a separate issue I'm experiencing. smile.png )

void Game::doAllTheGraphicsStuff() {
	//VideoSystemDelegate::singleton().clearScreen();	// If I do this, I see only white.
	VideoSystemDelegate::singleton().renderTexture(testBackground, 0, 0);
	VideoSystemDelegate::singleton().renderTexture(testImage, 0, 0);

The clearScreen() and updateScreen() calls are simply wrappers for SDL_RenderClear() and SDL_RenderPresent respectively. (The VideoSystemDelegate and Texture classes are *friends* simply because the Texture class requires the renderer and I wanted to restrict the passing around of SDL stuff...in my efforts to decouple.)


ADDENDUM: I saved the file out at compression level zero, and the color values come out identically to the above pattern. It's so odd, no? The red value is max (255) but there it's 16...which isn't a difference between max and any of the other numbers. So, one candidate down.


There aren't many options for saving a PNG with GIMP...and both GIMP and SDL_Image use libpng, so I would think one could ingest whatever the other spits out.


If you've had success with LazyFoo's tutorials and your own custom images, what did you save them out as? What did you use as your ColorKey?

Hey career-software engineers: do your friends approach you with game ideas? (Mine do.)

04 April 2014 - 03:09 PM

I wasn't sure what else to call this; no matter what I put in the subject bar it felt sensationalist to me. Sorry for that.


Anyway, I'm a software engineer professionally and my friends know it. Every so often, one of my friends will approach me with a "Video game" idea. It's a little like when people come on this board and post "I have a game idea! How can I get a program to make it magically for me!?" in which case we invariably link Mr. Tom Sloper's website to them and let them down gently.


But, these are my friends. I don't have the time or energy to make an RPG (by myself) or the art skills to make a viral cutsie iOS game (much less what a terrible market entry it imposes) but...I am unskilled at letting them down easy. In some cases my standard appeal of "I just don't have the time to make a game" works. But sometimes I see the letdown in their eyes and, well, I feel bad. They were really thinking I could make their dream a reality.


Maybe the worst part is that I probably *can* if I am willing to commit literal years of my life to the task, but...haha, well, even then. Hiring artists, design and programming help, a musician...you all know the drill. <3


I'm very interested to know if any of you all have had this experience, i.e. a good friend approaches you, knows you make software, and asks you to "go into the Indie business" with them. How have you handled that? If you have a tried-and-true method of easy let downs, I'd very much like that haha. smile.png


If not, commiseration is okay too. I'll be honest: just writing this post was cathartic for me. Maybe I just needed to talk about it. <3

SDL2, mingw64, and Code::Blocks

24 March 2014 - 09:24 PM

This warning is a new one on me. I am trying to build a shockingly simple SDL2 app with Code::Blocks and mingw64 and here's what I'm getting: 

||Warning: .drectve `/manifestdependency:"type='win32' name='Microsoft.VC90.CRT' version='9.0.21022.8' processorArchitecture='amd64' publicKeyToken='1fc8b3b9a1e18e3b'" /DEFAULTLIB:"MSVCRT" /DEFAULTLIB:"OLDNAMES" ' unrecognized|

My Googlefu has failed me and I'm totally out of ideas. smile.png I'm hoping that, if I list out everything I've learned and seen so far, maybe one of you all will see what I'm missing.


The source is super-simple. It's just this:

#include "SDL.h"

int main(int argc, char* args[]) {

    return 0;

YES those two function calls are commented out; the warning is only thanks to the  #include of SDL.h.


I do have quite a few VS redists installed (including one with the exact version number in that warning), but I don't think that's really connected here. This post (here on Gamedev) and this post (from the SDL forums) seem to imply that I'm (accidentally) trying to link either

mingw SDL against VS libs


VS SDL against mingw libs

Then in this spectacularly frustrating post, a user puts up an error very, very similar to mine (same warning, different version, and some actual errors) but then shortly after posts that they fixed it...so the solution is never shared. sad.png


I installed mingw-64 (which is the only mingw on my system) but it's worth mentioning that VS 2010 is also on here. I've pointed Code::Blocks at mingw; it's on the path and the only "g++" on my system, so that's got to be what it's running. smile.png


I would like to play around with SDL2, so I grabbed SDL2-devel-2.0.3-mingw.tar.gz which, on the site, is linked beside the MinGW64 that I downloaded. I.E., inspite of what the warning is telling me it appears to me that I am in fact trying to use mingw libs with a mingw compiler. Maybe this is somehow a 32-bit vs. 64-bit issue? (Again, it looks to me like it's not, and I did try throwing -m64 into the options; it had no effect.) Or maybe I'm just way more wrong than I think I am.


I used sdl-config to get the cflags and libs appends but here's the first catch; I couldn't run these in Windows (it gave me a "These are 16-bit" errors) so I ran them in my Cygwin window. (I'm mostly certain this is okay, but including it here anyway.)


sdl-config --cflags gave me

-I/usr/local/cross-tools/x86_64-w64-mingw32/include/SDL2 -Dmain=SDL_main

sdl-config --libs gave me

-L/usr/local/cross-tools/x86_64-w64-mingw32/lib -lmingw32 -lSDL2main -lSDL2 -mwindows

I translated those /usr/local/cross-tools/ paths to Windows paths, pointing at the pre-compiled ones that I unpacked with the rest of SDL 2.0.3, thus:


It occurred to me that possibly I need to build the libs myself, so I installed Make for Windows and tried that and ran into some problems. (This is where I reveal to you what a giant noob I am with make.)

f:\SDL\SDL2-2.0.3>make native
make install-package arch=i686-w64-mingw32 prefix=/usr
make[1]: Entering directory `f:/SDL/SDL2-2.0.3'
-d was unexpected at this time.
make[1]: *** [install-package] Error 255
make[1]: Leaving directory `f:/SDL/SDL2-2.0.3'
make: *** [native] Error 2

The "native" specificer there seems to be trying to crank out 32-bit, when I would prefer 64-bit; perhaps that's just the first step, but I'm not sure. The only page I found on "-d was unexpected" was this StackOverflow post that looked like I'd need to debug SDL2's makefile, which I wished to avoid. I did try "cross" which gave me an even fancier error.

f:\SDL\SDL2-2.0.3>make cross
for arch in i686-w64-mingw32 x86_64-w64-mingw32; do \
            make install-package arch=$arch prefix=/usr/local/cross-tools/$arch;
arch was unexpected at this time.
make: *** [cross] Error 255

I'm at a loss and I'm very hopeful that one of you has possibly encountered this (or something like it) before and can spot what I've missed or done incorrectly here.


Thank you in advance!

C++ Inheritance confusion

25 February 2014 - 01:42 PM

This has got to be a simple mistake, but I've stared at it for too long now and I just don't see it. sad.png (Excuse the canned example; it's pretty close to the real thing though.)

class Foo {

Foo(vector<string> things, string place) : m_thingList(things)
log("in Foo ctor");

virtual ~Foo() {}


vector<string> m_thingList;
string m_validatedPlace;

validatePlacePriorToAssignment(string p)
// do stuff to validate place; if not valid, throw exception
// else
m_validatedPlace = p;

class Bar : public Foo {
const string defaultPlace = "Validville";
Bar(vector<string> stuff) : Foo(stuff, defaultPlace)
log("in Bar ctor");

someFunctionSomewhere(vector<string> allTheThings) {
try { Bar barbar(allTheThings); }
catch (...) { log("saddness"); }

Okay, here's the weirdness I'm seeing: there is a log entry for the Foo ctor but not the Bar ctor. I'm stack-allocating a member of a derived class; my understanding is that the base ctor is called THEN the derived ctor is called...but in this case that doesn't seem to be happening!


I'm worried that I'm doing inheritance wrong, or using initializer lists wrong, or I'm  using them wrong in the context of inheritance, or for some reason it's wrong (not allowed) to validate prior to assigning in a constructor. I think I'm doing this all correctly, but obviously I'm not so here I am. smile.png

I've read through a bunch of tutorials and stackoverflow examples this afternoon, and - for the life of me - I can't figure out why this isn't working; it looks like all the sample code. If you can catch it, I'd be especially grateful.


Thank you in advance. <3

Odd usage of friend keyword in C++ class.

27 February 2013 - 02:19 PM

It's my esteemed privilege to try and repair some old legacy code (or at least gather enough evidence to convince the powers at be to let my team rewrite it)...in digging through this code (which is devoid of comments) I've come across something that seems a bit weird. Here's a paraphrasing of the declaration:


class foo
  class event;
  class request
    friend class event;
  class response
    friend class event;


Now, I've seen nested class definitions before, and I've seen the friend keyword in use before, but I've never seen the friend keyword used like this. There are several other classes with this very structure, unspecified class 'event' friended to 'response' and 'request' and others. Is the goal to instantiate foo1, foo2, etc. then iterate over unspecified object pointers, controlling them through this 'event interface'?


Maybe it's a design paradigm one of you has seen in the past (and not just something this programmer made up) so I thought I'd ask. What could be gained from this? (Other than some obfuscation, which isn't out of the question.) Naturally it's hard (maybe impossible) to determine what the point is without the code base (which I can't share directly). However, these classes aren't instantiated in the code I have because it's a library, and while I technically know what the library accomplishes, I'm now at a loss as to how I might wield it.


Oh, I found a comment. It says "// TODO: FIX". It's the only one. I lol'd.


So, possibly you've seen this or have a sharper imagination than I. In any case, thanks in advance for your time. :)