Jump to content
  • Advertisement
Sign in to follow this  
chbrules

SDL Blitting bug

This topic is 4805 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I can't figure this out for the life of me and neither could my friend. I have an image that holds all the frames of my player's 4-directional animation sequences. 4x3 32x32px frames. I successfully load it into RAM with the SDL_Image LoadIMG() function, but for some odd reason my CutIMG() function isn't working! I've tried to debug it, tried error checking, everything, it isn't working! The furthest I've found is the SDL_BlitSurface() function is returning -1 (an error). Here's what I have: My CPlayer class has 2 surface variables in it:
SDL_Surface *char_temp;
SDL_Surface *char_temp_slice;
Then I have a the constructor do the work:
CPlayer::CPlayer()
{
	pStatus = new sStatus;
	bDead = false;
	time = 1;
	loop = 0;
	frame_speed = 250;

	pStatus->Compass = CBase::Direction::east; //default direction

	//Load Gabey
	char_temp = IMG_Load(".\\textures\\player\\neon_base.bmp");
	char_temp_slice = SDL_DisplayFormat(char_temp);

	for(int y=0; y < 4; y++)
	{
		for(int x=0; x < 3; x++)
		{
			if( CutIMG( char_temp_slice, char_temp, x*32, y*32, 32, 32 ) == -1 ) { char_temp_slice = char_temp; }

			SDL_SetColorKey( char_temp_slice, SDL_SRCCOLORKEY, 0xFF00FF);

			character.push_back( *char_temp_slice );
		}
	}
}
And then finally here is my CutIMG() function:
int CSDL_Shared::CutIMG(SDL_Surface *to, SDL_Surface *from, int x_start, int y_start, int width_crop, int height_crop)
{
	SDL_Rect imag;
	imag.x = x_start;
	imag.y = y_start;
	imag.w = width_crop;
	imag.h = height_crop;

	SDL_Rect dest;
	dest.x = 0;
	dest.y = 0;

	if( SDL_BlitSurface(from, &imag, to, &dest) == -1 ) { return -1; }

	return 0;
}
I've just been trying to go over this many times in my mind. I've looked up tutorials, debugged, looked up info in my Focus On SDL book, nothing! Please help. =/

Share this post


Link to post
Share on other sites
Advertisement
The bug is that you are re-using the same surface everytime to cut from. In that sense, SDL_DisplayFormat, is not what you need because it only converts a surface into the correct mode to be displayed.

This is what you need to do:

1. Create a new SDL surface that is of the same size and pixel format as the size of the character you want to extract. To do that take a look at SDL_CreateRGBSurface.

2. After you have that surface, which represents the final image, you will then need to blit on the source from your char_temp image.

3. After that you can modify the color key and push back the surface.

A general note is that you should be doing all of this inside the for loops. Otherwise the surface will not be correct since they are pointers. Since they are pointers, you can never simply assign them using =. It is valid and legal, but will not do what you want it to do in this case.

So wrapping up with some pseudo C++ code:

for(int y=0; y < 4; y++)
{
for(int x=0; x < 3; x++)
{
// rmask and such are obtained from the <tt>char_temp</tt> surface
char_temp_slice = SDL_CreateRGBSurface( SDL_SWSURFACE, width, height, 32, rmask, gmask, bmask, amask);

// Now blit onto the surface
CutIMG( char_temp_slice, char_temp, x*32, y*32, 32, 32 );
// Set transparent color
SDL_SetColorKey( char_temp_slice, SDL_SRCCOLORKEY, 0xFF00FF);
// Save it now!
character.push_back( *char_temp_slice );

}

}




That should be about it. Feel free to ask if you have any other questions. I think this was about what I had to do when I did this quite a whiles back.

- Drew

Share this post


Link to post
Share on other sites
What do I use for the last 4 parameters of SDL_CreateRGBSurface() function? My SDL book is saying it's the bit depths of the channels, but they're all in hex and don't make sence. =/

Share this post


Link to post
Share on other sites
Ahh yes! Here ya go, from the docs:


/* Create a 32-bit surface with the bytes of each pixel in R,G,B,A order,
as expected by OpenGL for textures */

SDL_Surface *surface;
Uint32 rmask, gmask, bmask, amask;

/* SDL interprets each pixel as a 32-bit number, so our masks must depend
on the endianness (byte order) of the machine */

#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
#else
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
#endif

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!