Writing a PacMan game in windows C# with GDI+, need help with framerate.

Started by
4 comments, last by kal_jez 18 years ago
Hello, As the topic says, I'm writting a PacMan clone in C# and using the GDI+ library. So far it has been going fairly well. I have sprite data loading from xml files and animating beautifully, a state machine managing game states and user input. However, my GDI+ 2d "engine" dies when it draw the map. it's 28x22 25 pixel square tiles. That's 616 tiles. The way my engines works, is it just clears an internally stored Image object with my default backgroudn color, then it loops through all IRenderable objects registered with it (each visible game object, including tiles) and calls their Render() method and passes them the graphics object to draw to it's Image buffer. Then, at the end of that loop it just returns the Image and that's set to the background of a panel that I call Refresh() on. The tile render code is as such:
[source=C#]
		public override void Render(System.Drawing.Graphics graphics, System.Drawing.Image image)
		{
			if(_source != Rectangle.Empty)
			{
				graphics.DrawImage(_background, Destination, _source.X, _source.Y, _source.Width, _source.Height, GraphicsUnit.Pixel, ImageAttributes);
			}
			else
			{
				graphics.DrawImage(_background, Destination, 0, 0, _background.Width, _background.Height, GraphicsUnit.Pixel, ImageAttributes);
			}
		}

It's setup so that if the tile is on a sheet of tiles it uses the source co-ords, but if not then it just uses the entire sheet dimentions. Now, I have ANOTHER program that can draw the same amount of tiles using the same redraw-each-time method, but it draws the tiles using DrawRectangle and methods like that, not from an Image. Is this DrawImage method what's killing me? And does anyone have any suggestions? I know this might all seem overkill for PacMan, but I'm making everything so that I could reuse it for any 2d arcady game I want to make later. Thanks,
---------------------------------------------------------------"The problem with computers is they do what you tell them.""Computer programmers know how to use their hardware."- Geek#
Advertisement
Well, GDI+ isn't exactly speedy to begin with, however a simple tip to make things perform better would be to only update what has changed from one frame to the next, do not redraw things in the next frame if they are the same as the previous frame.
.
Yeah, I've thought about that. Are there any tricks to implementing it?

See if you had a character standing on a floor tile, and then the floor tile cracks, then the tile would be redrawn but the character hasn't changed himself, so he wouldn't get redrawn. Then the tile would be drawn ontop of the guy.

How do you get around stuff like that?

---------------------------------------------------------------"The problem with computers is they do what you tell them.""Computer programmers know how to use their hardware."- Geek#
GDI is not very performance minded as has been said, It ought to be able to handle pac-man but it may be more trouble than its worth.

Personally, I think the best and most performant way to get graphics running well on GDI is to only use it to get a framebuffer and write your own blitting routines. The fastest routines I have for this will fill a 640x480 screen w/32x32 tiles (300 tiles) at over 1200 frames per second. The clipping blitter I have has almost no overhead since I resize the blit rather than if-checking the extents, and the transparancy blitter is roughly 2/3s the speed, or ~800 fps. When you're done blitting, you use GDI once again to copy your framebuffer onto your window's surface. Older game programming books (ie for DOS, when this was the only way to go) and the internet have tons of info on writing your own blitters. But, man its too early to be replying to forum posts... since you're doing this in C# the only way to to get this kind of performance would be to use pointers within an 'unsafe' code block, which you may be uncomfortable with if you've only got a C# background. I'm a C/C++ guy at home, and C# guy at work.


[EDIT] - Part of your performance issue is the fact that you seem to be using tiles which are 25x25 pixels. Tiles which are a multiple of 4, and particularly powers of 2 should really help your framerate. Same goes for your window's frame buffer, which you seem to have set at 700x550. Stick to a more 'normal' resolution like 640x480.

throw table_exception("(? ???)? ? ???");

Hrm, well this sucks. I'm quite comfortable with C++ but not for my current project.

I tried doing a DrawImage(...) with a source size of 32x32 instead of 25x25 and it didn't seem to make a difference. I know with DirectX, and other graphics APIs that sprite size is a big deal, but how so is that true in GDI?

I wanted to draw each tile individually so I could implement random map generation. However, it seems now I'll have to settle for background maps with tile data ontop, so one DrawImage will do the whole map and I can just drop a few sprites ontop.

---------------------------------------------------------------"The problem with computers is they do what you tell them.""Computer programmers know how to use their hardware."- Geek#
I thought GDI+ was loosely based on DX7, although it is aimed more at "corporate graphics" rather than games I agree that pacman shouldn't be a problem. You could create your random map by drawing the tiles onto an extra bitmap when you start the level and using that as your background source.

ie

Pre-Play -> Create "Background" Bitmap from the tiles.

In-Play -> Draw "Background" to the backbuffer
-> Draw "Pacman" etc to the backbuffer
-> Draw the backbuffer to the screen

If you need to crack a tile you can just update the "Background" image.

Of course your other option is to create a 2D engine from Managed DX, using the 3DX3D sprite class, there is a sample in the April SDK.

This topic is closed to new replies.

Advertisement