Jump to content
  • Advertisement
  • entries
  • comments
  • views

What to do for the fourth?

Sign in to follow this  


Wow, July. Where is the year going? Still don't know what I'm gonna do for Y2K. Probably just set a couple of lawn chairs on the back porch and await The Rapture :)

On the CE front, I have a question that I'll throw out into the ether here before posting it to the newsgroups. After hearing the presentation on CE Games, I'm wondering if I'm doing my sprite engine in the most optimal way. The Sprite engine that the authors present is pretty simple. Their sprites are basically wrappers around HBITMAPs. To put a sprite on the screen, they just do a couple of BitBlt's (or a transparent BitBlt if the hardware supports it). The way I'm doing it is quite different.

Basically, I'm planning to do things similarly to the way I did 'em with my existing game packs, only under CE. This means that I'm triple-buffered. Behind the visible screen are a Background (the background stuff that almost never changes, like the maze in Pac Man) and a Canvas (the background with the sprites drawn on it). These are both DIBSections, so I can mess with the bits. To draw stuff on the screen, I actually copy stuff to the DIBSection's bits and then BitBlt the changed stuff from the Canvas to the screen. This is considerably more difficult then simply transparently BitBlt-ing HBITMAPS to the screen, but it seemed much more optimal. Since all the sprite stuff is drawn to the canvas with memory operations, I only need to do a single Blit to replace the changed stuff on the screen.

The other big advantage of this approach is that Sprites don't need to be HBITMAPs. Since I'm never actually drawing 'em to the screen, just to the canvas's bits, they can simply be little bags of bits.

The presenters made it sound like Blitting a buncha HBITMAPS to the screen is fast fast fast, so you don't need to be as clever as I am being. I'm skeptical, though. They kept talking about how HBITMAP stuff can be kept in video-memory, but is this really the case? I wasn't aware that CE devices had separate video hardware.

Any opinions? Am I overengineering? Their demo didn't seem overly impressive, but it might've been the low framerate of the internet-streaming that was making it look less-than-optimal.

Interestingly, this little column seems to have gotten some new readers. I got some good comments on older news items.

On my complaint a few weeks ago about how stuff like the hPrevInst parameter seems to stick with us like a bad case of herpes, I got some info from a former Windows API guy. It was pretty-much what I had expected. The philosophy is basically "don't change anything unless you have to". I guess having an unused parameter on the stack is better than having to go and re-tool the parameter lists every release. It's also one less hassle, as it's easier to keep the documentation honest. I'm still torn.

I guess it's a better solution than Apple had with their Toolbox calls. In the past, whenever Apple figured out a better way of doing things, they just wrote a slick new set of functions, but they left the old one in place as-is. Hence, if you want to play a sound or read a file, there are two or three ways of doing it. Of course, only one way is the "right" way, but it's awfully difficult to figure out which way is recommended and which way is just a holdover from 1987. Apple's new "carbon" API is basically their attempt to dump everything but the latest way of doing things, so they're not porting fossil functions to their new Mach-based kernel.

I think my favorite method is that used by StarView (AKA the best class library that you've never heard of and never will). When they went from version 1.0 to 2.0, they didn't look back. They rewrote the stuff that needed rewriting. They included a utility, however, that scanned your source code and told you which stuff you'd have to change. It wasn't too sophisticated --it would just point out functions and classes that had changed. stuff like:

Line 103: Accelerator is no longer a separate class. It is now an attribute of Menu.

Line 372: OutputDevice::Line() no longer takes X and Y coordinates. It now takes two Point objects.

Once you had addressed all the stuff that the 1.0-to-2.0 utility pointed out, you were pretty-much ported to the new version.

Interestingly, it appears that Apple is doing just this with "carbon". They've got a utility that scans your source code and tells you which functions will be going away. I certainly think this is a better solution than a person scratching his head and wondering why his app isn't correctly detecting the other instance of the app, because hPrevInst has been NULL since 1992.

I also got a message about one of the comments about my bookshelf. I had mentioned that I didn't really like Debugging the Development Process much because it didn't cover debugging too well. He pointed out that the book is more about modifying the development process itself than debugging. I guess this was a pun that went over my head when I originally read the book. I read it several years ago and don't currently own it, so I probably ought to give it a second look. Thanks for keeping me honest.

FWIW, the bookshelf page is probably going to go away eventually. I plan to write reviews of the best stuff there and put 'em in the gamedev.net book review page. That way, everything will be where you're looking for it, and you won't see stuff like book reviews scattered around everywhere.
Sign in to follow this  


Recommended Comments

There are no comments to display.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!