GDI, slow?

Started by
5 comments, last by Lalle 22 years, 10 months ago
Is GDI really slow or am I probarly writing non-efficient code? Now I have a DC in memory wich hold the image that is going to be drawn on the screen. To get the different parts of the image(background, ship, enemies..) I use another DC to BitBlt from. This result in a really slow game. I use 24-bit bmp images, should I switch to another format or change to 16 bit or something? Are there any general tips for making fast code? Any difference from using resources than loading external bitmaps? should I store variables on the heap or on the stack? should I make all large structures(bitmaps, etc..) static?
Advertisement
GDI is VERY slow!!!

I dont think its your code.

You should use something like DirecDraw for speed, its a lot faster than GDI.

Edited by - Slinger on June 12, 2001 12:10:50 PM
GDI is reasonably slow, but it depends on the context:
stretchblitting a large surface onto another with antialiasing is definately a no-no. Stay away from it. Evil!
However, straight bitblitting should not be all THAT slow. Depending on what kind of game/graphics you''re doing you should be able to get a pretty nice framerate anyway.

Some tips:
1. Limit the size of the rectangles you are blitting to the absolute minimum. If something''s not changing from frame to frame, then don''t blit it needlessly.
2. Don''t ever EVER use dynamic allocation in your inner loop. There are few evils worse than a "new" in time-critical code.
3. Make sure all DCs involved have the same pixel format as the video mode you are in. Conversions take time, sometimes LOTS of time.

If you are doing a lot of large blits that you cannot avoid, you probably are better off going to a "real" graphics API like DirectDraw, as Slinger said.

People might not remember what you said, or what you did, but they will always remember how you made them feel.
Mad Keith the V.
It's only funny 'till someone gets hurt.And then it's just hilarious.Unless it's you.
GDI is slow because GDI the principal point is to take some bridge (layer) from code to graphics card the more away can it be possible. Why?. because gdi must support ALL graphics card, even the graphics card from 80''. Good:compatibility, bad:slow as hell.

In opposite directx is a more fastest because is only a tiny layers from code to graphics card. good :speed, bad :some incompatibilities (req. directx installed!).
-----------------------------------------------"Cuando se es peon, la unica salida es la revolución"
Thanks alot for your answers guys!
It would be great if somebody also could answer me the other questions I asked.
what bit-size to use etc...
Considering hardly any graphics programs even fully support 16-bit bitmaps, you''re probably fine with 24. That''s what I use. Go with DirectDraw (buy one of André LaMothe''s books about DirectX, it''s not that difficult), and use 24-bit color for bitmaps, you can convert them to 16-bit in the game (or whatever), assuming that you would only be loading and converting bitmaps during initialization and map changes, etc.
GDI is exremely slow and I wouldn''t wish it upon anyone who wanted to write a game in it, GDI''s just not meant for games, it''s ok for stuff like menus and stuff I guess, but definitely not for the main game part. I wrote some of a tetris clone in GDI and it was hell, now i''m going to try and write it in OpenGL. Just stay away from GDI

This topic is closed to new replies.

Advertisement