Jump to content
  • Advertisement
Sign in to follow this  

DX7 VC++6 Slow Alpha Blending

This topic is 2174 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Limitations: VC++ 6 and DX7

FPS = 60 (Dither)
FPS drops significantly when rendering Alpha Blended Text

Any Ideas?

Code as below.

Called to draw surface with Alpha Blend

// Alpha mode

DXDraw::RGBMASK mask = DXDraw::GetInstance()->GetRGBmask();
DWORD r, g, b, color;

// Render a alpha blended
for( int y = rc.top; y < rc.bottom; y++ )
for( int x = 0; x < rc.right; x++ )
color = pDestStart[ x ];
MergeColor( pimpl->backgroundColor, color );
pDestStart[ x ] = color;
pDestStart += pitch;

Merge Color Code

// Add the color 1 with the second and divide the result by 2
#define MergeColor( clr1, clr2 )\
r = ( ( clr1 & mask.rMask ) + ( clr2 & mask.rMask ) ) >> 1;\
g = ( ( clr1 & mask.gMask ) + ( clr2 & mask.gMask ) ) >> 1;\
b = ( ( clr1 & mask.bMask ) + ( clr2 & mask.bMask ) ) >> 1;\
color = ( r & mask.rMask ) | ( g & mask.gMask ) | ( b & mask.bMask );
Edited by edb6377

Share this post

Link to post
Share on other sites
Is that Direct Draw? If so, reading kills your performance. You might be faster if you completely draw your surface into a memory buffer (with alpha) and do a full blit to the primary surface once you're done.

And additionally, what jbadams said ;)

Share this post

Link to post
Share on other sites

why do you think you need to stick with VC++6 and DirectX7?

perhaps he is maintaining some VERY old code that will generate thousands of errors if put through a modern compiler and a 1 line fix in the original environment would be faster than fix 6 thousands errors and then still be confronted with the original error in a new ecosystem.

If that's the case I sympathise for him rolleyes.gif but my directdraw is a bit rusty biggrin.png

If that's not the case.. yahh, get a real compiler and api.

Share this post

Link to post
Share on other sites
Looks like you're emulating alpha blending by doing doing it on your own directly on the back buffer. You should check if the Driver supports hardware alpha blending (blit) and use that instead (Look the API documentation on how to that, there were plenty of examples in the DX7 SDK).
You may need to attach an 8-bit alpha surface in order to get smooth alpha blending with a 16-bit image (I assume you're using 16-bit because you said you were using dithering).

Emulation will slowdown your application badly because you're reading from GPU memory and then sending it back.
If you're forced to emulation (that would be a very ancient card, like, a 1995 video card) may be requesting the back buffer to be stored in local memory rather than hardware memory could help. Can't remember if that was possible though (was it called system memory rather than local memory?).

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!