Sign in to follow this  
Followers 0
edb6377

DX7 VC++6 Slow Alpha Blending

5 posts in this topic

Limitations: VC++ 6 and DX7

FPS = 60 (Dither)
FPS drops significantly when rendering Alpha Blended Text

Any Ideas?


Code as below.

Called to draw surface with Alpha Blend
[CODE]
// Alpha mode

DXDraw::RGBMASK mask = DXDraw::GetInstance()->GetRGBmask();
DWORD r, g, b, color;

// Render a alpha blended
for( int y = rc.top; y < rc.bottom; y++ )
{
for( int x = 0; x < rc.right; x++ )
{
color = pDestStart[ x ];
MergeColor( pimpl->backgroundColor, color );
pDestStart[ x ] = color;
}
pDestStart += pitch;
}
[/CODE]

Merge Color Code
[CODE]
// Add the color 1 with the second and divide the result by 2
#define MergeColor( clr1, clr2 )\
r = ( ( clr1 & mask.rMask ) + ( clr2 & mask.rMask ) ) >> 1;\
g = ( ( clr1 & mask.gMask ) + ( clr2 & mask.gMask ) ) >> 1;\
b = ( ( clr1 & mask.bMask ) + ( clr2 & mask.bMask ) ) >> 1;\
color = ( r & mask.rMask ) | ( g & mask.gMask ) | ( b & mask.bMask );
[/CODE] Edited by edb6377
0

Share this post


Link to post
Share on other sites
Is that Direct Draw? If so, reading kills your performance. You might be faster if you completely draw your surface into a memory buffer (with alpha) and do a full blit to the primary surface once you're done.


And additionally, what jbadams said ;)
3

Share this post


Link to post
Share on other sites
[quote name='jbadams' timestamp='1344309865' post='4966895']
[i][b]why[/b] do you think you need to stick with VC++6 and DirectX7?[/i]
[/quote]

perhaps he is maintaining some VERY old code that will generate thousands of errors if put through a modern compiler and a 1 line fix in the original environment would be faster than fix 6 thousands errors and then still be confronted with the original error in a new ecosystem.

If that's the case I sympathise for him [img]http://public.gamedev.net//public/style_emoticons/default/rolleyes.gif[/img] but my directdraw is a bit rusty [img]http://public.gamedev.net//public/style_emoticons/default/biggrin.png[/img]

If that's not the case.. yahh, get a real compiler and api.
0

Share this post


Link to post
Share on other sites
Please note that I've split off an off-topic discussion of continued usage of Visual C++ 6 into a new topic: "[url="http://www.gamedev.net/topic/629235-using-visual-c-6/"]Using Visual C++ 6"[/url]. Please keep this discussion on topic from this point onwards.
0

Share this post


Link to post
Share on other sites
Looks like you're emulating alpha blending by doing doing it on your own directly on the back buffer. You should check if the Driver supports hardware alpha blending (blit) and use that instead (Look the API documentation on how to that, there were plenty of examples in the DX7 SDK).
You may need to attach an 8-bit alpha surface in order to get smooth alpha blending with a 16-bit image (I assume you're using 16-bit because you said you were using dithering).

Emulation will slowdown your application badly because you're reading from GPU memory and then sending it back.
If you're forced to emulation (that would be a very ancient card, like, a 1995 video card) may be requesting the back buffer to be stored in local memory rather than hardware memory could help. Can't remember if that was possible though (was it called system memory rather than local memory?).
1

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0