# [DX10] Sprites & alpha channel gamma II (or how to do linear fades in DX10)

This topic is 3030 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hey guys, Just retaking the discussion here: http://www.gamedev.net/community/forums/topic.asp?topic_id=546148 From Demirug:
Quote:
 The Direct3D 10 behavior is the correct one. As the name sRGB says it affects only the color channels. If you draw your alpha channel with Photoshop you will see it in gamma space and not in linear space as it should be.
Sometimes isn't the DX10 behavior a little bit awkward? For example, when using the alpha channel of a sprite to do a fade-in or fade-out, the sprite will fade in a strange non-linear way, because of how alpha blending is performed in DX10:
toSRGB(fromSRGB(src) * srcBlend + fromSRGB(dst) * dstBlend)
where for simplifying, toSRGB=pow(x,1/2.2) and fromSRGB=pow(x,2.2). So, if for example the background is black, then dst = (0,0,0,0), thus the alpha blending equation becomes:
toSRGB(fromSRGB(src) * srcBlend)
The alpha value (srcBlend), or fade progress, will have an exponential response over time:
pow(pow(src,2.2) * srcBlend,1/2.2)
So, the DX10 behavior may be the correct one, but there are some things that don't work in an intuitive way when using gamma corrected alpha blending (two of them very important: importing bitmaps from Photoshop --that was the issue of my previous post-- and performing fades). What I mean is, how can the DX10 behavior be the correct one, when you can't do a simple linear fade with its gamma corrected blending. What do you think about this? [Edited by - IrYoKu1 on November 3, 2009 2:51:22 PM]