Jump to content
  • Advertisement
Sign in to follow this  
joshfiles

Convert Grayscale Bitmap

This topic is 5143 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have spent about a week roaming this forum and the rest of the internet looking for a way to render a color bitmap in grayscale with directx 9.0. After a lot of searching i found out it is done by basically doing a palette swap. So here is my problem. I load the bitmap in GDI and obtain the palette information as long as it is a 256 color bitmap or less. I then take the average and use that to create a new palette where each color has a value equal to the average. I set the peFlags to 255, to take care of the alpha channel. I then delete the GDI version and load the bitmap into a surface (LoadSurfaceFromFile). After which i call: pDevice->SetPaletteEntries(500,m_peGray); pDevice->SetCurrentTexturePalette(500); Where pDevice is my LPDIRECT3DDEVICE9. The 500 doesnt really matter i tried tons of unsigned ints. And i have tried running both of those calls before the Surface call, i also tried running these commands in the Render Loop. The program compiles correctly, and neither of those functions return an error, but the picture simply renders as a color image when i present it. I present using the backbuffer: pDevice->UpdateSurface(m_pSurface, &m_SourceRect, pRenderSurface, &m_DestPoint); Where pRenderSurface is a pointer to the BackBuffer. All i want to do is load a bitmap convert it to grayscale and use it as a background, if anyone has any ideas or can offer any help, please let me know, thank you for anything you can offer, have a great day.

Share this post


Link to post
Share on other sites
Advertisement
Read documentation of

D3DXColorAdjustSaturation() Function

in D3DX sdk!


it says,

The grayscale color is computed as:

r = g = b = 0.2125*r + 0.7154*g + 0.0721*b;


You will need to do this for every pixel of your bitmap and store the computed new value per pixel in a new empty 8-bit but of same dimensions bitmap.

Share this post


Link to post
Share on other sites
Doesnt D3DXLoadSurfaceFromFile or D3DXCreateTextureFromFileEx convert the file to whatever image format the surface is? for example convert it to X8R8G8B8 from A8R8G8B8

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!