Jump to content
  • Advertisement
Sign in to follow this  
DesignerX

Win32Api problem

This topic is 5409 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

My problem is this : Array size - 128x128 of unsigned chars each element in the array represents a pixel with a value between 0 to 255. (Grayscale height map) What is the fastest way to draw this array on a specific window ? when I use the SetDIBitsToDevice() I need to expand my array by 3 (For containing RGB values for each pixel as demanded by bitmap functions) which is memory consuming when I use larger map dimensions. Is there anyway to let windows treat each element in the array as a single 8bit pixel ??? If u know about any solution I'll be happy to know about it. Thanks ! P.S. I tried to use SetPixel, it worked but very very very slowly.

Share this post


Link to post
Share on other sites
Advertisement
Quote:
Original post by DesignerX
Is there anyway to let windows treat each element in the array as a single 8bit pixel ???
Sort of. You have to store indices into the system palette rather than 8-bit color specifications, though. See GetSystemPaletteEntries.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!