Jump to content
  • Advertisement
Sign in to follow this  

Unable to get unfiltered textures on recent Radeon HD cards

This topic is 2777 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts


I have a program that shows images from certain hardware devices. I want to be able to show the actual pixels from the hardware so I have set the MAG filter to NEAREST.

glBindTexture( CGLwnd::arbTextureTarget, mnTextureIndex );

// Set texture parameters:
// Linear Filtered
glTexParameterf( CGLwnd::arbTextureTarget, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
// Do nearest for video, because we want to see the actual pixels when zooming!
glTexParameterf( CGLwnd::arbTextureTarget, GL_TEXTURE_MAG_FILTER, GL_NEAREST );

This works well on NVidia hardware and also on previous Radeon cards and achieves pixels with hard edges just as I want. On recent Radeon HD cards, however, I find that the texture pixels are blurred when rendered. It looks like some extra filtering that the card/driver adds automatically. It does not matter if I turn FSAA off completely in my program. I have also tried to fiddle around with the Catalyst control center but in vain.

Is this something that can be set up from my program? If it can be set in the control center, where should I look?


Share this post

Link to post
Share on other sites
Ok, I tested some more stuff and found out that changing the MIN filter to GL_NEAREST, too, made it work. Seems like a bug in the radeon driver...

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!