Jump to content
  • Advertisement
Sign in to follow this  
nahoj84

Unable to get unfiltered textures on recent Radeon HD cards

This topic is 2604 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi,

I have a program that shows images from certain hardware devices. I want to be able to show the actual pixels from the hardware so I have set the MAG filter to NEAREST.

glBindTexture( CGLwnd::arbTextureTarget, mnTextureIndex );

// Set texture parameters:
// Linear Filtered
glTexParameterf( CGLwnd::arbTextureTarget, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
// Do nearest for video, because we want to see the actual pixels when zooming!
glTexParameterf( CGLwnd::arbTextureTarget, GL_TEXTURE_MAG_FILTER, GL_NEAREST );


This works well on NVidia hardware and also on previous Radeon cards and achieves pixels with hard edges just as I want. On recent Radeon HD cards, however, I find that the texture pixels are blurred when rendered. It looks like some extra filtering that the card/driver adds automatically. It does not matter if I turn FSAA off completely in my program. I have also tried to fiddle around with the Catalyst control center but in vain.

Is this something that can be set up from my program? If it can be set in the control center, where should I look?

Thanks,

Share this post


Link to post
Share on other sites
Advertisement
Ok, I tested some more stuff and found out that changing the MIN filter to GL_NEAREST, too, made it work. Seems like a bug in the radeon driver...

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!