Jump to content
Posted 15 April 2014 - 10:53 PM
Posted 15 April 2014 - 11:30 PM
The interpolation of pixels is called filtering and is a standard feature of your video hardware. In 2d games and GUIs you often want to avoid the softening and a 1:1 mapping. In this case you would sample the texture with nearest-filter (take the pixel values at the nearest texel position). The often used, and standard filtering in 3d games, is the linear filtering, which interpolate the pixel value depending on the 4 neighbor texels. Further on, if you have mipmapping (you want to scale your image down without flickering artifacts), then the hardware will interpolate not only the values of a single texture, but of two neighbor mipmap layers (a mipmap is a downsampled version of the same texture). This is standard, when using hardware rendering and comes at no costs.
In fact, even if coding only a 2d game, you use the 3d rendering capability of the hardware (z-coord is fix), just render your 2d sprites as 3d quads with fixed z-coord.
Edited by Ashaman73, 15 April 2014 - 11:32 PM.
Posted 15 April 2014 - 11:41 PM
Posted 16 April 2014 - 01:14 AM
So with the sampling, is this something you can just instruct directX to do or would you have to apply a shader to do this?
No shader needed, it was one of the first hardware supported features and DX9 will have all support you need out-of-the-box, it is most likely only a simple flag you need to set. Best to start here.