Fov variation alters radically textures

Started by
5 comments, last by wintertime 8 years, 11 months ago

Hello Everyone,

I have a need of a very wide range of fovs supported by my world camera, from 0.02(~1.14º) radians to 0.785(~45º)

The thing is, when I am applying a between 30º and 45º to even more, all goes quite well, but when I go down 30º, my textures begin distorting so hard, exponentially when i go lower on degrees, and it reaches to apply mipfilter almost for everything.

I dont understand why is happening this terrible behaviour or if there is any way of fixing different of change my camera behaviour because it should cost me to spend a lot of time, so any help should be appreciated.

The attached images are both from the same point of view and it only changes the fov.

I will always apologize for my insulting bad english. huh.png

Cheers

Advertisement
If you look closely you have the same effect on the bigger FOV for farther away points. The way to tackle this is to increase the anisotropic taps (D3DSAMP_MAXANISOTROPY)

mmm, do you think it will it decrease so much the overal perfomance of the shader?

It helps a bit, but not always, it can be antiproductive that (D3DSAMP_MAXANISOTROPY) parameter...and doesnt solve the problem completely...

It will likely decrease performance, as any more expensive texture filtering operation does. But the "so much" can only be answered by profiling wink.png

Edit:

It helps a bit, but not always, it can be antiproductive that (D3DSAMP_MAXANISOTROPY) parameter...and doesnt solve the problem completely...


I'm not surprised. The farther away you look the more the horziontal and vertical derivatives diverge (the angle gets more grazing). There's only so much the hardware filtering can do about that. One could of course apply custom filtering manually. But that would cost even more performance.

Thank you so much unbird, this has helped me a lot, never saw that strange things haha I have always fixed this problems with normal linear filters...

Regards :)

If your scene wasn't just a flat plane, the reason for this would be much more obvious. Try scattering some objects around the scene.

Varying the field of view has a similar effect to a zoom lens: low fov angles are zoomed in, high fov is zoomed out. Zooming in and out has the result of showing more/less pixels from your textures in the same space, which in turn affects how much filtering is needed.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

More simply, the problem is how mipmapping is done:
GPUs only support mipmaps where both dimensions are scaled the same amount (that only needs 1/3 more memory, not 4x) and the GPU must sample the smaller mipmap to avoid lost pixels, which would look worse than some blur.
With anisotropic filtering it can sample a bigger mipmap, because it is allowed to take more samples in a bigger area.

This topic is closed to new replies.

Advertisement