quadratic interpolation?

Started by
2 comments, last by zedzeek 19 years, 5 months ago
is it possible to do a quadratic interpolation using opengl(linux, nvidia geforce fx)? I found a formula, for two coords a and b the quadratic interp. = a*(1-f*f) + b*(f*f) but I am not sure if this is correct. if so I should put it in a shader?
Advertisement
Well, yeah, it's possible with a shader.



You don't see quadratic interpolation in graphics too much, other than tesselation algorithms which you're most likely not doing here. It's not much more expressive than linear interpolation, and not much of an efficiency savings over cubic interpolation.


What, exactly, are you trying to accomplish? What are you performing the quadratic interpolation on, and why?
Quote:Original post by Sneftel
Well, yeah, it's possible with a shader.



You don't see quadratic interpolation in graphics too much, other than tesselation algorithms which you're most likely not doing here. It's not much more expressive than linear interpolation, and not much of an efficiency savings over cubic interpolation.


What, exactly, are you trying to accomplish? What are you performing the quadratic interpolation on, and why?


Well, actually, I was discussing linear and cubic interpolation in my current graphics engine with someone and they asked if I could do quadratic interpolation instead of the former two. As he is a math/cs major I thought I would try out some of what he was saying.

Nothing specific, then. And since he seemed to know a lot about it, I thought I'd google/ask.
a*(1-f*f) + b*(f*f) is not a very complicated equation, u could even do it in hardware with a gf1 with register combiners,
even better with glsl, just define what abf are and youre laughing

This topic is closed to new replies.

Advertisement