# OpenGL quadratic interpolation?

This topic is 4976 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

is it possible to do a quadratic interpolation using opengl(linux, nvidia geforce fx)? I found a formula, for two coords a and b the quadratic interp. = a*(1-f*f) + b*(f*f) but I am not sure if this is correct. if so I should put it in a shader?

##### Share on other sites
Well, yeah, it's possible with a shader.

You don't see quadratic interpolation in graphics too much, other than tesselation algorithms which you're most likely not doing here. It's not much more expressive than linear interpolation, and not much of an efficiency savings over cubic interpolation.

What, exactly, are you trying to accomplish? What are you performing the quadratic interpolation on, and why?

##### Share on other sites
Quote:
 Original post by SneftelWell, yeah, it's possible with a shader.You don't see quadratic interpolation in graphics too much, other than tesselation algorithms which you're most likely not doing here. It's not much more expressive than linear interpolation, and not much of an efficiency savings over cubic interpolation.What, exactly, are you trying to accomplish? What are you performing the quadratic interpolation on, and why?

Well, actually, I was discussing linear and cubic interpolation in my current graphics engine with someone and they asked if I could do quadratic interpolation instead of the former two. As he is a math/cs major I thought I would try out some of what he was saying.

Nothing specific, then. And since he seemed to know a lot about it, I thought I'd google/ask.

##### Share on other sites
a*(1-f*f) + b*(f*f) is not a very complicated equation, u could even do it in hardware with a gf1 with register combiners,
even better with glsl, just define what abf are and youre laughing

1. 1
2. 2
Rutin
19
3. 3
4. 4
5. 5

• 14
• 12
• 9
• 12
• 37
• ### Forum Statistics

• Total Topics
631434
• Total Posts
3000051
×