Jump to content
  • Advertisement
Sign in to follow this  
raydog

How to normalize gradient?

This topic is 4727 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have this linear rainbow gradient that I want to normalize so I can access it, but I don't really know how to do it. Like I want to pass a value, t=0 to 1, where 0 would return the starting color and t=1 would return the ending color. (gradient has more than two colors). No actual index array is being used. All I have is a couple defined colors that make up a linear gradient. All color values are floating-point. I'm trying to figure out something that will return a color given t=0 to 1.

Share this post


Link to post
Share on other sites
Advertisement
HSV = Hue Saturation Value

Hue = rainbow color
Saturation = uhhh. How far away from 'grey' the color is...
Value = how light or dark.

So you convert your start & end colors to HSV, then interpolate HSV based on t, and compute the RGB from that. Seriously. I'm not kidding. Google HSV RGB conversion. Colors are a weird thing.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!