Sign in to follow this  

How to normalize gradient?

This topic is 4518 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have this linear rainbow gradient that I want to normalize so I can access it, but I don't really know how to do it. Like I want to pass a value, t=0 to 1, where 0 would return the starting color and t=1 would return the ending color. (gradient has more than two colors). No actual index array is being used. All I have is a couple defined colors that make up a linear gradient. All color values are floating-point. I'm trying to figure out something that will return a color given t=0 to 1.

Share this post


Link to post
Share on other sites
HSV = Hue Saturation Value

Hue = rainbow color
Saturation = uhhh. How far away from 'grey' the color is...
Value = how light or dark.

So you convert your start & end colors to HSV, then interpolate HSV based on t, and compute the RGB from that. Seriously. I'm not kidding. Google HSV RGB conversion. Colors are a weird thing.

Share this post


Link to post
Share on other sites

This topic is 4518 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this