• Advertisement
Sign in to follow this  

GLSL and noise()

This topic is 4586 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Ok, this is a bit sad, but when are we going to get noise()'s for GLSL... I could really use them right now...

Share this post


Link to post
Share on other sites
Advertisement
Well, they're in the specification, I believe. The OpenGL Shading Language book describes them, at least. I tried to use them once myself in a shader I'd written in rendermonkey, however it obviously didn't work on my hardware (Radeon 9600), so it fell back to software which was painfully slow...

Share this post


Link to post
Share on other sites
technically speaking the NV40 lines of cards could do noise() as its got a high enuff instruction count, however depending on how its done it could still be painfully slow.

I'm trying to work out how to use the simplex noise code, which is mentioned in ATI's GDC05 GLSL presentation (see Forum FAQ for details)

Share this post


Link to post
Share on other sites
How interesting, I was just chatting with a friend a couple hours ago about how it would be nice if noise() was supported on something other than Wildcats.

Share this post


Link to post
Share on other sites
I'm sure it won't be too long until noise is available on commodity gfx hardware. Until then you can use the method Perlin describes in GPUGems, i.e. generate a 3d texture filled with noise...

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement