GLSL and noise()

Started by
3 comments, last by Kuladus 18 years, 8 months ago
Ok, this is a bit sad, but when are we going to get noise()'s for GLSL... I could really use them right now...
Advertisement
Well, they're in the specification, I believe. The OpenGL Shading Language book describes them, at least. I tried to use them once myself in a shader I'd written in rendermonkey, however it obviously didn't work on my hardware (Radeon 9600), so it fell back to software which was painfully slow...
technically speaking the NV40 lines of cards could do noise() as its got a high enuff instruction count, however depending on how its done it could still be painfully slow.

I'm trying to work out how to use the simplex noise code, which is mentioned in ATI's GDC05 GLSL presentation (see Forum FAQ for details)
How interesting, I was just chatting with a friend a couple hours ago about how it would be nice if noise() was supported on something other than Wildcats.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
I'm sure it won't be too long until noise is available on commodity gfx hardware. Until then you can use the method Perlin describes in GPUGems, i.e. generate a 3d texture filled with noise...

This topic is closed to new replies.

Advertisement