Underwater distortion

Started by
19 comments, last by _the_phantom_ 19 years, 9 months ago
I was talking to a friend about Quake 1 the other day, and then I remembered Quake used to do an effect like this when you go underwater: I was just wondering... is there any way to do this in opengl? It is obviously direct manipulation of the video buffer - but how do I get access to it on ogl?
Advertisement
Theoretically, you could use glReadPixels and glDrawPixels.

In practice, this is far too slow to do. I don't know of any good way to do this.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Easiest way is with a vertex shader. After the matrix has been applied, check the x variable, and depending on it's value, displace in the y direction a given amount. If you have no access to vertex shaders, you could render the entire level to a texture, then display a few quads on screen using that texture and displace the texture coords.
How about rendering to texture then mapping that onto a tesselated/skewed quad? You could always play with the texture coordinates on these vertices as well.

I guess the best way to do it would be a pixel shader though (also with RTT as above). Since you can work out sin/cos in shaders now AFAIK, it shouldn't be that hard to offset the texture coordinates.

Just brainstorming...
Quote:Original post by Ready4Dis
Easiest way is with a vertex shader. After the matrix has been applied, check the x variable, and depending on it's value, displace in the y direction a given amount.


This won't work right, at least I don't think so. It's clearly a framebuffer effect, and it changes everything. That won't look the same. It might not even work right.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
A little (OK, waaaay) off topic: In reality, there is no such distortion when you go underwater[wink]. Have fun with your implementation though.
apply them on the y axis in the vertex program after matrix projection
I'm a little worried about how tesselated the quad would have to be to make the vertex shader implementation look good (we're talking about a smooth trig curve here). I don't think doing this in the pixel shader would be much of a performance hit. You could still work out the y-offset on a per-vertex basis and let it lerp for each of the pixels... this would still look a lot better than simply moving the vertices.

When you do it though, try to get rid of those nasty "wrap-around" effects in the posted image ;)
First, thanks to everyone who took the time to help me!
But after some time trying to achieve this effect, I'm still kinda lost...

Pixels Shaders: well, a little too advanced for me yet ^_^
and also, not all cards support them, so, pixel shaders would be the last resource.

I tried to render the whole screen to a texture for latter using it on a mesh - but a lot of details were lost (even using a 512x512 texture), so that was no good.

And then, glReadPixel... the effect worked out great, but so slow, it kills the game performance, even in lower resolutions...

So, I ask again: there's no other way besides those 3? I'm not an OGL expert, but common, an effect that could be done on DOS cannot be done by modern graphics cards? There's no other way to retrieve the pixels on the screen that is faster then glReadPixel?
Quote:Original post by SLotman
an effect that could be done on DOS cannot be done by modern graphics cards?


Don't you love the irony?


The only efficient way to do this is through a pixel shader. Just sit down and learn, they're not as hard as they sound.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

This topic is closed to new replies.

Advertisement