Archived

This topic is now archived and is closed to further replies.

Cloud/smoke/thing algorithm(well, vague idea) that's probably too slow/looks crappy.

This topic is 5179 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Does this sound at all plausible? OK, your scene is rendered normally with one pass (without vaporous objects) and then the clouds in view are rendered as white on black (You'd need some kind of sorting, or to render the rest of the scene in yet another pass in black to cover the clouds) to a texture/backbuffer. Then you use the fastest blur filter you can think of on this backbuffer, and render it over the original scene( - clouds) with some type of addative blending. It seems to me that this would be too slow, but what do you think? [edited by - cowsarenotevil on October 9, 2003 8:09:09 PM]

Share this post


Link to post
Share on other sites
Hmm.... a few issues I can think of:

* occlusion: after the blur, clouds will show on the edge of occluded objects. If you have a smoke plume 20 feet away, and a flagpole 10 feet away, you want the edge between the flagpole and the plume to be sharp; but it would be blurred.

* blurring: fast blurs don''t tend to blur very much.

* blending: the blending should not be additive. That''s not how smoke works. Smoke is a partially opaque object, and should be rendered as such. That part wouldn''t be too hard to fix, tho.


How appropriate. You fight like a cow.

Share this post


Link to post
Share on other sites
Yes, those are very good points. I just picked additive blending because it would be fastest, you could also mask a solid gray, or even a shaded version of the model onto it (this could look cool for ghosts, if nothing else ). The occlusion is definately a good point, but that I''ll work out last, because it would still work somewhat without it, and it hurts my brain too much right now. Perhaps for the blur you could do a random noise thing, and then average pixels in close proximity, but if it''s not already obvious, fast image filtering is not something I know anything about

Share this post


Link to post
Share on other sites
Actually, if the occlusion thing were fixed, this could conceivably be very interesting for ghost/spirit effects. After all, the single accumulation pass would mean that ghosts wouldn''t show through each other, which tends to be the convention in most films. That could be cool.


How appropriate. You fight like a cow.

Share this post


Link to post
Share on other sites
Yeah, it could... I pondered the occlusion, and it seems that the way to do it would be to blur into the white area, but not out. So that nothing could possibly be covered with white that wasn't covered with the original objects, and it would also be more realistic in that the models wouldn't get bigger (and if you didn't like that, you could just make your models fatter to begin with).

EDIT: Though this would create a "dent" in the object where it's occluded, but it probably would still look better.

[edited by - cowsarenotevil on October 10, 2003 6:38:05 PM]

Share this post


Link to post
Share on other sites
>>* blurring: fast blurs don''t tend to blur very much.

Well, this depends. Summed area tables (google for more information) can do box filter blurs of arbitrary size, with arbitrary blur coefficient for each pixel. And that''s just four table lookups per pixel (depth of field has been done in realtime with this method, in a pixelshader).

Of course blurring in the image space doesn''t give perfect results but that''s not what people expect of games.

- Mikko Kauppila

Share this post


Link to post
Share on other sites
OMG I just had teh sudden inspiration. All right, so instead of bluring anything, you''d simply make the "white" polygons shaded with a head-on diffuse light. That would probably be a good enough aproximation, and you wouldn''t have to deal with occlusion at all. And it''d be fast. I think I''m going to patent this

Share this post


Link to post
Share on other sites
quote:
Original post by cowsarenotevil
OMG I just had teh sudden inspiration. All right, so instead of bluring anything, you''d simply make the "white" polygons shaded with a head-on diffuse light. That would probably be a good enough aproximation, and you wouldn''t have to deal with occlusion at all. And it''d be fast. I think I''m going to patent this


Huh, interesting... so you''d rely on the curvature of the object edges to effect the blur? That could work very well for relatively high-poly meshes of organic objects (less so for objects with angular edges).


How appropriate. You fight like a cow.

Share this post


Link to post
Share on other sites
quote:
Original post by Sneftel
quote:
Original post by cowsarenotevil
OMG I just had teh sudden inspiration. All right, so instead of bluring anything, you''d simply make the "white" polygons shaded with a head-on diffuse light. That would probably be a good enough aproximation, and you wouldn''t have to deal with occlusion at all. And it''d be fast. I think I''m going to patent this


Huh, interesting... so you''d rely on the curvature of the object edges to effect the blur? That could work very well for relatively high-poly meshes of organic objects (less so for objects with angular edges).


Yeah, that was what I was thinking. For sharper objects, you could use smooth normals to improve it a bit, but you must remember, most vapours aren''t particularly angular now for the implementation...

Share this post


Link to post
Share on other sites
UPDATE: It looks, um, weird... I tried it with the teapot (yes, I cheated and used photoshop if you''re wondering how I did it so fast... but it gives the effect very clearly) and it looks mostly good, except the handle and spout look awful (they''re not really attached to the teapot, which is why) and the base isn''t that great, because the angle is slightly sharper. But nevertheless, it looks good, and with relatively curvy objects (clouds, smoke, even ghosts if modeled well) it should be fine.

Share this post


Link to post
Share on other sites
Your first idea is rather interesting, still...

To solve your occlusion problem, blur before you render the geometry. The simplest way to blur would be to render to a smaller texture, and rely on texture filtering to stretch it.

Share this post


Link to post
Share on other sites
quote:
Original post by Deyja
Your first idea is rather interesting, still...

To solve your occlusion problem, blur before you render the geometry. The simplest way to blur would be to render to a smaller texture, and rely on texture filtering to stretch it.


Wait, how can I blur before rendering the geometry? The only way this will work is with a 2D backbuffer, and you can''t really do depth testing on a 2D image... Also, I tried the stretchy thing already, and it looks horrible with bilinear filtering.

Share this post


Link to post
Share on other sites
Yeah, I definately need a better blur, because there''s basically no point in doing it based only on the shading, because it''s just like generic alpha blending with shading anyway (except you don''t need to deal with depth at all if you make a new backbuffer for each blended object, but that will end up probably being slower than a halfway decent sorting algorithm...

Share this post


Link to post
Share on other sites
It would be interesting to combine this technique with metaballs. It seems like the two would go well together in terms of making plumes of smoke.


How appropriate. You fight like a cow.

Share this post


Link to post
Share on other sites
Well, you could probably achieve a nearly identical effect by using circular 2D particles and blurring them, and it would use less power.

EDIT: What might be cool is to have an object(fairly low-poly) with each vertex acted on by some algorithm to make it drift up, and they'd slowly fade away, and be respawned as part of new polygons. Probably no hope of that working in real time along with some blurring algorithm and two passes already...

EDIT: Someone must know a fast blur algorithm. If you do, post it, even if it sucks. If it works at all, I want it

[edited by - cowsarenotevil on October 11, 2003 10:48:52 PM]

Share this post


Link to post
Share on other sites
>>EDIT: Someone must know a fast blur algorithm. If you do,
>>post it, even if it sucks. If it works at all, I want it

Here's the idea of summed area tables in one dimension. Two (or more) dimensions aren't harder.

Let's say you have one dimensional array f[x] (x=0...N-1)

Now, create another table F[x] (of size N as well), in which each node is the sum of f(x) up to that point, that is:

F[x] = sum(i=0..x) f(i)

(this can be done in linear time in the implementation of course!)

Now, to make a blur around h pixels at position x we just do blur=(F[x+h]-F[x-h])/(2h+1)

This is intuitive, since F[x+h]-F[x-h] is the sum of all f(x) nodes between x-h and x+h. Then we just divide it by the number of pixels (2h+1) to get the average.

In two dimensions you have f[x,y] and each node in F[x,y] is the sum of all nodes from f(x,y) below y and on the left of x. That is,

F[x,y] = sum(j=0..y) sum(i=0..x) f(i,j)

(again doable in linear time)

Then blur = (F[x+h,y+h]-F[x-h,y]-F[x,y-h]+F[x-h,y-h])/(2h+1)^2
(draw the thing on paper to intuitivize the formula)

- Mikko Kauppila

[edited by - uutee on October 12, 2003 1:17:52 AM]

Share this post


Link to post
Share on other sites
Regarding your original idea (rendering to a texture), I've tested something similar a while ago.

It works exactly as you described: the scene is rendered without the mist. The mist is then rendered against a black copy of the scene into a texture, blurred, and then alpha blended back into the scene. And, as expected, you get blurred edges against the occluders.

mist_1.jpg

mist_2.jpg

And if you can bear installing the Macromedia Shockwave plugin, you can see it in real time here:

http://www.station-zero.com/medion/demos/misty.htm


[edited by - M3d10n on October 12, 2003 1:01:21 PM]

Share this post


Link to post
Share on other sites
Yes, it''s a good idea (albeit not entirely new, quite a few variations of that approach are known). An advantage with this approach is, that the mist/fog/vapour object can actually be lit, can receive shadows, etc. With some small modifications, you could even have atmospheric volume shadows through the mist.

There are two major issues, but both can be solved:

1) The blurring. Fast box filters exist, of course. But they require to copy back the texture/framebuffer into main memory, process it on the CPU, and copy it back. That''s horribly slow. But with shaders, some basic blurring can be implemented directly on the 3D card. The choice of filters are obviously limited, but you can get some rather pleasing results.

2) The occlusion. The basic idea, as you mentioned, is to render the vaporous objects against a mask of the scene. Only the zbuffer is needed from the scene here. Now, render textures (pbuffers) can contain a zbuffer, just as the main framebuffer.

The idea is as follows:

* render your scene without mist.
* copy the scene zbuffer into the zbuffer of an empty pbuffer.
* render your vaporous particles into the pbuffer, using z testing. They don''t need to be black and white, you can even use colour (can give nice coloured mist or toxic/radioactive gas effects).
* Blur the pbuffer using shaders (various approaches exist, some need to be combined with the next step, for example jittering)
* Blend the pbuffer back into the scene.

A different variation gets rid of the zbuffer copy overhead, but introduces another one. This one is good, if you have plenty of fillrate:

* render the scene without mist into a pbuffer.
* Draw the colour portion of the pbuffer into the framebuffer (who doesn''t need a dpeth buffer, btw). Use a textured quad for that.
* Clear the colour portion of the pbuffer.
* Go on as in the first method.

Of course, there is still ''real'' volumetric rendering. This will give the best results in terms of quality and flexibility, but is not very performance efficient. Although modern hardware can help. Using fatbuffers, you can partially defer the volume tracing onto the GPU.

Share this post


Link to post
Share on other sites
Oh, and BTW: if you use additive blending instead of standard alpha blending, you can use the very same technique to get impressive light glows and coronas.

Share this post


Link to post
Share on other sites
for the occlusion-dependend blurring, be sure to read on the ati page how they have done depth of field.. they took care of the occlusion part, too, to not blur far away objects over near objects.

the idea is simple.. for every sample, just check if its not too far away compared to the point you actually want to calculate the blurred smoke on. if it is, drop the sample, else cummulate it with the resting ones, and show up the blur..

it was some dx9 pixelshader paper about depthoffield..




If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net

Share this post


Link to post
Share on other sites