Per-pixel motion blur

Started by
5 comments, last by Josh Klint 13 years ago
I implemented a velocity buffer for both moving and static objects. Once you have this data, how do you write a good per-pixel motion blur shader?

I wrote a shader that performs lookups along the reverse direction of velocity. So if a pixel is moving to the left, it will look up pixels to the right of it and blend these with the color at that pixel:
#version 150

uniform sampler2D texture0;//color
uniform sampler2DMS texture1;//motion vector
uniform vec2 buffersize;
uniform vec4 drawcolor;

#define MOTIONBLURSAMPLES 64

in vec2 ex_texcoords0;

out vec4 out_Color;

void main(void)
{
vec4 color;
vec2 motion = -(texelFetch(texture1,ivec2(ex_texcoords0*buffersize),0).xy * 2.0 - 1.0);

vec2 motionblurtexcoord = ex_texcoords0;
float samples=0;
float strength=1;

for ( int i = 0; i < MOTIONBLURSAMPLES; i++ ) {
motionblurtexcoord += motion / buffersize * 0.5 * 10.0;
color += texture( texture0, motionblurtexcoord ) * strength;
samples += strength;
strength *= 0.98;
}

out_Color = color / samples;
}


This is not at all a realistic motion blur. Instead of moving objects leaving a trail, they take samples from the pixels behind their path of motion:
[attachment=1772:motionblur.png]

Imagine a truck driving by the camera, in front of a jungle. You would want the truck to leave a faint trail behind it as it moves to the left. Instead, you would see the rear bumper of the truck turning green as it picked up the color of the jungle in the background. The pixels of the jungle just behind the truck would be completely unaffected, because they are not in motion. Here's what my shader looks like when an object spins counterclockwise:
[attachment=1773:Image10.png]

The proper way to do this would be to look up the velocity at each pixel, and then copy that pixel's color to the pixel it "lands" on, by offsetting the coordinates by the velocity vector. But that isn't possible without scatter writing. Is there a way to do this properly?

10x Faster Performance for VR: www.ultraengine.com

Advertisement
I highly recommend checking out Eric Lengyel's article in Game Engine Gems Vol. 1.
Holy crap I started a blog - http://unobvious.typepad.com/
Looking at chapter 17, I think he is using basically the same technique for the post-filter.

The proper way would be to sample an n x n grid and add pixel samples that have velocities that intersect with the pixel you are working on.

10x Faster Performance for VR: www.ultraengine.com

Blurring both forward and backwards along the velocity might be an okay solution to this. There is still inherent error, but it's probably okay to do it like this, since it's impossible to improve without scatter write.

10x Faster Performance for VR: www.ultraengine.com

It's a tricky problem, since like you said it's really a scatter problem that you're trying to solve through gathering. A few thing you can try...
  1. Stretch the vertices in the direction of opposite direction of velocity. This helps keep adjacent frames from looking too "separated", particularly for skinned meshes. However it can cause a lot of artifacts if you stretch along the vertex normal, since in some cases it can cause the triangles to "tear apart" (think a cube with normals that point in the direction of the face). What you really want is to stretch along a shared, averaged normal. It also doesn't really work for objects that just rotate, like a wheel. It can also help to only do this when rendering velocity separately from your main pass (or G-Buffer pass), since it makes the artifacts much less noticable.
  2. "Dilate" your velocity either by blurring, or by pulling high velocities into adjacent pixels
  3. Use the previous frame's velocity and/or color buffers. For objects that don't move really really fast, the previous frame's velocity or color buffer can give you information to "fill in the gaps" outside of the object's silhouette.
You should check out the presentations for Lost Planet, Crysis, Little Big Planet, and Killzone 2. They all have information about their motion blur implementations.
Lucasarts' recent "Real-time Frame Rate Up-conversion" presentation contains some interesting ideas for fixing up these common motion blur artefacts.
Thanks for the tips. This is why I love GameDev.net!

10x Faster Performance for VR: www.ultraengine.com

This topic is closed to new replies.

Advertisement