Posted 09 February 2012 - 05:10 AM
Should it not be possible to create a good approximation in practically zero time by rendering a series of textured quads with additive blending? I'm considered unsigned now because it's conceptually easier to grok, but I guess signed works just the same by dividing by 2 and offsetting by 1/2.
The idea is this:
A distance map encodes the distance to the nearest set pixel (in a value from 0 to 255, assuming an 8 bit texture). Pixels that are "in" or very close to are white or nearly so, pixels that are far away are black. The closer they get to "in", the lighter a shade of grey they take.
Also, you could say that the closer a pixel is to "in", the higher the likelihood that at any uniformly distributed random sample from a disk with a certain radius will "hit" the "in" pixel.
Thus, enable additive blending, set the scale to 1/256, and render 256 fullscreen quads which are biased by an evenly distributed random offset within some small radius (ideally 128, I figure?). The 256 quads could all go into one vertex buffer and render with one draw call.
Pixels will end up brighter the closer they are to an "in" texel. If this is not accurate enough, use a 16 bit texture (or enable/disable writes to the 3 color channels of a typical RGBA8 texture one after another) and do the same thing a few thousand times.
Graphics cards are ridiculously fast at drawing textured quads. On a "typical" texture size of, say 512x512, I would expect drawing a few hundred textured quads at several hundred FPS, even on a not-so-high-end-card.