# image manipulation algorithm

This topic is 5395 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I'm developing an application which will warp images with "goo" like effects (or liquify in Photoshop). Does anyone know where I can find resources to achieve these effects? Thanx, toxie

##### Share on other sites
A simple implementation of liquify:

You have two maps, the original color map and a distortion map.
The color map is an ordinary RGB(A) surface, and the distortion map is a surface that has a 2d vector field encoded to the color components.

To achieve the result image, the color map is sampled, and the samples are offset by the distortion map, so that the distortion vector is added to the original sampling position per pixel (dependent texture read).

To construct the distortion field, render some geometry to the distortion channel - to emulate Photoshop, use an area of gaussian-distributed vectors, so that the edge of distortion is as smooth as possible. The colors (vectors) determine the per-pixel offset of the result.

-Nik

##### Share on other sites
Thank you very much, this is the way I understand what you wrote:

1) construct a distortion map by basically setting the color values to numbers which represent vectors.

2) to construct the distortion map vectors:
a) define the geometric area of the distortion
b) use a gaussian distribution of vectors in that area

3) using the distortion map, I will offset individual pixels in an image by an amount corresponding to the vector value in the corresponding "pixel" in the distortion map.

Does that seem accurate to what you said? I could definitely use more help with step 3), I guess I could use mathematics references - unless you know of anywhere thats good.

Thanks again,
toxie

##### Share on other sites

As for the right question:

You'd probably need to use a fragment shader to perform per-fragment texture coordinate distortion.
As I primarily use Direct3D instead of OpenGL, I'm not sure of the actual syntax - however, the concept is very simple, as I demonstrate in the following fragment shader snippet (not complete code as is, intended for concept demonstration):

//input textures:sampler colormap;sampler distortionmap;//one of the fragment/pixel shader input parameters, representing the undistorted texture coordinate of the screen quad. initialized by the vs, of course :)float2 tc; //some arbitrary value for use as the distortion strengthfloat distortionscale = 0.01f; //take the distortion vector from red and green components of the distortion map:float2 distortion = tex2d(distortionmap, tc).rg; //if the distortion texture is fixed point, you should scale the distortion vectors to -1...1 from 0...1 here (code omitted).//finally, perform a dependent texture read to effectively distort the color map by the values from the distortion map:return tex2d(colormap, tc + distortion * distortionscale);

-Nik

[Edited by - Nik02 on October 12, 2004 6:57:11 AM]

##### Share on other sites
oh man I'm sorry,

I meant to put 2b as the question not 3 - thats why the question was confusing - I'm sorry. However, for the question you did answer , I'm going to have to do something more low level since I'm using a pocket pc and I only have a "direct draw like" api available. Thats ok though - I can probably figure something out - hopefully.

If you know of any references to help defining a gaussian distribution of vectors though then I would really appreciate that!

thanks again,
toxie

##### Share on other sites
Gaussian function. In this particular application, the actual vector addition direction could be derived from the brush movement direction alone, and only the weighting of the addition could be determined with the Gaussian function, seeded with distance from the brush center. Of course, there's no limits for the technique other than your imagination.

The function, when plotted on a texture, should represent a blurred white circle on black background.

Performing the distortion on raw pixel data and in software is essentially same as the fragment shader technique - you offset the sampling points with the distortion map. However, the data is in your own arrays and not in textures. For smooth results, use bilinear or bicubic filters to look up the original color data [smile]

-Nik

##### Share on other sites
Wow that seemed too easy for you. Thanx very much - I really appreciate your help. I'll post the results back here - but it won't be for awhile. Thanks again,

toxie

• 18
• 29
• 11
• 21
• 16