Jump to content
  • Advertisement

Recommended Posts

Hi!
Im trying to use this radial shader in spheres, but unsuccessfully. the problem is related with screen coordinates. The shader only works if the sphere is in the center of the screen, just like show this pictures:

 

frameCount_004776.thumb.jpg.d111eb93494e002ee7fae3b3dc2269f3.jpg

if its not, this is what happen:frameCount_004968.thumb.jpg.0116abc3697684168e9ac72dde733981.jpg

 

i discovered that if i convert the position of the sphere to screen coordinates the shader works fine, just like i show in this code, using the function cam.worldToScreen():

 

 

fbo.begin();
ofClear(0);
cam.begin();

shader.begin();

sphere.setRadius(10);
sphere.setPosition(v.x, v.y, v.z);
sphere.draw();
shader.end();

cam.end();
fbo.end();



depthFbo.begin();
ofClear(0);
fbo.getDepthTexture().draw(0,0);
ofSetColor(255);

depthFbo.end();


radialBuffer.begin();
ofClear(0);
radial.begin();

f = cam.worldToScreen(v);
radial.setUniform3f("ligthPos", f.x, f.y, f.z);

radial.setUniformTexture("depth", depthFbo.getTexture(), 1);
fbo.draw(0,0);
radial.end();
radialBuffer.end();

radialBuffer.draw(0,0);

and this is the shader:

 

#version 150

in vec2 varyingtexcoord;
uniform sampler2DRect tex0;
uniform sampler2DRect depth;

uniform vec3 ligthPos;
 float exposure = 0.19;
 float decay = 0.9;
 float density = 2.0;
 float weight = 1.0;
 int samples = 25;

out vec4 fragColor;
const int MAX_SAMPLES = 100;
void main()
{
//        float sampleDepth = texture(depth, varyingtexcoord).r;
//    
//        vec4 H = vec4 (varyingtexcoord.x *2 -1, (1- varyingtexcoord.y)*2 -1, sampleDepth, 1);
//    
//    vec4 = mult(H, gl_ViewProjectionMatrix);

    vec2 texCoord = varyingtexcoord;
    vec2 deltaTextCoord = texCoord - ligthPos.xy;
    deltaTextCoord *= 1.0 / float(samples) * density;
    vec4 color = texture(tex0, texCoord);
    float illuminationDecay = .6;
    for(int i=0; i < MAX_SAMPLES; i++)
    {
        if(i == samples){
            break;
        }
        texCoord -= deltaTextCoord;
        vec4 sample = texture(tex0, texCoord);
        sample *= illuminationDecay * weight;
        color += sample;
        illuminationDecay *= decay;
    }
    fragColor = color * exposure;
}

 

the problem is that i want to apply this effect to multiples spheres, and i dont know how to do it, because i pass the position of the sphere like a uniform variable to the radial shader post processing.

but what i want in fact, is discover the way that apply this or others shader to the meshes and not to deal with this problem everytime (what about is the shader have not a uniform to determine the position of the object? which is the way to make a shader works in screen coordinates? how can do? i need some light here, becouse i feel that im missing something.

this is an illustrative picture for what i want to do :

 

761173650_Capturadepantalla2019-01-23alas6_16_27a.m..thumb.png.ba01999455ecaf2cbe8bd927081bffa8.png

but i want post processing shaders that can have the information of the objects and not always compute the effect in the center of the screen (in the center of the object is what i want). please, any suggestion will be apprecciate

Edited by prxtxn

Share this post


Link to post
Share on other sites
Advertisement
#version 150



// these are for the programmable pipeline system and are passed in

// by default from OpenFrameworks

uniform mat4 modelViewMatrix;

uniform mat4 projectionMatrix;

uniform mat4 textureMatrix;





uniform mat4 modelViewProjectionMatrix;







in vec3 position;

in vec4 color;

in vec4 normal;

in vec2 texcoord;



vec4 temp;

// this is the end of the default functionality



// this is something we're creating for this shader

out vec2 varyingtexcoord;



// this is coming from our C++ code

uniform float mouseX;



void main()

{

    // here we move the texture coordinates

    varyingtexcoord = vec2(texcoord.x, texcoord.y);



    // send the vertices to the fragment shader

    gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );



}

 

Share this post


Link to post
Share on other sites

In vs multiple positoon by world matrix and sendnthis to shader

Now in fs:

Define an array of lights and pass data from cpu to it (pos, color, radius)

Given position of light and actual fragment you compute the distance if dst == 0 then you are at 0.5,0.5 of your tex coord of the corona texture.... Which is lame anyway...

Share this post


Link to post
Share on other sites

thanks, i actually fix the problem of the coordinates, but now i have another problem. because i dont know how to avoid calculate the samples once that has been calculate. this is my problem, for each loop the function calculates the shader for each coordiantes, plus the others coordinates... this is what happens: 

1506797145_Capturadepantalla2019-02-05alas10_06_12p.m..thumb.png.d48911427a99c2edf0b8ec7109f94c79.png

 

notice howthe halo is compute well for each point, plus others iterations... any idea? this is the shader now:

 

#version 150



in vec2 varyingtexcoord;

uniform sampler2DRect tex0;



uniform int size;





 float exposure = 0.79;

 float decay = 0.9;

 float density = .9;

 float weight = .1;

 int samples = 25;





out vec4 fragColor;

const int MAX_SAMPLES = 25;

const int N = 3;

uniform vec2 ligthPos [N];





int a = 1;





vec4 halo(vec2 pos){

    

    

    float illuminationDecay = 1.2;

    vec2 texCoord = varyingtexcoord;

    vec2 current = pos.xy;

    vec2  deltaTextCoord = texCoord - current;

    

    deltaTextCoord *= 1.0 / float(samples) * density;

    vec4 color = texture(tex0, texCoord);



    

    for(int i=0; i < MAX_SAMPLES; i++){

        

        texCoord -= deltaTextCoord;



        vec4 sample = texture(tex0, texCoord);

        sample *= illuminationDecay * weight;

        color += sample;

        illuminationDecay *= decay;

        

    }

    return color;



}





void main(){

    

    vec2 uv = varyingtexcoord;

    

    vec4 color = texture(tex0, uv);

   



        

        vec4 accum = vec4(0.0);

        

        for(int e = 0; e < N;e++){

            

            vec2 current =ligthPos[e];



            accum += halo(current);



            

            

        }

        

        fragColor = (accum) * exposure;

    







}

 

Share this post


Link to post
Share on other sites

i need a way to avoid this extra calculations. i understand the problem, cos the fragment computes all the pixels on the screen, i want a way to avoid that. any clues?

Share this post


Link to post
Share on other sites

To be honest, on this ing i dont see how they calculate additional samples, i could recommend that you explain here what your shader exactly does (even if you've shown us the code), so maybe someone will guess how to extract this into two phases

Share this post


Link to post
Share on other sites

the function "halo" makes the calculation to generate that "glow" effect/ radial blur. it takes an origin point. mi attemp is to have a shader with multiples origin points. for that,  pass an array of origin positions, as you can see in the pictures, the shader calculates the different origin points well, but for each loop, they keep generate the samples for the others points. 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!