Writing my own pathtracer on mobile device

Started by
55 comments, last by IsItSharp 9 years, 3 months ago

So like this?


No, see here if x1 is negative and x2 is positive then your code will return x1, but the correct return value should be x2 as that is the first intersection along the ray. Something like:
if(discriminant > 0){
    float x1 = (-b - (float)Math.sqrt(discriminant) / (2.0f * a));
    float x2 = (-b + (float)Math.sqrt(discriminant) / (2.0f*a));

    if (x1 < 0) return x2;
    if (x2 < 0) return x1;
    return Math.min(x1,x2);
}
You can check this will always return the smallest positive solution, and a negative value (not necessarily -1!) if there is no positive solution. It could probably be optimized, though.

So like this?


Yes, but I really meant to remember to make sure they actually are normalized before giving them to your intersect function, since it depends on the direction vector having unit length.

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Advertisement

Thanks again Bacterius for your help, i really appreciate it smile.png

So after i have implemented the Vector, Ray and Sphere classes i think i am ready to go to calculate my first image, am i?


                Sphere sphere = new Sphere(Color.BLUE, false, new Vector3D(10.0f, 10.0f, 0.0f), 1.5f);
                Vector3D camera = new Vector3D(0.0f, 0.0f, 0.0f);

                for (int x = 0; x < width; x++) {
                    for (int y = 0; y < height; y++) {
                        Ray ray = new Ray(camera, new Vector3D((float)x, (float)y, 0.0f).normalize());
                        if(sphere.intersect(ray) > -1.0f)
                            this.canvas.drawPoint(x,y, paint);
                    }
                }

So here is the code and the second point i am struggling with. I expected to see a blue round ball but instead i just see a blue line:

http://img5.fotos-hochladen.net/uploads/screenshot2014soug08w4d6.png

Uhm.....

Your ray generation code is a nonsense... for F.e. x = 0, y = 0 you create a vector (0, 0, 0) and normalize it - all of its components are divided by sqrt(0^2 + 0^2 + 0^2) ... afaik sqrt(0) = 0 (or technically +0 or -0, both are correct answers) ... nothing changes the fact that you're dividing by 0. (Fyi. in mathematics sqrt(0) depends on convention - there are multiple definitions - one of the conventions also specifies that 0^2 is undefined, thus also sqrt(0) is undefined).

You want your rays to go from left to right, where x=0 will be in the middle column of pixels, and y=0 in middle row of pixels - also your z determines your field of view angle. Technically you want something like this:


float xdir = (x / (float)width) * 2.0f - 1.0f; // Keep x direction in interval <-1; 1>
float ydir = ((y / (float)height) * 2.0f - 1.0f) * aspect; // Keep y direction in interval <-1; 1>, aspect is (height / width) in your case
float zdir = 1.0f / (float)tan(fov); // Where fov represents field-of-view angle (in radians)
Ray ray = new Ray(camera, new Vector3D(xdir, ydir, zdir).normalize());

I hope xdir and ydir makes sense (their computation), zdir can be calculated using trigonometry.

This will generate correctly your ray directions, assuming your intersection code is correct - you should see a sphere.

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

Fyi. in mathematics sqrt(0) depends on convention - there are multiple definitions - one of the conventions also specifies that 0^2 is undefined, thus also sqrt(0) is undefined


o.O

I have never heard of such a convention. The square root of zero has always been, and always will be zero, and zero squared is most definitely zero. Are you confusing with 0^0?

That said, yes, dividing by zero is always a mistake, and indeed you need some positive z component to drive the camera rays in a particular direction - there are different camera types but the one given by Villem is probably the simplest and easiest (and most common).

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Thanks for your little hint with the x, y and z coordinates @Vilem Otte.

I changed my loop to this:


                Sphere sphere = new Sphere(Color.BLUE, false, new Vector3D(0.0f, 1.0f, 10f), 2.8f);
                Vector3D camera = new Vector3D(0.0f, 0.0f, 0.0f);
                float aspect = (float)height / (float)width;

                for (int x = 0; x < width; x++) {
                    for (int y = 0; y < height; y++) {
                        float xdir = (x / (float)width) * 2.0f - 1.0f;
                        float ydir = ((y / (float)height) * 2.0f - 1.0f) * aspect;
                        float zdir = 1.0f / (float)Math.tan(255);
                        Ray ray = new Ray(camera, new Vector3D(xdir, ydir, zdir).normalize());
                        if(sphere.intersect(ray) > -1.0f)
                            this.canvas.drawPoint(x,y, paint);
                    }
                }

But i have a strange "appearance". If i set the FOV-Value to 255 i get this image:

http://img5.fotos-hochladen.net/thumbnail/fov255t14jw2so50_thumb.jpg

but if i set it just 1 point higher to 256 i get this:

fov256lj5xczf398.png

Are there any problems in my calculation or is this completely normal?

Reading the docs helps, or what Vilem has written. Argument is in radians, not degrees. Even for degrees the value would be silly (> 180°).

Shadertoy is really useful for simple and bruteforce examples. Like this small path tracer. https://www.shadertoy.com/view/4sfGDB

Its easy to just start to modify sources and learn what each lines do. And you can avoid all non relevant problems.

Reading the docs helps, or what Vilem has written. Argument is in radians, not degrees. Even for degrees the value would be silly (> 180°).

Thanks, now it works fine:


        float fov = 35 * (float)Math.PI / 180;
        float zdir = 1.0f / (float)Math.tan(fov);

Now i have to get to the really hard part: bouncing rays, diffuse material, calculating colors and light sad.png

Edit: So what is the next step now? Should i generate the reflection ray from the intersection? If so: how do i get the intersection point between the ray and the sphere? I just get a float value back from my function?

Okay so i did a little bit of research and implemented a recursive "Trace" function which takes a Ray and a depth counter as parameters. It looks like this:


        public Color Trace(Ray ray, int depth)
        {
            float distance = 5000.0f;
            BaseObject hitObject = null;
            Vector3D HitPoint = null;
            foreach (BaseObject obj in this.scene.Objects)
            {
                float currentDistance = obj.Intersect(ray);
                if (currentDistance < distance && currentDistance > 0)
                {
                    distance = currentDistance;
                    hitObject = obj;
                }

                HitPoint = ray.origin.add(ray.direction.multiply(distance));
            }

            if (distance == 5000.0f) //Kein Object wurde getroffen
                return Color.Black;
            if (hitObject.isEmitter) //Eine Lichtquelle wurde getroffen
                return hitObject.surfaceColor;
            if (depth == MAX_DEPTH)
                return Color.Black;

            Vector3D normal = hitObject.Normal(HitPoint);

            Ray reflectionRay = null;

            if (hitObject.mat == Material.Diffuse)
            {
                reflectionRay = new Ray(HitPoint, Vector3D.getRandomVectorInHemisphere(1.0f));
            }

            Color returnColor = Trace(reflectionRay, depth + 1);

            float r = hitObject.surfaceColor.R * returnColor.R;
            float g = hitObject.surfaceColor.G * returnColor.G;
            float b = hitObject.surfaceColor.B * returnColor.B;

            r /= 255.0f;
            g /= 255.0f;
            b /= 255.0f;

            return Color.FromArgb(255, (int)r, (int)g, (int)b);
        }

And i call the function like this:


            float fov = 35 * (float)Math.PI / 180;
            float zdir = 1.0f / (float)Math.Tan(fov);
            float aspect = (float)height / (float)width;

            //BWorker.RunWorkerAsync(new Tuple<int, int, Bitmap, float, float, float>(height, width, drawArea, fov, zdir, aspect));

            for (int y = 0; y < pB_Result.Height; y++)
            {
                for (int x = 0; x < pB_Result.Width; x++)
                {
                    float xdir = (x / (float)width) * 2.0f - 1.0f;
                    float ydir = ((y / (float)height) * 2.0f - 1.0f) * aspect;
                    Ray ray = new Ray(new Vector3D(0.0f, 0.0f, 0.0f), new Vector3D(xdir, ydir, zdir).normalize());

                    float r = 0, g = 0, b = 0;
                    for (int i = 0; i < 3; i++)
                    {

                        Color c = Trace(ray, 0);
                        r += c.R;
                        g += c.G;
                        b += c.B;
                    }

                    drawArea.SetPixel(x, y, Color.FromArgb(255, (int)r / 3, (int)g / 3, (int)b / 3));
                }
            }
            pB_Result.Image = drawArea;

But i only get this (there is one sphere on the left side of the light and one sphere on the right):

studioray1imagu5l84phyea.png

I think it all depends on my "getRandomVectorInHemisphere" function.

Currently i am just generating a random Vector:


        public static Vector3D getRandomVectorInHemisphere(float radius)
        {
            float x = (float)new Random(DateTime.Now.Millisecond).NextDouble() * radius;
            float y = (float)new Random(DateTime.Now.Millisecond).NextDouble() * radius;
            float z = (float)new Random(DateTime.Now.Millisecond).NextDouble() * radius;

            return new Vector3D(x, y, z).multiply((float)new Random(DateTime.Now.Millisecond).NextDouble());
        }

So anyone here who can give me a hint how i can compute a random direction vector for the bouncing ray in the hemisphere the normal is pointing at?

Edit: If i do something like this:


            int abortCounter = 0;
            while (abortCounter < 500)
            {
                Vector3D b = new Vector3D((float)new Random(523940).NextDouble() - 0.5f, (float)new Random(5231).NextDouble() - 0.5f, (float)new Random(25061).NextDouble() - 0.5f);
                b.normalize();
                if (b.Dotproduct(normal) > 0)
                    return b;
                abortCounter++;
                if (abortCounter == 499)
                    return b;
                else
                    return b;
            }
            return null;

I get this:

studioray2imaggm6krs7x9h.png

Your random number generator and logic behind it is wrong. Sadly I have to go off from PC in few minutes so I won't be able to give full answer and some background why is that (I will do that later).

As for now - your random direction should be normalized (unless you normalize it in ray constructor - which I don't think you do). There are few rules about the "random" direction and generally random number generation behind - you should use different random number generators for different surfaces. As of now, go with:


// Initialization of RNG, try to do this only once for whole random direction generator object - SEED is some number (F.e. 123456 ... or current timestamp, or anything :) )
Random rng = new Random(SEED);

...

// Your random direction into sphere generation code

Vector3D v = new Vector3D();
v.x = rng().NextDouble() * 2.0 - 1.0;
v.y = rng().NextDouble() * 2.0 - 1.0;
v.z = rng().NextDouble() * 2.0 - 1.0;
v.normalize();

...

// Your random direction into hemisphere
Vector3D v = GenerateRandomDirectionIntoSphere();
if (v.dot(normal) < 0.0)
{
    // Where negate does return new Vector3D(-v.x, -v.y, -v.z);
    v = v.negate();
}

The process of "how" you generate random numbers determines how good is your path tracer (I will elaborate later ;) ).

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

This topic is closed to new replies.

Advertisement