Writing my own pathtracer on mobile device

Started by
55 comments, last by IsItSharp 9 years, 3 months ago

Salt'n Pepper ? Probably hitting the same object you bounced the ray off again. This can happen due to precision issues. Grant your Trace function a ignoreObject parameter, skip collision check for this and feed hitObject when you recurse.

Advertisement


Salt'n Pepper ? Probably hitting the same object you bounced the ray off again. This can happen due to precision issues. Grant your Trace function a ignoreObject parameter, skip collision check for this and feed hitObject when you recurse.

Another quick'n'dirty way to fix it is to "nudge" your hitpoint slightly outside the object (or inside, in the case of refraction) to make sure it doesn't intersect it again. A bit ugly, though, and unreliable with floating-point arithmetic although it tends to work most of the time (I actually read a paper where the authors used fixed point arithmetic to make it reliable).

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Your random number generator and logic behind it is wrong. Sadly I have to go off from PC in few minutes so I won't be able to give full answer and some background why is that (I will do that later).

As for now - your random direction should be normalized (unless you normalize it in ray constructor - which I don't think you do). There are few rules about the "random" direction and generally random number generation behind - you should use different random number generators for different surfaces. As of now, go with:


// Initialization of RNG, try to do this only once for whole random direction generator object - SEED is some number (F.e. 123456 ... or current timestamp, or anything :) )
Random rng = new Random(SEED);

...

// Your random direction into sphere generation code

Vector3D v = new Vector3D();
v.x = rng().NextDouble() * 2.0 - 1.0;
v.y = rng().NextDouble() * 2.0 - 1.0;
v.z = rng().NextDouble() * 2.0 - 1.0;
v.normalize();

...

// Your random direction into hemisphere
Vector3D v = GenerateRandomDirectionIntoSphere();
if (v.dot(normal) < 0.0)
{
    // Where negate does return new Vector3D(-v.x, -v.y, -v.z);
    v = v.negate();
}

The process of "how" you generate random numbers determines how good is your path tracer (I will elaborate later ;) ).

Works somehow better smile.png

studioray3imagecqnwiv378.png

Another quick'n'dirty way to fix it is to "nudge" your hitpoint slightly outside the object (or inside, in the case of refraction) to make sure it doesn't intersect it again.

If i debug through the code i understand the problem. But how do i move the hitpoint in the right direction away from the object?

Grant your Trace function a ignoreObject parameter, skip collision check for this and feed hitObject when you recurse.

Hm if i set an additional parameter "ignoreObject" of the type "BaseObject" and set the hitObject to ignoreObject and ignore the loop if ignoreObject is not null i get just the light in my image:


        public Color Trace(Ray ray, int depth, BaseObject ignoreObject = null)
        {
            float distance = 5000.0f;
            BaseObject hitObject = ignoreObject;
            Vector3D HitPoint = null;

            if (hitObject == null)
            {
                foreach (BaseObject obj in this.scene.Objects)
                {
                    float currentDistance = obj.Intersect(ray);
                    if (currentDistance < distance && currentDistance > 0)
                    {
                        distance = currentDistance;
                        hitObject = obj;
                    }

                    HitPoint = ray.origin.add(ray.direction.multiply(distance));
                }
            }

            if (distance == 5000.0f) //Kein Object wurde getroffen
                return Color.Black;
            if (hitObject.isEmitter) //Eine Lichtquelle wurde getroffen
                return hitObject.surfaceColor;
            if (depth == MAX_DEPTH)
                return Color.Black;

            Vector3D normal = hitObject.Normal(HitPoint);
           
            Ray reflectionRay = null;

            if (hitObject.mat == Material.Diffuse)
            {
                Vector3D randomVector = Vector3D.getRandomVectorInHemisphere();
                if (randomVector.Dotproduct(normal) < 0.0)
                    randomVector = randomVector.negate();
                reflectionRay = new Ray(HitPoint, randomVector);
            }

            Color returnColor = Trace(reflectionRay, depth + 1, hitObject);

            float r = hitObject.surfaceColor.R * returnColor.R;
            float g = hitObject.surfaceColor.G * returnColor.G;
            float b = hitObject.surfaceColor.B * returnColor.B;

            r /= 255.0f;
            g /= 255.0f;
            b /= 255.0f;

            return Color.FromArgb(255, (int)r, (int)g, (int)b);
        }

Ah i understood the idea now. I hope i did it right:


        public Color Trace(Ray ray, int depth, BaseObject missObject = null)
        {
            float distance = 5000.0f;
            BaseObject hitObject = null;
            Vector3D HitPoint = null;

            foreach (BaseObject obj in this.scene.Objects)
            {
                if (obj == missObject)
                    continue;
                float currentDistance = obj.Intersect(ray);
                if (currentDistance < distance && currentDistance > 0)
                {
                    distance = currentDistance;
                    hitObject = obj;
                }
            }

            if (distance == 5000.0f) //Kein Object wurde getroffen
                return Color.Black;
            if (hitObject.isEmitter) //Eine Lichtquelle wurde getroffen
                return hitObject.surfaceColor;
            if (depth == MAX_DEPTH)
                return Color.Black;


            HitPoint = ray.origin.add(ray.direction.multiply(distance));
            Vector3D normal = hitObject.Normal(HitPoint);
           
            Ray reflectionRay = null;

            if (hitObject.mat == Material.Diffuse)
            {
                Vector3D randomVector = Vector3D.getRandomVectorInHemisphere();
                if (randomVector.Dotproduct(normal) < 0.0)
                    randomVector = randomVector.negate();
                reflectionRay = new Ray(HitPoint, randomVector);
            }

            Color returnColor = Trace(reflectionRay, depth + 1, hitObject);

            float r = hitObject.surfaceColor.R * returnColor.R;
            float g = hitObject.surfaceColor.G * returnColor.G;
            float b = hitObject.surfaceColor.B * returnColor.B;

            r /= 255.0f;
            g /= 255.0f;
            b /= 255.0f;

            return Color.FromArgb(255, (int)r, (int)g, (int)b);
        }

But all i get is this:

studioray4imaggozunmk04q.png

There is still something wrong. Any ideas or hints where i have to search?

Hm no one? sad.png

I would really appreciate it if someone could solve the problem or give me a little hint :)

I'm still at work today (I've got a demo day tomorrow with one application, so I will be most likely sleeping for just few hours). But as we're running tests that take very long time, so...

Why you see black/colored pixels

The hint you want had already been mentioned:

You cast a ray from camera origin to sphere - you compute your hitpoint as "HitPoint = Origin + Distance * Direction", which will most likely get you INSIDE the sphere (due to precision of floating point numbers).

Now, a quick hack is using something like "HitPoint = Origin + Distance * 0.99 * Direction", this isn't a real solution, but this hack works 99% of time and for start it is absolutely valid to use it. Technically you would like the constant to be relative to your distance (floats are quite precise between 0.0 and 1.0, less precise in thousands, and even less precise when you get to millions, billions and up ... the higher your value (in absolute value) is, the worse the problem is, and the constant offset also needs to be higher).

For a quick view into precision problem - go check this site https://randomascii.wordpress.com/2012/03/08/float-precisionfrom-zero-to-100-digits-2

E.g. modify your hitpoint calculation as:


HitPoint = ray.origin.add(ray.direction.multiply(distance * 0.99f));

This way you should see something. (It won't be perfect and lots of rays miss, you will need better sampling than just randomly shoot rays - once you got this running, I will describe few ways how to improve it).

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

Hi Vilem,

thank you really much for your late post.

Í followed your tipp and i also skip checking for intersection for the last intersected object in my trace class:


        public Color Trace(Ray ray, int depth, BaseObject missObject = null)
        {
            float distance = 5000.0f;
            BaseObject hitObject = null;
            Vector3D HitPoint = null;

            foreach (BaseObject obj in this.scene.Objects)
            {
                if (obj == missObject)
                    continue;
                float currentDistance = obj.Intersect(ray);
                if (currentDistance < distance && currentDistance > 0.1f)
                {
                    distance = currentDistance;
                    hitObject = obj;
                }
            }

            if (distance == 5000.0f) //Kein Objekt wurde getroffen
                return Color.Black;
            if (hitObject.isEmitter) //Eine Lichtquelle wurde getroffen
                return hitObject.surfaceColor;
            if (depth == MAX_DEPTH)
                return Color.Black;


            HitPoint = ray.origin.add(ray.direction.multiply(distance));
            Vector3D normal = hitObject.Normal(HitPoint);
           
            Ray reflectionRay = null;

            if (hitObject.mat == Material.Diffuse)
            {
                Vector3D newDirection = Vector3D.getRandomVectorInHemisphere();
                if (newDirection.Dotproduct(normal) < 0.0)
                    newDirection = newDirection.negate();
                HitPoint = ray.origin.add(ray.direction.multiply(distance * 0.99f));
                reflectionRay = new Ray(HitPoint, newDirection.normalize());
            }

            Color returnColor = Trace(reflectionRay, depth + 1, hitObject);

            float r = hitObject.surfaceColor.R * returnColor.R;
            float g = hitObject.surfaceColor.G * returnColor.G;
            float b = hitObject.surfaceColor.B * returnColor.B;

            r /= 255.0f;
            g /= 255.0f;
            b /= 255.0f;

            return Color.FromArgb(255, (int)r, (int)g, (int)b);
        }

So this is the image i get with 1 RAY_PER_PIXEL:

pathtracerv4ic7b8p2zdka.png

Looks pretty good so far, thanks!

But if i increase the RAY_PER_PIXEL value to 4, i get this strange image:

pathtracerv4ih7jvbe8qzl.png

So any more hints maybe? unsure.png

Okay, so i think i got the problem: Because i wanted to speed things up a bit i used the .NET Parallel For loop which looks like this:


                    //System.Threading.Tasks.Parallel.For(0, RAYS_PER_PIXEL, i =>
                    //{
                    //    Color c = Trace(ray, 0);
                    //    r += c.R;
                    //    g += c.G;
                    //    b += c.B;
                    //});

If i use a normal for loop:


                    for (int i = 0; i < RAYS_PER_PIXEL; i++)
                    {
                        Color c = Trace(ray, 1);
                        r += c.R;
                        g += c.G;
                        b += c.B;
                    }

Everything works fine:

pathtracerv4ioran6df1xt.png

So now i ported the c# .net code to android java. But i now got a problem which i encountered at early stages at my c# .net version:

screenshot2014tsafj0mx15.png

There is also a sphere between the two lit spheres.

My code in java:

http://pastebin.com/BpmL6Qtf

I compare the java code with my working .net code for 5 hours now and i couldn't find anything which is different...

General debugging hints ahead:

Write some test(s) or step through both implementations simultaneously, compare the intermediate results to see where they diverge. Or generate logs which you then can compare with diff.

Be aware that floating point calculation can produce different results depending on used language/compiler settings etc. (Edit: and hardware!)

The strange thing is if i add planes to the scene i see the reflected rays but i don't see the lights anymore?

screenshot2014hd92u6i7ls.png

Edit: Okay i found the problem:

screenshot2014z6ef5r8idq.png

Old one:


        if(discriminant > 0){
            float x1 = (-b - (float)Math.sqrt(discriminant) / (2.0f * a));
            float x2 = (-b + (float)Math.sqrt(discriminant) / (2.0f * a));


            if (x1 >= 0 && x2 >= 0) return x1;
            if (x1 < 0 && x2 >= 0) return x2;
            else return -1.0f;
        }

New one:


        if(discriminant > 0){
            float x1 = (-b - (float)Math.sqrt(discriminant)) / (2.0f * a);
            float x2 = (-b + (float)Math.sqrt(discriminant)) / (2.0f * a);


            if (x1 >= 0 && x2 >= 0) return x1;
            if (x1 < 0 && x2 >= 0) return x2;
            else return -1.0f;
        }

Quite funny to run a pathtracer on a smartwatch ^^

So next step is anti aliasing...

This topic is closed to new replies.

Advertisement