Writing my own pathtracer on mobile device

Started by
55 comments, last by IsItSharp 9 years, 3 months ago

Hm, still the same result.

Did you mean this:


Vector3D direction = new Vector3D(xdir + r1 * dxSize, ydir + r2 * dySize, zdir);

instead of this:


Vector3D direction = new Vector3D(xdir + r1 * dxSize, ydir + r2 * dxSize, zdir);

? Even if you did, it is also not working.

Actually...

Vector3D direction = new Vector3D(xdir + r1 * dxSize, ydir + r2 * dySize, zdir);

direction.normalize();

But I think I forgot some division (that is why your rays are "spread" that much (they shouldn't be).

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

Advertisement

But I think I forgot some division (that is why your rays are "spread" that much (they shouldn't be).

Which division do you mean? smile.png

My apologise man, I made a stupid mistake there.

What we want when computing the dx and dy are the differences between two rays on X, respectively Y axes (on viewport). Now, why do we do that, we take base vector, which is the one directly in the centre of the screen and then we take ray to one pixel right from centre and to one pixel up from centre. The code doesn't do that.


float xone = (((float)width * 0.5f + 1.0f) / (float)width) * 2.0f - 1.0f;
float yone = ((((float)height * 0.5f + 1.0f) / (float)height) * 2.0f - 1.0f) * aspect;

These are the fixed xone and yone vectors. I intentionally kept the computation more intensive (you can actually optimise it to 2/width, respectively (2/height)*aspect.

Now it should work properly.

And btw. Happy New Year to whole community here smile.png

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

Hi Vilem,

thank you very much for your reply.

And btw. Happy New Year to whole community here[/background]


Thanks, you too!

So i get to this code now:


    private CColor ComputeSample(int x, int y, int width, int height){
        float fov = 160.0f * (float)Math.PI / 180.0f;
        float zdir = 1.0f / (float)Math.tan(fov);
        float aspect = (float)height / (float)width;

        float xdir = (x / (float) width) * 2.0f - 1.0f;
        float ydir = ((y / (float) height) * 2.0f - 1.0f) * aspect;

        /*
            Anti-Aliasing
         */
        float xone = (((float)width * 0.5f + 1.0f) / (float)width) * 2.0f - 1.0f;
        float yone = ((((float)height * 0.5f + 1.0f) / (float)height) * 2.0f - 1.0f) * aspect;

        Vector3D base = new Vector3D(0.0f, 0.0f, zdir).normalize();
        Vector3D xvec = new Vector3D(xone, 0.0f, zdir).normalize();
        Vector3D yvec = new Vector3D(0.0f, yone, zdir).normalize();
        Vector3D dx = xvec.sub(base);
        Vector3D dy = yvec.sub(base);

        float dxSize = dx.magnitude();
        float dySize = dy.magnitude();

        int AA_SAMPLES = 4;

        CColor c = new CColor(0.0f, 0.0f, 0.0f);

        for(int aa = 0; aa < AA_SAMPLES; aa++){
            float r1 = getRndFloat(rnd) * 0.5f;
            float r2 = getRndFloat(rnd) * 0.5f;
            Vector3D direction = new Vector3D(xdir + r1 * dxSize, ydir + r2 * dySize, zdir).normalize();
            Ray ray = new Ray(mCamera, direction);
            c.add(Trace(ray, 1));
        }

        return c.divide(AA_SAMPLES);

        //Ray ray = new Ray(mCamera, new Vector3D(xdir, ydir, zdir).normalize());
//        return Trace(ray,1);
    }


And i think it works now:

http://img5.fotos-hochladen.net/uploads/adelpath0302043hzdliocx.png

Sorry to ask you something which is not directly related to pathtracing but maybe you can help me:

As you can see on the image above, there is a green area on the left sphere. I implemented a function, which saves the current colors, which are hold in the accumulator, to a bitmap file to the local storage of the smartphone. Sadly these strange artefacts happen from time to time (sometimes both spheres are red or green).

In my Renderer Class i use this function to get my bitmap:


    public Bitmap GetBitmap(){
        int width = mCanvas.getWidth();
        int height = mCanvas.getHeight();

        Bitmap rBitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);

        int i = 0;
        for(int x = 0; x < width; x++){
            for(int y = 0; y < height; y++){
                int r = (int)(accumulator[i*3+0] / (float)samples);
                int g = (int)(accumulator[i*3+1] / (float)samples);
                int b = (int)(accumulator[i*3+2] / (float)samples);

                rBitmap.setPixel(x,y,Color.rgb(r,g,b));
                i++;
            }
        }

        return rBitmap;
    }

I think i do this the correct way, because i just copy pasted the code from the Display Result Class. But can these artefacts caused by the fact, that i can save the image whenever i want by a button press and the accumulator holds the color values already for a few pixels for the next sample?

I am not sure if you are familiar with android but i use this code to save my image to a FileOutputStream:


    public void saveImage(){
        Bitmap bitmap = mRender.GetBitmap();

        try{
            Date d = new Date();
            SimpleDateFormat sdf = new SimpleDateFormat("hhmm ddMMyy");
            sdf.setTimeZone(TimeZone.getTimeZone("Europe/Amsterdam"));
            String dateString = sdf.format(d);

            File file = new File(Environment.getExternalStorageDirectory(), "AdelPath_" + dateString + ".png");
            if(!file.exists())
                file.createNewFile();

            FileOutputStream fos = new FileOutputStream(file);
            bitmap.compress(Bitmap.CompressFormat.PNG, 100, fos);
        } catch(Exception e){
            Log.e("Error --->", e.toString());
        }
    }

I asked around on different forums but i didn't get a answer which solved my problem.

Thanks for your endurance to help me smile.png

About saving the image and your issue.

There are multiple reasons why you can get the issue:

1.) Due to asynchronous and highly parallel nature of the code, the accumulator buffer is re-written while you are reading from it inside the loop. See following diagram:

| Thread 1 | Thread 2 |

|----------| |

| Render | |

| | | |

| | |----------|

| | | GetBitmap|

| | | | |

| | | | |

|----------| | |

| Render | | |

| | | | |

| | | | |

| | |----------|

| | | saveImage|

| | | | |

|----------| | |

Now, the GetBitmap doesn't read the correct data (they are re-written while GetBitmap reads them. The proper solution for this is to use Mutex lock on the shared resources (which is accumulator buffer and samples - as both can be changed during rendering phase).

First of all you need an instance of Mutex object (it is one of the core java objects) to synchronize your code; that is accessible from render and getBitmap functions.

Your render and getBitmap will then look like:


public void render() {
    try {
        mutex.acquire();
        try {
            ... // The code of render will be here (or at least the part of code where you write into accumulator and samples - assuming you write to samples too)
        }
        finally {
            mutex.release();
        }
    }
    catch (InteruptedException ex) {
        ...
    }
}

...

public Bitmap getBitmap() {
    ... // Here you create bitmap object
    try {
        mutex.acquire();
        try {
            ... // The nested loop from getBitmap goes here
        }
        finally {
            mutex.release();
        }
    }
    catch (InteruptedException ex) {
        ...
    }
    // Here you return rBitmap
}

This way (using Mutex object), you synchronize your code to work like this:

| Thread 1 | Thread 2 |

|----------| |

| Render | |

| | | |

| | | |

| | | |

| | | |

| | | |

|----------| |

| |----------|

| | GetBitmap|

| | | |

| | | |

| | | |

| | | |

| | | |

| |----------|

|----------| saveImage|

| Render | | |

| | | | |

| | | | |

| | | | |

Of course this is only going to help you when those functions are actually processed on multiple threads (and I assume they do).

Another possible reason would be out-of-range colors (when your code is sequential, it is most likely related to the divisions or type-casts), in this part of the code:
int r = (int)(accumulator[i*3+0] / (float)samples);
int g = (int)(accumulator[i*3+1] / (float)samples);
int b = (int)(accumulator[i*3+2] / (float)samples);

Make sure that r, g and b are indeed in range from 0 to 255 (just Log.v it - to see whether there isn't something weird happening). Although it would most likely make the issue be throughout whole image, not just parts.

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

Wow thanks for your detailed answer, you are great!!! I really appreciate your help biggrin.png

So i implemented the stuff with Mutex (it is called Semaphore in Android). So my two functions look like this:


    public void RenderOneStep(){
        try{
            mSemaphore.acquire();
            try{
                int width = mCanvas.getWidth();
                int height = mCanvas.getHeight();
                int i = 0;
                for(int x = 0; x < width; x++){
                    for(int y = 0; y < height; y++){
                        CColor cPixel = ComputeSample(x,y,width,height);
                        accumulator[i*3+0] += cPixel.getR();
                        accumulator[i*3+1] += cPixel.getG();
                        accumulator[i*3+2] += cPixel.getB();
                        i++;
                    }
                }

                samples++;
            }
            finally{
                mSemaphore.release();
            }
        } catch (Exception e){
            Log.e("Exception in RenderOneStep-->", e.getMessage());
        }
    }

    public Bitmap GetBitmap(){
        int width = mCanvas.getWidth();
        int height = mCanvas.getHeight();

        Bitmap rBitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);

        try {
            mSemaphore.acquire();
            try{
                int i = 0;
                for(int x = 0; x < width; x++){
                    for(int y = 0; y < height; y++){
                        int r = (int)(accumulator[i*3+0] / (float)samples);
                        int g = (int)(accumulator[i*3+1] / (float)samples);
                        int b = (int)(accumulator[i*3+2] / (float)samples);

                        if(r > 255 || r < 0) {
                            Log.e("Value of Red: ", String.valueOf(r));
                        }
                        if(g > 255 || g < 0 ) {
                            Log.e("Value of Green: ", String.valueOf(g));
                        }
                        if(b > 255 || b < 0) {
                            Log.e("Value of Blue: ", String.valueOf(b));
                        }

                        rBitmap.setPixel(x,y,Color.rgb(r,g,b));
                        i++;
                    }
                }
            }
            finally {
                mSemaphore.release();
            }
        } catch (Exception e){
            Log.e("Exception in GetBitmap-->", e.getMessage());
        }

        return rBitmap;
    }

But this doesn't fix the problem. So as you can already see i check for the correct range of the colors in GetBitmap().

As you said, there are a lot of strange red values over 255 (a lot of 286). But if i check the same colors in my DisplayResult function, i don't get any message in my Log although i think the processing is the same, isn't it?


    public Canvas DisplayResult(){
        int width = mCanvas.getWidth();
        int height = mCanvas.getHeight();

        Paint p = new Paint();
        int i = 0;
        for(int x = 0; x < width; x++){
            for(int y = 0; y < height; y++){
                int r = (int)(accumulator[i*3+0] / (float)samples); //die gespeicherte Farbe wird durch die Anzahl der Samples geteilt
                int g = (int)(accumulator[i*3+1] / (float)samples);
                int b = (int)(accumulator[i*3+2] / (float)samples);

                if(r > 255 || r < 0) {
                    Log.e("Value of Red: ", String.valueOf(r));
                }
                if(g > 255 || g < 0 ) {
                    Log.e("Value of Green: ", String.valueOf(g));
                }
                if(b > 255 || b < 0) {
                    Log.e("Value of Blue: ", String.valueOf(b));
                }

                p.setColor(Color.rgb(r,g,b));
                mCanvas.drawPoint(x,y,p);
                i++;
            }
        }

        Paint myPaint = new Paint();
        myPaint.setColor(Color.WHITE);
        mCanvas.drawText("Samples: " + String.valueOf(this.samples), 10, 25, myPaint);

        return mCanvas;
    }

Any idea why this happens? I mean, i am doing the exact same thing blink.png

This topic is closed to new replies.

Advertisement