Hi Vilem,
thank you very much for your reply.
And btw. Happy New Year to whole community here[/background]
Thanks, you too!
So i get to this code now:
private CColor ComputeSample(int x, int y, int width, int height){
float fov = 160.0f * (float)Math.PI / 180.0f;
float zdir = 1.0f / (float)Math.tan(fov);
float aspect = (float)height / (float)width;
float xdir = (x / (float) width) * 2.0f - 1.0f;
float ydir = ((y / (float) height) * 2.0f - 1.0f) * aspect;
/*
Anti-Aliasing
*/
float xone = (((float)width * 0.5f + 1.0f) / (float)width) * 2.0f - 1.0f;
float yone = ((((float)height * 0.5f + 1.0f) / (float)height) * 2.0f - 1.0f) * aspect;
Vector3D base = new Vector3D(0.0f, 0.0f, zdir).normalize();
Vector3D xvec = new Vector3D(xone, 0.0f, zdir).normalize();
Vector3D yvec = new Vector3D(0.0f, yone, zdir).normalize();
Vector3D dx = xvec.sub(base);
Vector3D dy = yvec.sub(base);
float dxSize = dx.magnitude();
float dySize = dy.magnitude();
int AA_SAMPLES = 4;
CColor c = new CColor(0.0f, 0.0f, 0.0f);
for(int aa = 0; aa < AA_SAMPLES; aa++){
float r1 = getRndFloat(rnd) * 0.5f;
float r2 = getRndFloat(rnd) * 0.5f;
Vector3D direction = new Vector3D(xdir + r1 * dxSize, ydir + r2 * dySize, zdir).normalize();
Ray ray = new Ray(mCamera, direction);
c.add(Trace(ray, 1));
}
return c.divide(AA_SAMPLES);
//Ray ray = new Ray(mCamera, new Vector3D(xdir, ydir, zdir).normalize());
// return Trace(ray,1);
}
And i think it works now:
http://img5.fotos-hochladen.net/uploads/adelpath0302043hzdliocx.png
Sorry to ask you something which is not directly related to pathtracing but maybe you can help me:
As you can see on the image above, there is a green area on the left sphere. I implemented a function, which saves the current colors, which are hold in the accumulator, to a bitmap file to the local storage of the smartphone. Sadly these strange artefacts happen from time to time (sometimes both spheres are red or green).
In my Renderer Class i use this function to get my bitmap:
public Bitmap GetBitmap(){
int width = mCanvas.getWidth();
int height = mCanvas.getHeight();
Bitmap rBitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
int i = 0;
for(int x = 0; x < width; x++){
for(int y = 0; y < height; y++){
int r = (int)(accumulator[i*3+0] / (float)samples);
int g = (int)(accumulator[i*3+1] / (float)samples);
int b = (int)(accumulator[i*3+2] / (float)samples);
rBitmap.setPixel(x,y,Color.rgb(r,g,b));
i++;
}
}
return rBitmap;
}
I think i do this the correct way, because i just copy pasted the code from the Display Result Class. But can these artefacts caused by the fact, that i can save the image whenever i want by a button press and the accumulator holds the color values already for a few pixels for the next sample?
I am not sure if you are familiar with android but i use this code to save my image to a FileOutputStream:
public void saveImage(){
Bitmap bitmap = mRender.GetBitmap();
try{
Date d = new Date();
SimpleDateFormat sdf = new SimpleDateFormat("hhmm ddMMyy");
sdf.setTimeZone(TimeZone.getTimeZone("Europe/Amsterdam"));
String dateString = sdf.format(d);
File file = new File(Environment.getExternalStorageDirectory(), "AdelPath_" + dateString + ".png");
if(!file.exists())
file.createNewFile();
FileOutputStream fos = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, fos);
} catch(Exception e){
Log.e("Error --->", e.toString());
}
}
I asked around on different forums but i didn't get a answer which solved my problem.
Thanks for your endurance to help me