hi all,
i am building a simple raytracer to generate images (first time doing anything of the sort). right now the scene is just a simple sphere with a plane as a background. when the sphere is in teh middle of the screen, the image looks like it should ( http://hometown.aol.com/donate52/images/result1.jpg ), but if i move the sphere, it leaves a sort of "trail" of rings behid the sphere leading back to the center ( http://hometown.aol.com/donate52/images/result2.jpg ). i was wondering if anyof you have come across a problem like this one, that could shed some light. here is part of my code:
this function shoots the rays from the viewpoint (0, 0, 0.1) and looks for intersections with the sphere. Po is the eyepoint, P is the direction, and Pt is the calculated vector.
color trace(Point P, double step) {
for(int Z=1; Z<11; Z+=1) {
Point Pt = Po+(P-Po)*Z;
if(intersectSphere(Pt)) {
//shade using lambertian model
float val = acos(Dot(Normalize(Vector(Pt, sphere)), Normalize(light)));
if(val<0) { val = 0; }
color local = SPHERE_COLOR*val;
//color reflected = trace(Pt, step+1);
return (local);
}
if(intersectPlane(Pt)) {
*/ return PLANE_COLOR;
}
}
return (BACKGROUND);
}
this is the sphere collision method. the Round() method just rounds the floats in order to rule out errors with really small decimals. P is the point on the sphere, sphere is the point at the center of the sphere:
bool intersectSphere(Point P) {
float total = Dot((P-sphere), (P-sphere));
float rad2 = radius*radius;
total = Round(total, 1);
rad2 = Round(rad2, 1);
if(total == rad2) { return true; }
else { return false; }
}
am i casting the rays wrong, or testing for intersection incorrectly? im open to any suggestions, i havent been able to figure out the problem and it's driving me crazy. thanks!