Jump to content

  • Log In with Google      Sign In   
  • Create Account


Ray Tracing: Sphere distortion due to Camera Movement


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
1 reply to this topic

#1 sinbag   Members   -  Reputation: 104

Like
0Likes
Like

Posted 06 January 2013 - 03:09 PM

I am building a ray Tracer from scratch. My question is: When I change camera coordinates the Sphere changes to ellipse. I don't understand why it's happening. Here are some images to show the artefacts:
Sphere: 1 1 -1 1.0 (Center, radius)
Camera: 0 0 5 0 0 0 0 1 0 45.0 1.0 (eyepos, lookat, up, foy, aspect)
9K3C2.png
But when I changed camera coordinate, the sphere looks distorted as shown below:
Camera: -2 -2 2 0 0 0 0 1 0 45.0 1.0
voiMF.png
I don't understand what is wrong. If someone can help that would be great!
I set my imagePlane as follows:

 

 

 

//Computing u,v,w axes coordinates of Camera as follows: 
{ 
Vecter a = Normalize(eye - lookat); //Camera_eye - Camera_lookAt 
Vecter b = up; //Camera Up 
Vecter m_w = a;
Vecter m_u = b.cross(m_w); 
m_u.normalize(); 
Vecter m_v = m_w.cross(m_u); 
}

 

After that I compute directions for each pixel from the Camera position (eye) as mentioned below:

 

//Then Computing direction as follows: 
int half_w = m_width * 0.5; 
int half_h = m_height * 0.5; 
double half_fy = fovy() * 0.5; 
double angle = tan( ( M_PI * half_fy) / (double)180.0 ); 

for(int k=0; k<pixels.size(); k++){ 
  double j = pixels[k].x(); //width 
  double i = pixels[k].y(); //height

 double XX = aspect() * angle * ( (j - half_w ) / (double)half_w ); 
 double YY = angle * ( (half_h - i ) / (double)half_h ); 

 Vecter dir = (m_u * XX + m_v * YY) - m_w ; directions.push_back(dir); 
}

 

After that:

 

for each dir: 
Ray ray(eye, dir); 
int depth = 0; 
t_color += Trace(g_primitive, ray, depth);

I have only provided the code which I think is important. But if more code snippets are required then I will provide those. 

 

Thanks for your help.

Attached Files



Sponsor:

#2 ApochPiQ   Moderators   -  Reputation: 14252

Like
0Likes
Like

Posted 07 January 2013 - 06:12 AM

Most likely there is a discrepancy between the field-of-view angle you are using to generate rays, and the actual field-of-view you expect to "see" based on the rendered image and your physical eye's FOV. This kind of distortion is common with FOV mismatches.

Experiment with changing how close together your rays are per pixel. Making the rays closer together will narrow your FOV and provide a sort of "zoom lens" effect which minimizes projection distortion but also obviously limits the range you can "see" on each axis. Making rays further apart will increase distortion but allow you to see more range. Play with various settings until you get an intuitive sense for how your FOV affects the rendered results.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS