• Advertisement
Sign in to follow this  

Ray Tracing: Sphere distortion due to Camera Movement

This topic is 1841 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am building a ray Tracer from scratch. My question is: When I change camera coordinates the Sphere changes to ellipse. I don't understand why it's happening. Here are some images to show the artefacts:
Sphere: 1 1 -1 1.0 (Center, radius)
Camera: 0 0 5 0 0 0 0 1 0 45.0 1.0 (eyepos, lookat, up, foy, aspect)
But when I changed camera coordinate, the sphere looks distorted as shown below:
Camera: -2 -2 2 0 0 0 0 1 0 45.0 1.0
I don't understand what is wrong. If someone can help that would be great!
I set my imagePlane as follows:




//Computing u,v,w axes coordinates of Camera as follows: 
Vecter a = Normalize(eye - lookat); //Camera_eye - Camera_lookAt 
Vecter b = up; //Camera Up 
Vecter m_w = a;
Vecter m_u = b.cross(m_w); 
Vecter m_v = m_w.cross(m_u); 


After that I compute directions for each pixel from the Camera position (eye) as mentioned below:


//Then Computing direction as follows: 
int half_w = m_width * 0.5; 
int half_h = m_height * 0.5; 
double half_fy = fovy() * 0.5; 
double angle = tan( ( M_PI * half_fy) / (double)180.0 ); 

for(int k=0; k<pixels.size(); k++){ 
  double j = pixels[k].x(); //width 
  double i = pixels[k].y(); //height

 double XX = aspect() * angle * ( (j - half_w ) / (double)half_w ); 
 double YY = angle * ( (half_h - i ) / (double)half_h ); 

 Vecter dir = (m_u * XX + m_v * YY) - m_w ; directions.push_back(dir); 


After that:


for each dir: 
Ray ray(eye, dir); 
int depth = 0; 
t_color += Trace(g_primitive, ray, depth);

I have only provided the code which I think is important. But if more code snippets are required then I will provide those. 


Thanks for your help.

Share this post

Link to post
Share on other sites
Most likely there is a discrepancy between the field-of-view angle you are using to generate rays, and the actual field-of-view you expect to "see" based on the rendered image and your physical eye's FOV. This kind of distortion is common with FOV mismatches.

Experiment with changing how close together your rays are per pixel. Making the rays closer together will narrow your FOV and provide a sort of "zoom lens" effect which minimizes projection distortion but also obviously limits the range you can "see" on each axis. Making rays further apart will increase distortion but allow you to see more range. Play with various settings until you get an intuitive sense for how your FOV affects the rendered results.

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement