Ray generation with rotated camera...

Started by
7 comments, last by forsandifs 13 years, 5 months ago
For a system like the one illustrated in the following picture:



I am using the following code to generate rays from the camera center to each pixel on its orthogonally oriented screen:

Ray.Origin = CamPos;float3 XTerm = PixelWidth * (-ResWidth/2 + PixelCoord.x + 0.5) * normalize( RightVector );float3 YTerm = PixelHeight * (ResHeight/2 - PixelCoord.y - 0.5) * normalize( UpVector );float3 ZTerm = normalize( LookVector ) * CamLength;Ray.Direction = normalize(ZTerm + YTerm + XTerm);return Ray;


This works well for when the camera is aligned with the world axes, ( LookVector(0,0,1); UpVector(0,1,0); RightVector(1,0,0); ) and produces the following picture:



But when I change the orientation vectors to represent a 45 degree rotation around the y axis, (i.e. LookVector(1,0,1); UpVector(0,1,0); RightVector(1,0,-1); ), it does not work as intended and produces the following aberration:



I have been banging my head against this problem all day. I've tried generating the rays with the orientation centered and then applying the appropriate rotation matrix to the direction, but I always get the same messed up result! The system seems so easy to visualise and the method I'm using seems so intuitively correct! I have no idea what I'm doing wrong. Please help.
Advertisement
I don't see any obvious problems there (although I didn't proof all of your math). It's a little atypical to use non-unit-length direction vectors, I think, but you appear to be normalizing them in your example code.

In principle, generating the ray directions in local space and then transforming them should work, and that appears to be what you're during currently. If you get the same results whether you do it 'manually' as in the code you posted, or by using a matrix transform, that would suggest the problem may be elsewhere.

Do you have any way to create a debug visualization? Or is your only means of visualization the output of the raytracer itself? If you do have some other means of visualization, it seems you could create another camera, position it so as to observe the first camera, and then step through, ray by ray, to see if the ray directions are coming out wrong. (Of course the problem could be somewhere else in the pipeline, but that might at least help you narrow things down.)
Thank you.

Unfortunately the ray trace output is my only means of visualization.

It might be worth noting that:

when I rotate around the Look axis by 45 degrees ( LookVector(0,0,1); UpVector(1,1,0); RightVector(1,-1,0); ) I get the correct result:

Photobucket

but when I rotate around the Right axis by 45 degrees ( Look(0,1,1); Up(0,1,-1); Right(1,0,0); ) I get the wrong result:

Photobucket
Maybe it's because the camera position is too close to the view plane, so your field of view is very large. What are the values of PixelWidth, PixelHeight and CamLength?
Quote:Original post by knighty
Maybe it's because the camera position is too close to the view plane, so your field of view is very large. What are the values of PixelWidth, PixelHeight and CamLength?


PixelWidth = 0.001;

PixelHeight = 0.001;

CamLength = 0.1;

(ResHeight and ResWidth are 900 and 1600 respectively).
That's what I thought. Your Field of view is too large. with the values you gave the field of view is (in left-right direction):
2*arctan(PixelWidth*ResWidth/2/CamLength)=2*arctan(0.001*1600/2/0.1)=165.75°
which is very large.

If You set a variable FOV as the field of view angle, CamLength should be (If I did the math correctly):

CamLength = PixelWidth * ResWidth / 2 / tan( FOV / 2 )

For an FOV=90° it gives: CamLength = 0.8
After filling in the vectors for the bottom left corner of your second image (rotated 45 degrees about Y), I noticed your field of view is indeed very large.
With a resolution of 1600x900 and a pixel width/height of 0.001, you get a viewing plane of size 1.6 by 0.9. The distance of the camera to this plane is only 0.1, which gives you a horizontal viewing angle of ~83 degrees from the center of your view. This is way too big to provide a good image.

Try setting your CamLength to 1.0 for instance.

EDIT: beaten apparently :)
Thank you very much guys! That certainly improves matters a great deal, rotation is now working as intended. (I didn't mean to have such a large FOV btw, I had erred in my mental image of the dimensions by a factor of 10).

Now I have another headache though.

Using the method presented in the OP (with corrected FOV) the picture does not respond (well) to changes in camera position.

Using CamPos(0,0,0) gives the same result as CamPos(0,2,5) (where the triangle is at a distance of 10 and has a height of 2 and a width of 2). However, when I use CamPos(2,0,0) I get a displacement of the triangle to the left of the screen AND to the bottom of the screen...

Interestingly though, when I calculate the direction by first calculating the pixel centers with the world aligned axes, applying the appropriate rotation matrix, and then normalize(PixelCenter - CamPos), the picture responds correctly to changes in CamPos BUT I get a blank screen when CamPos.z < 0 ... which is not intended behaviour...

This is doubly puzzling to me because as far as my grasp of the geometry goes, calculating the direction using the camera orientation vectors should give the same result as rotating the pixel centers...

Any ideas?
Solved! The secondary problem of the image not responding to changes in camera position was due to a silly mistake on my part.

All is working as intended now! Thank you very much once again guys!

It's always a massive relief when you find the solution to the problem and can be once again 95% certain that you aren't going crazy and that logic is actually infallible.

This topic is closed to new replies.

Advertisement