This topic is now archived and is closed to further replies.


Ray Tracing Camera problem

Recommended Posts

I'm writting a ray tracer and encountered a problem with my camera that I was hoping someone could help me out with. Scene using Pov-Ray Same scene using my ray tracer From what I can tell, it looks I may have inadvertantly used a Fisheye algorithm for my camera. My camera code: CCamera includes m_fFOV = the Field of View m_vLookAt, m_vUp, m_vRight = Normalized vectors pointing in each direction m_vPosition = Position of the Camera
// CCamera::GetRay() = Returns a ray with a start point at the near clipping plane, and an end point at the far clipping plane pointing in a direction to pass from the viewpoint (CCamera::m_vPosition) through the pixel(x,y) in the viewport

// parameters <x> and <y> are the pixel locations of the pixel to be mapped in the output image

// <Width> and <Height> are the total dimensions of the output image

// <Default> is the color of the ray if no objects are intersected (background color)

const CRay CCamera::GetRay( int x, int y, int Width, int Height, CColor Default ) const
	// Aspect Ratio * FOV

	float AspectFOV = float(Height)/float(Width) * m_fFOV;
	// Rotate around the Up Vector by a factor of x/Width * FOV

	// Rotate around the Right Vector by a factor of y/Height * AspectRatio * FOV

	CVector RayDir = m_vLookAt.Rotate( m_vUp, m_fFOV/2.0f-(m_fFOV * float(x)/float(Width-1)) ).Rotate( m_vRight, (AspectFOV * float(y)/float(Height-1))-AspectFOV/2.0f );
	return CRay( m_vPosition + (RayDir * m_fNearClip), m_vPosition + (RayDir * m_fFarClip), Default );

// ... Rotate function for Vectors, as used in m_vLookAt.Rotate()

const CVector CVector::Rotate( const CVector &NormalAxis, float Angle ) const
float x = NormalAxis.UnitVector().m_fX;
float y = NormalAxis.UnitVector().m_fY;
float z = NormalAxis.UnitVector().m_fZ;

float cosA		= cosf(Angle); 
float sinA		= sinf(Angle);
float icosA		= (1-cosA);		

float xsinA		= x * sinA;
float ysinA		= y * sinA;
float zsinA		= z * sinA;

// Fill the matrix

float m[4][4] = { { cosA + x*x*icosA,	x*y*icosA - zsinA,	x*z*icosA + ysinA,	0.0f },
				  { x*y*icosA + zsinA,	cosA + y*y*icosA,	y*z*icosA - xsinA,	0.0f },
				  {	x*z*icosA - ysinA,	y*z*icosA + xsinA,	cosA + z*z*icosA,	0.0f },
				  { 0.0f,				0.0f,				0.0f,				1.0f } };
return CVector( m_fX*m[0][0] + m_fY*m[1][0] + m_fZ*m[2][0],
				m_fX*m[0][1] + m_fY*m[1][1] + m_fZ*m[2][1],
				m_fX*m[0][2] + m_fY*m[1][2] + m_fZ*m[2][2] );

Can anyone see what I may be doing wrong? Thanks in advance. [edited by - enfekted on May 20, 2004 6:05:45 PM] [edited by - enfekted on May 20, 2004 6:08:30 PM]

Share this post

Link to post
Share on other sites
I made the same mistake when I first tried to write a raytracer. You must not rotate the ray directions.

Instead, you should find the coordinates of the corners of your image on your view plane, and create row and column vectors... Then you can interpolate on those row and column vectors to get the actual position of the center of your pixel in space... I hope you see what I mean.

And the ray direction is...

RayDirection = PixelPosition - CameraOrigin;

Obviously, you will need to normalize that vector. The code to do all this isn't very complicated either.

Looking for a serious game project?

[edited by - Max_Payne on May 20, 2004 8:47:24 PM]

Share this post

Link to post
Share on other sites
All I have to say is "Man, am I stupid!!!"

Thanks for your help Max_Payne, but it turns out the camera was working fine (for the most part). It was the plane that wasn''t working right.

The image as it was supposed to look (with messed up plane intersection)

If you take the messed up picture from my previous post and flip it vertically it will more closly resemble the pov-ray picture.

I can''t beleive I spent all day on this.

Share this post

Link to post
Share on other sites