Jump to content
  • Advertisement
Sign in to follow this  
Ilici

OpenGL Ray generation

This topic is 4798 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm beginning work on a raytracer and I've managed to get the basics up and running but I've started running into some problems with the ray generation. When I setup my camera at (0, 0, -10) looking at (0, 0, 0), with up (0, 1, 0), the ray in the middle of the viewport (512x512) should be (0, 0, 1) right? (pointing straight to the lookat point). Here's my code for creating the matrices:
inline void mat_perspective(matrix44* res, float fov, float aspect, float hither, float yonder)
{
	float invtanf = 1.0f / tanf( fov / 2.0f );
	mat_load(res, 
		invtanf / aspect, 0, 0, 0,
		0, invtanf, 0, 0,
		0, 0, -(yonder + hither) / (hither - yonder), yonder * hither / (hither - yonder),
        0, 0, 1, 0);
}

inline void mat_lookat(matrix44* res, const vec3& Position, const vec3& View, const vec3& Up)
{
	vec3 dir, right, newUp;
	vec_sub(View, Position, &dir);		
	vec_normalize(dir, &dir);
	
	vec_normalize(Up, &right);	
	vec_cross(dir, right, &right);
	
	vec_cross(right, dir, &newUp);

	mat_load(res,
		right.x, newUp.x, dir.x, Position.x,
		right.y, newUp.y, dir.y, Position.y,
        right.z, newUp.z, dir.z, Position.z,
        0, 0, 0, 1.0f);	
}

inline void mat_viewport(matrix44* res, float x, float y, float width, float height)
{
	mat_load(res, 
		(width - x) / 2, 0, 0, (width + x) / 2,
		0, (height - y) / 2, 0, (height + y) / 2,
		0, 0, 1, 0,
		0, 0, 0, 1);
}



And for creating the camera and generating rays:
void cam_perspective(Camera* cam, const vec3& pos, const vec3& view, const vec3& up, 
						float fov, float width, float height, float hither, float yonder)
{
	mat_lookat(&cam->WorldToCamera, pos, view, up);
	mat_perspective(&cam->CameraToScreen, fov, width / height, hither, yonder);
	mat_viewport(&cam->ScreenToRaster, 0, 0, width, height);

	mat_inverse(cam->WorldToCamera, &cam->CameraToWorld);
	mat_inverse(cam->CameraToScreen, &cam->ScreenToCamera);
	mat_inverse(cam->ScreenToRaster, &cam->RasterToScreen);

	mat_mul(cam->RasterToScreen, cam->ScreenToCamera, &cam->RasterToCamera);

	mat_print(cam->RasterToScreen);

	cam->scr_width = width;
	cam->scr_height = height;
	cam->hither = hither;
	cam->yonder = yonder;
}

void cam_generate_ray(const Camera& cam, const Sample& sample, ray3* ray)
{
	vec_load(&ray->o, 0, 0, 0);
	vec_load(&ray->d, sample.imageX, sample.imageY, 0);

	//we need to transform this as a point
	mat_transf_point(cam.RasterToCamera, ray->d, &ray->d);

	vec_normalize(ray->d, &ray->d);
	ray->mint = 0.0f;
	ray->maxt = (cam.yonder - cam.hither) / ray->d.z;

	mat_transf_point(cam.WorldToCamera, ray->o, &ray->o);
	mat_transf_vec(cam.CameraToWorld, ray->d, &ray->d);
}
I know it's quite a bit of code to post, but I've looked in different places and the perspective matrices are given quite differently. For example: 1 | 2 | 3 | 4 I'm using the one OpenGL uses but with 1 in m[3][2] (taken from "Physically based rendering" by Pharr and Humphreys). Another question: i've seen some places use NDCs and some use Cannonical coordinates.. is there any important difference affecting the end result?

Share this post


Link to post
Share on other sites
Advertisement
I'm not sure about your code, but here is what I use. Note that I use a left handed coordinate system, while you're using a right handed one (at least I think so, since you're using a OGL matrix format, don't you?)

bool QLCamera::GenerateRay(uint x, uint y, ray &r) const
{
r.Set(viewp[0] + dx*x + sx*cx + dy*y - sy*cy - position, position);
return true;
}
void QLCamera::SetupCamera()
{
vector3 at(position + direction);
vector3 up(0, 1, 0);
vector3 p(position);
ResetCamera();

position = p;
direction = at - position;

matrix m;
m.LookAt(position, at, up);
m.Invert();

m.Apply(viewp[0]);
m.Apply(viewp[1]);
m.Apply(viewp[2]);
m.Apply(viewp[3]);

dx.Set((viewp[1] - viewp[0]) / res_x);
dy.Set((viewp[2] - viewp[0]) / res_y);
sx.Set(dx / samples_x);
sy.Set(dy / samples_y);

diameter = focal / fstop;
ppmx = res_x / cameras[current_camera].w;
ppmy = res_y / cameras[current_camera].h;
}


I don't do the job using matrices. Instead, I use 4 points that rapresent the four viewplane corners, and transform them before I generate the rays. Then I interpolate throught those vertices and generate the rays.
Sorry if that does not answer to your question, but hope it helps.
By the way, is there some reason to use your method instead of mine? Is your faster?

Share this post


Link to post
Share on other sites
Given the camera coordinates and lookat point you provided, then yes the center ray should originate at (0, 0, -10) and point in the direction (0, 0, 1) after normalization.

I've always used the same method as cignox1 described - four points provide the viewplane corners (based on desired FOV and aspect ratio) and then the rays for each individual pixel are interpolated from those four corners. Generally in my experience this avoids some pitfalls with weird aspect problems and the occasional camera rotation oddity.

You said you've run into some problems but you didn't give any details about what sort of problems. Are rays being generated incorrectly? Is there an aspect or FOV problem? Is there a strange magnification problem (camera seems too zoomed in/zoomed out)?

Share this post


Link to post
Share on other sites
Quote:
Original post by ApochPiQ
You said you've run into some problems but you didn't give any details about what sort of problems. Are rays being generated incorrectly? Is there an aspect or FOV problem? Is there a strange magnification problem (camera seems too zoomed in/zoomed out)?


Yes, the rays are generated incorrectly, in the middle of the viewport I don't get (0, 0, 1), but something close, but off by quite a large angle. I was wondering if my matrices are correct.

Share this post


Link to post
Share on other sites
Sounds like it's most likely a calculation problem somewhere. Depending on how close the values are it may even be a floating-point precision issue, but I doubt it if the discrepancy is large enough to be noticeable.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!