Ray Tracing - generating eye rays

Started by
1 comment, last by _Sauce_ 14 years, 5 months ago
I'm writing a simple ray-tracer (it was meant to be a one day project but now I'm stumped) that will simply draw the silhouette of the objects in the scene. Now, I've read around a bit and found out how to generate the eye rays, but there seems to be a few different methods. I'm trying the following (using pixelToaster to get the pixels on screen);
#include "PixelToaster.h"
#include "RTSphere.h"
#include <Ray.h>
#include <limits>

using namespace PixelToaster;

void clear(vector<TrueColorPixel> &#8465;, TrueColorPixel &color)

int main(int argc, char *argv[])
{
	const int width = 800;
	const int height = 600;

	Display display("Ray Tracer", width, height, Output::Default, Mode::TrueColor);
	vector<TrueColorPixel> pixels(width * height);

	clear(pixels, TrueColorPixel(0, 0, 255));

	float fovW = toRadians(45.0f);
	float fovH = (float)height / (float)width * fovW;
	float tanFovW = tan(fovW);
	float tanFovH = tan(fovH);
	float zNear = -1.0f;
	float zFar = -1000.0f;

	Vector3f camPos(0.0f, 0.0f, 0.0f);
	Vector3f lookAt(0.0f, 0.0f, zNear);
	lookAt.normaliseSelf();
	
	RTSphere sphere(0.5f, Vector3f(0.0f, 0.0f, -5.0f));

	//Perform raytracing.
	unsigned int index = 0;
	for(int v = 0; v < height; ++v)
	{
		for(int u = 0; u < width; ++u)
		{
			Vector3f target(lookAt);
			float x = ((2.0f * (float)u) - (float)width) / (float)width * tanFovW;
			float y = ((2.0f * (float)v) - (float)height) / (float)height * tanFovH;

			target = Vector3f(x, y, zNear);
			target.normaliseSelf();
			Rayf ray(camPos, target);

			float result = sphere.intersect(ray);
			if(result < zFar)
			{
				pixels[index].r = 255;
				pixels[index].b = pixels[index].g = pixels[index].r;
			}

			++index;
		}
	}

	while(display.open())
	{
		display.update(pixels);
	}

	return 0;
}

void clear(vector<TrueColorPixel> &#8465;, TrueColorPixel &color)
{
	vector<TrueColorPixel>::iterator it = image.begin();

	while(it != image.end())
	{
		*it = color;
		++it;
	}
}




but that gives me a solid black screen for output. I've tried putting the x, y, and z direction of the eye rays into the pixel r, g, b values, which gives me these: ray X direction: Free Image Hosting at www.ImageShack.us ray Y direction: Free Image Hosting at www.ImageShack.us ray Z direction: Free Image Hosting at www.ImageShack.us ray XYZ direction: Free Image Hosting at www.ImageShack.us So what there is wrong exactly? I dunno about you but that doesn't look right to me :P I haven't discounted the fact that I may also have something wrong with my ray-sphere intersection code, so here's that as well;
float RTSphere::intersect(const Rayf &ray)
{
	Vector3f delta = ray.mPos - mPos;
	float A = delta.dot(ray.mDir);
	float B = delta.dot(delta) - mRadius * mRadius;
	float C = A * A - B;
	return C > 0 ? (-A - sqrt(C)) : std::numeric_limits<float>::infinity();
}




Any ideas? [Edited by - _Sauce_ on October 27, 2009 3:42:34 AM]
Advertisement
Your eye ray calculations are fine - I think you're just forgetting to offset the scaled X/Y/Z values for the pixel values (i.e. doing static_cast<int>(x * 255) instead of 127 + static_cast<int>(x * 255)).

However your RTSphere::intersect test returns the distance along the input ray at which it intersects the sphere (or +inf), which will be positive here (i.e, 4.5 for a ray going directly towards its centre). But you then test to see whether these intersection distances are less than zFar = -1000, which will never be true.
*facepalm*

Thanks heaps for that :)

This topic is closed to new replies.

Advertisement