Sign in to follow this  
bladerunner627

Ray-tracing - Primitive Shape Distortions

Recommended Posts

I started writing a simple ray-tracer on Friday night but I've hit a snag.

My short-term goal was to create a 100% reflective sphere more or less in the center of the scene and a "full bright" opaque colored plane below it to demonstrate the sphere reflecting the rays.

For my test case at the moment I am only rendering an opaque sphere.

Unfortunately, for some reason whenever I change the sphere's origin in the scene instead of simply translating to that position it seems to stretch across whatever axis I'm moving it on.

At first I pondered whether my ray normals weren't being generated right but it looks correct to me.

void generateRays()
{
// Generate rays
for(uint32 i(0); i < width; ++i)
{
for(uint32 j(0); j < height; ++j)
{
Vec4<float> p((float)i, (float)j, 0.0f);
Vec4<float> d = p - origin;
pRays[i][j].setDone(false);


d.Normalize();

pRays[i][j].setOrigin(origin);
pRays[i][j].setDirection(d);
}
}
}





My next thought was maybe it's the sphere-ray intersection... again I poured over this for several hours but nothing sticks out.



bool testRay(jb_ray* ray)
{
Vec4<float> dir = ray->getDirection();
Vec4<float> org = ray->getOrigin();

// quadratic equation.

float a = dir * dir;
dir *= 2;
Vec4<float> omo = org - origin;

float b = dir * omo;
float c = omo*omo - radius*radius;

float quad = b*b-4*a*c;

float time0 = -1.0f;
float time1 = -1.0f;

if(quad > 0)
{
time0 = (-b + sqrt(quad)) / (2*a);
time1 = (-b - sqrt(quad)) / (2*a);
}
else
if(quad == 0)
time0 = (-b / (2*a));
else
{
return false;
}

Vec4<float> t0Origin
(
(ray->getOrigin().x + ray->getDirection().x) * time0,
(ray->getOrigin().y + ray->getDirection().y) * time0,
(ray->getOrigin().z + ray->getDirection().z) * time0
);

ray->setOrigin(t0Origin);

// Calculate new ray normal.

Vec4<float> t0Normal(t0Origin);

t0Normal -= origin;

float invRadius = 1.0f/radius;

t0Normal *= invRadius;

t0Normal.Normalize();
ray->setDirection(t0Normal);

ray->setColor(diffuse);
ray->incReflections();

// Lets ignore secondary intersections... for now..
// In this case it would be the point the
// ray leaves the sphere.

/*
Vec4<float> t1Normal
(ray->getOrigin().x + ray->getDirection().x,
ray->getOrigin().y + ray->getDirection().y,
ray->getOrigin().z + ray->getDirection().z);

t1Normal *= t1;
t1Normal - origin;
t1Normal *= iRad;
t1Normal.Normalize();
*/


return true;
}





My render routine is pretty straight forward (though inefficient), generate the rays, and trace them by checking if there's an intersection between every object in the scene and set the pixel color to corresponding the ray's resulting color.


void traceRays()
{
for(uint32 i(0); i < width; ++i)
{
for(uint32 j(0); j < height; ++j)
{
jb_ray* ray = &pRays[i][j];

bool intersected = true;

while(intersected)
{
if(ray->isDone())
break;

if(ray->numReflections() > MAX_RAY_REFLECTIONS)
break;

if(shapes.size() == 0)
break;

for(uint32 k(0); k < shapes.size(); ++k)
{
intersected = shapes[k]->testRay(ray);

if(intersected)
if(shapes[k]->isOpaque())
break;
}
}
}
}
}

void Render()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

glMatrixMode(GL_PROJECTION);
glLoadIdentity();
// glViewport(0, 0, SCREEN_WIDTH, SCREEN_HEIGHT);
// We'll just ignore resizing for now...
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

rayGen.generateRays();
rayGen.traceRays();

for(uint32 i(0); i < SCREEN_WIDTH; ++i)
{
for(uint32 j(0); j < SCREEN_HEIGHT; ++j)
{
jb_ray* ray = rayGen.getRay(i, j);

pixels[j*SCREEN_WIDTH + i] = ray->getColor();
}
}

glDrawPixels(SCREEN_WIDTH, SCREEN_HEIGHT, GL_RGBA, GL_UNSIGNED_BYTE, pixels);

glutSwapBuffers();
}





Any help would be greatly appreciated :) Thank you!

Share this post


Link to post
Share on other sites
The fact that you're doing your ray/sphere intersection math with vec4s is troubling. Make sure that nothing with a nonzero w creeps in, or do it with vec3s.

Share this post


Link to post
Share on other sites
Are you sure that the stretching you're seeing isn't simply a result of perspective projection? (this is extremely noticable with spheres)

An image would also be helpful.

Share this post


Link to post
Share on other sites
Gere's a before and after.

The first is before translation, the second is the sphere translated to units two the right. The third one is down one unit, left one unit.

ray_sphere
ray_sphere
ray_sphere

Looks like some really funky FOV.

[Edited by - bladerunner627 on December 5, 2010 8:50:31 PM]

Share this post


Link to post
Share on other sites
wow, thats some serious stretching!

That could easily be the result of some obscenely high FOV however it's unclear from the code you've provided.

The field of view is defined by the distance between the image plane and your camera origin. You may find it easier to specify the FOV directly in your calculations.

Your camera position should be constant for all rays if you're aiming to simulate a pin-hole camera. You also need to look at the direction of all your rays - they are far too widely distributed.

Share this post


Link to post
Share on other sites
Quote:
Original post by _Sauce_The field of view is defined by the distance between the image plane and your camera origin. You may find it easier to specify the FOV directly in your calculations.


Oh..oh...doooh! Well that would probably be it then. How would I calculate the appropriate distance for a given FOV?

Share this post


Link to post
Share on other sites
I forgot to mention, the dimensions of the image plane are also involved in determining the FOV, which means you can either move the camera origin further away from the image plane, or reduce the size of the image plane. I don't have access to my code at the moment, so I can't tell you exactly how to determine the values to use for a particular FOV, but I can tell you that it's just a matter of some simple trig.

At the moment your image plane is the same size as your output image - instead you can multiply the x and y coords of the ray's direction component by some small factor (< 0), or set the camera origin a long distance behind the image plane.

[Edited by - _Sauce_ on December 5, 2010 10:56:58 PM]

Share this post


Link to post
Share on other sites
Thank you :)

I better read up in one of my computer graphic books I believe the math is in there.

I'll let you know how it works out, won't have anything for a few weeks due to finals. Thanks again!

Share this post


Link to post
Share on other sites
These lines don't look right to me:

Vec4<float> p((float)i, (float)j, 0.0f);
Vec4<float> d = p - origin;

It seems like no matter where you put the camera the rays will get traced toward the point (i,j,0) in world space? That doesn't sound right and it explains why your directions are coming out wrong.

I think what you really want is something like

d = (2*i / width - 1) * right + (2*j/height - 1) *(width/height)* up + (1 / tan(fov)) * forward;

where (right,up,forward) are orthogonal unit vectors describing the camera orientation, and fov is the field of view.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this