Ray Tracer

Started by
135 comments, last by Tipotas688 14 years ago
if I do:

BYTE r = GetRValue(tempSphere->color)*DIFFUSE_COEF * lambert;

where i multiply everything as I should in the first place I get this:


Photobucket
Advertisement
Sorry, I have no time for going through this with you, maybe someone else will help.
Just a shot in the dark. But when I read your code I wondered about this line:

D3DXVECTOR3 pointOfIntersection = ray.origin + ray.destination*minimumIntersectDistance;

Now I don't know how your Ray class is designed but should it not be something like this instead:

D3DXVECTOR3 pointOfIntersection = ray.origin + ray.m_direction * minimumIntersectDistance;

I assume that minimumIntersectDistance is the number of ray-lengths you have to travel down the ray to intersect the nearest object. So I think you get the wrong point in space to start your color calculations from.
Yup, that's possible, since I don't speak English so well, I translated "destination" as "direction".
my ray is a struct with just position, direction and nearest intersection named like that:

struct Ray
{
D3DXVECTOR3 origin;
D3DXVECTOR3 destination;
float nearestIntersect;
};

and tbh float nearestIntersect; is redundant so I could just take it out
OK, so you use Ray.destination as the direction of the ray. I would suggest that you rename it to remove the confusion. :-)
done!
But still I don't get whats wrong with my code :'(
Hi, first of all, I highly suggest you to use normalized color values (0-1). This not only simplifies the whole pipeline, but also let you represent higher than white value (bright lights, or why not? HDR rendering).

I'm not sure to understand those lines:
Quote:
BYTE r = GetRValue(tempSphere->color)+DIFFUSE_COEF * lambert;
BYTE g = GetGValue(tempSphere->color)+DIFFUSE_COEF * lambert;
BYTE b = GetBValue(tempSphere->color)+DIFFUSE_COEF * lambert;

tempSphere->color = RGB(r, g, b);


Try with:
BYTE r = GetRValue(tempSphere->color) * DIFFUSE_COEF * lambert;
BYTE g = GetGValue(tempSphere->color) * DIFFUSE_COEF * lambert;
BYTE b = GetBValue(tempSphere->color) * DIFFUSE_COEF * lambert;
Are you sure you want to store the final rgb in the objects rgb? Wouldn't that rewrite the color, then screw the whole thing to hell?

Why can't you just simply return an RGB vector, and display that? What's the point of storing it back to the temp sphere? (I don't do C++, but maybe that change will affect the object, so everything will be f.cked up).
Quote:
Hi, first of all, I highly suggest you to use normalized color values (0-1). This not only simplifies the whole pipeline, but also let you represent higher than white value (bright lights, or why not? HDR rendering).


I am not sure how to draw on per pixel basis so I am using SetPixel which is using a COLORREF variable, its not even DirectX

Quote:
cignox1
BYTE r = GetRValue(tempSphere->color) * DIFFUSE_COEF * lambert;
BYTE g = GetGValue(tempSphere->color) * DIFFUSE_COEF * lambert;
BYTE b = GetBValue(tempSphere->color) * DIFFUSE_COEF * lambert;

The last picture you see:
Photobucket

is written like ur quote, if you check some posts above I changed it, sry for the confusion

Quote:
Are you sure you want to store the final rgb in the objects rgb? Wouldn't that rewrite the color, then screw the whole thing to hell?

I am creating a temporary sphere object which holds the data I need, I do the calculations and then I return just its color but I need the object that is the nearest to the camera for the raytracing

This topic is closed to new replies.

Advertisement