Jump to content

  • Log In with Google      Sign In   
  • Create Account


Arjan B

Member Since 04 Nov 2007
Offline Last Active Today, 09:19 AM
-----

Topics I've Started

Path tracing - Direct lighting

09 January 2014 - 03:51 PM

In many explanations of path tracers, such as http://www.thepolygoners.com/tutorials/GIIntro/GIIntro.htm, they apply direct lighting at each intersection point of the path. So at every intersection point, they trace a ray towards a light, and sample a different direction to continue the path.

 

Suppose I send out a ray that bounces three times and then hits the one light in the scene. Adding direct lighting at every bounce would be the same as not doing that, but having three paths, each leading to a light. Here the first path goes from the first intersection point to the light. The second path goes from the first to the second intersection point and then to the light. The third ray goes from the first to the second to the third intersection point and then to the light.

 

Is my understanding correct about this direct lighting approach? Does it sum up to the same end result?

 

If so, does this not lead to many more "samples" hitting a light than normally would, leading to incorrect lighting?


Path tracing - Incorrect lighting

03 January 2014 - 06:49 AM

I have started implementing a simple path tracer. However, I have run into some problems. Placing spheres into the scene leaves most of them not being lit at all.

 

Scene setup

First, let me show the setup of the scene:

1628qy9.jpg

 

I have a sphere in the middle with a radius of 4 that emits light. Around it are four spheres with a radius of 1.

 

Code

One path cannot bounce on a surface more than 5 times. If a ray does not intersect any of the objects in the world, the path terminates as well. The last way of terminating a path is by returning the emitting color of the material we just hit. This happens with 20% of the hits.

Color TraceRay(const Ray& ray, unsigned depth) {
	const unsigned maxDepth = 5;
	if (depth > maxDepth)
		return Color(0.f, 0.f, 0.f);

	float t;
	Shape* shape = NULL;
	if (!world.Intersect(ray, t, &shape))
		return Color(0.f, 0.f, 0.f);

	Point p = ray(t);
	Normal n = shape->GetNormal(p);

	const float pEmit = 0.2f;
	if (urd(mt) < pEmit) {
		return shape->emittance * (1.f / pEmit);
	}
	else {
		Vector newDir = RandomDirection(n);
		Ray newRay(p, newDir, 0.001f);
		return TraceRay(newRay, depth+1) * Dot(n, newDir) * (1.f / (1.f - pEmit));
	}
}

.

 

When bouncing off a surface, a random new direction in the same hemisphere as the surface normal must be generated. I generate three random floats which form a vector v. I normalize this vector and check whether it is in the same hemisphere as the surface normal. If so, return v. If not, flip v and return it.

Vector RandomDirection(const Normal& n) {
	Vector v(urd(mt), urd(mt), urd(mt));
	Normalize(v);
	return Dot(v, n) < 0.f ? -v : v;
}

.

 

After every pixel has been sampled, I present the results so far. The function below is called 500 times to take 500 samples per pixel. All sampled colors are summed up and divided by the number of them for the final resulting color.

void TraceRays(unsigned maxIterations, sf::Texture& texture) {
	for (unsigned x = 0; x < camera.film.GetWidth(); x++) {
		for (unsigned y = 0; y < camera.film.GetHeight(); y++) {
			Ray ray = camera.GetRay(x, y);
			Color c = camera.film.GetPixel(x, y);
			Color l = TraceRay(ray, 0);
			camera.film.SetPixel(x, y, l + c);
		}
	}

	ClearImage();
	for (unsigned x = 0; x < camera.film.GetWidth(); x++) {
		for (unsigned y = 0; y < camera.film.GetHeight(); y++) {
			Color c = camera.film.GetPixel(x,y);
			c /= maxIterations;
			image.setPixel(x, y, c.ToSFMLColor());
		}
	}
	texture.update(image);
}

.

 

Results

The light emitting sphere is clearly visible. You can also see sphere D being slightly lit in the lower right corner.

 

However, none of the other spheres are being lit. I would expect at least a few of the paths that bounce on spheres A and B to bounce in the direction of the light emitting sphere, leading to those pixels being brightened.

 

10ojtw2.jpg

 

 

Questions

I'm having a hard time debugging things pixel by pixel. I'm hoping someone here might be able to make an educated guess about what I'm doing wrong, either by seeing the resulting image or browsing through the above code.

 

Any help would be greatly appreciated!


FPS Network Architecture

23 August 2011 - 02:34 PM

Hey forum!

I've been wanting to create a multiplayer FPS.
For the network part I've come up with the following:


There will be one central server, acting as a lobby. All clients can connect to this server and request a list of games being hosted. A client can then pick one of those games and request the server for the needed information to actually join the game.
Every game is hosted by one player, the host. The host will act as the central point connecting all the clients in the game. The hosts gamestate is the one true gamestate, ie when one of the clients disagrees with the host's gamestate that’s too bad, because the host is the boss around here.
Whenever a client provides input for the game (such as clicking a mouse button, moving the mouse, pressing a key) this input is sent to the server, which in turn sends it to all other clients.
To provide some illusion of smooth motion, gamestates are updated locally according to the locally known input states. But there is always some difference in time between actual input and the arrival of a message notifying about this input. That’s why these messages should have some sort of timestamp, so that when receiving an update on input we can handle it as if it really happened at that point in time. To make this happen, we need to keep a buffer of gamestates so we can make the update in a previous gamestate and recalculate the new gamestate.

But now I'm still having some questions:

  • What sort of timestamp should I use? A simple counter that starts at 0 and increases at, for example, 60Hz? How do I synchronize this counter for all clients?
  • Would it be okay to handle some collisions only at serverside? Such as a bullet hitting a player or a player picking up a health pack. Server sends a message that it happened and voila, it happened everywhere. Else I'd have to implement some kind of query "Did I pick up this health pack?", which seems an inferior choice.
  • How long should I keep old gamestates? Discard anything before the oldest update of all of the players? But what if somebody in the game is AFK for 15 minutes, that would make everyones buffers grow pretty big. Maybe send an obligatory status update every 2 seconds or so?
Thank you in advance,
Arjan

OpenGL Shaders and Depth

12 July 2011 - 07:17 AM

Hi forum!

I've been trying to get the hang of using shaders and have been doing this with OpenGL.
However, things aren't looking the way I want them too ^^.

Posted Image

It seems like there is something going wrong with calculating the depth of each to-be-drawn pixel.
All I'm trying to do yet is draw everything in red with diffuse lighting.

Vertex shader:
varying vec3 normal;
varying vec4 pos;

void main() {
  	normal = normalize(gl_NormalMatrix * gl_Normal);
  	pos = gl_ModelViewMatrix * gl_Vertex;
	gl_Position = ftransform();
}

Fragment shader:
varying vec3 normal;
varying vec4 pos;

void main() {
  	vec4 color = vec4(1.0, 0.0, 0.0, 1.0);
  	vec4 lightPos = gl_LightSource[0].position;
  	vec4 lightVec = normalize(lightPos - pos);
  
 	vec4 diffuse = color * max(0.0, dot(normal, lightVec.xyz));

  	gl_FragColor = diffuse;
}

Set up code:
int main(int argc, char **argv) {
	// initialize glut
	glutInit(&argc, argv);
	glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
	glutInitWindowSize(400, 400);
	glutCreateWindow("GLSL Testing");
	glClearColor(0, 0, 0, 1);
	glEnable(GL_CULL_FACE);
	glEnable(GL_DEPTH_TEST);

	glutDisplayFunc(DisplayFunction);
	glutReshapeFunc(ReshapeFunction);
	glutIdleFunc(IdleFunction);
	glutKeyboardFunc(KeyboardFunction);

	// initialize glew
	glewInit();
	
	// set up shaders
	SetShaders();

	glutMainLoop();

	CleanUp();

	return 0;
}

Rendering function:
void DisplayFunction() {
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

	float lightPos[] = {-2, 3, 20, 1};
	glLightfv(GL_LIGHT0, GL_POSITION, lightPos);

	glTranslatef(0, 0, 5);
	glRotatef(30, 1, 1, 0);
	glutSolidTeapot(1);

	glLoadIdentity();
	gluLookAt(0, 0, 0, 0, 0, 1, 0, 1, 0);

	glutSwapBuffers();
}

When I comment out glEnable(GL_CULL_FACE), I get nothing but a black screen. Also, I thought setting the light positions z-value to -20 would light the side of the teapot we're looking at.
These two things together make me think the problem might lie with calculating the normals.

Does anybody see what causes the errors you can see in the image?

[XNA] Matrix Transformations: Pointing a gun at a crosshair

17 January 2010 - 10:20 AM

Hi there! I'm trying to make a game using XNA where a tank-like vehicle can be moved left and right, and automatically aims at the mouse cursor. I know I can calculate a direction vector like this:
Vector3 gunDirection = new Vector3(mousePosition, 0.0f) - tank.position;

A rotation around the Z-axis in radians could then be acquired like this:
float rotation = atan(gunDirection.Y/gunDirection.X);

I run into problems when I want to apply a rotation to the "Gun" mesh in the tank's model. The transformation matrix of the mesh can be retrieved this way:
Matrix gunTransform = tank.model.Bones["Gun"].Transform;

How do I alter the gun's transformation matrix to keep it's translation and scale, but have the new rotation? Thank you in advance, Arjan B

PARTNERS