Jump to content

  • Log In with Google      Sign In   
  • Create Account

Arjan B

Member Since 04 Nov 2007
Offline Last Active Apr 09 2016 10:32 AM

Topics I've Started

Uploading a 1D texture in OpenGL - All (0, 0, 0, 1)

25 November 2015 - 10:01 AM

Hi! I first create a lookup-table for a transfer function and then try to upload it as a 1D texture as follows:

for (unsigned i = 0; i < 1024; i++)
  tfm0[i] = qfeGetProbability(tf0, (float)i/1023.f);

glActiveTexture(GL_TEXTURE17);
if (glIsTexture(tfmTex0)
  glDeleteTextures(1, &tfmTex0);
glGenTextures(1, &tfmTex0);
glBindTexture(GL_TEXTURE_1D, tfmTex0);
glTexImage1D(GL_TEXTURE_1D, 0, GL_R16F, 1024, 0, GL_RED, GL_FLOAT, tfm0);

Right before the rendering call I make sure all the textures I need are bound to the right texture units:

glActiveTexture(GL_TEXTURE17);
glBindTexture(GL_TEXTURE_1D, tfmTex0); 

Then, I set my uniform variable for the 1D texture:

glUniform1f(glGetUniformLocation(shaderProgram, "tf0"), 17);

And this is how the 1D texture is defined in the fragment shader, where sampleNorm is a value between 0 and 1:

uniform sampler1D tf0;
vec4 tfValue = texture1D(tf0, sampleNorm);

Somehow, all of the tfValues end up being (0, 0, 0, 1), which I suspect is a default fallback value.

 

To be sure that I uploaded the values to the graphics card correctly, I also have this check right before the draw call:

float values[1024];
glActiveTexture(GL_TEXTURE17);
glGetTexImage(GL_TEXTURE_1D, 0, GL_RED, GL_FLOAT, values);

It retrieves the values in the texture I uploaded back to "normal" memory, and they show up to be exactly the values I expect them to be.

 

Does anyone have an idea of where things might be going wrong? What would cause the sampler in the fragment shader to return (0, 0, 0, 1), when it should be returning my values in the R-channel?

 

Thank you in advance,

Arjan


SPH Fluid Simulation - Explodes

14 June 2014 - 03:37 AM

For a school project, I'm implementing SPH fluid simulation, according to the paper by Müller et al in 2003: "Particle-Based Fluid Simulation for Interactive Applications". I've implemented the calculation of density and pressure, the pressure forces, viscosity forces and gravity. Now that I'm adding a bounding box, things start going wrong.

 

My response to a collision is to move the particle back to the contact point, reflect its velocity around the normal of the box at the contact point, and damp the magnitude a bit by some bounce factor.

 

Now I have the following scenario. Two particles, p2 above p1, start by floating somewhere in the bounding box. They are too far away from each other for the pressure or viscosity forces to work on them. So gravity starts pulling them both down. p1 reaches the bottom of the bounding box, bounces a little bit and then stays on the bottom. Now, p2 is still too far away for pressure/viscosity forces and then, within one timestep, p2 hits the bounding box as well and is placed at the bottom. Now p1 and p2 are incredibly close to each other, causing the pressure force to be incredibly large. This makes the particles propel away from each other with extreme speed.

 

What kind of solution would you suggest? Just decrease the timestep? Use penalty forces instead of projection?

 

Thanks in advance!


Path tracing - Direct lighting

09 January 2014 - 03:51 PM

In many explanations of path tracers, such as http://www.thepolygoners.com/tutorials/GIIntro/GIIntro.htm, they apply direct lighting at each intersection point of the path. So at every intersection point, they trace a ray towards a light, and sample a different direction to continue the path.

 

Suppose I send out a ray that bounces three times and then hits the one light in the scene. Adding direct lighting at every bounce would be the same as not doing that, but having three paths, each leading to a light. Here the first path goes from the first intersection point to the light. The second path goes from the first to the second intersection point and then to the light. The third ray goes from the first to the second to the third intersection point and then to the light.

 

Is my understanding correct about this direct lighting approach? Does it sum up to the same end result?

 

If so, does this not lead to many more "samples" hitting a light than normally would, leading to incorrect lighting?


Path tracing - Incorrect lighting

03 January 2014 - 06:49 AM

I have started implementing a simple path tracer. However, I have run into some problems. Placing spheres into the scene leaves most of them not being lit at all.

 

Scene setup

First, let me show the setup of the scene:

1628qy9.jpg

 

I have a sphere in the middle with a radius of 4 that emits light. Around it are four spheres with a radius of 1.

 

Code

One path cannot bounce on a surface more than 5 times. If a ray does not intersect any of the objects in the world, the path terminates as well. The last way of terminating a path is by returning the emitting color of the material we just hit. This happens with 20% of the hits.

Color TraceRay(const Ray& ray, unsigned depth) {
	const unsigned maxDepth = 5;
	if (depth > maxDepth)
		return Color(0.f, 0.f, 0.f);

	float t;
	Shape* shape = NULL;
	if (!world.Intersect(ray, t, &shape))
		return Color(0.f, 0.f, 0.f);

	Point p = ray(t);
	Normal n = shape->GetNormal(p);

	const float pEmit = 0.2f;
	if (urd(mt) < pEmit) {
		return shape->emittance * (1.f / pEmit);
	}
	else {
		Vector newDir = RandomDirection(n);
		Ray newRay(p, newDir, 0.001f);
		return TraceRay(newRay, depth+1) * Dot(n, newDir) * (1.f / (1.f - pEmit));
	}
}

.

 

When bouncing off a surface, a random new direction in the same hemisphere as the surface normal must be generated. I generate three random floats which form a vector v. I normalize this vector and check whether it is in the same hemisphere as the surface normal. If so, return v. If not, flip v and return it.

Vector RandomDirection(const Normal& n) {
	Vector v(urd(mt), urd(mt), urd(mt));
	Normalize(v);
	return Dot(v, n) < 0.f ? -v : v;
}

.

 

After every pixel has been sampled, I present the results so far. The function below is called 500 times to take 500 samples per pixel. All sampled colors are summed up and divided by the number of them for the final resulting color.

void TraceRays(unsigned maxIterations, sf::Texture& texture) {
	for (unsigned x = 0; x < camera.film.GetWidth(); x++) {
		for (unsigned y = 0; y < camera.film.GetHeight(); y++) {
			Ray ray = camera.GetRay(x, y);
			Color c = camera.film.GetPixel(x, y);
			Color l = TraceRay(ray, 0);
			camera.film.SetPixel(x, y, l + c);
		}
	}

	ClearImage();
	for (unsigned x = 0; x < camera.film.GetWidth(); x++) {
		for (unsigned y = 0; y < camera.film.GetHeight(); y++) {
			Color c = camera.film.GetPixel(x,y);
			c /= maxIterations;
			image.setPixel(x, y, c.ToSFMLColor());
		}
	}
	texture.update(image);
}

.

 

Results

The light emitting sphere is clearly visible. You can also see sphere D being slightly lit in the lower right corner.

 

However, none of the other spheres are being lit. I would expect at least a few of the paths that bounce on spheres A and B to bounce in the direction of the light emitting sphere, leading to those pixels being brightened.

 

10ojtw2.jpg

 

 

Questions

I'm having a hard time debugging things pixel by pixel. I'm hoping someone here might be able to make an educated guess about what I'm doing wrong, either by seeing the resulting image or browsing through the above code.

 

Any help would be greatly appreciated!


FPS Network Architecture

23 August 2011 - 02:34 PM

Hey forum!

I've been wanting to create a multiplayer FPS.
For the network part I've come up with the following:


There will be one central server, acting as a lobby. All clients can connect to this server and request a list of games being hosted. A client can then pick one of those games and request the server for the needed information to actually join the game.
Every game is hosted by one player, the host. The host will act as the central point connecting all the clients in the game. The hosts gamestate is the one true gamestate, ie when one of the clients disagrees with the host's gamestate that’s too bad, because the host is the boss around here.
Whenever a client provides input for the game (such as clicking a mouse button, moving the mouse, pressing a key) this input is sent to the server, which in turn sends it to all other clients.
To provide some illusion of smooth motion, gamestates are updated locally according to the locally known input states. But there is always some difference in time between actual input and the arrival of a message notifying about this input. That’s why these messages should have some sort of timestamp, so that when receiving an update on input we can handle it as if it really happened at that point in time. To make this happen, we need to keep a buffer of gamestates so we can make the update in a previous gamestate and recalculate the new gamestate.

But now I'm still having some questions:

  • What sort of timestamp should I use? A simple counter that starts at 0 and increases at, for example, 60Hz? How do I synchronize this counter for all clients?
  • Would it be okay to handle some collisions only at serverside? Such as a bullet hitting a player or a player picking up a health pack. Server sends a message that it happened and voila, it happened everywhere. Else I'd have to implement some kind of query "Did I pick up this health pack?", which seems an inferior choice.
  • How long should I keep old gamestates? Discard anything before the oldest update of all of the players? But what if somebody in the game is AFK for 15 minutes, that would make everyones buffers grow pretty big. Maybe send an obligatory status update every 2 seconds or so?
Thank you in advance,
Arjan

PARTNERS