Jump to content

  • Log In with Google      Sign In   
  • Create Account

mg_mchenry

Member Since 12 Oct 2005
Offline Last Active Sep 10 2012 11:05 AM

Topics I've Started

Perspective, Camera, Model - basic matrix math

31 August 2012 - 11:27 AM

My matrices are all muddy and I wonder if anyone has a favorite resource for a reference on this.

I have a coordinate system where +x is to the right, +y goes into the screen, and +z is up. But it doesn't always quite work out.

I tried baking my camera matrix into my perspective matrix. Is that a mistake? Should people avoid that?
My camera has x rotation, z rotation, and a location. To get my camera matrix, I start from an identity matrix, do the rotations, then translate by the camera location.

Then I have an ortho perspective matrix. I multiply the perspective matrix by the camera matrix and pass that into the shader. Should I keep it separate?

Well, it's all mixed up. Moving the camera +z has the opposite of expected result. y also seems to be flipped when moving the camera. Yet, objects appear on screen as expected. Translating objects on the screen acts as expected.

I think the camera translations need to be inversed?

The model view matrix works exactly as expected. As does the perspective matrix.

[EDIT]
OK, OK! Before anyone attacks, I see that the camera does not go into the perspective matrix. I took it out and tacked it onto the model view matrix. Makes sense.
http://www.opengl.org/archives/resources/faq/technical/viewing.htm

It's still not doing the right thing, but I'll work it out.

Sharp lines with linear filtering look awful (WHY!?)

05 July 2012 - 03:39 PM

My previous post is here:
http://www.gamedev.net/topic/626376-per-vertex-lighting-are-these-ugly-lines-normal/

When trying to light some procedurally-generated terrain, I see these alarmingly bright lines:
Posted Image

These lines run along polygon edges, so I thought maybe doing lighting in the fragment shader would help. In that case, the vertex normals were still subject to linear interpolation, and even after re-normalizing, produces almost the exact same sort of lines.

I tried vertex smoothing. It helped a little, mainly by removing the areas of highest contrast. I tried tesselating quads into 4 triangles instead of two (with a new generated midpoint), but that didn't help at all.

Finally, I decided that the problem must be that I want to filter linearly over a quad and I can only do that when sampling a texture, so I tried packing my lighting data into a texture. This texture has one pixel per vertex.

Result:
Posted Image

On the left, you see the terrain rendered with the terrain texture using nearest filtering. You can see there is one pixel per vertex and it is centered neatly on it. On the right, you see what it looks like with linear interpolation. I really really really did not expect to see those sharp lines. I expected to see nice blurry shapes.

I tried resizing the image in paint.net and was surprised that bilinear interpolation in a image editing tool displays the same lines:
Posted Image
The third image is bicubic filtering. That is more like the result I was hoping for.


I feel like I'm doing this all wrong. I don't think I want to generate a higher-res texture just to do some fancy spline interpolation on it. I think I just have to hide the artifacts using some noise from the ground texture, normal smoothing, less contrast, more ambient light, and maybe some noise in the lighting calculation.

Any thoughts are appreciated.

One more thing: In the side-by side, you can see on the left side, even though all coloring is coming from the fragment shader with no lighting whatsoever (just a nearest-pixel texture lookup), you can still see lines on polygon edges. I have absolutely no explanation for that. Where can that possibly be coming from?

Dumb, dumb, dumb. I accidentally left some lighting in the shader, it wasn't just a texture lookup. When removed, the lines along polygon edges on the left disappears as expected, but the lines remain in the bilinear filter:
Posted Image
It just doesn't look "linear" to me.

Per-vertex lighting - Are these ugly lines normal?

14 June 2012 - 03:12 PM

I am lighting a patch of terrain per-vertex in webgl, and finding highlights that stand out way too much between some vertices. Is that normal? Does switching to per-pixel lighting magically solve that. If I smooth out the terrain does it go away?

Here is a live example:
http://dl.dropbox.co...in/terrain.html

Posted Image

Separate topic: I am showing the vertex normals to prove they are reasonably sane and they appear to be. I tried to draw lines by drawing triangles where two points are identical, but nothing was displayed at all, so I offset one of the points by a small amount, which is why the lines look like needles. Is that normal? What is the correct way to render lines?

My main issue is that the linear interpretation of the light weighting between vertices just seems ugly. Will interpolating the normals and calculating the light weighting per pixel improve things at all for a directional light?

Here's my glsl:

<script id="shader-fs" type="x-shader/x-fragment">
	precision mediump float;

	varying vec4 vColor;
	varying vec3 vLightWeighting;

	void main(void) {
		gl_FragColor = vec4(vColor.rgb * vLightWeighting, vColor.a);
	}
</script>

<script id="shader-vs" type="x-shader/x-vertex">
	attribute vec3 aVertexPosition;
	attribute vec3 aVertexNormal;
	attribute vec4 aVertexColor;

	uniform mat4 uMVMatrix;
	uniform mat4 uPMatrix;
	uniform mat3 uNMatrix;

	uniform vec3 uAmbientColor;

	uniform vec3 uLightingDirection;
	uniform vec3 uDirectionalColor;

	varying vec4 vColor;
	varying vec3 vLightWeighting;

	void main(void) {
		gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);
		vColor = aVertexColor;

		vec3 transformedNormal = uNMatrix * aVertexNormal;
		float directionalLightWeighting = max(dot(transformedNormal, uLightingDirection), 0.0);
		vLightWeighting = uAmbientColor + uDirectionalColor * directionalLightWeighting;
	}
</script>

I did something similar to this with per-vertex lighting (although using the standard pipeline, not glsl) a few years ago and didn't see these kinds of artifacts:
http://claritydevjou...ss-week-11.html
Posted Image

SlimDX - How to submit minor corrections to samples

22 June 2009 - 04:57 AM

I just started using SlimDX after long-held plans of switching off of mdx. The conversion was easy and nearly painless. THANKS! However, while figuring things out, I found a few errors in the samples. What is the best way for me to submit these fixes back to the project?

Lighting Problems - Still

05 December 2007 - 02:26 AM

[Edit: The problem has changed somewhat - check below] I'm using Managed DirectX, and C#. I'm also using the CustomVertex types. The object you see above is PositionNormalColored. There is no texture. My set up code looks like this:
			device.RenderState.Lighting = true;

			device.Lights[0].DiffuseColor = ColorValue.FromArgb(0xFFFFFF);
			device.Lights[0].Type = LightType.Directional;
			device.Lights[0].Direction = new Vector3(-1, -0.05f, -1);
			device.Lights[0].Update();
			device.Lights[0].Enabled = true;

			device.RenderState.AmbientColor = 0x808080;
			//Material material = new Material();
			//material.AmbientColor = ColorValue.FromArgb(0xFFFFFF);
			//device.Material = material;


The normals are correct, and the directional light is working fine. The ambient color does not show up at all. I have some PositionTextured objects in the scene that appear black with lighting enabled, which is to be expected because they have no normals. If I enable the material, the textured objects show up with the ambient color. However, the PositionNormalColored object shows up with BLUE ambient color. I can not get rid of this blue. If I switch the globe to PositionNormalTextured, I get the ambient lighting, but the directional lighting does not work. I feel like I've tried every combination of everything... Help? [Edited by - mg_mchenry on December 7, 2007 4:00:11 PM]

PARTNERS