Jump to content

  • Log In with Google      Sign In   
  • Create Account


LordSputnik

Member Since 18 Jul 2008
Offline Last Active Aug 29 2012 06:52 AM

Topics I've Started

Normalized Device Co-ordinates to World Space

17 August 2012 - 04:40 PM

I did write a wonderfully detailed post about my problem, then accidentally closed the tab. So this'll be a little more brief, but hopefully still detailed enough to get some help!

Anyway, I'm trying to convert mouse co-ordinates to world space. At the moment, I'm passing in two normalized device co-ordinates, (x,y,-1.0f) and (x,y,1.0f), then transforming them by the inverse of (proj_matrix*view_matrix). I'm expecting to get two points - one on the near clipping plane and one on the far clipping plane, but I'm not. The near plane is 30, and the far plane is 5000, but I'm getting z values of 0.7 and 7 respectively.

I'm not doing any multiplication by any w value to get to clipping space - could that be the problem? If so, how should I get the w value to multiply all the elements by?

Here's the bits of my code that are relevant:

Ray newray(0.2f,-0.2f,-1.0f,0.2f,-0.2f,1.0f);
	newray.SetMatrices(cam_->GetProjectionMatrix(),cam_->GetViewMatrix());
	newray.Calculate();

class Ray
{
  cml::matrix44f_c inv_mat_;
  vector3f start_, end_;

  vector3f transformed_start_, transformed_end_;
public:
  Ray(float sx, float sy, float sz, float dx, float dy, float dz);
  void SetRayEnds(float sx, float sy, float sz, float dx, float dy, float dz);

  void SetMatrices(const cml::matrix44f_c & proj, const cml::matrix44f_c & view);
  void Calculate();

  vector3f GetYIntersection(float y);
};

Ray::Ray(float sx, float sy, float sz, float dx, float dy, float dz) :
	inv_mat_(cml::identity_4x4()),
	start_(sx,sy,sz),
	end_(dx,dy,dz)
  {

  }
  void Ray::SetRayEnds(float sx, float sy, float sz, float dx, float dy, float dz)
  {
	start_.set(sx,sy,sz);
	end_.set(dx,dy,dz);
  }

  void Ray::SetMatrices(const cml::matrix44f_c & proj, const cml::matrix44f_c & view)
  {
	inv_mat_ = cml::inverse(proj*view);
  }

  void Ray::Calculate()
  {
	transformed_start_ = cml::transform_point(inv_mat_, start_);
	transformed_end_ = cml::transform_point(inv_mat_, end_);
  }

To all the matrix and graphics wizards, what am I doing wrong? Is this the way that you would approach the problem?

Thanks for your help!
Ben

EDIT: World space, not eye space.

Calculating Tangent and Bitangent

04 September 2010 - 12:51 PM

I need some feedback and advice! We have a deferred rendering system, which allows for animated objects, and are hoping to add tangent space normals to it.

The renderer currently sends the following per-pixel to the post renderer:

Diffuse Color
World Space Normals
Specular Intensity and Exponent
World Space Position

The way I see it, there are two methods of doing this:

1. Calculate tangent and bi-tangent per frame of each mesh's animation, using the new object space position of the vertices and their UV co-ordinates. This seems very much the traditional method of doing things. The normal can be calculated as the cross product of the tangent and bi-tangent.

2. Render the UV co-ordinates of each vertex to a render target. Then use the world space co-ordinate texture to calculate the tangent per fragment in each frame (in screen space). Use a cross product of the global normal texture and the tangent to calculate the bi-tangent.

From either method, use the tangent and bitangent to apply the tangent space normal map.

I'm thinking that the second method, although requiring the generation of a UV texture, would still be faster as the calculations are done on the GPU, as opposed to the CPU for the first method. But is it worth the extra texture for the speed boost I'd gain?

Thoughts appreciated!

Sput

GLSL Getting 3D Position from Depth

16 August 2010 - 11:18 AM

Hey everyone! Over the past week, I've been writing a shader which takes the Depth co-ordinate from my depth buffer, and a two texture co-ordinates, and attempts to use them to reconstruct the 3D world-space position for use in deferred lighting.

However, something is all wrong with the co-ordinates it generates. I'm pretty sure it's something to do with the inverse combined camera and projection matrix I'm using, but I'm not completely sure.

This is the relevant GLSL shader code:

uniform mat4 ModelProject;

vec3 DepthToPos(vec2 texcoord)
{
vec2 screen = texcoord;
float depth = texture2D(tex4, screen).x;
screen.x = (screen.x * 2.0) - 1.0;
screen.y = (screen.y * 2.0) - 1.0;
depth = (depth * 2.0) - 1.0;
vec4 world = inverse(ModelProject) * vec4(screen, depth, 1.0);

return world.xyz / world.w;
}

vec3 fragpos = DepthToPos(gl_TexCoord[0].st);

vec4 final;
final.x = fragpos.x;
final.y = fragpos.y;
final.z = -fragpos.z;
final.w = 0.0;
final /= 32; //Scale the output down so that values are in the range [-1.0,1.0].
gl_FragColor = final;



I have my engine rendering a cube at the moment. The cube is centered around (0.0,0.0,-30.0). Here's the output of my shader:



Taking the center of the cube as an example, you can see that the RGB is:

R: 82
G: 49
B: 133

According to my shader, these values correspond to:

X: (82/255) = 0.322 0.322 * 32 = 10.3
Y: (49/255) = 0.192 0.192 * 32 = 6.2
Z: (133/255) = 0.522 0.522 * 32 = 16.7

And I know for a fact it's a 2x2 cube, meaning that the x and y values should be no bigger than the sqrt(1+1), right?

I'm passing the matrix in like so:
mat44 CombinedMatrix = ProjectionMatrix * CameraMatrix;

loc = glGetUniformLocation(ShaderID, "ModelProject");
glUniformMatrix4fv(loc, 1, false, CombinedMatrix.data());



So, what's up with it? :S

Thanks for all your help,

Sput

Corrupt Vertex Buffer Objects?

18 December 2009 - 11:58 AM

Hey everyone! I'm having a problem getting vertex buffer objects to retain their data. I have simplefied my problem down, so that it shows the problem without being too complicated. I initialise an array of GLubyte values in the pattern 0,1,0,0,1,0... (Repeated upwards pointing vectors).
if(isShaded)
{
	for(int i = 0; i < Stats[0]; i++)
	{
		NVertices[(i*3)] = 0;
		NVertices[(i*3)+1] = 1;
		NVertices[(i*3)+2] = 0;
	}
}




I generate a series of VBOs:
///Initialise Vertex Buffer Objects
VBuffers = new GLuint[(Stats[5] + Stats[6])];
glGenBuffers((Stats[5] + Stats[6]), VBuffers); //Indices and vertices.




I update the VBO which I use for normal data using this array. I know that the data is intact at this point, through outputting a certain value I know should have a value of 1.
if(isShaded)
{
	cout << bool(glIsBuffer(VBuffers[2])) << endl;
	cout << "VBuffers[2]: " << VBuffers[2] << endl;
	glBindBuffer(GL_ARRAY_BUFFER, VBuffers[2]);
	glBufferData(GL_ARRAY_BUFFER, VertexProduct[1]*sizeof(GLubyte), NVertices, GL_STATIC_DRAW);
	cout << "NVertices[1]: " << NVertices[1] << endl;
}


I then unbind the GL_ARRAY_BUFFER by binding a zero value, and delete the array holding the data.
cout << "Blank Binding!" << endl;
glBindBuffer(GL_ARRAY_BUFFER, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);

delete NVertices; NVertices = NULL;




I use glMapBuffer to grab a pointer to the normal data in the VBO, and the data is completely changed.
glBindBuffer(GL_ARRAY_BUFFER, VBuffers[2]);
GLubyte* Pointer = (GLubyte*)glMapBuffer(GL_ARRAY_BUFFER, GL_READ_WRITE);
for(int i = 0; i < Stats[0]; i++)
{
	cout << "NVertices[" << (i*3) << "]: "  << int(Pointer[(i*3)]) << "\t";
	cout << "NVertices[" << (i*3)+1 << "]: "  << int(Pointer[(i*3)+1]) << "\t";
	cout << "NVertices[" << (i*3)+2 << "]: "  << int(Pointer[(i*3)+2]) << "\t" << endl;
}

glUnmapBuffer(GL_ARRAY_BUFFER);

glBindBuffer(GL_ARRAY_BUFFER, 0);
delete Pointer; Pointer = NULL;




Here's the output from my code, each line of 3 NVertices comprises the xyz of a normal: http://pastebin.com/mdc08864 An example from the output is the NVertices[1] byte. Initially, it is set to 1, as it should be. When I grab the data back from the graphics card, it tells me that the same location in my array is now 4. I've been trying to figure this out all of today, and I just can't see where I'm going wrong. I've checked tutorials, read the whole section in the Red Book again, but I can't make any sense of it. Any help would be much appreciated! Many Thanks, LordSputnik [Edited by - LordSputnik on December 19, 2009 6:15:43 AM]

Circle centre

26 November 2009 - 10:39 AM

Hey everyone! I was wondering, is it posible to find the centre of a circle given two points on the circumference? I thought not, since these two points could be any distance apart, so you could only give a centre if the radius was known. Many thanks!

PARTNERS