Jump to content
  • Advertisement
Sign in to follow this  
Bonehed316

Matrices, shaders, and pointers

This topic is 4731 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Okay, this is one of those problems that shouldnt be a problem, or maybe an anomoly in the earths magnetic field has developed around my house making all kinds of weird things happen. Yes, that must be it. Okay, so I've got this engine working pretty decently so far (still very early on). I've set up my own math related classes for doing 3d math related things in my engine so that I can send those numbers to the renderer which is decoupled from any specific API (although designed around the features of directx, as that is the API I am working with atm). All that, in theory, works well. My 3D math skills are pretty lame, but I think I know how to find the basic formulas in books and other resources and transfer those into code (the only true use for derivitives, imo). Anywho, on to my problems. What I am currently doing is merely rendering a test cube while I set up all the features of my renderer, so I can see when things go wrong. This has evolved throughout developement to utilize the current interfaces of my renderer and other engine components. Currently, I am setting the transform matrices from the main loop in absense of a more proper way just yet (such as a scene graph, etc.), using my own matrix and vector structures. It looks something like this:
	Interface->SetTransform( TT_WORLD, FMatrix::Identity());
	Interface->SetTransform( TT_VIEW, FMatrix::Identity().CameraLookAtMatrixLH(FVector( 0.0f, 3.0f,-10.0f ),FVector(),FVector(0.0f,1.0f,0.0f)));
	Interface->SetTransform( TT_PROJECTION, FMatrix::Identity().ProjectionMatrixPerspectiveFOVLH(PI/4.0f,Viewport->SizeX/Viewport->SizeY,1.0f,1000.0f ));

What this is doing is calling the Identity() method, which is a static function that returns an instance of the struct initialized to an identity matrix. It works as it is supposed to. That value can then be used to set a camera look-at matrix, or a perspective matrix. The FVector struct initializes to 0,0,0 with a default constructor, and in this instance is the look at parameter. These numbers have been used since the early beginnings of my renderer, so I know they are valid and should work, and they do work ;) So, what is the problem, you ask? Well, to transform the cube I'm rendering, I've been using a vertex shader, and sending the matrix to the shader constants as you are supposed to. There are two problems with this procedure, however. First, the matrix I am giving to the hardware is World*View*Projection, which I believe would be correct, except it isnt. Maybe I read wrong (in several different places), but I am quite sure that the matrices are set correctly, and are correctly referenced when multiplying. Like so: WantedState.WorldViewProjection = (WantedState.Matrices[TS_WORLD] * WantedState.Matrices[TS_VIEW]) * WantedState.Matrices[TS_PROJECTION]; The result is, however, incorrect, but is correct if the order projection and view matrices are reversed. I know matrix multiplication isnt commutative, but isnt it supposed to be world*view*projection, or am I just crazy, illiterate, and blind? My projection matrix code, or even the multiplication code could be wonky, but it seems to work (I suck at math, so I cant do it long hand to check). I'll post more code if anyone thinks it would help. Second, I'm setting the result of the multiply/transpose in to a struct in my render state management code which holds the shader constant data needed to make the call (StartRegister, Data, and RegisterCount) until the render state is about to be commited to hardware (thus no redundant values are ever set). The matrix is passed as a float* and the pointer is valid in the debugger. This all happens after the vertices are sent to the vertex buffer. The problem is, calling CreateIndexBuffer causes the pointer to my matrix data to become invalid. It's valid right before the call, but never right after, but the matrix that the pointer is referring to (the pointer is actually pointing to float m[16]; in the matrix, which gets returned in the float* cast). Is this some trick of pointers that I'm missing, or some fluke, or some evil that harasses me for sport? I can reset the pointer after creating the index buffer, and it stays valid through the call to DIP, and works fine, OR allocating the pointer data with new works as well, but that either creates a memory leak, or I have to constantly delete [] the pointer before setting new data to it, thus making my design cumbersome.

Share this post


Link to post
Share on other sites
Advertisement
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!