I am importing various animations from PowerVR POD files. I need conversions from matrices to quaternions & stuff, as I want to implement animation blending and inter-frame interpolation.
My problem is that, when reading data exported from powervr as quaternions, translations and rotations, matrices converted from those TRS does not equal ( even by far ) the matrices read if I export the models with matrices.
Also, if I try to decompose the matrix into TRS, after extracting the scaling component, the resulting matrix is not a rotation matrix: columns are unit length, but rows aren't ( I have removed the translation component ).
For example, when I export as matrices ( model has about 100 bones, with about 250 frames ), the 11174 matrix is, COLUMN ORDERED:
Perhaps there is some bug in my conversion code although I have checked 1000 times, also I have created random translation, rotations and scaling and converted to matrices and then converted back to TRS and got the same values. That's why I have put here the parameters read from POD file.
From imgtec site I received no answer.
Or perhaps the frame matrices does not come from a simple T*R*S transform and there is something more.
Animations are done in 3dmax.
Any solution or explanation for this is welcome.
Useless to say, the model, when used with matrices that come from TRS read from the POD file, deforms horribly. The first frame looks good, but the following not.
I am developing a fractal terrain generation based on an original 256x256 heightmap.
I need to use a texture to represent this heightmap, together with other gradients in each point, that is, 3 floats per 'texel'.
//////////////////////////////////////////////////////////////////////////////in the vertex shader, I am doing:
uniform sampler2D TexHeightMap;
vec4 GetVertCellParameters( uint i, uint j )
return texture( TexHeightMap, vec2( i, j ) ) * 2500.0f;
the * 2500.0f multiplication was used just to make sure the values are not somewhere between 0-1; in the final code,
the function will just return texture( TexHeightMap, vec2( i, j ) );
//////////////////////////////////////////////////////////////////////////////in the CPU, I am using:
for some reason, I am only getting zeros in the vertex shader ( when calling GetVertCellParameters ).
I checked the parameters pvBytes with memory view and they look fine.
I just need to get the float values, unmodified, just as I send them, in the vertex shader, and access them via texture as vec4.
I am working on a simple terrain demo and I have a 256x256 height-map that stores not only height but also other information, each grid point having 8 floats.
I need this grid available in the video card memory ( so I can create a VBO or smth ).
The most important, in the vertex shader, when drawing the terrain, when getting the x and y ( Z is vertical ) of a vertex in the terrain, I will compute the cell grid in which the x-y point lies and then I need to be able to access in the vertex shader those 8 floats that are stored in the closest grid point of the 256x256 points "height map".
If someone has a suggestion on how to do it, any help is appreciated.