Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 16 Mar 2012
Offline Last Active Jun 21 2016 06:13 AM

#5294781 Computing Matrices on the GPU

Posted by StanLee on 03 June 2016 - 08:13 AM



I'm in the situation that I need to create a lot of model-view-projection (MVP) matrices (around 100 and more) out of a normal and position vector which are stored in a texture.

I already implemented it in a way that I download these two textures and then create the matrices on the client side, but this is very very slow, as predicted. But the impact on the performance is really too big.


How to do this entirely on the GPU? I don't even need the matrices on the client side. They are later used for rendering.


My idea so far is to use a SSBO (Shader Storage Buffer Object) to store my matrices. I take every sample point on my texture (which corresponds to one matrix) and put it into a VBO which is then rendered to a screenquad. This should result in a fragment shader thread for every sample point. Then I sample the position and the normal from my textures in the fragment shader, construct a MVP matrix and store it in the SSBO. To get the right index for the SSBO every sample point is provided with its corresponding SSBO-index to the vertex shader. 


The question is: Is this the fastest way possible to generate the matrices? Are there other faster possibilities? I also thought about using 2D Textures to store the matrices, but this would imply 4 texture accesses later to get one matrix. (considering a 4 channel texture) But I don't know if this is faster than using SSBO's.




#5043003 Camera for Raytracing

Posted by StanLee on 14 March 2013 - 04:09 AM

Thanks a lot! Your code helped me to figure out that I totally forgot to normalize the scale of my image plane.
Before normalization my image plane was centered around (0,0,500) and the top left corner around (-512, 384, 500) in world space. Without adjustments of the object positions and sizes absolutely nothing could be seen.
Now I used your proposed equation for the focal length f:


Everything works fine by now though I still have a question. When I choose a wide angle for the horizontal field of view, let's say 120°, I see spheres getting oblong at the border of the screen.  Is this the so called "fisheye"-effect due to a wide field of view?