Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 22 Feb 2000
Offline Last Active Today, 12:39 AM

#5278151 Directly access XMMATRIX ellements

Posted by on 25 February 2016 - 01:42 PM


1) The OP is asking why he's getting an error trying to access "members _11 _21 _31" etc. I have correctly answered why this is occurring. You have not.

2) While I do assume OP is using Visual Studio as we are dealing with the DirectX Math library here the advice of "going to the definition" of XMMATRIX is sound and teaches the OP how to get to the root of the problem.

3) I have not recommended disabling SIMD. But it appears to be the only way to access elements NAMED _11 _21 _31 if you need to do so for learning purposes.


4) There are additional ways to access the individual components of the matrix including the one you pointed out.

#5277972 Directly access XMMATRIX ellements

Posted by on 24 February 2016 - 04:59 PM

Hi there,

Seems like you need to get a little more familiar with C++ and your dev environment. I assume you're using Visual Studio... if so you can right click on any class name and select "goto definition"

If you do that for XMMATRIX it will bring you to this bit of code 


        XMVECTOR r[4];
            float _11, _12, _13, _14;
            float _21, _22, _23, _24;
            float _31, _32, _33, _34;
            float _41, _42, _43, _44;
        float m[4][4];
    XMVECTOR r[4];

As you can see ... if _XM_NO_INTRINSICS_ is defined you have access to "float _11, _12, _13, _14;" etc via the union.
otherwise the data for the structure is defined as XMVECTOR r[4];

So the simple answer would be to define _XM_NO_INTRINSICS_ if you NEED access to those members for learning purposes. However I assume from the definition name that you will lose all SIMD optimizations!

Good luck!

#5270185 Temporally Smoothing SSAO ?

Posted by on 08 January 2016 - 06:16 PM


I ran across this blog post today so I thought I'd link it here for you. It discusses temporal reprojection of SSAO.


#5264764 Questions on Baked GI Spherical Harmonics

Posted by on 03 December 2015 - 11:57 AM

Just a quick semi answer whilst glancing over your question.....

Here's the GDC presentation by Robert Cupisz


And a link to his website he's reposted the slides and a video with some more info.


#5169914 Reconstructing Position From Depth Buffer

Posted by on 28 July 2014 - 07:28 PM

Hi BlueSpud,


It's actually quite simple to understand if you think of it this way to retrieve a viewspace coordinate.


- the screen coordinate comes in as 0 to 1 in x and y
- you remap that to -1 to 1 like so.... screencoord.xy * 2 - 1 (migth have to flip the sign of the results depending on API)
- you now have an xy value you can picture as laying on the near plane of your camera....so the z coordinate of the vector is whatever your near plane value is.
- you now have to figure out how to scale the -1 to 1 xy values properly to represent the dimensions of the near plane "quad" in view space... this is easy
- you just use some trig to figure out the width and height of the "quad" at the near plane.... basically it's .... tan(FOV * 0.5) * near
- also you'll have to multiply that by the aspect ratio for x probably.

- now after you do all this you will have calculated a vector from the eye to the near plane.
- you just have to now scale this vector by whatever the depth of that pixel is..... you'll have to account for the ratio of how close your near plane is

basically that's how you think of it... if you draw some pictures you should be able to get it..
in order to recover in another space... it's basically the same... but you move along the axis directions in x/y/z then add the eye position.

#5157671 Ideas for rendering huge vegetation (foliage)

Posted by on 02 June 2014 - 05:22 PM

You may find this recent series of devblog's by Casey Muratori interesting. He goes into detail about his grass planting system. Less about the rendering side of things and more about proper placement of the ground cover itself to get the best coverage with the least amount of geometry. I don't think he goes into anything about LOD.. but it does get you thinking about how to provide the best coverage.


Starts in "Working On The Witness, Part 5" and continues on through his reasoning into part 8 where he has some code for optimal placement.



#5155907 Doing local fog (again)

Posted by on 25 May 2014 - 12:02 PM

Hey there. This came out a week ago.. May be of interest to you. There's some good videos on the site so you can see if the effect is to your liking before even reading the paper. It's essentially a ray marching post process. I believe he's working on a demo to come as well which will be nice.


#5077653 Image Based Reflections - DX11

Posted by on 14 July 2013 - 01:02 PM

Thanks for the tips Hodgman!

I'm going to give it a try when I get back to work next week. I'll report my findings if I'm able to pull it off. I like how you're sampling the different mips a la irridiance map. I wouldn't have thought of that. Do you fade near the edges of the quad? Or do some sort of clamping?

#5077453 Image Based Reflections - DX11

Posted by on 13 July 2013 - 06:38 PM



That is a great article but not the method that the Unreal Engine is using.

If you read in the comments of one of the you videos that accompanies that article the author (Sebastien Lagarde) states:


"The algorithm aim to replace dynamic reflection. Goal is performance. So all is static and computed offline (No characters). All the details + code can be found at the links in the description of the video.The algorithm was design for current gen platform DX9/PS3/XBOX360, for modern platform there can be better way. Image-based reflection of Unreal are better quality but at a higher cost. All depends on your targets framerate."

So I'm wondering if anyone out there implemented the Unreal method!

#5077442 Image Based Reflections - DX11

Posted by on 13 July 2013 - 05:47 PM

I've been reading a bit about Image Based Reflections as seen in the Epic Samaritan Demo. There's a bit of information on how they work over on this UDN page.. http://udn.epicgames.com/Three/ImageBasedReflections.html 

I was wondering if anyone has attempted supporting this type of reflection in their own engine or had any links they could point me to to learn more about how this is achieved. Since it seems to be a just quad reflector you could probably check intersection inside a pixel shader to determine texture coordinates for the quad. I'm wondering how this might be accelerated to robustly support multiple quads in a scene and limit the intersection checks.


Thanks in advance!

#5010736 DX9 -> Dx11 Port... Nothing being drawn?

Posted by on 14 December 2012 - 03:06 PM

Haha! Did you have the same issue or are just saluting me for figuring it out? It may seem trivial but when you're in the middle of a port that took 3 days to compile and run without errors it's hard to zero in. Hoping this post helps someone out in the future!

#5010468 DX9 -> Dx11 Port... Nothing being drawn?

Posted by on 13 December 2012 - 10:34 PM


If anyone else is having this problem ... you want to make sure you see the viewport outline in pix. The vertex values after transforms were correct in taht they should be displayed on screen. But the viewport structure itself wasn't properly setup (D3D11_VIEWPORT and RSSetViewports) it was set to 0 pixels X 0 pixels due to the fact that setting up the backbuffer is somewhat of a special case because you have to grab it from the swap chain.


#5010445 DX9 -> Dx11 Port... Nothing being drawn?

Posted by on 13 December 2012 - 08:43 PM

Wait a minute isn't there supposed to be a rectangle in PIX representing the viewport?
I guess that means I'm way off!

#5010421 DX9 -> Dx11 Port... Nothing being drawn?

Posted by on 13 December 2012 - 07:30 PM

- I'm clearing the backbuffer to a different color every frame and that is properly flickering
- PIX shows that there is output to the viewport as you can see in the image
- None the less it appears no pixels are being written to the backbuffer

Any ideas? Thanks!

Posted Image

#4903554 Cascade Stability

Posted by on 17 January 2012 - 02:58 AM


There is a way to calculate the proper scale and offset directly form your cascade shadow viewproj matricies. It's right in the article but wasn't working due to a bug in my matrix inversion code.

The way to get the scale remains

float scale[n] = splitRadius[0] / splitRadius[n]

The offset is calculated as

float4 offset[n] = float4(0.0f,0.0f,0.0f,1.0f) * Inverse(

ShadowMat[n]) *