DX11 Picking in DX11

This topic is 1757 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

Hello,

I'm trying to make picking working following this tutorial: http://www.rastertek.com/dx11tut47.html

I'm not sure is it me or the author confuses spaces at the end of this tutorial? Can someone have a fresh look at this? Namely he states that multyplying vector by inverse view matrix we get the result in view space. Shouldn't it be in world space? And then we go from world into object space and there make final test? His ray intersection doesn't take into account sphere position so final test looks like it's in object space, but he also says it's in world space... So yeah, thoughts?

Edited by keym

Share on other sites

As far as I understand things, a typical view matrix is actually a "inverted" in the sense that if the camera matrix is in the world space, then the required matrix to transform things from world space to the camera's space is actually "inverse camera matrix" or the view matrix or inverse view matrix in this case... to make things complicated. It is just a case of "confusing naming". I use naming "camera matrix" to define camera's location and direction in world space and view matrix is actually the inverted camera matrix.

You can confirm this from many code samples where the view matrix is constructed. Just rarely the code uses the actual matrix inversion, since in the view matrix case the inverse can be calculated easily.

Yes, in the code the the ray is transformed to the local/object space by the inverse world matrix of the sphere. The beauty of things is that in the local space the sphere is located at origo (0,0,0) so translation doesn't have to be accounted in the ray-sphere intersection test.

The advantage of this technique is that it supports also things like scaling / non-uniform scaling for the world matrix. The ray-sphere test remains always the same, since it's just the ray's position and direction changing.

Cheers!

Edited by kauna

Share on other sites

Well... shouldn't this be that simple:

object space ----[world a.k.a. model matrix]----> world space

world space ----[view a.k.a. camera matrix]----> view space

view space ----[projection matrix]----> clip space

object space <----[inverse world a.k.a. model matrix]---- world space

world space <----[inverse view a.k.a. camera matrix]---- view space

view space <----[inverse projection matrix]---- clip space

?

Anyways, this is how it *seems right* to me, but I'm not a guru here. Maybe I'm being picky ;) about naming and that was not the intention of this topic (but still I wanted to clarify naming before I ask my question(s) and make more confusion).

So, the reason I post is because (obviously) I have a problem with picking. The issue here is that in my renderer I use right hand coordinate system, like in OpenGL (for sake of compatibility, I have OGL renderer in this app too and I don't want to negate every needed value to get the same result, it would only make more future errors).

So I construct my projection matrix using D3DXMatrixPerspectiveFovRH() and view matrix using D3DXMatrixLookAtRH(). Before sending them to HLSL I transpose them (for some reason I have to do this, otherwise I get incorrect results [DX stores matrices in row major, but in HLSL they need to be in column major?]). All is sweet and dandy until picking occurs. I'm pretty sure that I'm doing something wrong, because this is my first attempt with renderer independent picking. I follow what's in the tutorial but intersection test gives incorrect results. For sake of simplicity my sphere is at (0,0,0) so I don't have to care about world and invWorld matrices. I'm guessing that something is wrong with my matrices but it's hard to track down.

Also I'm not sure what's going on here (tutorial):

// Adjust the points using the projection matrix to account for the aspect ratio of the viewport.
m_D3D->GetProjectionMatrix(projectionMatrix);
pointX = pointX / projectionMatrix._11;
pointY = pointY / projectionMatrix._22;


and how exactly the unprojecting part works. I mean I have mouse coordinates that I rescale into -1, 1 range but how do I get from vec2 to vec3? Where does the 3rd component come from?

Share on other sites

Solved.

Looks like all my math was ok but I forgot one thing - my rendering WinAPi control has an offset in x,y (cause I have sidebar and other stuff on the side) and I forgot to take that into account when reading mouse position over the viewport. For instance I got [0,0] at the origin of the window, not the rendering control. Now all works well. Thanks for looking.

Share on other sites

This topic is 1757 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Create an account

Register a new account

• Similar Content

• Hi, can somebody please tell me in clear simple steps how to debug and step through an hlsl shader file?
I already did Debug > Start Graphics Debugging > then captured some frames from Visual Studio and
double clicked on the frame to open it, but no idea where to go from there.

I've been searching for hours and there's no information on this, not even on the Microsoft Website!
They say "open the  Graphics Pixel History window" but there is no such window!
Then they say, in the "Pipeline Stages choose Start Debugging"  but the Start Debugging option is nowhere to be found in the whole interface.
Also, how do I even open the hlsl file that I want to set a break point in from inside the Graphics Debugger?

All I want to do is set a break point in a specific hlsl file, step thru it, and see the data, but this is so unbelievably complicated

• I finally ported Rastertek's tutorial # 42 on soft shadows and blur shading. This tutorial has a ton of really useful effects and there's no working version anywhere online.
Unfortunately it just draws a black screen. Not sure what's causing it. I'm guessing the camera or ortho matrix transforms are wrong, light directions, or maybe texture resources not being properly initialized.  I didnt change any of the variables though, only upgraded all types and functions DirectX3DVector3 to XMFLOAT3, and used DirectXTK for texture loading. If anyone is willing to take a look at what might be causing the black screen, maybe something pops out to you, let me know, thanks.

Also, for reference, here's tutorial #40 which has normal shadows but no blur, which I also ported, and it works perfectly.

• By xhcao
Is Direct3D 11 an api function like glMemoryBarrier in OpenGL? For example, if binds a texture to compute shader, compute shader writes some values to texture, then dispatchCompute, after that, read texture content to CPU side. I know, In OpenGL, we could call glMemoryBarrier before reading to assure that texture all content has been updated by compute shader.
How to handle incoherent memory access in Direct3D 11? Thank you.
• By _Engine_
Atum engine is a newcomer in a row of game engines. Most game engines focus on render
techniques in features list. The main task of Atum is to deliver the best toolset; that’s why,
as I hope, Atum will be a good light weighted alternative to Unity for indie games. Atum already
has fully workable editor that has an ability to play test edited scene. All system code has
simple ideas behind them and focuses on easy to use functionality. That’s why code is minimized
as much as possible.
Currently the engine consists from:
- Scene Editor with ability to play test edited scene;
- Powerful system for binding properties into the editor;
- Render system based on DX11 but created as multi API; so, adding support of another GAPI
is planned;
- Controls system based on aliases;
- Font system based on stb_truetype.h;
- Support of PhysX 3.0, there are samples in repo that use physics;
- Network code which allows to create server/clinet; there is some code in repo which allows
to create a simple network game
I plan to use this engine in multiplayer game - so, I definitely will evolve the engine. Also
I plan to add support for mobile devices. And of course, the main focus is to create a toolset
that will ease games creation.
Link to repo on source code is - https://github.com/ENgineE777/Atum
Video of work process in track based editor can be at follow link:

• I made a spotlight that
1. Projects 3d models onto a render target from each light POV to simulate shadows
2. Cuts a circle out of the square of light that has been projected onto the render target
as a result of the light frustum, then only lights up the pixels inside that circle
(except the shadowed parts of course), so you dont see the square edges of the projected frustum.

After doing an if check to see if the dot product of light direction and light to vertex vector is greater than .95
to get my initial cutoff, I then multiply the light intensity value inside the resulting circle by the same dot product value,
which should range between .95 and 1.0.

This should give the light inside that circle a falloff from 100% lit to 0% lit toward the edge of the circle. However,
there is no falloff. It's just all equally lit inside the circle. Why on earth, I have no idea. If someone could take a gander