BlackJoker

Members
  • Content count

    384
  • Joined

  • Last visited

Community Reputation

1328 Excellent

About BlackJoker

  • Rank
    Member
  1. No one from over 200 viewers cannot help me with this issue?
  2. Hi, I am trying to implement correct user experience for rotation tool in my game engine. I want it visually behave like the same tool in Maya or Unity. When I rotate an object, rotation tool should also rotate, BUT its axis should always be on the near camera plane and never go to another side of the tool like described in 2 first attached screenshots from Unity. You can see here that X axis (red) go to the up part of the tool instead of the back. The same for Y and Z axis. Currently I implement something similar, but my code has huge limitation - it gave me correct quaternion for rotation, BUT to have correct axis alignment I must rewrite my existing tool rotation, so I cannot accumulate rotation and I cannot implement correct visual experience for tool. (See next 2 screenshots). As you can see there is no difference between visual tool representation despite I rotate an object itself. Here is code I am using currently: /// <summary> /// Calculate Quaternion which will define rotation from one point to another face to face /// </summary> /// <param name="objectPosition">objectPosition is your object's position</param> /// <param name="targetPosition">objectToFacePosition is the position of the object to face</param> /// <param name="upVector">upVector is the nominal "up" vector (typically Vector3.Y)</param> /// <remarks>Note: this does not work when objectPosition is straight below or straight above objectToFacePosition</remarks> /// <returns></returns> public static QuaternionF RotateToFace(ref Vector3F objectPosition, ref Vector3F targetPosition, ref Vector3F upVector) { Vector3F D = (objectPosition - targetPosition); Vector3F right = Vector3F.Normalize(Vector3F.Cross(upVector, D)); Vector3F backward = Vector3F.Normalize(Vector3F.Cross(right, upVector)); Vector3F up = Vector3F.Cross(backward, right); Matrix4x4F rotationMatrix = new Matrix4x4F(right.X, right.Y, right.Z, 0, up.X, up.Y, up.Z, 0, backward.X, backward.Y, backward.Z, 0, 0, 0, 0, 1); QuaternionF orientation; QuaternionF.RotationMatrix(ref rotationMatrix, out orientation); return orientation; } And I am using some hack to correctly rotate all axis and keep them 90 degrees to each other: private void TransformRotationTool(Entity current, Camera camera) { var m = current.Transform.GetRotationMatrix(); if (current.Name == "RightAxisManipulator") { var rot = QuaternionF.RotateToFace(current.GetRelativePosition(camera), Vector3F.Zero, m.Right); rot.Z = rot.Y = 0; rot.Normalize(); current.Transform.SetRotation(rot); } if (current.Name == "UpAxisManipulator") { var rot = QuaternionF.RotateToFace(current.GetRelativePosition(camera), Vector3F.Zero, m.Up); rot.X = rot.Z = 0; rot.Normalize(); current.Transform.SetRotation(rot); } if (current.Name == "ForwardAxisManipulator") { var rot = QuaternionF.RotateToFace(current.GetRelativePosition(camera), Vector3F.Zero, m.Forward); rot.X = rot.Y = 0; rot.Normalize(); current.Transform.SetRotation(rot); } if (current.Name == "CurrentViewManipulator" || current.Name == "CurrentViewCircle") { var billboardMatrix = Matrix4x4F.BillboardLH( current.GetRelativePosition(camera), Vector3F.Zero, camera.Up, camera.Forward); var rot = MathHelper.GetRotationFromMatrix(billboardMatrix); current.Transform.SetRotation(rot); } } As you can see I am zeroing 2 of 3 axis and renormalize quaternion to keep axis perpendicular to each other. And when I try to accumulate rotation, I am receiving completely incorrect result. On the last image you see what happening with my tool when I try to apply rotation to face with some basic rotation. This issue is driving me crazy. Could anyone help me to implement correct behaviour?
  3. Ok, seams ortho projection was enough and I just passed incorrect znear/zfar. I thought that max zfar could be 1, but obviously I was mistaken.
  4. Hi, I am trying to render my 3d object in exact screen space coordinates with correct shading. I tried for this orthoOffCenterLH matrix, but it produces strange render results (object in front part in disappearing and appearing during rotation). I think it because of nature of ortho projection (but maybe I am wrong). I want to correctly display my 3d object like when I render with perspective projection. To achieve this I tried to use PerspectiveOffCenterLH matrix, but when I apply it, I can see nothing. Here is parameters I pass to the method: PerspectiveOffScreenProjection = Matrix4x4F.PerspectiveOffCenterLH(0, Width, Height, 0, 100, 1); ZFar and ZNear are flipped because I am using reversed depth buffer. And here is how I build matrix itself public static void PerspectiveOffCenterLH(float left, float right, float bottom, float top, float znear, float zfar, out Matrix4x4F result) { float zRange = zfar / (zfar - znear); result = new Matrix4x4F(); result.M11 = 2.0f * znear / (right - left); result.M22 = 2.0f * znear / (top - bottom); result.M31 = (left + right) / (left - right); result.M32 = (top + bottom) / (bottom - top); result.M33 = zRange; result.M34 = 1.0f; result.M43 = -znear * zRange; } Also I render it without View matrix to fix object position and dont let it move somewhere. Also if I flip Znear and ZFar back on their places, it will resnder distored like on first screenshot, but it should render like on the second screenshot Can anyone help me to fix this issue?
  5. Hi, I am trying to implement rotation tool in my own engine, which will have same user experience as rotation tool in unity or maya. When user rotate object by tool, it also rotates, but its axis visually does not rotate on the back side of the tool - they are always near camera view +/-90 on the X/Y/Z axes. I used this code to achieve rotation to face for each of the axes: // objectPosition is your object's position // objectToFacePosition is the position of the object to face // upVector is the nominal "up" vector (typically Vector3.Y) // Note: this does not work when objectPosition is straight below or straight above objectToFacePosition QuaternionF RotateToFace(Vector3F objectPosition, Vector3F objectToFacePosition, Vector3F upVector) { Vector3F D = (objectPosition - objectToFacePosition); Vector3F right = Vector3F.Normalize(Vector3F.Cross(upVector, D)); Vector3F backward = Vector3F.Normalize(Vector3F.Cross(right, upVector)); Vector3F up = Vector3F.Cross(backward, right); Matrix4x4F rotationMatrix = new Matrix4x4F(right.X, right.Y, right.Z, 0, up.X, up.Y, up.Z, 0, backward.X, backward.Y, backward.Z, 0, 0, 0, 0, 1); QuaternionF orientation; QuaternionF.RotationMatrix(ref rotationMatrix, out orientation); return orientation; } And then I just apply resulting rotateToFaceQuaternion for each axis. If I apply rotateToFaceQuaternion, axis always move towards to the camera (which is correct), but when I try to rotate the whole tool by some quaternion, it will not rotate correctly: current_rotation * rotateToFaceQuaternion = incorrect result, because after applying this rotation to the axis of the tool, they will rotate to the other side of the tool (which is not correct). My question is: how to rotate axis of the tool in such way, that they always rotates towards to the camera, but also could be rotated +/- 90 degrees and if current axis reaches edge of the tool from one side, it will appear on the other side immediately (like in unity or maya) i. e. not to allow axis to appear on the other side of the tool.
  6. Dual contouring implementation on GPU

    Thanks for the link!
  7. Dual contouring implementation on GPU

    Anfaenger      It would be cool if QEF_SOLVER_SVD2 have at least few comments to understand what is going on in code. I assume that there is no good enought commented code for QEF in whole Internet
  8. Dual contouring implementation on GPU

    Thanks for that. After googling a little I found this article from GPU Gems:   http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter37.html   Will study this more detail.
  9. Hi. I`ve recently started to study dual contouring/manifold dual contouring algorithms and see that they are using an octree.   But because of the fact that HLSL/GLSL has no pointers, it is impossible to implement octree in shaders at least the same way it exists on CPU.   So I wanted to ask does someone already faced with such issue and could say how to implement such algorithms completely on GPU side?   Maybe there is a way to replace octree with data structure more suitable for shaders or something like that?
  10. Swapchain presentation freezing issue

    Hah. That really interesting. I started thinking already that this is a bug :)
  11. Hi. I found that when creating Swapchain with swapEffect = discard, then dispose it and create new swapchain with swapEffect = flipsequental, then dispose it and create again swapchain with swapeffect = discard, It will stop presenting anything on the screen and will be no error. Image just stop updating. But if I create new swapchain with swapEffect = flipsequental, presenting starts working again, but from that moment you cannot see anything if you create swapchain with discard effect during current lifecycle. Seems this issue causing only for discard -> flipsequental -> discard sequence. Could someone say is this by design? Does anyone faced similar issue in DirectX 11?   P.S. I forgot to mention that I did this with Desktop window and Windows 8.1/10.
  12. Ok, thanks. I will try texture2d array first. Hope it work.
  13. I just thought that I can create a Texture3D and fill it with different 2d textures at each slice and load them by depth, correct? This could be a solution to my problem. The only issue I see - it would consume more memory.   Btw, I suppose it is not possible to use different slices in texture 3d? They all should be align to the same size, correct?
  14.     Can you, please, give some examples how to implement this?
  15. Ok, I see that in DX 11 I cannot do that, because also I need to dynamically update textures in array from frame to frame and with Texture2dArray it would be a little bit problematic if not say more.