Zosimas

Members
  • Content count

    25
  • Joined

  • Last visited

Community Reputation

122 Neutral

About Zosimas

  • Rank
    Member
  1. DirectX bounding sphere weird results

    In Managed Direct3D there are 4 options to use this function. One of them is the one that I wrote above. When I changed it to what you said, like this: [CODE] objectRadius = Geometry.ComputeBoundingSphere(vertexData, m.NumberVertices, m.NumberBytesPerVertex, objectCenter) [/CODE] everything worked great. BUT, I found that only the meshes that are exported with KW X-port plugin from 3Ds max have this problem. PandaExporter plugin seems to work well with this but unfortunately it doesn't export tangets, so I can't use it. Thanks for the help.
  2. I made a simple box in 3Dstudio max and exported it with KW X-port plugin in .X format. Now I'm loading this mesh in my Direct3D project and calculating the bounding sphere of it like this: [CODE] m = Mesh.FromFile(My.Application.Info.DirectoryPath & "\Data\Models\Box.X", MeshFlags.Managed, DEV) Using vb As VertexBuffer = m.VertexBuffer Dim vertexData As GraphicsStream = vb.Lock(0, 0, LockFlags.None) objectRadius = Geometry.ComputeBoundingSphere(vertexData, m.NumberVertices, m.VertexFormat, objectCenter) vb.Unlock() End Using [/CODE] Now the problem is that I'm getting some weird results as center and radius: [CODE] objectCenter.X : -944.5798 objectCenter.Y : 0 objectCenter.Z : 944.5798 objectRadius : 0.007336923 [/CODE] although box's width, height and length are about 50, and box's position is 0. What could be going wrong here?
  3. I'm using VB.NET and DirectX9 to animate a mesh. The following code is the Draw function: [CODE] p = FX.Begin(Direct3D.FX.None) For k = 0 To p - 1 FX.BeginPass(k) For iattrib As Integer = 0 To mesh.NumberAttributes - 1 Dim boneCombo As BoneCombination = mesh.BoneTable(iattrib) For i As Integer = 0 To mesh.NumberInfluences - 1 Dim matrixIndex As Integer = boneCombo.BoneId(i) If matrixIndex <> -1 Then Dim tempMatrix As Matrix = offsetMatrices(matrixIndex) * frameMatrices(matrixIndex).Combined 'DXDEV.Transform.SetWorldMatrixByIndex(i, tempMatrix) ''''''''''''''''''''' End If Next DXDEV.RenderState.VertexBlend = DirectCast(boneCombo.BoneId.Length, VertexBlend) - 1 mesh.MeshData.Mesh.DrawSubset(iattrib) Next FX.EndPass() Next k FX.End() [/CODE] As you can see there are multiple world matrices that are passed in different indexes in device. The thing is that I use an HLSL effect file to render the mesh and I don't know how to pass and use multiple world matrices in my effect. I want to replace the commented line: [CODE] DXDEV.Transform.SetWorldMatrixByIndex(i, tempMatrix) [/CODE] with a line that passes those world matrices in my effect and then some HLSL code to use those world matrices in my vertex shader. Any help will be appreciated.
  4. Unity 3D camera vectors

    Quote: As for the 'eye' point (the camera position), you can get that be 'un-inverting' that part of the matrix (I can post some code for this if you need it). I don't understand what you mean but fortunately the eye position is something that I don't have to extract because I have the original. If you mean something else please post the code. Quote:That said, I am really curious as to why you need to do this. Why do you need to extract the eye, at, and up vectors from a view matrix that's already been constructed? What do you need them for? I'm using this camera class in my project to move in space and I also create a water effect in a plane. So, I have to construct the reflectionViewMatrix at every frame to render the scene to a texture as reflected. To construct this reflectionViewMatrix I need the original Eye, LookAt, Up, Right vectors. I tried to use the extracted vectors from the viewMatrix but it doesn't work right. Conversely it works fine if I have the original vectors.
  5. Unity 3D camera vectors

    Quote:Quote:Of course these vectors are normalized and thats the problem!Why is that a problem? Direction vectors are *usually* normalized. The problem is that I don't want a Direction vector but a Position vector. Suposing that the LookAt vector that the viewMatrix returns came from this: zaxis = normal(At - Eye) do you think that this is going to solve my problem: LookAtPosition = Eye + zaxis ???
  6. (HLSL) Spot light calculations

    I think I know all the basic maths about vectors and matrices. You can make a start and we'll see how it goes, if you want. :) Anyway this is not only for me, this can help others that may have this problem, too.
  7. (HLSL) Spot light calculations

    Sorry, this is my fault. I didn't mean that.[embarrass] I can calculate this direction and any other direction, distance or whatever else I may need. But I can't find the way to combine them to come to a conclusion. The complex mathematics are my problem!!! Is this cone between my EyePosition and my pixel? That's the question!
  8. (HLSL) Spot light calculations

    This is the pixel shader code I use to calculate if a pixel is in the cone: float3 Ln = normalize(IN.LightVec); float ConeDot; if (LightDirTarg.w == 0) { /*if w==0 : Direction //if w==1 : Target*/ ConeDot = dot(Ln,normalize(LightDirTarg.xyz)); } else { ConeDot = dot(Ln,normalize(LightDirTarg.xyz-LightPosition.xyz)); } if (ConeDot <= ((LightConeAngle / 360) * 2) - 1) { /*In light*/ } else { /*In shadow*/ } As you said I only have the LightPosition, the LightDirection and the cone angle. And of course my pixel position and normal, and the eye position. The problem is that now enters a new factor, the eye direction that confuse me. I realy can't find a way to calculate it! [totally]
  9. (HLSL) Spot light calculations

    Yes, the ray is obviously the one that you explained, but how am I going to calculate this cone/triangle or whatever in my pixel shader?
  10. As you can see in the image below, I made a spot light shader that looks pretty good. Now, I have an interesting question. In real life when the atmosphere is foggy, the space inside the red lines in the image becomes brighter. I want to find a way to calculate (in my pixel shader) if a pixel is in this area, to make a more realistic spot light. Of course I have all the necessary vectors that can help: WorldView WorldNormal LightPosition LightDirection EyePosition Any suggestions?
  11. I'm not sure what you are asking, but if I understood right, you don't have to use SetTexture in your program at all if you use an effect. You just load your textures in your shader and then combine them in your pixel shader. Simple example: sampler2D texMap; sampler2D g_buffer_norm; sampler2D g_buffer_pos; sampler2D g_random; float3 tex1 = tex2D(texMap, i.uv).xyz; float3 tex2 = tex2D(g_buffer_norm, i.uv).xyz; float3 tex3 = tex2D(g_buffer_pos, i.uv).xyz; float3 tex4 = tex2D(g_random, i.uv).xyz; float3 result = lerp(tex1, tex2, 0.5); result = lerp(result , tex3, 0.5); result = lerp(result , tex4, 0.5); o.color = result;
  12. Unity 3D camera vectors

    No, I don't have these vectors. If you see my first comment it refers to a camera system at toymaker.info that uses only the position of the eye, and calculates Right, Look and Up vectors from the rotation. Of course these vectors are normalized and thats the problem! I use this camera class in a project, and I want to get these three vectors for another reason in my project, but I can't!
  13. Unity 3D camera vectors

    The formula for D3DXMatrixLookAtLH is this: zaxis = normal(At - Eye) xaxis = normal(cross(Up, zaxis)) yaxis = cross(zaxis, xaxis) xaxis.x yaxis.x zaxis.x 0 xaxis.y yaxis.y zaxis.y 0 xaxis.z yaxis.z zaxis.z 0 -dot(xaxis, eye) -dot(yaxis, eye) -dot(zaxis, eye) 1 Now, this code returns a ViewMatrix. I want to do the reverse job. How can I extract from this ViewMatrix the original Eye, At and Up vectors that I used to create it?
  14. Ohhh, thank you very much!!! I guess I need sleep!!!!!
  15. Why when I use this in my HLSL shader, everything work fine: struct vertexOutput { float4 HPosition : POSITION; float2 UV : TEXCOORD0; float3 LightVec : TEXCOORD1; float3 WorldNormal : TEXCOORD2; float3 WorldTangent : TEXCOORD3; float3 WorldBinormal : TEXCOORD4; float3 WorldView : TEXCOORD5; }; /*Pixel Shader*/ float4 depthMap_PS(vertexOutput IN) : COLOR { float3 Ln = normalize(IN.LightVec); float c = length(Ln); return float4(c,0,0,1); } but when I use this I get an "E_FAIL" error: ? struct vertexOutput { float4 HPosition : POSITION; float2 UV : TEXCOORD0; float3 LightVec : TEXCOORD1; float3 WorldNormal : TEXCOORD2; float3 WorldTangent : TEXCOORD3; float3 WorldBinormal : TEXCOORD4; float3 WorldView : TEXCOORD5; }; /*Pixel Shader*/ float4 depthMap_PS(vertexOutput IN) : COLOR { float c = length(IN.lightVec); return float4(c,0,0,1); } Why can't I get IN.LightVec in Pixel Shader?