Jump to content
  • Advertisement


  • Content count

  • Joined

  • Last visited

Community Reputation

1333 Excellent


About lucky6969b

  • Rank

Personal Information

  • Role
    3D Animator
  • Interests

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. If I had an array like this uniform extern float4 lightDirs[4]; How can load the lightDirs from the program? m_effect->SetMatrixArray(lightDirs, &lightDirs[0], 4); or m_effect->SetMatrix(lightDirs[i], &lightDirs[i]); where lightDirs in this case is an array of D3DXHANDLE? Thanks Jack
  2. Hello, Are there any generalized bezier or bspline transformations in matrix form? I want to pop the matrix into the shader directly and having it modify the model pos on the fly while keeping the rest intact. thanks Jack
  3. lucky6969b

    Using a B-Spline Transformation.

    The major reason that I couldn't render from the results of the bspline computation was that the number of vertices coming out of the function is higher than the original input. For example, if I pass the standard cube which has 8 vertices into the function, I'll get 48 vertices out, rendering the original 8 vertices is no problem, because the indices are predefined. when the function returns, I don't know anything about the index buffer anymore. thanks Jack
  4. Does this data look good? Input: D3DXVECTOR3(0,0,0); D3DXVECTOR3(0,1,0); D3DXVECTOR3(1,1,0); D3DXVECTOR3(1,0,1); Results: Vertex 0 Pos: 0 0 0 Vertex 1 Pos: 0 1 0 Vertex 2 Pos: 0 1 0 Vertex 3 Pos: 0 0 0 Vertex 4 Pos: 0 0 0 Vertex 5 Pos: 0 1 0 Vertex 6 Pos: 0 1 0 Vertex 7 Pos: 1 1 0 Vertex 8 Pos: 1 1 0 Vertex 9 Pos: 0 1 0 Vertex 10 Pos: 0 1 0 Vertex 11 Pos: 1 1 0 Vertex 12 Pos: 1 1 0 Vertex 13 Pos: 1 0 1 Vertex 14 Pos: 1 0 1 Vertex 15 Pos: 1 1 0 Vertex 16 Pos: 1 1 0 Vertex 17 Pos: 1 0 1 Vertex 18 Pos: 1 0 1 Vertex 19 Pos: 0 0 0 Vertex 20 Pos: 0 0 0 Vertex 21 Pos: 1 0 1 Vertex 22 Pos: 1 0 1 Vertex 23 Pos: 0 0 0
  5. Are there any free shader resources publicly available on the net? it used to be a lot, but all seems to be gone. thanks Jack
  6. lucky6969b

    Rendering and compositing G-buffers.

    Dear all, I am wondering when I output a stencil value on a float2, and I got an additional sampler. Assuming I output color2 to a texture or some kind without leaving the shader, how does the sampler got knowledge about the stencil texture? sampler depthSampler; //-------------------------------------------------------------------------------------- // Render particle information into the particle buffer //-------------------------------------------------------------------------------------- struct PBUFFER_OUTPUT { float4 color0 : COLOR0; float4 color1 : COLOR1; float4 color2 : COLOR2; // depth buffer }; thanks Jack
  7. When I directly render to the primary surface using the z-buffer on, the particles do go behind the buildings. However, when I turn on the z-buffer while rendering on the g-buffer (render targets) then composite the surfaces using z-buffer off, then the z-buffer properties are gone, and the particles are in front of the buildings. How come? thanks Jack
  8. //AlphaToCoverageEnable = FALSE; //BlendEnable[0] = TRUE; //BlendEnable[1] = TRUE; //RenderTargetWriteMask[0] = 0x0F; //RenderTargetWriteMask[1] = 0x0F; Only available in sm 4.0 or up. And are there any equivalent states for sm 3.0? thanks Jack
  9. For Direct3D 9, Yes, sorry, I am about to upgrade, please bear with me, is it possible to render into 2 targets and 1 stencil view simultaneously. pd3dDevice->OMGetRenderTargets( 1, &pOldRTV, &pOldDSV ); For DX9, do I have to, I may have some performance hit, but that's ok, I am going to upgrade anyways pd3dDevice->SetRenderTarget(1, pParticleView); // Render the particles RenderParticles( pd3dDevice, pEffect, pVB, pParticleTex, numParts, renderTechnique); pd3dDevice->SetRenderTarget(2, pParticleColorView); RenderParticles( pd3dDevice, pEffect, pVB, pParticleTex, numParts, renderTechnique); CComPtr<IDirect3DSurface9> pStencilBuffer = 0; pd3dDevice->GetDepthStencilSurface(&pStencilBuffer); pd3dDevice->SetRenderTarget(3, pStencilBuffer); RenderParticles( pd3dDevice, pEffect, pVB, pParticleTex, numParts, renderTechnique);
  10. I want to load a texture in lab color space for meanshift clustering operations. I am looking into a parameter of D3DFMT..., but the image must stay in the original RGBA format. Any ideas? Thanks Jack
  11. If I have a bunch of "Points" in my world, how is it possible to construct a convex volume out of them? I've checked with the recast source code, the original source of the convex volume is a soup of triangles or polygons. If I use the rubber band algorithm, I can end up with a contour, but it's way too many "vertices" around the contour. Is it the only thing I can do is to check when I trace along the contour, whether the direction of the lines have changed, if so, I create a vertex at wherever the direction changes? thanks Jack
  12. Thanks, Joe. That helps a lot!
  13. Actually, should I move the uvw's around, or should I manipulate the lattices directly? which way is better? if I don't want the plumb rendered as a cube? thanks Jack
  14. I would like to build a S-shaped mesh, pretty much like a B-Spline, then perhaps creating a lattice on it? I think I should start off with a box, when the upper vertices get transformed into a B-Spline shape, the u,v stays the same.. any ideas? thanks Jack
  15. I seem not to able to find it. If I generate a 3D uvw from the vertex shader, I need to have a 3D texture sampler anyways. But I can't find it in the MSDN? thanks Jack
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!