BrechtDebruyne

Members
  • Content count

    47
  • Joined

  • Last visited

Community Reputation

154 Neutral

About BrechtDebruyne

  • Rank
    Member

Personal Information

  1. Soft body physics

    That's np, I am an undergraduate mathematics student and will eventually cover that. But I'll need to have a bit patience then :)
  2. Soft body physics

    I am interested in learning about (and possibly implementing) soft body physics. Are there books or articles available that give a detailed overview of the simulation process with academical rigor that you can recommend me? (I may be interested in specialising in this field, so please no books/articles with 'eased'/'fuzzy' math etc) Also, what would the mathematical prerequisites be? Is a college-level introductory analysis/calculus and LA course sufficient? Is it important to have a rigid body simulation engine running before you attempt soft body physics, or are they largely independent?
  3. What can software rasterizers be used for today?

    [quote name='Radikalizm' timestamp='1354114934' post='5004979'] I've seen real-time raytracers done with GPGPU solutions, but no rasterizers as far as I can remember. There's a good reason for a lack of GPGPU rasterizers though, as DX and OGL would always outperform GPGPU solutions as they can use the actual rasterizer hardware, while a GPGPU solution would need to do rasterization completely in software. Maybe there are some obscure use cases where a GPGPU rasterizer would be actually useful, but in general it'd be better to stick with libraries like DX and OGL. [/quote] Well, gonna have to try the gpgpu raytracing then
  4. What can software rasterizers be used for today?

    Another thing I'm wondering, I've been thinking about experimenting with some rendering techniques like rasterization with gpgpu by using something like CUDA. However, I have no CUDA experience and wondering if it would be possible to do so? Could there be certain advantages over just using Dx / GL?
  5. I decided to code my own software renderer, because I think it would make for a great learning experience. And although it will probably stop there for me, I'd like to know if there is still use for software rasterizers today, and what they can be used for? Can they be used for anything besides systems that lack a GPU? And what are the most important target devices today that lack a GPU?
  6. int main() { std::cout << "Hello World!" << std::endl; return 0; }
  7. std::cout << "hello world!";
  8. std::cout << "hello world!";
  9. std::cout << "hello world!";
  10. What are the advantages and disadvantages of multi-stream multi-index rendering? (as is done in a d3d10 sample) Is it often used in graphics engines? When should I bother to implement it?
  11. Thanks, cleared up all my questions
  12. I am aware that there is a sample on working without FX in the samplebrowser, and I already checked that one. However, some questions arise: In the sample: [CODE]D3DXMATRIXA16 mWorldViewProj; D3DXMATRIXA16 mWorld; D3DXMATRIXA16 mView; D3DXMATRIXA16 mProj; mWorld = g_World; mView = g_View; mProj = g_Projection; mWorldViewProj = mWorld * mView * mProj; VS_CONSTANT_BUFFER* pConstData; g_pConstantBuffer10->Map( D3D10_MAP_WRITE_DISCARD, NULL, ( void** )&pConstData ); pConstData->mWorldViewProj = mWorldViewProj; pConstData->fTime = fBoundedTime; g_pConstantBuffer10->Unmap(); [/CODE] They are copying their D3DXMATRIX'es to D3DXMATRIXA16. Checked on msdn, these new matrices are 16 byte aligned and optimised for intel pentium 4. So as my first question: 1) Is it necessary to copy matrices to D3DXMATRIXA16 before sending them to the constant buffer? And if no, why don't we just use D3DXMATRIXA16 all the time? I have another question about managing multiple constant buffers within one shader. Suppose that, within your shader, you have multiple constant buffers that need to be updated at different times: [CODE]cbuffer cbNeverChanges { matrix View; }; cbuffer cbChangeOnResize { matrix Projection; }; cbuffer cbChangesEveryFrame { matrix World; float4 vMeshColor; };[/CODE] Then how would I set these buffers all at different times? [CODE]g_pd3dDevice->VSSetConstantBuffers( 0, 1, &g_pConstantBuffer10 );[/CODE] gives me the possibility to set multiple buffers, but that is within one call. 2) Is that okay even if my constant buffers are updated at different times? And do I suppose I have to make sure the constantbuffers are in the same position in the array as the order they appear in the shader?
  13. That's just *insert f word* awesome! Thanks a lot