Jump to content
  • Advertisement

bioM

Member
  • Content Count

    41
  • Joined

  • Last visited

Community Reputation

144 Neutral

About bioM

  • Rank
    Member
  1. so, this is all from artist's stuff and programmer do nothing, just load the model which have normal mapping???
  2. can you explain what normal mapping means or show me any article talking about this?
  3. i'm looking for a standard to design a model or a scene object so that it can render with a good performance. the thing i am caring about is the number of vertices and polygons? is this a good approach? anybody have comment or technique relate?
  4. have anybody tried multihead for a multi monitor system with Direct3D? i intend to do research on this but still not find out any sample and document in detail for this?
  5. does Direct3D support any mechanism to stream the 3D renderrring in to avi?
  6. to change the frameBuffer, you can lock the texture surface, get the framebuffer, do looping to access to each pixel and do your stuff with that pixel. there will have an issue that you can't not lock directly your texture,( cos your texture was created with RENDERTARGET params), then you have to copy your texture surface to another surface, and work in that surface.
  7. CodeSampler.com have an example for swap chain.
  8. as my testing, GetRenderTargetData(..) function seem to lock CPU's processing ->slow down the performance. without this fucntion, my app runs with 100% CPU, otherwise, it is only 40% CPU usage. so, can anybody show by how to get the render target surface without looking CPU processing.?
  9. i am developing a 3D engine base on DirectX. Now, i want to set up my engine into a multi monitor system. its requirement is: - render the main scene - display this scene in differrent monitor with differrent camera my solution for this system is: TASK1- use the main gfx card to render the scene there TASK2- use swap chain supported from DX to display this scene in differrent window with differrent camera TASK3- create differrent monitor with its own DC( supported by window APIs) TASK4- copy the offscreen buffer(back buffer) from each time render to each window and stretchbit this buffer the each monitor's DC in fullscreen mode. => it worked but i got the problem from TASK4 that its take lots of CPU processing( because of stretchbit function)and reduce half of my renderring framerate. so, my question is: - is this a proper solution for multi monitor system? - is there is anyway to improve from TASK4?
  10. thanks. does it mean that i have to create more d3d device, because from DX function, i can see that each device was created from specified gfx card?
  11. i would like to render multi scenes in multi windows and each window will have its own gfx card to render. so, my question is if DX support us to do that? If it is possible, can anybody show my which technique should i deal with?
  12. i have a 3D test application renderring a scene with a model as a house ( x file format, 8M, 22000 vertex, 3 texture((28KB, 20KB , 42KB)....)and frame rate set at 10 fps.=> The performance is pretty bad that it took 50 % CPU process usage. so, my question is if it is because of the number of vertices is too big or the technique i use to render is not properly.
  13. i has found the result why. it is because of my SetRenderState functions. my solution is as following: D3DCOLOR colorkey = 0xFFFF00FF; m_pd3dDevice->SetRenderState(D3DRS_ALPHAFUNC, D3DCMP_EQUAL); m_pd3dDevice->SetRenderState(D3DRS_ALPHAREF, colorkey); m_pd3dDevice->SetRenderState(D3DRS_ALPHATESTENABLE, TRUE); instead of using: m_pd3dDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE); m_pd3dDevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA); m_pd3dDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);
  14. as taking a look at my image, you can see batman image have its transparent part hide the parts of other images. My RED ARROW here to ask for that... why? sorry, i missed my code. here is detail: //create texture: D3DCOLOR colorkey = 0xFFFF00FF; if (FAILED(D3DXCreateTextureFromFileEx (m_pD3DDevice, "file.png", 0, 0, 1, 0, D3DFMT_A8R8G8B8, D3DPOOL_MANAGED, D3DX_FILTER_NONE, D3DX_DEFAULT, colorkey, &m_SrcInfo, NULL, &pTexture))) { return hr; } // blending and draw texture: m_pd3dDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE); m_pd3dDevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA); m_pd3dDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA); m_pd3dDevice->SetTexture(0, pTexture); //m_pd3dDevice->SetRenderState( D3DRS_ZENABLE, FALSE ); m_pd3dDevice->SetFVF( D3DFVF_TLVERTEX); m_pd3dDevice->SetStreamSource(0, vertexBuffer, 0, sizeof(TLVERTEX)); m_pd3dDevice->DrawPrimitive( D3DPT_TRIANGLEFAN, 0, 2 );
  15. i have read the article about draw and transparent 2D image in Direct3D by using texture quad, it worked fine but i have a problem that when i tried to draw 2 or more textures, the transparent part of each image work wrong as the link below: can anybody know show me the result why?
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!