Hi, for gpgpu, I want to use ps to read int, and process, and write int to rendertarget.
read int is ok, since texture of int is successfully created.
but create(rendertarget of R32G32B32_UINT) failed, why? Thanks.
My card is 8800GTX.
ps. Now I tested DXGI_FORMAT_R32G32B32A32_UINT and it works! why 3 channels can't work and 4 channels work? isn't it that dx10 is "all caps supported"?
descTex.Width = texW;
descTex.Height = texH;
descTex.ArraySize = 1;
descTex.SampleDesc.Count = 1;
descTex.SampleDesc.Quality = 0;
descTex.Format = DXGI_FORMAT_R32G32B32_UINT; //failed! can only DXGI_FORMAT_R32G32B32A32_FLOAT;
descTex.Usage = D3D10_USAGE_DEFAULT;
descTex.BindFlags = D3D10_BIND_RENDER_TARGET; //rendertarget
descTex.CPUAccessFlags = 0;
descTex.MiscFlags = 0;
descTex.MipLevels = 1;
g_pd3dDevice->CreateTexture2D(&descTex, NULL, &pTex);
ps. I have set a fake format during the earliest creation of device:
UINT createDeviceFlags = 0;
#ifdef _DEBUG
createDeviceFlags |= D3D10_CREATE_DEVICE_DEBUG;
#endif
DXGI_SWAP_CHAIN_DESC sd;
ZeroMemory( &sd, sizeof(sd) );
sd.BufferCount = 1;
sd.BufferDesc.Width = 256;
sd.BufferDesc.Height = 256;
sd.BufferDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM; //
sd.BufferDesc.RefreshRate.Numerator = 60;
sd.BufferDesc.RefreshRate.Denominator = 1;
sd.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT;
sd.OutputWindow = g_hWnd;
sd.SampleDesc.Count = 1;
sd.SampleDesc.Quality = 0;
sd.Windowed = TRUE;
V_RETURN(D3D10CreateDeviceAndSwapChain(NULL, D3D10_DRIVER_TYPE_HARDWARE, NULL, createDeviceFlags, D3D10_SDK_VERSION, &sd, &g_pSwapChain, &g_pd3dDevice));
#endif
by the sd.BufferDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM; I never meant to use this format, since I actually render to a much bigger texture (4k*4k), which has nothing to do with the swap chain. Actually I only render *once* and don't enter the ProcMsg loop. Does this have problem? Thanks a lot!!1
[Edited by - yk_cadcg on May 13, 2007 1:00:42 AM]