I've just setup my Constant Buffer in DirectX11, everything seems to be going fine, but the "IASetVertexBuffers" seems to be breaking while I debug, I havn't any idea why, could an experienced programmer please take a look and tell me if I have made a mistake in Syntax or something?
here is a screenshot of where it breaks: http://img194.imageshack.us/img194/6109/herrrx.jpg
HRESULT Object::CompileShaderFromFile( WCHAR* szFileName, LPCSTR szEntryPoint, LPCSTR szShaderModel, ID3DBlob** ppBlobOut )
{
HRESULT hr = S_OK;
DWORD dwShaderFlags = D3DCOMPILE_ENABLE_STRICTNESS;
#if defined( DEBUG ) || defined( _DEBUG )
// Set the D3DCOMPILE_DEBUG flag to embed debug information in the shaders.
// Setting this flag improves the shader debugging experience, but still allows
// the shaders to be optimized and to run exactly the way they will run in
// the release configuration of this program.
dwShaderFlags |= D3DCOMPILE_DEBUG;
#endif
ID3DBlob* pErrorBlob;
hr = D3DX11CompileFromFile( szFileName,
NULL,
NULL,
szEntryPoint,
szShaderModel,
dwShaderFlags,
0,
NULL,
ppBlobOut,
&pErrorBlob,
NULL );
if( FAILED(hr) )
{
if( pErrorBlob != NULL )
OutputDebugStringA( (char*)pErrorBlob->GetBufferPointer() );
if( pErrorBlob ) pErrorBlob->Release();
return hr;
}
if( pErrorBlob ) pErrorBlob->Release();
return S_OK;
}
if( FAILED( hr ) )
{
MessageBox( NULL,
L"The FX file cannot be compiled. Please run this executable from the directory that contains the FX file.", L"Error", MB_OK );
return hr;
}
[sub]//You dont create the buffer![/sub]
[sub]something like...device->Createe buffer(&bd, &initdata, &vertexbuffer) or whatever the names are exactly![/sub]
ok guys, taken what you said and made some changes, yet I still get the same problems. I've commented it so it might be easier to read.
The constant buffer is created and Setup at the bottom of Object::InitObject. and then the UpdateSubresource is in Object::Render to update the constant buffer, yet I still get the same problem in the screenshot with the "pImmediateContext->IASetVertexBuffers(0, 1, &g_pVertexBuffer, &stride, &offset);"
I get the feeling something is suppose to be passed from InitObject into the UpdateSubresources parameters, but I'm not sure what. If you could take a look at this updated code and point me in the right direction that'd be great, thanks!
// create the vertex buffer
D3D11_BUFFER_DESC bd;
ZeroMemory(&bd, sizeof(bd));
bd.Usage = D3D11_USAGE_DYNAMIC; // write access access by CPU and GPU
bd.ByteWidth = sizeof(VERTEX) * 3; // size is the VERTEX struct * 3
bd.BindFlags = D3D11_BIND_VERTEX_BUFFER; // use as a vertex buffer
bd.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; // allow CPU to write in buffer
device->CreateBuffer(&bd, NULL, &g_pVertexBuffer); // create the buffer
// copy the vertices into the buffer
D3D11_MAPPED_SUBRESOURCE ms;
pImmediateContext->Map(g_pVertexBuffer, NULL, D3D11_MAP_WRITE_DISCARD, NULL, &ms); // map the buffer
memcpy(ms.pData, OurVertices, sizeof(OurVertices)); // copy the data
pImmediateContext->Unmap(g_pVertexBuffer, NULL); // unmap the buffer
}
HRESULT Object::CompileShaderFromFile( WCHAR* szFileName, LPCSTR szEntryPoint, LPCSTR szShaderModel, ID3DBlob** ppBlobOut )
{
HRESULT hr = S_OK;
DWORD dwShaderFlags = D3DCOMPILE_ENABLE_STRICTNESS;
#if defined( DEBUG ) || defined( _DEBUG )
// Set the D3DCOMPILE_DEBUG flag to embed debug information in the shaders.
// Setting this flag improves the shader debugging experience, but still allows
// the shaders to be optimized and to run exactly the way they will run in
// the release configuration of this program.
dwShaderFlags |= D3DCOMPILE_DEBUG;
#endif
ID3DBlob* pErrorBlob;
hr = D3DX11CompileFromFile( szFileName,
NULL,
NULL,
szEntryPoint,
szShaderModel,
dwShaderFlags,
0,
NULL,
ppBlobOut,
&pErrorBlob,
NULL );
if( FAILED(hr) )
{
if( pErrorBlob != NULL )
OutputDebugStringA( (char*)pErrorBlob->GetBufferPointer() );
if( pErrorBlob ) pErrorBlob->Release();
return hr;
}
if( pErrorBlob ) pErrorBlob->Release();
return S_OK;
}
//Error Handling
if( FAILED( hr ) )
{
MessageBox( NULL,
L"The FX file cannot be compiled. Please run this executable from the directory that contains the FX file.", L"Error", MB_OK );
return hr;
}
Has your OFFSET type got a fourth float member? Remember that your cbuffer size must be a multiple of 16, UpdateSubresource must update the entire cbuffer in D3D11, so when you're calling UpdateSubresource you're reading from a 12-byte source into a 16-byte destination, you overflow the bounds of the source data and - BOOM!
Simple solution - pad OFFSET to 4 floats (you can leave it as float3 in your shader).
Alternatively, and if you really don't want to do that, use a dynamic cbuffer and Map it with discard instead.
Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.
Your main loop looks very suspect - you're calling Render before you call InitGraphics, meaning that your objects haven't yet been created when you call Render. It's also a baaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaad idea to create and destroy objects like that every frame. Object creation is expensive so do it once only during startup.