void createBuffer( unsigned int bufferSize, void* data, bool dynamic, unsigned int bindFlag )
{
m_enabled = true;
m_bufferDescriptor.Usage = D3D11_USAGE_DEFAULT;
m_bufferDescriptor.ByteWidth = bufferSize;
m_bufferDescriptor.BindFlags = bindFlag;
m_bufferDescriptor.CPUAccessFlags = 0;
m_initData.pSysMem = data;
m_initData.SysMemPitch = 0;
m_initData.SysMemSlicePitch = 0;
HRESULT hr = DeviceManager::getInstance().getDevice()->CreateBuffer( &m_bufferDescriptor, &m_initData, &m_buffer );
if (FAILED(hr))
{
std::cout << "Failed to create a D3D11 Buffer object with code: " << hr << std::endl;
}
}
//This function is called from a derived class
void VertexBuffer::createBufferAndLayout( unsigned int bufferSize, void* data, bool dynamic, bool position, bool normal, const std::vector<unsigned int>& textureCoordinateDimensions )
{
Buffer::createBuffer(bufferSize, data, dynamic, D3D11_BIND_VERTEX_BUFFER);
}
The device is valid and created, and running on a AMD RADEON HD4850, the device is created with feature level 10_1 which is the maximum this card supports. Device is created with D3D11_CREATE_DEVICE_SINGLETHREADED | D3D11_CREATE_DEVICE_DEBUG these flags and the application has been added to the debug list in the DX config panel.
This call is fine when the buffer size is 128 which happens to be 16-byte aligned, if I however call this with a buffer size that is 20000 it crahses with this call stack:
atidxx32.dll!6d1d775e()
[Frames below may be incorrect and/or missing, no symbols loaded for atidxx32.dll]
atidxx32.dll!6d1d7809()
atidxx32.dll!6ce91874()
atidxx32.dll!6ce90ada()
atidxx32.dll!6ce903b3()
atidxx32.dll!6ce548b6()
atidxx32.dll!6ce57ba3()
atidxx32.dll!6ce5be26()
atidxx32.dll!6ce5b77c()
atidxx32.dll!6ce53bbb()
atidxx32.dll!6ce68471()
atiuxpag.dll!6da31d60()
d3d11.dll!CResource<ID3D11Texture3D>::CLS::FinalConstruct() + 0x18e bytes
d3d11.dll!CTexture1D::CLS::FinalConstruct() + 0x35 bytes
d3d11.dll!TCLSWrappers<CBuffer>::CLSFinalConstructFn() + 0x13 bytes
d3d11.dll!CLayeredObjectWithCLS<CUnorderedAccessView>::FinalConstruct() + 0x61 bytes
d3d11.dll!CLayeredObjectWithCLS<CBuffer>::CreateInstance() + 0x68 bytes
d3d11.dll!CDevice::CreateLayeredChild() + 0x98 bytes
d3d11.dll!CBridgeImpl<ID3D11LayeredDevice,ID3D11LayeredDevice,CLayeredObject<CDevice> >::CreateLayeredChild() + 0x22 bytes
d3d11.dll!CD3D11LayeredChild<ID3D11DeviceChild,NDXGI::CDevice,64>::FinalConstruct() + 0x2a bytes
d3d11.dll!NDXGI::CDeviceChild<IDXGISurface>::FinalConstruct() + 0x1b bytes
d3d11.dll!NDXGI::CResource::FinalConstruct() + 0x23 bytes
d3d11.dll!CLayeredObject<NDXGI::CResource>::CreateInstance() + 0x68 bytes
d3d11.dll!NDXGI::CDevice::CreateLayeredChild() + 0x135 bytes
d3d11.dll!CBridgeImpl<ID3D11LayeredDevice,ID3D11LayeredDevice,CLayeredObject<NDXGI::CDevice> >::CreateLayeredChild() + 0x22 bytes
D3D11SDKLayers.dll!CD3D11LayeredChild<ID3D11DepthStencilState,NDebug::CDevice,32>::FinalConstruct() + 0x2d bytes
D3D11SDKLayers.dll!NDebug::CDeviceChild<ID3D11Counter>::FinalConstruct() + 0x50 bytes
D3D11SDKLayers.dll!NDebug::CResource<ID3D11Buffer>::FinalConstruct() + 0x1d bytes
D3D11SDKLayers.dll!NDebug::CBuffer::FinalConstruct() + 0x15 bytes
D3D11SDKLayers.dll!CLayeredObject<NDebug::CBuffer>::CreateInstance() + 0x53 bytes
D3D11SDKLayers.dll!NDebug::CDevice::CreateLayeredChild() + 0x90 bytes
D3D11SDKLayers.dll!CBridgeImpl<ID3D11LayeredDevice,ID3D11LayeredDevice,CLayeredObject<NDebug::CDevice> >::CreateLayeredChild() + 0x22 bytes
d3d11.dll!NOutermost::CDeviceChild::FinalConstruct() + 0x29 bytes
d3d11.dll!CUseCountedObject<NOutermost::CDeviceChild>::CUseCountedObject<NOutermost::CDeviceChild>() + 0x48 bytes
d3d11.dll!CUseCountedObject<NOutermost::CDeviceChild>::CreateInstance() + 0x6e bytes
d3d11.dll!NOutermost::CDevice::CreateLayeredChild() + 0xd0 bytes
d3d11.dll!CDevice::CreateBuffer_Worker() + 0xde bytes
d3d11.dll!CDevice::CreateBuffer() + 0x18 bytes
D3D11SDKLayers.dll!NDebug::CDevice::CreateBuffer() + 0x105 bytes
> SpaceSim.exe!Buffer::createBuffer(unsigned int bufferSize=20000, void * data=0x020dca50, bool dynamic=false, unsigned int bindFlag=1) Line 46 + 0x3a bytes C++
The vertex format being used is a 3 float position, a 3 float normal and a 2 float texcoord. in the case of 128 bytes I am only passing 4 vertecies to the buffer, in the case of 20000 I am passing 625 vertices to the buffer.
It's still passing a 16-byte aligned pointer, but as you can see it is falling somewhere in the driver, since I have no symbols for the driver I am really wondering whats going on. And because this is dying in the driver I am not getting any real usefull information from the D3D debug runtime, I also tried creating the buffer with 225 verts and that also fails with the same callstack.
Side question:
I am new to D3D11 but I do know D3D9 quite well so most concepts probably carry over between the two. But am I right in thinking that once the Buffer is created D3D11 has made a copy of the vertex data onto the GPU as in D3D9 and so it is safe to delete the CPU side buffer?