ID3D11Buffer* CreateVertexBufferFromGeometry(Mesh& mesh) const{
[cut unrelated stuff]
ID3D11Buffer* buffer;
D3D11_BUFFER_DESC bd;
ZeroMemory(&bd, sizeof(bd));
bd.Usage = D3D11_USAGE_DEFAULT;
bd.ByteWidth = sizeof(float) * mesh.NumberOfFloatsPerVertex * mesh.vertexData.size();
bd.BindFlags = D3D11_BIND_VERTEX_BUFFER;
bd.CPUAccessFlags = 0;
bd.MiscFlags = 0;
D3D11_SUBRESOURCE_DATA InitData;
ZeroMemory(&InitData, sizeof(InitData));
InitData.pSysMem = mesh.vertexData.data();
RenderDevice->CreateBuffer(&bd, &InitData, &buffer);
return buffer;
}
Driver crash on creation of vertex buffer (D3D11)
Reading / writing what address?
It changes everytime IIRC (don't have the code with me atm), but is not 0xcccccc or 0x00000 or any usual null/known address
What is mesh.vertexData? It would be my first suspect.
Off topic, but FWIW below, #1 is the same as #2, but less efficient, more verbose, and prone to human error:
//#1
D3D11_BUFFER_DESC bd;
ZeroMemory(&bd, sizeof(bd));
//#2
D3D11_BUFFER_DESC bd = {};
If it's a violation *reading* from a valid looking address (one that looks like it comes some sane number of bytes after the start of vertexData.data()) then I would assume vertexData doesn't contain enough bytes to fulfil filling the buffer in its entirety. Check mesh.NumberOfFloatsPerVertex is correct.
If it's a violation *reading* from a valid looking address (one that looks like it comes some sane number of bytes after the start of vertexData.data()) then I would assume vertexData doesn't contain enough bytes to fulfil filling the buffer in its entirety. Check mesh.NumberOfFloatsPerVertex is correct.
This was it! There was an error in the calculation of floatsPerVertex, Thank you!