ID3D11Device::CreateBuffer returning E_INVALIDARG for no apparent reason.[RESOLVED]

This topic is 2003 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

I'm using a little helper function to create vertex buffers:
void AssetManager::CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void* vertices, D3D11_USAGE bufferUsage, D3D11_CPU_ACCESS_FLAG processorAccess) { D3D11_BUFFER_DESC bufferDesc; D3D11_SUBRESOURCE_DATA subResData; bufferDesc.Usage = bufferUsage; bufferDesc.ByteWidth = bufferSize; bufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; bufferDesc.CPUAccessFlags = processorAccess; bufferDesc.MiscFlags = 0; bufferDesc.StructureByteStride = 0; subResData.pSysMem = vertices; subResData.SysMemPitch = 0; subResData.SysMemSlicePitch = 0; HR(gD3DDevice->CreateBuffer(&bufferDesc, &subResData, buffer)); } And it causes a E_INVALIDARG error,but doesn't give a clue about which argument is the invalid one.Here is how the function is called in the program:
 vertexBuffer = 0; VertexType* vertices = new VertexType[vertexCount];; vertices[0].position = XMFLOAT3(-1.0f, -1.0f, 0.0f); //just random values to see if it debugs at all vertices[0].texture = XMFLOAT2(0.0f, 1.0f); vertices[1].position = XMFLOAT3(0.0f, 1.0f, 0.0f); vertices[1].texture = XMFLOAT2(0.5f, 0.0f); vertices[2].position = XMFLOAT3(1.0f, -1.0f, 0.0f); vertices[2].texture = XMFLOAT2(1.0f, 1.0f); AssetManager->CreateVertexBuffer(&vertexBuffer,sizeof(VertexType)*vertexCount, &vertices); 

Could the problem be in "bufferDesc.StructureByteStride = 0;"?Am I supposed to leave it at 0?Cause all the tutorial project files leave it at 0 and they compile just fine.But the problem can't be in the project itself,I linked all the needed libraries and the paths to the SDK.

Share on other sites
What are the default values for the bufferUsage and processorAccess parameters, since you dont specify them? D3D11_USAGE_DEFAULT and 0? Edited by SamiHuutoniemi

Share on other sites

What are the default values for the bufferUsage and processorAccess parameters, since you dont specify them? 0 and 0?

What are the default values for the bufferUsage and processorAccess parameters, since you dont specify them? 0 and 0?

They are
void CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void * vertices, D3D11_USAGE bufferUsage = D3D11_USAGE_DEFAULT, D3D11_CPU_ACCESS_FLAG processorAccess = D3D11_CPU_ACCESS_READ);

It should result in the same buffer creation that I saw in a tutorial which used to work with no problems.

Share on other sites
I always set it to 0 if I go with D3D11_USAGE_DEFAULT. I dont think you can use D3D11_CPU_ACCESS_READ for this. Edited by SamiHuutoniemi

Share on other sites

I always set it to 0 if I go with D3D11_USAGE_DEFAULT. I dont think you can use D3D11_CPU_ACCESS_READ for this.

I always set it to 0 if I go with D3D11_USAGE_DEFAULT. I dont think you can use D3D11_CPU_ACCESS_READ for this.

lol it actually worked,but what kind of access does 0 mean?READWRITE?

Share on other sites
No. Default usage means that only the GPU can read and write. That other parameter has something to do with staging. Can't say I've never used it.

Share on other sites
Is vertexCount == 3?

AssetManager->CreateVertexBuffer(&vertexBuffer,sizeof(VertexType)*vertexCount, &vertices);

void AssetManager::CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void* vertices, D3D11_USAGE bufferUsage, D3D11_CPU_ACCESS_FLAG processorAccess)

What are the default values?

ID3D11Buffer** buffer means pointer to pointer.
In the creation you have buffer pointer instead of *buffer pointer
HR(gD3DDevice->CreateBuffer(&bufferDesc, &subResData, *buffer));

Share on other sites

Is vertexCount == 3?

AssetManager->CreateVertexBuffer(&vertexBuffer,sizeof(VertexType)*vertexCount, &vertices);

void AssetManager::CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void* vertices, D3D11_USAGE bufferUsage, D3D11_CPU_ACCESS_FLAG processorAccess)

What are the default values?

ID3D11Buffer** buffer means pointer to pointer.
In the creation you have buffer pointer instead of *buffer pointer
HR(gD3DDevice->CreateBuffer(&bufferDesc, &subResData, *buffer));
Firstly, it has already been solved. But to answer your questions. About the parameters: Default values. He specified them a bit up when I asked. About the pointer. If you look closely, buffer IS ID3D11Buffer**. It wouldn't have compiled otherwise.

Share on other sites

No. Default usage means that only the GPU can read and write. That other parameter has something to do with staging. Can't say I've never used it.

This is the correct answer. The DX SDK clearly explains what combinations of parameters are valid, and CPU access read (the fact that you even want to do this sends more alarm bells, by the way) is not valid for default usage. MSDN page for D3D11_USAGE: http://msdn.microsoft.com/en-us/library/windows/desktop/ff476259%28v=vs.85%29.aspx

Share on other sites
Sorry didn't see it resolved, page refresh

Share on other sites
Thanks for showing me this,I guess it's not such a good idea to have a buffer creator function anyway

Share on other sites
There's nothing wrong with having a buffer creation routine so long as it works correctly. There are interesting and useful things you can do with one, including calling SetPrivateData to give your buffer a name (really helps tracking resource leaks), add your buffer to a resources list which is then used for Releasing them (you don't need to worry about adding each individual buffer to your shutdown routine anymore), consolidate your error checking for buffers in one place, etc.

But the buffer creation routine does have to work correctly first.

Share on other sites
What mhagain said!

If you'd like another sample of a create vertex buffer routine, this is one that I have in my framework:

[source lang="cpp"]void RenderableObject::CreateVertexBuffer(const Vertex* vertices, bool dynamic) {
HRESULT hr;

if (pVertexBuffer) {
pVertexBuffer->Release();
}

D3D11_BUFFER_DESC vertexBufferDesc;
ZeroMemory(&vertexBufferDesc, sizeof(vertexBufferDesc));

if (dynamic) {
vertexBufferDesc.Usage = D3D11_USAGE_DYNAMIC;
vertexBufferDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
}
else {
vertexBufferDesc.Usage = D3D11_USAGE_DEFAULT;
vertexBufferDesc.CPUAccessFlags = 0;
}
vertexBufferDesc.ByteWidth = textured ? sizeof(TextureVertex) * numVertices : sizeof(ColorVertex) * numVertices;
vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER;
vertexBufferDesc.MiscFlags = 0;

D3D11_SUBRESOURCE_DATA vertexBufferData;

ZeroMemory( &vertexBufferData, sizeof(vertexBufferData) );
vertexBufferData.pSysMem = vertices;

hr = pD3DEngine->GetDevice()->CreateBuffer(&vertexBufferDesc, &vertexBufferData, &pVertexBuffer);
if (hr != S_OK) {
MessageBox(0, L"RenderableObject::CreateVertexBuffer(): Could not create Vertex buffer!", L"Error!", 0);
}
pVertexBuffer->SetPrivateData(WKPDID_D3DDebugObjectName, sizeof(name) - 1, name.c_str());
}[/source]

Share on other sites
Off the top of my head I don't know if it would have helped in this situation but in general when you hit a problem make sure you have the D3D debug runtime active - it will print out pretty detailed information about the problems which are happening to the Visual Studio output window.