ID3D11Device::CreateBuffer returning E_INVALIDARG for no apparent reason.[RESOLVED]

Started by
12 comments, last by _the_phantom_ 11 years, 9 months ago
I'm using a little helper function to create vertex buffers:
void AssetManager::CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void* vertices, D3D11_USAGE bufferUsage, D3D11_CPU_ACCESS_FLAG processorAccess)
{
D3D11_BUFFER_DESC bufferDesc;
D3D11_SUBRESOURCE_DATA subResData;

bufferDesc.Usage = bufferUsage;
bufferDesc.ByteWidth = bufferSize;
bufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER;
bufferDesc.CPUAccessFlags = processorAccess;
bufferDesc.MiscFlags = 0;
bufferDesc.StructureByteStride = 0;

subResData.pSysMem = vertices;
subResData.SysMemPitch = 0;
subResData.SysMemSlicePitch = 0;
HR(gD3DDevice->CreateBuffer(&bufferDesc, &subResData, buffer));
}

And it causes a E_INVALIDARG error,but doesn't give a clue about which argument is the invalid one.Here is how the function is called in the program:

vertexBuffer = 0;

VertexType* vertices = new VertexType[vertexCount];;

vertices[0].position = XMFLOAT3(-1.0f, -1.0f, 0.0f); //just random values to see if it debugs at all
vertices[0].texture = XMFLOAT2(0.0f, 1.0f);
vertices[1].position = XMFLOAT3(0.0f, 1.0f, 0.0f);
vertices[1].texture = XMFLOAT2(0.5f, 0.0f);

vertices[2].position = XMFLOAT3(1.0f, -1.0f, 0.0f);
vertices[2].texture = XMFLOAT2(1.0f, 1.0f);


AssetManager->CreateVertexBuffer(&vertexBuffer,sizeof(VertexType)*vertexCount, &vertices);


Could the problem be in "bufferDesc.StructureByteStride = 0;"?Am I supposed to leave it at 0?Cause all the tutorial project files leave it at 0 and they compile just fine.But the problem can't be in the project itself,I linked all the needed libraries and the paths to the SDK.
Advertisement
What are the default values for the bufferUsage and processorAccess parameters, since you dont specify them? D3D11_USAGE_DEFAULT and 0?

What are the default values for the bufferUsage and processorAccess parameters, since you dont specify them? 0 and 0?

What are the default values for the bufferUsage and processorAccess parameters, since you dont specify them? 0 and 0?


They are
void CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void * vertices, D3D11_USAGE bufferUsage = D3D11_USAGE_DEFAULT, D3D11_CPU_ACCESS_FLAG processorAccess = D3D11_CPU_ACCESS_READ);

It should result in the same buffer creation that I saw in a tutorial which used to work with no problems.
I always set it to 0 if I go with D3D11_USAGE_DEFAULT. I dont think you can use D3D11_CPU_ACCESS_READ for this.

I always set it to 0 if I go with D3D11_USAGE_DEFAULT. I dont think you can use D3D11_CPU_ACCESS_READ for this.

I always set it to 0 if I go with D3D11_USAGE_DEFAULT. I dont think you can use D3D11_CPU_ACCESS_READ for this.


lol it actually worked,but what kind of access does 0 mean?READWRITE?
No. Default usage means that only the GPU can read and write. That other parameter has something to do with staging. Can't say I've never used it.
Is vertexCount == 3?

Your call:
AssetManager->CreateVertexBuffer(&vertexBuffer,sizeof(VertexType)*vertexCount, &vertices);

Your definition:
void AssetManager::CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void* vertices, D3D11_USAGE bufferUsage, D3D11_CPU_ACCESS_FLAG processorAccess)

What are the default values?


ID3D11Buffer** buffer means pointer to pointer.
In the creation you have buffer pointer instead of *buffer pointer
HR(gD3DDevice->CreateBuffer(&bufferDesc, &subResData, *buffer));

Is vertexCount == 3?

Your call:
AssetManager->CreateVertexBuffer(&vertexBuffer,sizeof(VertexType)*vertexCount, &vertices);

Your definition:
void AssetManager::CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void* vertices, D3D11_USAGE bufferUsage, D3D11_CPU_ACCESS_FLAG processorAccess)

What are the default values?


ID3D11Buffer** buffer means pointer to pointer.
In the creation you have buffer pointer instead of *buffer pointer
HR(gD3DDevice->CreateBuffer(&bufferDesc, &subResData, *buffer));
Firstly, it has already been solved. But to answer your questions. About the parameters: Default values. He specified them a bit up when I asked. About the pointer. If you look closely, buffer IS ID3D11Buffer**. It wouldn't have compiled otherwise.

No. Default usage means that only the GPU can read and write. That other parameter has something to do with staging. Can't say I've never used it.


This is the correct answer. The DX SDK clearly explains what combinations of parameters are valid, and CPU access read (the fact that you even want to do this sends more alarm bells, by the way) is not valid for default usage. MSDN page for D3D11_USAGE: http://msdn.microsoft.com/en-us/library/windows/desktop/ff476259%28v=vs.85%29.aspx

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Sorry didn't see it resolved, page refresh :(
Eeek, your right

This topic is closed to new replies.

Advertisement