• Advertisement
Sign in to follow this  

ID3D11Device::CreateBuffer returning E_INVALIDARG for no apparent reason.[RESOLVED]

This topic is 2095 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm using a little helper function to create vertex buffers:
void AssetManager::CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void* vertices, D3D11_USAGE bufferUsage, D3D11_CPU_ACCESS_FLAG processorAccess)
{
D3D11_BUFFER_DESC bufferDesc;
D3D11_SUBRESOURCE_DATA subResData;

bufferDesc.Usage = bufferUsage;
bufferDesc.ByteWidth = bufferSize;
bufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER;
bufferDesc.CPUAccessFlags = processorAccess;
bufferDesc.MiscFlags = 0;
bufferDesc.StructureByteStride = 0;

subResData.pSysMem = vertices;
subResData.SysMemPitch = 0;
subResData.SysMemSlicePitch = 0;
HR(gD3DDevice->CreateBuffer(&bufferDesc, &subResData, buffer));
}

And it causes a E_INVALIDARG error,but doesn't give a clue about which argument is the invalid one.Here is how the function is called in the program:

vertexBuffer = 0;

VertexType* vertices = new VertexType[vertexCount];;

vertices[0].position = XMFLOAT3(-1.0f, -1.0f, 0.0f); //just random values to see if it debugs at all
vertices[0].texture = XMFLOAT2(0.0f, 1.0f);
vertices[1].position = XMFLOAT3(0.0f, 1.0f, 0.0f);
vertices[1].texture = XMFLOAT2(0.5f, 0.0f);

vertices[2].position = XMFLOAT3(1.0f, -1.0f, 0.0f);
vertices[2].texture = XMFLOAT2(1.0f, 1.0f);


AssetManager->CreateVertexBuffer(&vertexBuffer,sizeof(VertexType)*vertexCount, &vertices);


Could the problem be in "bufferDesc.StructureByteStride = 0;"?Am I supposed to leave it at 0?Cause all the tutorial project files leave it at 0 and they compile just fine.But the problem can't be in the project itself,I linked all the needed libraries and the paths to the SDK.

Share this post


Link to post
Share on other sites
Advertisement
What are the default values for the bufferUsage and processorAccess parameters, since you dont specify them? D3D11_USAGE_DEFAULT and 0? Edited by SamiHuutoniemi

Share this post


Link to post
Share on other sites

What are the default values for the bufferUsage and processorAccess parameters, since you dont specify them? 0 and 0?

What are the default values for the bufferUsage and processorAccess parameters, since you dont specify them? 0 and 0?


They are
void CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void * vertices, D3D11_USAGE bufferUsage = D3D11_USAGE_DEFAULT, D3D11_CPU_ACCESS_FLAG processorAccess = D3D11_CPU_ACCESS_READ);

It should result in the same buffer creation that I saw in a tutorial which used to work with no problems.

Share this post


Link to post
Share on other sites

I always set it to 0 if I go with D3D11_USAGE_DEFAULT. I dont think you can use D3D11_CPU_ACCESS_READ for this.

I always set it to 0 if I go with D3D11_USAGE_DEFAULT. I dont think you can use D3D11_CPU_ACCESS_READ for this.


lol it actually worked,but what kind of access does 0 mean?READWRITE?

Share this post


Link to post
Share on other sites
Is vertexCount == 3?

Your call:
AssetManager->CreateVertexBuffer(&vertexBuffer,sizeof(VertexType)*vertexCount, &vertices);

Your definition:
void AssetManager::CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void* vertices, D3D11_USAGE bufferUsage, D3D11_CPU_ACCESS_FLAG processorAccess)

What are the default values?


ID3D11Buffer** buffer means pointer to pointer.
In the creation you have buffer pointer instead of *buffer pointer
HR(gD3DDevice->CreateBuffer(&bufferDesc, &subResData, *buffer));

Share this post


Link to post
Share on other sites

Is vertexCount == 3?

Your call:
AssetManager->CreateVertexBuffer(&vertexBuffer,sizeof(VertexType)*vertexCount, &vertices);

Your definition:
void AssetManager::CreateVertexBuffer(ID3D11Buffer** buffer, unsigned int bufferSize, void* vertices, D3D11_USAGE bufferUsage, D3D11_CPU_ACCESS_FLAG processorAccess)

What are the default values?


ID3D11Buffer** buffer means pointer to pointer.
In the creation you have buffer pointer instead of *buffer pointer
HR(gD3DDevice->CreateBuffer(&bufferDesc, &subResData, *buffer));
Firstly, it has already been solved. But to answer your questions. About the parameters: Default values. He specified them a bit up when I asked. About the pointer. If you look closely, buffer IS ID3D11Buffer**. It wouldn't have compiled otherwise.

Share this post


Link to post
Share on other sites

No. Default usage means that only the GPU can read and write. That other parameter has something to do with staging. Can't say I've never used it.


This is the correct answer. The DX SDK clearly explains what combinations of parameters are valid, and CPU access read (the fact that you even want to do this sends more alarm bells, by the way) is not valid for default usage. MSDN page for D3D11_USAGE: http://msdn.microsoft.com/en-us/library/windows/desktop/ff476259%28v=vs.85%29.aspx

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement