Jump to content

  • Log In with Google      Sign In   
  • Create Account


DirectX10 IBuffer, custom non-struct vertexbuffers format


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
No replies to this topic

#1 messup000   Members   -  Reputation: 108

Like
0Likes
Like

Posted 29 August 2012 - 10:32 PM

Hey gamedev,

First post woohoo. I'm having a little trouble passing binary array data to the vertex buffer.
I am currently passing position and color to the vertex IBuffer in what I had assumed was the correct format. Below I is both the non-working code and the working code.

//create a basic vertex container
struct vPC
{
D3DXVECTOR3 pos;
D3DXVECTOR4 color;
vPC( D3DXVECTOR3 p, D3DXVECTOR4 c )
{
  pos = p;
  color = c;
}
};
//vertex buffer code
pVertexBuffer->Map(D3D10_MAP_WRITE_DISCARD, 0, (void**) &v);
v[0] = vPC( D3DXVECTOR3(-1,-1,0), D3DXVECTOR4(1,0,0,1) );
v[1] = vPC( D3DXVECTOR3(0,1,0), D3DXVECTOR4(0,1,0,1) );
v[2] = vPC( D3DXVECTOR3(1,-1,0), D3DXVECTOR4(0,0,1,1) );
pVertexBuffer->Unmap();


My version of the pure byte code.
int DXRenderer::FormatToByteNumber(DXGI_FORMAT f)
{
int byteNumber = 0;
switch(f)
{
  case DXGI_FORMAT_R32G32_FLOAT:
   byteNumber += 8;
   break;
  case DXGI_FORMAT_R32G32B32_FLOAT:
   byteNumber += 12;
   break;
  case DXGI_FORMAT_R32G32B32A32_FLOAT:
   byteNumber += 16;
   break;
}
return byteNumber;
}


//vertex buffer code
char* byteOut = NULL;
pVertexBuffer->Map(D3D10_MAP_WRITE_DISCARD, 0, (void**)  &byteOut);

int byteNumber = 0;
int i = 0;
int bytePosition = 0;
int byteCount = 0;
float *vertArr;


float pos[] =   {-1,-1, 0, 0,  0, 1, 0, 0,  1,-1, 0, 0};
float color[] = { 1, 0, 0, 1,  0, 1, 0, 1,  0, 0, 1, 1};

for(i = 0; i < inputCount; i++)
{
byteNumber += FormatToByteNumber(layout[i].Format);
}
byteNumber *= vertexCount;
byteOut = new char[byteNumber];

for(int i = 0; i < vertexCount; i++)
{
for(int j = 0; j < inputCount; j++)
{
  if(strcmp(layout[j].SemanticName,"POSITION") == 0)
  {
   vertArr = pos;
  }
  else if(strcmp(layout[j].SemanticName,"TEXCOORD") = 0)
  {
   vertArr = texture;
  }
  else if(strcmp(layout[j].SemanticName,"COLOR") == 0)
  {
   vertArr = color;
  }
  else if(strcmp(layout[j].SemanticName,"NORMAL") == 0)
  {
   vertArr = normal;
  }
  else if(strcmp(layout[j].SemanticName,"TRANSFORM") == 0)
  {
   vertArr = transform;
  }
  byteCount = FormatToByteNumber(layout[j].Format);

  memcpy(byteOut+(bytePosition),&(vertArr[i*4]),byteCount);
	bytePosition+=byteCount;
}
}

Here's the draw code that runs afterwards. This works on the first example which is the exact same size(28 bits) as the second version.
// Set primitive topology
pD3DDevice->IASetPrimitiveTopology( D3D10_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP );
//get technique desc
D3D10_TECHNIQUE_DESC techDesc;
pBasicTechnique->GetDesc( &techDesc );

for( UINT p = 0; p < techDesc.Passes; ++p )
{
  //apply technique
  pBasicTechnique->GetPassByIndex( p )->Apply( 0 );
  
  //draw
  pD3DDevice->Draw( numVertices, 0 );
}
pSwapChain->Present(0,0);

I've been reading through Frank Luna's DX10 book and haven't found a thing. Hopefully you can help write a flexable buffer input than having to predefine the different shader structs pre runtime.

Thank you so much,

Robert

Edited by messup000, 29 August 2012 - 10:33 PM.


Sponsor:



Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS