Quad not showing up

Started by
6 comments, last by 3DModelerMan 12 years ago
I'm trying to create a quad with this code:


SVertex vertices[] =
{
SVertex(core::Vector3(-w, -h, 0.0f), 0.0f, 0.0f, core::Vector3(0,0,0)),
SVertex(core::Vector3(-w, h, 0.0f), 1.0f, 0.0f, core::Vector3(0,0,0)),
SVertex(core::Vector3(w, h, 0.0f), 0.0f, 1.0f, core::Vector3(0,0,0)),
SVertex(core::Vector3(w, -h, 0.0f), 1.0f, 1.0f, core::Vector3(0,0,0))
};
const unsigned int numIndices = 6;
unsigned int indices[numIndices] =
{
0, 1, 2,
0, 2, 3
};


Then I just create the vertex buffer and index buffer normally. I set the primitive type to triangle list. w and h are just the size of the quad that I generate. But when I render this with DrawIndexed it only draws a triangle. Is there anything wrong with my vertex/index data?
Advertisement
Hi,

Can you show the actual drawing code?

Cheers!
Is it ok to use 4 verticles for 2 triangles in DX11 ?
Try giving coordinates for both triangles one after the other , see what happens then.

Another thing , I'm too tired to look at that right now but make sure the second triangle isn't facing the opposite direction ( so that culling removes it )
Off the top of my head ...
what dose you shader look like? try turning on backfaceculling to see if the triangle is render backwards.

do you have dxdebug on? if yes, do you get warnings? errors?

And as DannyZB says, can you actualy post all of the drawing code?
"There will be major features. none to be thought of yet"
Sorry, I was sick and couldn't get on. I already tried to disable culling and that didn't work. I'll post the drawing code as soon as I can get to the computer that it's on. DX debug isn't on. I know that DirectX11 supports drawing a quad with four vertices because it supports indexed primitives and that's what they're for.
My shader is just the shader from the MSDN drawing a tri tutorial. Here's my drawing code:


void CDX11Driver::drawMeshBuffer(IMeshBuffer* mb)
{
CDX11MeshBuffer* dxMb = (CDX11MeshBuffer*)mb;
m_deviceContext->IASetIndexBuffer(dxMb->m_indexBuffer, DXGI_FORMAT_R32_UINT, 0);
m_deviceContext->IASetVertexBuffers( 0, 1, &dxMb->m_vertexBuffer, &dxMb->m_stride, &dxMb->m_offset );
m_deviceContext->IASetPrimitiveTopology(dxMb->getPrimitiveTopology());
setMaterial(mb->getMaterial());
m_deviceContext->DrawIndexed(dxMb->getNumIndices(), 0, 0);
}

Shaders are set inside setMaterial(). I can't figure out why it's doing this, it draws triangles just fine.
I just opened it up in PIX and it shows two primitives, and only three vertices. The first primitive is made up of the first three vertices like I expected. The second one is just made of the last vertex over and over again. I init the vertex and index data like I showed above, and then I create the buffers like this:


CDX11MeshBuffer::CDX11MeshBuffer(
ID3D11Device* d3dDevice,
const std::string& name,
E_PRIMITIVE_TYPE primitiveType,
IMaterial* defaultMat,
bool isStatic,
SVertex vertices[],
unsigned int indices[],
unsigned int numIndices)
:IMeshBuffer(name, primitiveType, defaultMat, isStatic),
m_stride(0), m_offset(0), m_numIndices(numIndices)
{
//Create index buffer
{
core::DebugUtils::out << "Creating index buffer.\n";
D3D11_BUFFER_DESC bufferDesc;
ZeroMemory( &bufferDesc, sizeof(bufferDesc) );
if ( m_isStatic )
bufferDesc.Usage = D3D11_USAGE_IMMUTABLE;//Or a different static format if this is the wrong one.
else
bufferDesc.Usage = D3D11_USAGE_DEFAULT;
bufferDesc.ByteWidth = sizeof( unsigned int ) * 3;
bufferDesc.BindFlags = D3D11_BIND_INDEX_BUFFER;
bufferDesc.CPUAccessFlags = 0;
bufferDesc.MiscFlags = 0;
D3D11_SUBRESOURCE_DATA initData;
initData.pSysMem = indices;
initData.SysMemPitch = 0;
initData.SysMemSlicePitch = 0;
if ( FAILED(d3dDevice->CreateBuffer( &bufferDesc, &initData, &m_indexBuffer ) ) )
{
core::DebugUtils::out << "Error: Failed to create index buffer.\n";
m_indexBuffer = 0;
}
}
//Create vertex buffer
{
core::DebugUtils::out << "Creating vertex buffer.\n";
D3D11_BUFFER_DESC bd;
ZeroMemory( &bd, sizeof(bd) );
if ( m_isStatic )
bd.Usage = D3D11_USAGE_IMMUTABLE;//Or a different static format if this is the wrong one.
else
bd.Usage = D3D11_USAGE_DEFAULT;
bd.ByteWidth = sizeof( SVertex ) * 3;
bd.BindFlags = D3D11_BIND_VERTEX_BUFFER;
bd.CPUAccessFlags = 0;
bd.MiscFlags = 0;
D3D11_SUBRESOURCE_DATA initData;
ZeroMemory( &initData, sizeof(initData) );
initData.pSysMem = vertices;
if ( FAILED(d3dDevice->CreateBuffer( &bd, &initData, &m_vertexBuffer ) ) )
{
core::DebugUtils::out << "Error: Failed to create vertex buffer.\n";
m_vertexBuffer = 0;
}
}
//Calculate stride and offset
m_stride = sizeof(SVertex);
m_offset = 0;
}
Nevermind. I finally figured out the problem: the ByteWidth for the buffers was wrong.

This topic is closed to new replies.

Advertisement