# OpenGL problem with texcoords (using multiple buffers)

## Recommended Posts

Hi, I am trying to render a mesh using two buffers, one of the vertexes the other for the uv coords. http://www.jaapvanderwulp.nl/tech/doom3.jpg This is my code but as you can see in the image, it does not work. Could it be that directx uses texture coords in another way that openGL does because the mesh does look normal in openGL. Or am I forgetting something? Here is the draw code.
LPDIRECT3DVERTEXBUFFER9 Vertex_Buffer;
LPDIRECT3DVERTEXBUFFER9 Coord_Buffer;
LPDIRECT3DVERTEXDECLARATION9 VertexDeclaration = NULL;

for(uint i=0; i<NumSubMeshes; i++)
{
uint vertexsize = SubMeshes[i].VertexCount*sizeof(CVector3);
uint coordsize = SubMeshes[i].VertexCount*sizeof(CTexCoord);

// Create buffer
if(FAILED(gEngine.pDirect3DDevice->CreateVertexBuffer(vertexsize, 0, 0,
D3DPOOL_DEFAULT, &Vertex_Buffer, NULL)))
{
MessageBox(NULL, L"CreateVertexBuffer", L"Vertex_Buffer", MB_OK);
return;
}

// Lock
CVector3 *Vertices;

// Lock the buffer we can write to it.
if(FAILED(Vertex_Buffer->Lock(0, vertexsize, (void**)&Vertices, 0)))
{
MessageBox(NULL, L"Lock", L"Vertex_Buffer", MB_OK);
return;
}

// Here we copy the square's data into the vertex buffer.
memcpy(Vertices, SubMeshes[i].Vertexes, vertexsize);

// Unlock when your done coping data into the buffer.
if(FAILED(Vertex_Buffer->Unlock()))
{
MessageBox(NULL, L"Lock", L"Vertex_Buffer", MB_OK);
return;
}

// Create buffer
if(FAILED(gEngine.pDirect3DDevice->CreateVertexBuffer(coordsize, 0, 0,
D3DPOOL_DEFAULT, &Coord_Buffer, NULL)))
{
MessageBox(NULL, L"CreateVertexBuffer", L"Coord_Buffer", MB_OK);
return;
}

// Lock
CTexCoord *pCoords;

// Lock the buffer we can write to it.
if(FAILED(Coord_Buffer->Lock(0, coordsize, (void**)&pCoords, 0)))
{
MessageBox(NULL, L"Lock", L"Coord_Buffer", MB_OK);
return;
}

// Here we copy the square's data into the vertex buffer.
memcpy(pCoords, SubMeshes[i].UVs, coordsize);

// Unlock when your done coping data into the buffer.
if(FAILED(Coord_Buffer->Unlock()))
{
MessageBox(NULL, L"Unlock", L"Coord_Buffer", MB_OK);
return;
}

//Declare
D3DVERTEXELEMENT9 Declaration[] =
{
{0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0},
{1, 0, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0},
D3DDECL_END()
};

// Create the vertex declaration so DirectX knows how our vertex is made up.
if(FAILED(gEngine.pDirect3DDevice->CreateVertexDeclaration(Declaration, &VertexDeclaration)))
{
MessageBox(NULL, L"CreateVertexDeclaration", L"", MB_OK);
return;
}

if(!VertexDeclaration)
{
MessageBox(NULL, L"!VertexDeclaration", L"", MB_OK);
return;
}

//Set the stream sources
if(FAILED(gEngine.pDirect3DDevice->SetStreamSource(0, Vertex_Buffer, 0, sizeof(CVector3))))
{
MessageBox(NULL, L"SetStreamSource", L"Vertex_Buffer", MB_OK);
return;
}

if(FAILED(gEngine.pDirect3DDevice->SetStreamSource(1, Coord_Buffer, 0, sizeof(CTexCoord))))
{
MessageBox(NULL, L"SetStreamSource", L"Coord_Buffer", MB_OK);
return;
}

// Set the vertex stream declaration.
if(FAILED(gEngine.pDirect3DDevice->SetVertexDeclaration(VertexDeclaration)))
{
MessageBox(NULL, L"SetVertexDeclaration", L"", MB_OK);
return;
}

HRESULT r = gEngine.pDirect3DDevice->DrawPrimitive(D3DPT_TRIANGLELIST, 0, SubMeshes[i].VertexCount/3);
if(FAILED(r))
{
LPWSTR t = (LPWSTR)malloc(512);
wsprintfW(t, L"Error %u", r);
MessageBox(NULL, t, L"DrawPrimitive", MB_OK);
delete []t;
return;
}

VertexDeclaration->Release();
Vertex_Buffer->Release();
Coord_Buffer->Release();

}



##### Share on other sites
I can't see anything obviously wrong with the code you've posted. Are you getting any debug warnings whilst running your app? It's quite possible a state is invalid but the HRESULT's will not return an error.

Have you checked against the reference rasterizer? Shouldn't be necessary, but can't help to verify that its not an odd characteristic of your hardware.

Have you done a frame capture and analyzed it in PIX? You should be able to dig down to the pipeline configuration and see how it's assembling the data it actually renders.

Also, have you double-checked against the MultiStreamRendering Sample? Ignore the D3D10 stuff...

hth
Jack

##### Share on other sites
Well I just found out that it isn't because I'm using multiple buffers.
I put everything into one buffer and still the result is the same.
On to the harder part.

## Create an account

Register a new account

• ## Partner Spotlight

• ### Forum Statistics

• Total Topics
627653
• Total Posts
2978440
• ### Similar Content

• Both functions are available since 3.0, and I'm currently using glMapBuffer(), which works fine.
But, I was wondering if anyone has experienced advantage in using glMapBufferRange(), which allows to specify the range of the mapped buffer. Could this be only a safety measure or does it improve performance?
Note: I'm not asking about glBufferSubData()/glBufferData. Those two are irrelevant in this case.
• By xhcao
Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness.
• By cebugdev
hi guys,
are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic
let me know if you guys have recommendations.
• By dud3
How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below?
Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.

References:
Code: https://pastebin.com/Hcshj3FQ
The video shows the difference between blender and my rotation:

• By Defend
I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
* make lots of VAO/VBO pairs and flip through them to render different objects, or
* make one big VBO and jump around its memory to render different objects.
I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?

• 10
• 12
• 22
• 13
• 33