Hi, I'm currently working on an abstraction layer between Direct3D9 and OpenGL 2.1.
Basically there is a virtual class (the rendering interface) that can be implemented with D3D9 or GL21 class. The problem that I have is that I don't know how to store and send the vertices. D3D9 provides FVF where every vertices contains position, texture uv and color in a DWORD value. OpenGL handles the single vertices' elements in different locations of memory (an array for position, an array for texture uv and an array for color in vec4 float format). Basically I want an independent way to create vertices, the application should store them in a portion of memory, then send them all at once to the render interface, but how? I know that OpenGL can handle the vertices in D3D compatibility mode (with glVertexAttribPointer, managing stride and pointer value) that seems to be slower than the native way, but how to manage for example the colors? D3D9 accepts a single 32-bit value in integer format, OpenGL manages it in a more dynamic (but heavy) way, storing each color's channel in a 32-bit floating point value. In this case I can handle the 32-bit integer value inside the vertex shader and then convert it in a vec4 right? But all these operations seems to be too heavy. In the future I want that this abstraction layer is good enough to implement other rendering engines like Direct3D11, OpenGL ES 2.0 and others. So my main question is: there is a nice way to abstract these two rendering API without to change the rest of the code that uses the rendering interface?