Thanks you for the input,

I have since further investigated the problem, but first some heads up. My matrices are already in row major ordering, I have already explicitely set that in DirectX11, and OpenGL seems to be able to work with it. Furthermore, the multiplication inversion still has to happen for some reason, otherwise I a funny looking vertex soup. I belive this is due to the fact that I multiply my viewprojection-matrix with the world-matrix in shader:

out.vPos = mul(in.vPos, mul(mWorld, mViewProj));
// translates to this in GLSL:
out.vPos = (mViewProj * mWorld) * in.vPos;

I really don't have any clue why this is necessary though, since I'm using row major ordering in both cases. Any ideas whats the reason behind this?

I'd really avoid having this difference if I was you. There's no reason that you have to use one multiplication order in D3D and the opposite in GL. It will be a lot easier if you just use the same conventions in both APIs.

When multiplying a 4x4 matrix and a vector 4, you're actually treating the vector as a matrix. You've got a choice whether you treat it as a 4x1 or a 1x4 matrix. When multiplying matrices, the inner values have to match (num columns of lhs and num rows of rhs have to match).

This means that if you treat the vector as a 4x1 matrix (4-rows, 1-column, AKA a "column vector") then it must appear on the right of the matrix, and that if you treat it as a 1x4 (1-row, 4-columns, AKA a "row vector") then it must appear on the left of the matrix.

i.e.

4x4 * 4x1 < valid

1x4 * 4x1 < valid

4x4 * 1x4 < invalid / can't be done

4x1 * 4x1 < invalid / can't be done

GLSL and HLSL do this automatically - if you put your vector on the right of a matrix, you're using the column-vector convention / if you put your vector on the left of the matrix, you're using the row-vector convention.

So, as you can see from your code, you're using column-vectors in GL and row-vectors in D3D.

This is a huge problem, because the correct way to construct a transformation matrix differs depending on this convention.

If you're using the column-vector convention, your matrices look like this:

|xx yx zx tx| |vx|
|xy yy zy ty| * |vy|
|xz yz zz yz| |vz|
| 0 0 0 1| | 1|

And if you're using the row-vector convention, your matrices look like this:

|xx xy xz 0|
|vx vy vz 1| * |yx yy yz 0|
|zx zy zz 0|
|tx ty tz 1|

i.e. if you're using the row-vector convention, you store your basis vectors and translation in the rows of the matrix. If you're using the column-vector convention, you store your basis vectors in the columns of the matrix.

Each convention also uses the opposite matrix concatenation order than the other.

Row-vector convention uses result = vector * local * global, column-vector convention uses result = global * local * vector.

Are you using the same math library across both APIs? Or are you for example using D3DX for D3D and GLM for GL?

If you are, then it's possible that each of these libraries is constructing matrices using the opposite convention to the other.

A column-vector convention matrix, stored in column-major order looks like:

xx, xy, xz, 0, yx, yy, yz, 0, zx, zy, zz, 0, tx, ty, tz, 1

A column-vector convention matrix, stored in row-major order looks like:

xx, yx, zx, tx, xy, yy, zy, ty, xz, yz, zz, tz, 0, 0, 0, 1

A row-vector convention matrix, stored in column-major order looks like:

xx, yx, zx, tx, xy, yy, zy, ty, xz, yz, zz, tz, 0, 0, 0, 1

A row-vector convention matrix, stored in row-major order looks like:

xx, xy, xz, 0, yx, yy, yz, 0, zx, zy, zz, 0, tx, ty, tz, 1

So as well as making sure you're using both row-major or both-column major in both D3D and GL, you also need to make sure that your math libraries are both following the column-vector convention or the row-vector convention. If you do that, then the shader code and matrix multiplication order will be exactly the same across both APIs.