No, I know about the SVD. It's a
type of complete orthogonal factorization, but it's expensive to compute. And you don't really need it if all you want is the minimum norm solution.
See, for example,
This note in the LAPACK manual. It suggests that you can find the solution by doing (converting the notation):
AP = Q1*R2T*Q2T
x = P * Q2*R2-T*Q1T * b
That is, you end up inverting the triangular matrix. But I'm not sure that you actually need the inverse of the triangular matrix. If you can do some forward/backward substitution that should work just fine, too. But either way you need to make sure that R2-T is actually invertible.
In general, the triangular matrix obtained from a complete orthogonal factorization isn't invertible. See, for example:
1 0 * 0 0 * 1 00 1 1 0 0 1
But I think that because a QR factorization was done, maybe the orthogonal matrices are such that the triangular matrix will always be either invertible or diagonal.
I'm just not sure how to prove it to myself, or what sort of special cases I'd need to consider. I could look at the LAPACK source, but I have a hard time working my way through the logic of Linear Algebra in source code form. Especially with the sort of packing together of matrices I think LAPACK does.