I know you said that this will run on the GPU, but I sat down to find an analytical solution to this problem (if nothing else, I was just curious myself) under some definition of an optimal solution. Whether it's a feasible solution depends on what constraints you have on your solution and implementation. I came up with an analytical solution at least, so do what you want with it.
- Let vn be the n:th vector in your initial set of vectors.
- Let V = [v1, v2, v3 ... vn] be a matrix of all vector vn.
- Let p be the final vector perpendicular to all vectors vn.
Now, define the problem as finding p such that p is as perpendicular as possible to all vectors vn. The definition of an optimal solution here is that the sum of the squares of all projections is as small as possible. The trivial solution to this problem is p=0, so constrain p to be a unit vector. Thus, the problem is stated as:
- minimize norm(V*pT)2 over p, subject to norm(p)2 = 1.
- minimize pT*V*VT*p over p, subject to pT*p = 1.
This is a standard constrained quadratic optimization. Introduce the Lagrange multiplier to turn the constrained optimization into an unconstrained optimization:
- minimize pT*V*VT*p - L*(pT*p - 1) over p.
Solve the optimization by finding where the partial derivative of the function to be minimized, with respect to the variable being optimized, is equal to zero:
- V*VT*popt - L*popt = 0
- V*VT*popt = L*popt
This is a standard eigenvector problem. The optimal solution, popt, is the eigenvector of V*VT with the minimum eigenvalue.
The product V*VT is a 3x3 matrix, assuming your initial vectors vn are 3-dimensional vectors, so you "only" need to find the eigenvector for the smallest eigenvalue of quite small (relatively speaking) matrix. Furthermore, the product V*VT is also fairly trivial to compute; it is simply V*VT = sum(vn*vnT) for all n; that is, the sum of all outer products of all vectors.
Keep in mind here that the optimal solution is not unique. If popt is a solution, then its inverse -popt is also an equally optimal solution. This is not a problem with my solution, but with your problem. Remember that the cross-product is not commutaive, so AxB != BxA. Well, in fact, AxB=-BxA. The operands are order-sensitive, but your set of vectors are inherently unordered. Unless you have additional constraints that indicates in which direction you want the final perpendicular vector to point in, you have to deal with this uncertainty.