Fitting a plane to a 3d point set

Started by
6 comments, last by xissburg 13 years, 10 months ago
The question raised previously on this forum was :

Citation : " I have a set of points which are almost all in the same plane. How do I find the plane which best fits these points? I thought of a least squares method for planes. In theory it seems possible but I haven't found any websites that explain how to do it. I tried to derive it myself but got stuck very quickly. Any ideas? If not I will have to resort to taking cross-products of my points and averaging them. "

A non-iterative method is now available : Direct and exact orthogonal regression in 3d. The set of formulae is provided in a e-paper through the link :
http://www.scribd.com/people/documents/10794575-jjacquelin
Then, select the paper " Regressions et trajectoires en 3D."
Some developments are in french, but you may go staight to the practical formulas page 18.
Another set of formulas is provided page 7 , valuable in the particular case of orthogonal fitting to a straight line in 3d.
Advertisement
1. calculate average of all input points (that's the point your plane will pass through)
2. calculate covariance matrix for points minus average
3. find 2 largest eigenvectors of that matrix
4. calculate cross product of those vectors and you get normal of your plane
You know that each point satisfies the equation

a x + b y + c z + d = 0


We have n (at least 3) points (x1,y1,z1),(x2,y2,z2),...,(xn,yn,zn)
This means we have at least 3 homogenus linear equations in 4 unknowns (a,b,c,d)

a x1 + b y1 + c z1 + d = 0a x2 + b y2 + c z2 + d = 0...a xn + b yn + c zn + d = 0


This can be set up as a matrix system

|x1 y1 z1 1||a|   |0||x2 y2 z2 1||b|   |0||x3 y3 z3 1||c| = |0||    ...   ||d|   |0||xn yn zn 1|


If the matrix of homogenous points is X, and the matrix of unknowns is p, then the solution to the equation Xp = 0 is given by the "Null Space." Computing this null space in a least squared sense can be done using the "Singular Value Decomposition" Take the SVD of X to produce U,S,V. Then, the last column of V (assuming a sorted implementation of the svd, which all of them have) will be the value of (a,b,c,d).
Hello Steve132,

citation : "You know that each point satisfies the equation
a x + b y + c z + d = 0 "

No, each point doesn't satisfies the equation, because the point aren't exactly located on the plane, but close to the plane.

Hello biki_ ,

citation :
" 1. calculate average of all input points (that's the point your plane will pass through)
2. calculate covariance matrix for points minus average
3. find 2 largest eigenvectors of that matrix
4. calculate cross product of those vectors and you get normal of your plane. "

Right. This is a well-known method, generally sufficient in practice. But the least squares fitting isn't correctly achieved as far as the ORTHOGONAL distances beteween the points and the plan are considered.
Having a bunch of points that you want to fit to a plane boils down to solving a system of linear equations. The system is both overdetermined (more points than needed), and there's no plane that lies exactly on the points.
This is a very common problem in real life scenarios in which values are subject to a degree of measurement error.

Google "Algebraic Reconstruction Techniques". The Wiki entry focuses on CT scanner machines but the method can be applied to practically anything. I've got a text book (Elementary Linear Algebra) that provides a method for approximating the solution using successive orthogonal projections, which should solve the plane fitting problem.
Hello taz0010,

citation :
"Google "Algebraic Reconstruction Techniques". The Wiki entry focuses on CT scanner machines but the method can be applied to practically anything. I've got a text book (Elementary Linear Algebra) that provides a method for approximating the solution using successive orthogonal projections, which should solve the plane fitting problem."

Fine, but all these methods are iteratives methods, leading to approximations to the solution (as accurate as we want, of course).
On the other hand, the formulas provided in page 18 of the paper already cited ( Regressions & trajectoires 3d.), link :
http://www.scribd.com/people/documents/10794575-jjacquelin
are exact formulas (i.e. exact analytical solution of the orthogonal fitting prpblem ) and the computation isn't iterative : the formulas directly gives the definitive result in one run only.

The document does not display, at least on Opera, and download is not possible without registering at sites I'm unwilling to register at. Can you provide the document at a more friendly location?

Anyway, have you compared your algorithm with the established iterative processes? How does the runtime compare? How does the quality of the solution compare? How does this change for different sizes of data sets (from hundreds to millions of points)? How is your algorithm affected by outliers?
Least Square Method
.

This topic is closed to new replies.

Advertisement