Point cloud registration
In this post, I mainly address the least square optimization problem in point cloud registration.
where are the correspondence point pair in two point cloud pieces. is rigid transformation matrix.
Generally, there are two approaches to solve this problem (These two methods have been widely used in various non-linear optimization).
I briefly summarize some important points here. Please refer to here for the detail of close-form derivation, and here for iterative solution.
Close-form solution
The optimal translation is
where are the mean of correspondence points. is rotation matrix.
The optimal rotation is
where is the SVD two component results of a matrix, which is composed by normalized correspondence points (see derivation for the detail).
Note that the SVD here is not applied for solving least square (the least square here is different from conventional format, check my previous post in linear algebra section).
Besides, the solution may not be a true rotation matrix, because the derivation process is to solve the optimal orthodox matrix. Hence, may be a reflect matrix (the determinant of is -1). the determinant of orthodox matrix either equals to 1 or -1 (this can be proved from and det()det()=1), but det(rotation matrix)=1, det(reflect matrix)=-1 In this case, rotation matrix should be modified as below, so that we can gurantee the det()=1 (the reasons and details in above derivation link).
Iterative solution (Gauss Newton method)
There is another solution which is a more flexible and general format.
We represent the rigid transformation matrix into a vector that collates rotation and translation components.
Suppose we have pairs correspondence points. We define the error function
Note that we discard the last element in after calculation, and is included in . The original objective is
Given a tiny disturbance , and apply Taylor expansion.
Where is the Jacobian matrix first order is called Jacobian, and second order is called Hessian .
Our target now is to find a best iteration direction , which make the minimum . In other word, we have
calculate derivation w.r.t. , and make the derivation equals to 0. We have
since is a initial guess, we can calculate (i.e. this equation tells us how to update ).
Note that is Hessian matrix too. Pose graph optimization also uses this method to optimize.
Note that we cannot directly apply add to update because of the rotation component. One alternative way is to update as below (because is tiny, we have , ).
Iterative solution (Levenberg-Marquardt method)
LM is similar with Newton method, but it controls a maximum bound for update step. Therefore, it is usually more robust than Newton method.
Please refer to here for the detailed introduction.
An extension, Lagrange Multipler Method
Lagrange multiple method has been widely used in various optimization with constraints problems.
This method has very intuitive geometry (the tangent lines at optimal point must be parallel) explanation about why it works. Check HERE for the details.