Suppose you have a huge number of similar objects in world space. These objects can be represented as instances. But, the meshes of these objects are in world space and are represented as un-related objects. The meshes are given to be of identical geometry and topology and material assignment. Also, it is given that, the unknown transformation that happened in creating each of the mesh is of affine nature ( ie: one involving, scale, rotation, shearing and translation ). Given two similar meshes in world space, how can one guess the transformation that transforms one mesh to another?

Let the vertices of mesh $M_0$ be $a_0, a_1, a_2 ... a_{n-1}$ and the vertices of mesh $M_1$ be $b_0, b_1, b_2 ... b_{n-1}.$ Each $a_i$ or $b_i$ is a $1 \times 4$ row vector of the $x, y , z$ and $w$ components of the corresponding vertex. The $w$ component is always $1$.

We have $b_i = a_i \times T, i=0$ to $n-1,$ where $T$ is an unknown $4 \times 4$ matrix. Please consider $T$ as made up of four $4 \times 1$ matrices, $T_x, T_y, T_z$ and $T_w$.

Now

Make a big $n \times 1$ matrix $B_x$, where the $i^{th}$ row of $B$ is $b_i.x$. Make a big $n \times 4$ matrix $A$, where the $i^{th}$ row of $A$ is $a_i$.

$B_x, B_y, B_z , B_w$ and $A$ are known. $T_x, T_y, T_z, Tw$ are unknown.

Since $n$, the number of vertices is often a large number compared to the other matrix dimension which is $4$, we have an over constrained system (ie too many knowns and too few unknowns). We can find the value of $T$, which gives the minimum least squares error.

Let $e_i$ be the $1 \times 4$ matrix which denotes the error,

Let $E_x$ be the $n \times 1$ matrix where the $i^{th}$ row is $e_i.x$.

Note that

From $(2)$,

Let “$^T$” be the transpose operator.

Similarly

From $(3)$,

Now $T_x^T \times A^T \times B_x$ and $B_x^T \times A \times T_x$ are each $1 \times 1$ matrices and are the same. So $(5)$ can be written as,

Differentiate this matrix equation wrt. the unknown, ie $T_x$. The minimum error occurs when the differentiated result is equal to zero, and the second differential is a positive definite matrix.

Please use the following equations from matrix calculus is doing the differentiation. Let $M$ be a $n \times n$ matrix and let $p$ be a $n \times 1$ column vector.

Equating this above derivative to zero, we get a linear equation for $T_x$, which we can solve using any of the popular numerical methods for solving linear equations.

or

or

Note that the second differential of $(6)$ is $A^T \times A$ which is a positive definite matrix. so solving for the above value of $T_x$ will give us the least square minimum.

Similarly we can solve for $T_y$, $T_z$ and $T_w$ separately, and thusly retrieve $T$.

Ref: Practical Least Squares for Computer Graphics, Siggraph-07 Course.

blog comments powered by Disqus

28 May 2010