Least squares fit is used for 2D line fitting. In 3D space, the line is called 3D Orthogonal Distance Regression (ODR) line. The line can be easily found in 3D using SVD (singular value decomposition).

Assuming that we have a bunch of 3D points (x_{0}, y_{0}, z_{0}) to (x_{n}, y_{n}, z_{n}), the algorithm (in MATLAB) is as follows:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
% Step 1: arrange points in a matrix format points = [x0 y0 z0 ; x1 y1 z1 ; .... ; xn yn zn]; % Step 2: find the mean of the points avg = mean(points, 1); % Step 3: subtract the mean from all points subtracted = bsxfun(@minus, points, avg); % Step 4 : perform SVD [~, ~, V] = svd(subtracted); % Step 5: find the direction vector % (which is the right singular vector corresponding to the largest singular value) direction = V(:, 1); % Line is 'avg' and 'direction' p0 = avg; d = direction; % Parametric equation: P = p0 + t*d |

## 12 comments

Skip to comment form ↓

## Carl

December 11, 2015 at 9:54 AM (UTC -4) Link to this comment

I would like to ask the equation obtained at the end is like this?

x=mean_x+v(1,1)*t

y=mean_y+v(1,2)*t

z=mean_z+v(1,3)*t

where t is a parametrical value for each point

## Mehran Maghoumi

December 11, 2015 at 9:57 AM (UTC -4) Link to this comment

Yes 🙂

## Carl

December 14, 2015 at 9:33 AM (UTC -4) Link to this comment

I tried it, with the data set as follow:

points = [1 1 1;2 2 2;3 3 3;4 4 4;5 5 5];

All of the parameters for “direction” should be the same , but it i choose

direction = V(1, :);

it is not the same, so should the final result be

direction = V(:,1);

Woudl you mind have a check?

## Mehran Maghoumi

December 14, 2015 at 10:46 AM (UTC -4) Link to this comment

Thanks for spotting the error Carl 🙂

I fixed the error. Also added the optional step of normalizing the direction vector. It should produce correct results now.

## Dave

December 15, 2015 at 11:58 PM (UTC -4) Link to this comment

Or skip steps 2 and 3 and, per Rodger Stafford, change step 4 to [~,~,V] =svd(cov(points));

## Mehran Maghoumi

December 16, 2015 at 8:04 AM (UTC -4) Link to this comment

Yes, it is true because we are essentially computing the covariance matrix.

## Ron

June 22, 2017 at 4:25 PM (UTC -4) Link to this comment

Hi Mehran,

I will like to know why the the direction vector given by V(:, 1)

thank you

## Mehran Maghoumi

June 22, 2017 at 4:49 PM (UTC -4) Link to this comment

Recall that the goal here is to minimize the distance between the points and their projection on the obtained line. This is equivalent to the first principal component of the covariance matrix. In layman’s terms, we want to find the dominant direction that the data is spread in the space, similar to PCA. This is equivalent to the singular vector corresponding to the largest singular value of the SVD decomposition. Take a look at here for more details:

https://en.wikipedia.org/wiki/Principal_component_analysis#Singular_value_decomposition

## Ron

June 23, 2017 at 12:08 PM (UTC -4) Link to this comment

Thank you Mehran

I read on the link you posted.

I agree with your equivalence you established with the PCA.

I know the svd decomposition consists of the right singular vector, the diagonal matrix of the singular values and the left singular matrix.

From your script you used the column of the left singular vector corresponding to the largest singular value

What i wanted to find out was if the right singular vector could also give the same direction information?

Thanks

## Mehran Maghoumi

June 23, 2017 at 4:48 PM (UTC -4) Link to this comment

No problem.

I’m not sure I follow: in the code I posted above, I’m using the

rightsingular vector (not the left one). According to the link below, MATLAB’s svd function first outputs left singular vectors, then outputs singular values and finally outputs right singular vectors:https://www.mathworks.com/help/matlab/ref/svd.html#outputarg_V

## Ron

June 23, 2017 at 6:27 PM (UTC -4) Link to this comment

Sorry about the typo.

What i meant to ask was if the left singular matrix values give us any indication of the direction of best fit line as well?

If not do you have any idea what the elements of the left singular matrix gives us?

(To avoid ambiguity im referring to [U,~,~] = svd(3d points);

I could make some correlation with the significance of the diagonal of the singular values and the right singular matrix

however i am curious what the elements of the left singular matrix U gives us.

Some insight on that will be much appreciated.

Thanks.

## Mehran Maghoumi

June 24, 2017 at 11:43 AM (UTC -4) Link to this comment

I’m unsure if the columns of the left matrix of singular vectors stand to have any meaning by themselves. However in [U,S,VT] = svd(3d points), columns of U*S would be principal components (the projections of data on the principal axes, or in other words, the transformed data along the direction of each principal axis).

Also, one thing to note is that in the code above, when arranging the points in a matrix format, each row represents one point and each column represents one axis. If this matrix is transposed such that rows represent axes and columns are 3D points, then in SVD the role of U and V would be reversed. In other words, the columns of U would give you the principal components.

Hope this helps.