3D Line Fitting in 5 Easy Steps with SVD

Least squares fit is used for 2D line fitting. In 3D space, the line is called 3D Orthogonal Distance Regression (ODR) line. The line can be easily found in 3D using SVD (singular value decomposition).

Assuming that we have a bunch of 3D points (x0, y0, z0) to (xn, yn, zn), the algorithm (in MATLAB) is as follows:

% Step 1: arrange points in a matrix format
points = [x0 y0 z0 ; 
          x1 y1 z1 ;
            ....   ;
          xn yn zn];

% Step 2: find the mean of the points
avg = mean(points, 1);

% Step 3: subtract the mean from all points
subtracted = bsxfun(@minus, points, avg);

% Step 4 : perform SVD
[~, ~, V] = svd(subtracted);

% Step 5: find the direction vector 
%        (which is the right singular vector corresponding to the largest singular value)
direction = V(:, 1);

% Line is 'avg' and 'direction'
p0 = avg;
d = direction;

% Parametric equation: P = p0 + t*d

 

14 comments

Skip to comment form

    • Carl on December 11, 2015 at 9:54 AM
    • Reply

    I would like to ask the equation obtained at the end is like this?
    x=mean_x+v(1,1)*t
    y=mean_y+v(1,2)*t
    z=mean_z+v(1,3)*t

    where t is a parametrical value for each point

    1. Yes 🙂

        • Carl on December 14, 2015 at 9:33 AM
        • Reply

        I tried it, with the data set as follow:
        points = [1 1 1;2 2 2;3 3 3;4 4 4;5 5 5];
        All of the parameters for “direction” should be the same , but it i choose
        direction = V(1, :);
        it is not the same, so should the final result be
        direction = V(:,1);
        Woudl you mind have a check?

        1. Thanks for spotting the error Carl 🙂
          I fixed the error. Also added the optional step of normalizing the direction vector. It should produce correct results now.

            • Dave on December 15, 2015 at 11:58 PM

            Or skip steps 2 and 3 and, per Rodger Stafford, change step 4 to [~,~,V] =svd(cov(points));

          1. Yes, it is true because we are essentially computing the covariance matrix.

    • Ron on June 22, 2017 at 4:25 PM
    • Reply

    Hi Mehran,
    I will like to know why the the direction vector given by V(:, 1)
    thank you

    1. Recall that the goal here is to minimize the distance between the points and their projection on the obtained line. This is equivalent to the first principal component of the covariance matrix. In layman’s terms, we want to find the dominant direction that the data is spread in the space, similar to PCA. This is equivalent to the singular vector corresponding to the largest singular value of the SVD decomposition. Take a look at here for more details:
      https://en.wikipedia.org/wiki/Principal_component_analysis#Singular_value_decomposition

    • Ron on June 23, 2017 at 12:08 PM
    • Reply

    Thank you Mehran
    I read on the link you posted.
    I agree with your equivalence you established with the PCA.
    I know the svd decomposition consists of the right singular vector, the diagonal matrix of the singular values and the left singular matrix.
    From your script you used the column of the left singular vector corresponding to the largest singular value
    What i wanted to find out was if the right singular vector could also give the same direction information?
    Thanks

    1. No problem.
      I’m not sure I follow: in the code I posted above, I’m using the right singular vector (not the left one). According to the link below, MATLAB’s svd function first outputs left singular vectors, then outputs singular values and finally outputs right singular vectors:
      https://www.mathworks.com/help/matlab/ref/svd.html#outputarg_V

        • Ron on June 23, 2017 at 6:27 PM
        • Reply

        Sorry about the typo.
        What i meant to ask was if the left singular matrix values give us any indication of the direction of best fit line as well?
        If not do you have any idea what the elements of the left singular matrix gives us?
        (To avoid ambiguity im referring to [U,~,~] = svd(3d points);

        I could make some correlation with the significance of the diagonal of the singular values and the right singular matrix
        however i am curious what the elements of the left singular matrix U gives us.

        Some insight on that will be much appreciated.
        Thanks.

        1. I’m unsure if the columns of the left matrix of singular vectors stand to have any meaning by themselves. However in [U,S,VT] = svd(3d points), columns of U*S would be principal components (the projections of data on the principal axes, or in other words, the transformed data along the direction of each principal axis).

          Also, one thing to note is that in the code above, when arranging the points in a matrix format, each row represents one point and each column represents one axis. If this matrix is transposed such that rows represent axes and columns are 3D points, then in SVD the role of U and V would be reversed. In other words, the columns of U would give you the principal components.

          Hope this helps.

    • Peter on September 12, 2017 at 9:57 AM
    • Reply

    It looks like it could be use to fit 3D points with time information (like a track in a particle detector). Would you then simply put in a 4D vector matrix, to also get an estimate on “t”? I mean, not only for p0 and direction but also an estimation when the track would pass p0?

    • Matang Panchal on July 31, 2018 at 12:31 AM
    • Reply

    thank you for the code, I Implemented it in my project in c#

Leave a Reply

Your email address will not be published.