Accurate least-squares fit algorithm needed -
i've experimented 2 ways of implementing least-squares fit (lsf) algorithm shown here.
the first code textbook approach, described wolfram's page on lsf. second code re-arranges equation minimize machine errors. both codes produce similar results data. compared these results matlab's p=polyfit(x,y,1) function, using correlation coefficients measure "goodness" of fit , compare each of 3 routines. observed while 3 methods produced results, @ least data, matlab's routine had best fit (the other 2 routines had similar results each other).
matlab's p=polyfit(x,y,1) function uses vandermonde matrix, v (n x 2 matrix) , qr factorization solve least-squares problem. in matlab code, looks like:
v = [x1,1; x2,1; x3,1; ... xn,1] % line pseudo-code [q,r] = qr(v,0); p = r\(q'*y); % performs same p = v\y
i'm not mathematician, don't understand why more accurate. although difference slight, in case need obtain slope lsf , multiply large number, improvement in accuracy shows in results.
for reasons can't into, cannot use matlab's routine in work. so, i'm wondering if has more accurate equation-based approach recommendation use improvement on above 2 approaches, in terms of rounding errors/machine accuracy/etc.
any comments appreciated! in advance.
for polynomial fitting, can create vandermonde matrix , solve linear system, done.
another solution using methods gauss-newton fit data (since system linear, 1 iteration should fine). there differences between methods. 1 possibly reason runge's phenomenon.
Comments
Post a Comment