Jump to content
Wikipedia The Free Encyclopedia

Talk:Gauss–Newton algorithm

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is the talk page for discussing improvements to the Gauss–Newton algorithm article.
This is not a forum for general discussion of the subject of the article.
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL
Archives : 1, 2 Auto-archiving period: 12 months
Other talk page banners
This article is rated Start-class on Wikipedia's content assessment scale.
It is of interest to the following WikiProjects:
WikiProject icon Statistics Low‐importance
WikiProject icon This article is within the scope of WikiProject Statistics , a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.StatisticsWikipedia:WikiProject StatisticsTemplate:WikiProject StatisticsStatistics
Low This article has been rated as Low-importance on the importance scale.
WikiProject icon This article is within the scope of WikiProject Mathematics , a collaborative effort to improve the coverage of mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.MathematicsWikipedia:WikiProject MathematicsTemplate:WikiProject Mathematicsmathematics
Low This article has been rated as Low-priority on the project's priority scale.
This article is substantially duplicated in one or more external publications. Since these publication(s) copied Wikipedia, rather than the reverse, please do not flag this article as a copyright violation of the following source:
  • Surhone, L. M. (2009), Non-linear least squares: Least squares, nonlinear regression, linear least squares, errors and residuals in statistics, gradient, Gauss-Newton Algorithm, parabola, Betascript Publishing
Additional comments
OCLC 698560686, ISBN 9786130332143.


comfusion about 'normal equations'

[edit ]

The 'notes' section refers to something called 'the normal equations'. However, nothing of that name is mentioned in the main section ('Description') above the notes. Could the main section be altered to explicitly introduce these normal equations and also explain why they are in need of being solved? The thing is, in the bottom of the main section is an expression for β ( s + 1 ) {\displaystyle {\boldsymbol {\beta }}^{(s+1)}} {\displaystyle {\boldsymbol {\beta }}^{(s+1)}}. This expression seems perfect for straight out computation, as the desired component β ( s + 1 ) {\displaystyle {\boldsymbol {\beta }}^{(s+1)}} {\displaystyle {\boldsymbol {\beta }}^{(s+1)}} has been neatly isolated. Why then would we still need to to some 'solving of the normal equations'? — Preceding unsigned comment added by 87.73.120.49 (talk) 22:24, 19 January 2016 (UTC) [reply ]

Name?

[edit ]

I always thought that Newton's method, when applied to systems of equation, is still called Newton's method (or Newton-Raphson) and that the Gauss-Newton is a modified Newton's method for solving least squares problems. To wit, let f(x) = sum_i r_i(x)^2 be the least squares problem (x is a vector, and I hope you don't mind my being too lazy to properly type-set the maths). Then we need to solve 2 A(x) r(x) = 0, where A(x) is the Jacobian matrix, so A_ij = derivative of r_j w.r.t. x_i. In my understanding, Newton's method is as described in the article: f'(x_k) (x_{k+1} - x_k) = - f(x_k) with f(x) = 2 A(x) r(x). The derivative f'(x) can be calculated as f'(x) = 2 A(x) A(x)^\top + 2 sum_i r_i(x) nabla^2 r_i(x). On the other hand, the Gauss-Newton method neglect the second term, so we get the iteration A(x_k) A(x_k)^\top (x_{k+1} - x_k) = - A(x_k) r(x_k).

Could you please tell me whether I am mistaken, preferably giving a reference if I am? Thanks. -- Jitse Niesen 18:33, 18 Nov 2004 (UTC)

It is quite possible that you are right. Please feel free to improve both this article and the corresponding section in Newton's method. I do not have an appropriate reference at hands, therefore I am unable to contribute to a clarification. - By the way, thanks for the personal message on my talk page. -- Frau Holle 22:23, 20 Nov 2004 (UTC)
I moved the description of the general Newton's method in R^n back to Newton's method and wrote here a bit about the modified method for least squares problems. -- Jitse Niesen 00:45, 5 Dec 2004 (UTC)

Demo Code

[edit ]

I have implemented the algorithm given in this article in Matlab code. Hoewever I don't know how to add it to the page / format etc. It may be useful.

%GaussNewton_Demo

%Data x = [0.038 0.1947 0.425 0.626 1.253 2.500 3.740]'; y = [0.05 0.127 0.094 0.2122 0.2729 0.2665 0.3317]';

%PLot X X = [0.01:0.01:4];

%Inital guess B1 = 0.9; B2 = 0.2;

%Plot figure(1) cla hold on scatter(x,y, 'rd', 'fill') axis auto

%Loop for i = 1:5

 %Jacobian
 J = [-x./(B2+x) (B1*x)./(B2+x).^2];
 %Hessian (approx)
 H = J'*J;
 
 %Calculate residuals
 r = y - (B1*x)./(B2+x);
 %Sum of squares of residuals
 SSR = sum(r.*r)
 
 %Plot
 Y = (B1*X)./(B2+X);
 plot(X,Y, 'LineWidth', 1)
 
 %Calculate delta
 Delta = pinv(H)*J'*r;
 
 %Apply delta to parameters
 B1 = B1-Delta(1);
 B2 = B2-Delta(2);
 
 pause(1)
 

end — Preceding unsigned comment added by MartyBebop (talkcontribs) 11:39, 12 April 2012 (UTC) [reply ]

data fitting

[edit ]

I think that in the Description section, when the data fitting example is introduced, it should read

β ( s + 1 ) = β ( s ) + ( J f T J f ) 1 J f T r ( β ( s ) ) . {\displaystyle {\boldsymbol {\beta }}^{(s+1)}={\boldsymbol {\beta }}^{(s)}+\left(\mathbf {J_{f}} ^{\mathsf {T}}\mathbf {J_{f}} \right)^{-1}\mathbf {J_{f}} ^{\mathsf {T}}\mathbf {r} ({\boldsymbol {\beta }}^{(s)}).} {\displaystyle {\boldsymbol {\beta }}^{(s+1)}={\boldsymbol {\beta }}^{(s)}+\left(\mathbf {J_{f}} ^{\mathsf {T}}\mathbf {J_{f}} \right)^{-1}\mathbf {J_{f}} ^{\mathsf {T}}\mathbf {r} ({\boldsymbol {\beta }}^{(s)}).} — Preceding unsigned comment added by 129.240.215.211 (talk) 12:32, 27 May 2015 (UTC) [reply ]

Factor of 1/2 missing

[edit ]

It's there a factor of 1/2 missing in the derivation of \Delta \beta ? Alzibub (talk) 11:44, 9 January 2021 (UTC) [reply ]

Ignore that. I see my mistake now. Alzibub (talk) 15:21, 9 January 2021 (UTC) [reply ]

AltStyle によって変換されたページ (->オリジナル) /