Nonlinear regression

In statistics, nonlinear regression is the problem of fitting a model


 * $$ y = f(x,\theta) + \varepsilon $$

to multidimensional x,y data, where f is a nonlinear function with respect to the parameters θ.

It is often erroneously thought that the use of least squares to estimate the parameters a, b, c in the model


 * $$y_i = a x_i^2 + bx_i + c + \mathrm{error}_i$$

is strictly an instance of nonlinear regression. In fact that is a case of linear regression; see that topic for explanation.

General
In general, there is no closed form expression for the best-fitting parameters, as there is in linear regression. Usually numerical optimization algorithms are applied to determine the best-fitting parameters. Again in contrast to linear regression, there may be many local maxima of the goodness of fit. In practice, guess values of the parameters are used, in conjunction with the optimization algorithm, to attempt to find the global maximum.

Linearization
Some nonlinear regression problems can be linearized if the exact solution to the regression equation can be found.

For example, consider the nonlinear regression problem (ignoring the error):


 * $$ y = A e^{B x} \,\!$$

If we take a logarithm of both sides, it becomes


 * $$ \ln{(y)} = \ln{(A)} + B x \,\!$$,

a usual linear regression problem of optimizing parameters ln(A) and B, the exact solution of which is well known.

While linearization allows for an unambiguous closed form solution, it is not without potential drawbacks. The influences of the data values will change, as will the error structure of the model and the interpretation of any inferential results. These may not be desired effects.