From Wakapon
Jump to: navigation, search
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
 +
http://www.seas.ucla.edu/~vandenbe/133A/
 +
https://en.wikipedia.org/wiki/Gaussian_quadrature
 +
 +
 +
 
This page is a rough summary of the various methods for optimization, curve fitting/linear regression, etc.
 
This page is a rough summary of the various methods for optimization, curve fitting/linear regression, etc.
  
Line 9: Line 14:
 
== Optimization ==
 
== Optimization ==
  
=== Unconstrained Optimization ===
+
* Convex optimization
 +
 
 +
=== One-Dimensional Search Methods ===
 +
* Golden Section Search 91
 +
* Fibonacci Search 95
 +
* Newton's Method 103
 +
* Secant Method
 +
 
 +
=== Unconstrained Optimization and Neural Networks ===
 
* Descent methods
 
* Descent methods
* Line search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
+
* Line search
* Descent methods with trust region . . . . . . . . . . . . . . . . . 26
+
* Descent methods with trust region
* Steepest descent . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
+
* Steepest descent
* Quadratic models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
+
* Quadratic models
* Conjugate gradient methods . . . . . . . . . . . . . . . . . . . . . . . . 31
+
* Conjugate gradient methods
 +
* Single-Neuron Training
 +
* Backpropagation Algorithm
  
 
=== Newton-Type Methods ===
 
=== Newton-Type Methods ===
* Newton’s method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
+
* Newton’s method
* Damped Newton methods . . . . . . . . . . . . . . . . . . . . . . . . . . 48
+
* Damped Newton methods
* Quasi–Newton methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
+
* Quasi–Newton methods
* DFP formula . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
+
* DFP formula
* BFGS formulas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
+
* BFGS formulas
* Quasi–Newton implementation . . . . . . . . . . . . . . . . . . . . . 63
+
* Quasi–Newton implementation
  
 
=== Direct Search ===
 
=== Direct Search ===
* Simplex method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
+
* Simplex method
* Method of Hooke and Jeeves . . . . . . . . . . . . . . . . . . . . . . . 70
+
* Method of Hooke and Jeeves
  
 
=== Linear Data Fitting ===
 
=== Linear Data Fitting ===
* “Best” fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
+
* “Best” fit
* Linear least squares . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
+
* Linear least squares
* Weighted least squares . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
+
* Weighted least squares
* Generalized least squares . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
+
* Generalized least squares
* Polynomial fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
+
* Polynomial fit
* Spline fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
+
* Spline fit
* Choice of knots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
+
* Choice of knots
  
 
=== Nonlinear Least Squares Problems ===
 
=== Nonlinear Least Squares Problems ===
* Gauss–Newton method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
+
* Gauss–Newton method
* The Levenberg–Marquardt method . . . . . . . . . . . . . . . . . 120
+
* The Levenberg–Marquardt method
* Powell’s Dog Leg Method . . . . . . . . . . . . . . . . . . . . . . . . . . 125
+
* Powell’s Dog Leg Method
* Secant version of the L–M method . . . . . . . . . . . . . . . . . 130
+
* Secant version of the L–M method
* Secant version of the Dog Leg method . . . . . . . . . . . . . . 134
+
* Secant version of the Dog Leg method
 +
 
 +
=== Duality ===
 +
* The Lagrange dual function
 +
* The Lagrange dual problem

Latest revision as of 10:01, 5 August 2017

http://www.seas.ucla.edu/~vandenbe/133A/ https://en.wikipedia.org/wiki/Gaussian_quadrature


This page is a rough summary of the various methods for optimization, curve fitting/linear regression, etc.

First, some definitions:

  • In statistics, linear regression is basically a way to make a curve fit a set of data points

Linear Regression.png

Optimization

  • Convex optimization

One-Dimensional Search Methods

  • Golden Section Search 91
  • Fibonacci Search 95
  • Newton's Method 103
  • Secant Method

Unconstrained Optimization and Neural Networks

  • Descent methods
  • Line search
  • Descent methods with trust region
  • Steepest descent
  • Quadratic models
  • Conjugate gradient methods
  • Single-Neuron Training
  • Backpropagation Algorithm

Newton-Type Methods

  • Newton’s method
  • Damped Newton methods
  • Quasi–Newton methods
  • DFP formula
  • BFGS formulas
  • Quasi–Newton implementation

Direct Search

  • Simplex method
  • Method of Hooke and Jeeves

Linear Data Fitting

  • “Best” fit
  • Linear least squares
  • Weighted least squares
  • Generalized least squares
  • Polynomial fit
  • Spline fit
  • Choice of knots

Nonlinear Least Squares Problems

  • Gauss–Newton method
  • The Levenberg–Marquardt method
  • Powell’s Dog Leg Method
  • Secant version of the L–M method
  • Secant version of the Dog Leg method

Duality

  • The Lagrange dual function
  • The Lagrange dual problem