Bibliografische Daten
ISBN/EAN: 9780387249759
Sprache: Englisch
Umfang: xii, 687 S., 25 s/w Illustr., 25 Fotos
Format (T/L/B): 3.7 x 24.2 x 16.6 cm
Einband: gebundenes Buch
Beschreibung
InhaltsangabePreface 1 Introduction 1.1 Introduction 1.2 Mathematics Foundations 1.2.1 Norm 1.2.2 Inverse and Generalized Inverse of a Matrix 1.2.3 Properties of Eigenvalues 1.2.4 RankOne Update 1.2.5 Function and Differential 1.3 Convex Sets and Convex Functions 1.3.1 Convex Sets 1.3.2 Convex Functions 1.3.3 Separation and Support of Convex Sets 1.4 Optimality Conditions for Unconstrained Case 1.5 Structure of Optimization Methods Exercises 2 Line Search 2.1 Introduction 2.2 Convergence Theory for Exact Line Search 2.3 Section Methods 2.3.1 The Golden Section Method 2.3.2 The Fibonacci Method 2.4 Interpolation Method 2.4.1 Quadratic Interpolation Methods 2.4.2 Cubic Interpolation Method 2.5 Inexact Line Search Techniques 2.5.1 Armijo and Goldstein Rule 2.5.2 WolfePowell Rule 2.5.3 Goldstein Algorithm and Wolfe-Powell Algorithm 2.5.4 Backtracking Line Search 2.5.5 Convergence Theorems of Inexact Line Search Exercises 3 Newton's Methods 3.1 The Steepest Descent Method 3.1.1 The Steepest Descent Method 3.1.2 Convergence of the Steepest Descent Method 3.1.3 Barzilai and Borwein Gradient Method 3.1.4 Appendix: Kantorovich Inequality 3.2 Newton's Method 3.3 Modified Newton's Method 3.4 FiniteDifference Newton's Method 3.5 Negative Curvature Direction Method 3.5.1 GillMurray Stable Newton's Method 3.5.2 FiaccoMcCormick Method 3.5.3 Fletcher-Freeman Method 3.5.4 SecondOrder Step Rules 3.6 Inexact Newton's Method Exercises 4 Conjugate Gradient Method 4.1 Conjugate Direction Methods 4.2 Conjugate Gradient Method 4.2.1 Conjugate Gradient Method 4.2.2 Beale's Three-Term Conjugate Gradient Method 4.2.3 Preconditioned Conjugate Gradient Method 4.3 Convergence of Conjugate Gradient Methods 4.3.1 Global Convergence of Conjugate Gradient Methods 4.3.2 Convergence Rate of Conjugate Gradient Methods Exercises 5 QuasiNewton Methods 5.1 QuasiNewton Methods 5.1.1 QuasiNewton Equation 5.1.2 Symmetric Rank-One (SR1) Update 5.1.3 DFP Update 5.1.4 BFGS Update and PSB Update 5.1.5 The Least Change Secant Update 5.2 The Broyden Class 5.3 Global Convergence of Quasi-Newton Methods 5.3.1 Global Convergence under Exact Line Search 5.3.2 Global Convergence under Inexact Line Search 5.4 Local Convergence of Quasi-Newton Methods 5.4.1 Superlinear Convergence of General Quasi-Newton Methods 5.4.2 Linear Convergence of General Quasi-Newton Methods 5.4.3 Local Convergence of Broyden's Rank-One Update 5.4.4 Local and Linear Convergence of DFP Method 5.4.5 Superlinear Convergence of BFGS Method 5.4.6 Superlinear Convergence of DFP Method 5.4.7 Local Convergence of Broyden's Class Methods 5.5 SelfScaling Variable Metric (SSVM) Methods 5.5.1 Motivation to SSVM Method 5.5.2 SelfScaling Variable Metric (SSVM) Method 5.5.3 Choices of the Scaling Factor 5.6 Sparse Quasi-Newton Methods 5.7 Limited Memory BFGS Method Exercises 6 TrustRegion and Conic Model Methods 6.1 TrustRegion Methods 6.1.1 TrustRegion Methods 6.1.2 Convergence of Trust-Region Methods 6.1.3 Solving A Trust-Region Subproblem 6.2 Conic Model and Collinear Scaling Algorithm 6.2.1 Conic Model 6.2.2 Generalized Quasi-Newton Equation 6.2.3 Updates that Preserve Past Information 6.2.4 Collinear Scaling BFGS Algorithm 6.3 Tensor Methods 6.3.1 Tensor Method for Nonlinear Equations 6.3.2 Tensor Methods for Unconstrained Optimization Exercises
Produktsicherheitsverordnung
Hersteller:
Springer Verlag GmbH
juergen.hartmann@springer.com
Tiergartenstr. 17
DE 69121 Heidelberg
Leseprobe
Leseprobe
Inhalt
Preface 1 Introduction 1.1 Introduction 1.2 Mathematics Foundations 1.2.1 Norm 1.2.2 Inverse and Generalized Inverse of a Matrix 1.2.3 Properties of Eigenvalues 1.2.4 Rank-One Update 1.2.5 Function and Differential 1.3 Convex Sets and Convex Functions 1.3.1 Convex Sets 1.3.2 Convex Functions 1.3.3 Separation and Support of Convex Sets 1.4 Optimality Conditions for Unconstrained Case 1.5 Structure of Optimization Methods Exercises 2 Line Search 2.1 Introduction 2.2 Convergence Theory for Exact Line Search 2.3 Section Methods 2.3.1 The Golden Section Method 2.3.2 The Fibonacci Method 2.4 Interpolation Method 2.4.1 Quadratic Interpolation Methods 2.4.2 Cubic Interpolation Method 2.5 Inexact Line Search Techniques 2.5.1 Armijo and Goldstein Rule 2.5.2 Wolfe-Powell Rule 2.5.3 Goldstein Algorithm and Wolfe-Powell Algorithm 2.5.4 Backtracking Line Search 2.5.5 Convergence Theorems of Inexact Line Search Exercises 3 Newton¿s Methods 3.1 The Steepest Descent Method 3.1.1 The Steepest Descent Method 3.1.2 Convergence of the Steepest Descent Method 3.1.3 Barzilai and Borwein Gradient Method 3.1.4 Appendix: Kantorovich Inequality 3.2 Newton¿s Method 3.3 Modified Newton¿s Method 3.4 Finite-Difference Newton¿s Method 3.5 Negative Curvature Direction Method 3.5.1 Gill-Murray Stable Newton¿s Method 3.5.2 Fiacco-McCormick Method 3.5.3 Fletcher-Freeman Method 3.5.4 Second-Order Step Rules 3.6 Inexact Newton¿s Method Exercises 4 Conjugate Gradient Method 4.1 Conjugate Direction Methods 4.2 Conjugate Gradient Method 4.2.1 Conjugate Gradient Method 4.2.2 Beale¿s Three-Term Conjugate Gradient Method 4.2.3 Preconditioned Conjugate Gradient Method 4.3 Convergence of Conjugate Gradient Methods 4.3.1 Global Convergence of Conjugate Gradient Methods 4.3.2 Convergence Rate of Conjugate Gradient Methods Exercises 5 Quasi-Newton Methods 5.1 Quasi-Newton Methods 5.1.1 Quasi-Newton Equation 5.1.2 Symmetric Rank-One (SR1) Update 5.1.3 DFP Update 5.1.4 BFGS Update and PSB Update 5.1.5 The Least Change Secant Update 5.2 The Broyden Class 5.3 Global Convergence of Quasi-Newton Methods 5.3.1 Global Convergence under Exact Line Search 5.3.2 Global Convergence under Inexact Line Search 5.4 Local Convergence of Quasi-Newton Methods 5.4.1 Superlinear Convergence of General Quasi-Newton Methods 5.4.2 Linear Convergence of General Quasi-Newton Methods 5.4.3 Local Convergence of Broyden¿s Rank-One Update 5.4.4 Local and Linear Convergence of DFP Method 5.4.5 Superlinear Convergence of BFGS Method 5.4.6 Superlinear Convergence of DFP Method 5.4.7 Local Convergence of Broyden¿s Class Methods 5.5 Self-Scaling Variable Metric (SSVM) Methods 5.5.1 Motivation to SSVM Method 5.5.2 Self-Scaling Variable Metric (SSVM) Method 5.5.3 Choices of the Scaling Factor 5.6 Sparse Quasi-Newton Methods 5.7 Limited Memory BFGS Method Exercises 6 Trust-Region and Conic Model Methods 6.1 Trust-Region Methods 6.1.1 Trust-Region Methods 6.1.2 Convergence of Trust-Region Methods 6.1.3 Solving A Trust-Region Subproblem 6.2 Conic Model and Collinear Scaling Algorithm 6.2.1 Conic Model 6.2.2 Generalized Quasi-Newton Equation 6.2.3 Updates that Preserve Past Information 6.2.4 Collinear Scaling BFGS Algorithm 6.3 Tensor Methods 6.3.1 Tensor Method for Nonlinear Equations 6.3.2 Tensor Methods for Unconstrained Optimization Exercises 7 Nonlinear Least-Squares Problems 7.1 Introduction 7.2 Gauss-Newton Method 7.3 Levenberg-Marquardt Method 7.3.1 Motivation and Properties 7.3.2 Convergence of Levenberg-Marquardt Method 7.4 Implementation of L-M Method 7.5 Quasi-Newton Method Exercises 8 Theory of Constrained Optimization 8.1 Constrained Opti ...