000 03797cam a2200397Ka 4500
001 00011722
003 WSP
007 cr cnu|||unuuu
008 200416s2020 si ob 001 0 eng d
040 _a WSPC
_b eng
_c WSPC
010 _z 2020001727
020 _a9789811216572
_q(ebook)
020 _a9811216576
_q(ebook)
020 _z9811216568
_q(hbk.)
020 _z9789811216565
_q(hbk.)
050 0 4 _aQA184.2
_b.G35 2020
072 7 _aMAT
_x002050
_2bisacsh
072 7 _aMAT
_x042000
_2bisacsh
072 7 _aCOM
_x014000
_2bisacsh
082 0 4 _a512/.5
_223
100 1 _aGallier, Jean H.
_94003
245 1 0 _aLinear algebra and optimization with applications to machine learning.
_nVolume II,
_pFundamentals of optimization theory with applications to machine learning
_h[electronic resource] /
_cby Jean Gallier, Jocelyn Quaintance.
260 _aSingapore ;
_aHackensack, NJ :
_bWorld Scientific,
_c[2020]
300 _a1 online resource (xvii, 877 p.)
504 _aIncludes bibliographical references and index.
538 _aMode of access: World Wide Web.
538 _aSystem requirements: Adobe Acrobat Reader.
520 _a"Volume 2 applies the linear algebra concepts presented in Volume 1 to optimization problems which frequently occur throughout machine learning. This book blends theory with practice by not only carefully discussing the mathematical under pinnings of each optimization technique but by applying these techniques to linear programming, support vector machines (SVM), principal component analysis (PCA), and ridge regression. Volume 2 begins by discussing preliminary concepts of optimization theory such as metric spaces, derivatives, and the Lagrange multiplier technique for finding extrema of real valued functions. The focus then shifts to the special case of optimizing a linear function over a region determined by affine constraints, namely linear programming. Highlights include careful derivations and applications of the simplex algorithm, the dual-simplex algorithm, and the primal-dual algorithm. The theoretical heart of this book is the mathematically rigorous presentation of various nonlinear optimization methods, including but not limited to gradient decent, the Karush-Kuhn-Tucker (KKT) conditions, Lagrangian duality, alternating direction method of multipliers (ADMM), and the kernel method. These methods are carefully applied to hard margin SVM, soft margin SVM, kernel PCA, ridge regression, lasso regression, and elastic-net regression. Matlab programs implementing these methods are included"--Publisher's website.
650 0 _aAlgebras, Linear.
_94004
650 0 _aMachine learning
_xMathematics.
_94005
655 0 _aElectronic books.
_93294
700 1 _aQuaintance, Jocelyn.
_94006
856 4 0 _uhttps://www.worldscientific.com/worldscibooks/10.1142/11722#t=toc
_zAccess to full text is restricted to subscribers.
880 0 _6505-00
_aPreface -- Introduction -- Preliminaries for optimization theory. Opology. Differential calculus. Extrema of real-valued functions. Newton's method and its generalizations. Quadratic optimization problems. Schur complements and applications -- Linear optimization. Convex sets, cones, H-polyhedra. Linear programs. The simplex algorithm. Linear programming and duality -- Nonlinear optimization. Basics of hilbert spaces. General results of optimization theory. Introduction to nonlinear optimization. Subgradients and subdifferentials of convex functions. Dual ascent methods ; ADMM -- Applications to machine learning. Positive definite kernels. Soft margin support vector machines. Ridge regression, lasso, elastic net. Ν-sv regression -- Appendix a : total orthogonal families in hilbert spaces -- Appendix b : matlab programs -- Bibliography -- Index.
942 _cEBK
999 _c72579
_d72579