Get Free Shipping on orders over $79
Linear Algebra and Optimization with Applications to Machine Learning : Volume II: Fundamentals of Optimization Theory with Applications to Machine Learning - Jean H Gallier

Linear Algebra and Optimization with Applications to Machine Learning

Volume II: Fundamentals of Optimization Theory with Applications to Machine Learning

By: Jean H Gallier, Jocelyn Quaintance

eText | 16 March 2020

At a Glance

eText


$126.49

or 4 interest-free payments of $31.62 with

 or 

Instant online reading in your Booktopia eTextbook Library *

Why choose an eTextbook?

Instant Access *

Purchase and read your book immediately

Read Aloud

Listen and follow along as Bookshelf reads to you

Study Tools

Built-in study tools like highlights and more

* eTextbooks are not downloadable to your eReader or an app and can be accessed via web browsers only. You must be connected to the internet and have no technical issues with your device or browser that could prevent the eTextbook from operating.

Volume 2 applies the linear algebra concepts presented in Volume 1 to optimization problems which frequently occur throughout machine learning. This book blends theory with practice by not only carefully discussing the mathematical under pinnings of each optimization technique but by applying these techniques to linear programming, support vector machines (SVM), principal component analysis (PCA), and ridge regression. Volume 2 begins by discussing preliminary concepts of optimization theory such as metric spaces, derivatives, and the Lagrange multiplier technique for finding extrema of real valued functions. The focus then shifts to the special case of optimizing a linear function over a region determined by affine constraints, namely linear programming. Highlights include careful derivations and applications of the simplex algorithm, the dual-simplex algorithm, and the primal-dual algorithm. The theoretical heart of this book is the mathematically rigorous presentation of various nonlinear optimization methods, including but not limited to gradient decent, the Karush-Kuhn-Tucker (KKT) conditions, Lagrangian duality, alternating direction method of multipliers (ADMM), and the kernel method. These methods are carefully applied to hard margin SVM, soft margin SVM, kernel PCA, ridge regression, lasso regression, and elastic-net regression. Matlab programs implementing these methods are included.

on
Desktop
Tablet
Mobile

More in Algebra

Enriques Surfaces I - François Cossec

eTEXT

The Monodromy Group - Henryk ?o??dek

eTEXT

Finite Groups I - Bertram Huppert

eTEXT

$349.00