Get Free Shipping on orders over $79
Optimization Algorithms for Distributed Machine Learning : eBColl Synthesis Collection 12 - Gauri Joshi

Optimization Algorithms for Distributed Machine Learning

By: Gauri Joshi

eText | 25 November 2022

At a Glance

eText


$74.99

or 4 interest-free payments of $18.75 with

 or 

Instant online reading in your Booktopia eTextbook Library *

Why choose an eTextbook?

Instant Access *

Purchase and read your book immediately

Read Aloud

Listen and follow along as Bookshelf reads to you

Study Tools

Built-in study tools like highlights and more

* eTextbooks are not downloadable to your eReader or an app and can be accessed via web browsers only. You must be connected to the internet and have no technical issues with your device or browser that could prevent the eTextbook from operating.

This book discusses state-of-the-art stochastic optimization algorithms for distributed machine learning and analyzes their convergence speed. The book first introduces stochastic gradient descent (SGD) and its distributed version, synchronous SGD, where the task of computing gradients is divided across several worker nodes. The author discusses several algorithms that improve the scalability and communication efficiency of synchronous SGD, such as asynchronous SGD, local-update SGD, quantized and sparsified SGD, and decentralized SGD. For each of these algorithms, the book analyzes its error versus iterations convergence, and the runtime spent per iteration. The author shows that each of these strategies to reduce communication or synchronization delays encounters a fundamental trade-off between error and runtime.

on
Desktop
Tablet
Mobile

More in Algorithms & Data Structures

Addiction by Design : Machine Gambling in Las Vegas - Natasha Dow Schüll

eBOOK

Deep Learning Crash Course - Giovanni Volpe

eBOOK

RRP $81.07

$64.99

20%
OFF