Get Free Shipping on orders over $79
Bandit Algorithms - Tor Lattimore

Bandit Algorithms

By: Tor Lattimore, Csaba Szepesvári

eText | 16 July 2020

At a Glance

eText


$75.96

or 4 interest-free payments of $18.99 with

 or 

Instant online reading in your Booktopia eTextbook Library *

Why choose an eTextbook?

Instant Access *

Purchase and read your book immediately

Read Aloud

Listen and follow along as Bookshelf reads to you

Study Tools

Built-in study tools like highlights and more

* eTextbooks are not downloadable to your eReader or an app and can be accessed via web browsers only. You must be connected to the internet and have no technical issues with your device or browser that could prevent the eTextbook from operating.
Decision-making in the face of uncertainty is a significant challenge in machine learning, and the multi-armed bandit model is a commonly used framework to address it. This comprehensive and rigorous introduction to the multi-armed bandit problem examines all the major settings, including stochastic, adversarial, and Bayesian frameworks. A focus on both mathematical intuition and carefully worked proofs makes this an excellent reference for established researchers and a helpful resource for graduate students in computer science, engineering, statistics, applied mathematics and economics. Linear bandits receive special attention as one of the most useful models in applications, while other chapters are dedicated to combinatorial bandits, ranking, non-stationary problems, Thompson sampling and pure exploration. The book ends with a peek into the world beyond bandits with an introduction to partial monitoring and learning in Markov decision processes.
on
Desktop
Tablet
Mobile

More in Computer Vision

AI for You - Michael Martin

eBOOK

$15.99

Mastering Invideo AI - Robert Keyser

eBOOK

Mastering Chat GPT - Robert Keyser

eBOOK

A I(M) Here to Stay - Jonathon Wetzel

eBOOK

Cyber Time - Jameson Lyon

eBOOK

$15.99