• Corpus ID: 20541356

Understanding Machine Learning - From Theory to Algorithms

  title={Understanding Machine Learning - From Theory to Algorithms},
  author={Shai Shalev-Shwartz and Shai Ben-David},
Machine learning is one of the fastest growing areas of computer science, with far-reaching applications. The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way. The book provides an extensive theoretical account of the fundamental ideas underlying machine learning and the mathematical derivations that transform these principles into practical algorithms. Following a presentation of the basics of the field, the book covers a wide… 
Mathematics for Machine Learning
This self-contained textbook bridges the gap between mathematical and machine learning texts, introducing the mathematical concepts with a minimum of prerequisites to derive four central machine learning methods.
Statistical Computational Learning
This chapter provides the formal background about statistical learning problems, and surveys several theoretical results and algorithms in the topics of concept learning and convex learning, which take a central place in statistical computational learning.
A Falsificationist Account of Artificial Neural Networks
It is argued that the idea of falsification is central to the methodology of machine learning and taking both aspects together gives rise to a falsi-cationist account of arti ficial neural networks.
A Brief Introduction to Machine Learning for Engineers
  • O. Simeone
  • Computer Science
    Found. Trends Signal Process.
  • 2018
This monograph aims at providing an introduction to key concepts, algorithms, and theoretical results in machine learning by building on first principles, while also exposing the reader to more advanced topics with extensive pointers to the literature within a unified notation and mathematical framework.
A PAC Approach to Application-Specific Algorithm Selection
Concepts from statistical and online learning theory are adapted to reason about application-specific algorithm selection, and dimension notions from statistical learning theory, historically used to measure the complexity of classes of binary- and real-valued functions, are relevant in a much broader algorithmic context.
Optimization for Supervised Machine Learning: Randomized Algorithms for Data and Parameters
These algorithms have their primary application in training supervised machine learning models via regularized empirical risk minimization, which is the dominant paradigm for training such models, but can be applied in many other fields, including but not limited to data science, engineering, scientific computing, and statistics.
Neural Network Methods for Natural Language Processing
This book focuses on the application of neural network models to natural language data, and introduces more specialized neural network architectures, including 1D convolutional neural networks, recurrent neural Networks, conditioned-generation models, and attention-based models.
A New Perspective on Machine Learning: How to do Perfect Supervised Learning
This work's theoretical analysis has shown that all practical machine learning tasks are asymptotically solvable in a perfect sense, and derived new error bounds for perfect learning, which can quantify the difficulty of learning.
Mean-field inference methods for neural networks
  • Marylou Gabri'e
  • Computer Science
    Journal of Physics A: Mathematical and Theoretical
  • 2020
A selection of classical mean-field methods and recent progress relevant for inference in neural networks are reviewed, and the principles of derivations of high-temperature expansions, the replica method and message passing algorithms are reminded, highlighting their equivalences and complementarities.


Boosting: Foundations and Algorithms
This book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics.
Neural Network Learning - Theoretical Foundations
The authors explain the role of scale-sensitive versions of the Vapnik Chervonenkis dimension in large margin classification, and in real prediction, and discuss the computational complexity of neural network learning.
Machine learning - a probabilistic perspective
  • K. Murphy
  • Computer Science
    Adaptive computation and machine learning series
  • 2012
This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
An introduction to Support Vector Machines
This book is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory. The book also introduces
Bayesian reasoning and machine learning
Comprehensive and coherent, this hands-on text develops everything from basic reasoning to advanced techniques within the framework of graphical models, and develops analytical and problem-solving skills that equip them for the real world.
Nearest-Neighbor Methods in Learning and Vision: Theory and Practice (Neural Information Processing)
This volume presents theoretical and practical discussions of nearest-neighbor (NN) methods in machine learning and examines computer vision as an application domain in which the benefit of these advanced methods is often dramatic.
Introduction to the Theory of Computation
Throughout the book, Sipser builds students' knowledge of conceptual tools used in computer science, the aesthetic sense they need to create elegant systems, and the ability to think through problems on their own.
Online Learning: Theory, Algorithms, and Applications
This dissertation describes a novel framework for the design and analysis of online learning algorithms and proposes a new perspective on regret bounds which is based on the notion of duality in convex optimization.
Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications
  • D. Haussler
  • Mathematics, Computer Science
    Inf. Comput.
  • 1992