Boosting Algorithms for Maximizing the Soft Margin

Abstract

Algorithm 1: SoftBoost 1. Input: S = 〈(x1, y1), . . . , (xN , yN )〉, desired accuracy δ, and capping parameter ν ∈ [1, N ]. 2. Initialize: dn to the uniform distribution 3.Do for t = 1, . . . (a) Train classifier on dt−1 and {u1, . . . ,ut−1} and obtain hypothesis ht. Set un = h(xn)yn. (b) Calculate the edge γt of ht : γt = dt · ut (c) Set γ̂t = (minm=1...t… (More)

Topics

Figures and Tables

Sorry, we couldn't extract any figures or tables for this paper.

Slides referencing similar topics