Sijia Cai

Learn More
Lemma 1. Denote by ¯ z c and ¯ z the mean vectors of Z c and Z, respectively, where Z c is the set of coding vectors of samples from class c. Then L(Z) in FDDL is equivalent to the weighted sum of the squared distances of pairs of coding vectors: L(Z) = C ∑ c=1 (∑ yi=c,yj =c (1 n c − 1 2n)∥z i − z j ∥ 2 2 + ∑ yi=c,yj ̸ =c (− 1 2n)∥z i − z j ∥ 2 2). Proof :(More)
Discriminative dictionary learning aims to learn a dictionary from training samples to enhance the discriminative capability of their coding vectors. Several discrimination terms have been proposed by assessing the prediction loss (e.g., logistic regression) or class separation criterion (e.g., Fisher discrimination criterion) on the coding vectors. In this(More)
Conventional representation based classifiers, ranging from the classical nearest neighbor classifier and nearest subspace classifier to the recently developed sparse representation based classifier (SRC) and collaborative representation based classifier (CRC), are essentially distance based classifiers. Though SRC and CRC have shown interesting(More)
—Background modeling is a critical component for various vision-based applications. Most traditional methods tend to be inefficient when solving large-scale problems. In this paper, we introduce sparse representation into the task of large-scale stable-background modeling, and reduce the video size by exploring its " discriminative " frames. A cyclic(More)
  • 1