Local Sample-weighted Multiple Kernel Clustering with Consensus Discriminative Graph

@article{Li2022LocalSM,
  title={Local Sample-weighted Multiple Kernel Clustering with Consensus Discriminative Graph},
  author={Liangchi Li and Siwei Wang and Xinwang Liu and En Zhu and Li Shen and Kenli Li and Kuan-Ching Li},
  journal={IEEE transactions on neural networks and learning systems},
  year={2022},
  volume={PP}
}
Multiple kernel clustering (MKC) is committed to achieving optimal information fusion from a set of base kernels. Constructing precise and local kernel matrices is proven to be of vital significance in applications since the unreliable distant-distance similarity estimation would degrade clustering performance. Although existing localized MKC algorithms exhibit improved performance compared with globally designed competitors, most of them widely adopt the KNN mechanism to localize kernel matrix… 

Figures and Tables from this paper

Sample Weighted Multiple Kernel K-means via Min-Max optimization

A novel sample weighted multiple kernel k-means via min-max optimization (SWMKKM), which sufficiently considers the sum of relationship between one sample and the others to represent the sample weights, helps clustering algorithm pay more attention to samples with more positive effects on clustering and avoids unreliable overestimation for samples with poor quality.

Multiple Kernel Clustering with Dual Noise Minimization

This paper rigorously defines dual noise and proposes a novel parameter-free MKC algorithm that surpasses the recent methods by large margins and designs an efficient two-step iterative strategy to solve the resultant optimization problem.

Hard Sample Aware Network for Contrastive Deep Graph Clustering

A novel contrastive deep graph clustering method dubbed Hard Sample Aware Net- work (HSAN) is proposed by introducing a comprehensive similarity measure criterion and a general dynamic sample weighing strategy that can mine not only the hard negative samples but also the hard positive sample, thus improving the discriminative capability of the samples further.

Cluster-guided Contrastive Graph Clustering Network

A Cluster-guided Contrastive deep Graph Clustering network (CCGC) is proposed by mining the intrinsic supervision information in the high-confidence clustering results by improving the discriminative capability and reliability of the constructed sample pairs and design an objective function to pull close the samples from the same cluster while pushing away those from other clusters.

Interpolation-based Contrastive Learning for Few-Label Semi-Supervised Learning

An interpolation-based method to construct more reliable positive sample pairs and a novel contrastive loss to guide the embedding of the learned network to change linearly between samples so as to improve the discriminative capability of the network by enlarging the margin decision boundaries are proposed.

Continual Multi-view Clustering

This work proposes a continual approach on the basis of late fusion multi-view clustering framework that only needs to maintain a consensus partition matrix and update knowledge with the incoming one of a new data view rather than keep all of them.

References

SHOWING 1-10 OF 66 REFERENCES

Simultaneous Global and Local Graph Structure Preserving for Multiple Kernel Clustering

A novel MKL method, structure-preserving multiple kernel clustering (SPMKC), which proposes a new kernel affine weight strategy to learn an optimal consensus kernel from a predefined kernel pool, which can assign a suitable weight for each base kernel automatically.

Localized Incomplete Multiple Kernel k-means

Different from existing MKKM-IK, LI-MKKM only requires the similarity of a sample to its k-nearest neighbors to align with their ideal similarity values, which helps the clustering algorithm to focus on closer sample pairs that shall stay together and avoids involving unreliable similarity evaluation for farther sample pairs.

Localized Simple Multiple Kernel K-means

A novel MKC algorithm with a "local" kernel alignment, which only requires that the similarity of a sample to its k-nearest neighbours be aligned with the ideal similarity matrix, which helps the clustering algorithm to focus on closer sample pairs that shall stay together and avoids involving unreliable similarity evaluation.

Consensus Affinity Graph Learning for Multiple Kernel Clustering

This article proposes a new MKGC method to learn a consensus affinity graph directly via a thin autoweighted fusion model, in which a self-tuned Laplacian rank constraint and a top- $k$ neighbors sparse strategy are introduced to improve the quality of the consensus affinitygraph for accurate clustering purposes.

Multiple Kernel Clustering With Neighbor-Kernel Subspace Segmentation

A simple yet effective neighbor-kernel-based MKC algorithm that back-projects the solution of the unconstrained counterpart to its principal components and reveals an interesting insight into the exact-rank constraint in ridge regression by careful theoretical analysis.

Multiple Kernel k-Means Clustering with Matrix-Induced Regularization

This paper proposes an MKKM clustering with a novel, effective matrix-induced regularization to reduce such redundancy and enhance the diversity of the selected kernels and shows that maximizing the kernel alignment for clustering can be viewed as a special case of this approach.

Projective Multiple Kernel Subspace Clustering

By incorporating intrinsic structures with multi-view data, PMKSC alleviates the noise and redundancy in the original kernel space and obtains high-quality similarity to uncover the underlying clustering structures.

Multiple Kernel Clustering with Local Kernel Alignment Maximization

A novel MKC algorithm with a "local" kernel alignment, which only requires that the similarity of a sample to its k-nearest neighbours be aligned with the ideal similarity matrix, which helps the clustering algorithm to focus on closer sample pairs that shall stay together and avoids involving unreliable similarity evaluation for farther sample pairs.

Late Fusion Multiple Kernel Clustering With Proxy Graph Refinement.

This article theoretically revisit the connection between late fusion kernel base partition and traditional spectral embedding and constructs a proxy self-expressive graph from kernel base partitions, which refines the individual kernel partitions and also captures partition relations in graph structure rather than simple linear transformation.
...