Corpus ID: 598018

Some Competitive Learning Methods Contents 1 Introduction 3 2 Common Properties & Notational Conventions 4 3 Goals of Competitive Learning 7

@inproceedings{Fritzke1997SomeCL,
  title={Some Competitive Learning Methods Contents 1 Introduction 3 2 Common Properties \& Notational Conventions 4 3 Goals of Competitive Learning 7},
  author={Bernd Fritzke},
  year={1997}
}
(Some additions and reenements are planned for this document so it will stay in the draft status still for a while.) Comments are welcome. Abstract This report has the purpose of describing several algorithms from the literature all related to competitive learning. A uniform terminology is used for all methods. Moreover, identical examples are provided to allow a qualitative comparisons of the methods. The on-line version 1 of this document contains hyperlinks to Java implementations of several… Expand
A Mixed Ensemble Approach for the Semi-supervised Problem
TLDR
This approach consists of an ensemble unsupervised learning part where the labeled and unlabeled points are segmented into clusters and takes advantage of the a priori information of the labeled points to assign classes to clusters and proceed to predicting with the ensemble method new incoming ones. Expand
Architecture for graphical maps of Web contents
Efficient visualization of large collection of documents seems a primary topic in intelligent navigation through Internet resources. The paper describes a set of tools explored in the course ofExpand
Multi-topographic neural network communication and generalization for multi-viewpoint analysis
  • S. A. Shehabi, J. Lamirel
  • Computer Science
  • Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.
  • 2005
TLDR
A new generic multitopographic neural network model whose main area of application is clustering and knowledge extraction tasks on documentary data is presented and it is shown how its generalization mechanism and its mechanism of communication between topographies can be exploited within the framework of the SOM and NG models. Expand
Clustering algorithms for scenario tree generation: Application to natural hydro inflows
TLDR
This article uses a procedure to create the scenario tree divided into two phases: the first one produces a tree that represents accurately the original probability distribution, and in the second phase that tree is reduced to make it tractable. Expand
A visual data-mining methodology for seismic-facies analysis : P s
Seismic facies analysis aims to identify clusters groups of similar seismic trace shapes, where each cluster can be considered to represent variability in lithology, rock properties, and/or fluidExpand
Advances in Radio Science Context-based user grouping for multicasting in heterogeneous radio networks
Along with the rise of sophisticated smartphones and smart spaces, the availability of both static and dynamic context information has steadily been increasing in recent years. Due to the popularityExpand
Operational Two-Stage Stratified Topographic Correction of Spaceborne Multispectral Imagery Employing an Automatic Spectral-Rule-Based Decision-Tree Preliminary Classifier
TLDR
The novel operational two-stage SNLTOC system is presented and its capability of reducing within-stratum spectral variance while preserving pixel-based spectral patterns (shapes) is assessed quantitatively. Expand
Fuzzy voting in clustering
In this paper we present a fuzzy voting scheme for cluster algorithms. This fuzzy voting method allows us to combine several runs of cluster algorithms resulting in a common fuzzy partition. ThisExpand

References

SHOWING 1-10 OF 32 REFERENCES
Incremental Learning of Local Linear Mappings
TLDR
Simulation results for several classiication tasks indicate fast convergence as well as good generalization in a new incremental network model for supervised learning. Expand
A Growing Neural Gas Network Learns Topologies
An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule. In contrast toExpand
Fast adaptive k-means clustering: some empirical results
  • C. Darken, J. Moody
  • Computer Science
  • 1990 IJCNN International Joint Conference on Neural Networks
  • 1990
The authors present learning rate schedules for fast adaptive k-means clustering which surpass the standard MacQueen learning rate schedule (J. MacQeen, 1967) in speed and quality of solution byExpand
Fast Learning in Networks of Locally-Tuned Processing Units
We propose a network architecture which uses a single internal layer of locally-tuned processing units to learn both classification tasks and real-valued function approximations (Moody and DarkenExpand
Topology representing networks
TLDR
This competitive Hebbian rule provides a novel approach to the problem of constructing topology preserving feature maps and representing intricately structured manifolds and makes this novel approach particularly useful in all applications where neighborhood relations have to be exploited or the shape and topology of submanifolds have to been take into account. Expand
Improving the Learning Speed in Topological Maps of Patterns
TLDR
A method of improving the learning speed, by starting the map with very few units and increasing that number progressively until the map reaches its final size, which dramatically reduces the time needed for the “unfolding" phase and also yields some improvements in the asymptotic convergence phase. Expand
Adding Learned Expectation Into the Learning Procedure of Self-Organizing Maps
  • Lei Xu
  • Computer Science
  • Int. J. Neural Syst.
  • 1990
The self-organizing topological map is generalized by adding a learned expectation to its learning procedure, in order to improve its stability in nonstationary environments with unexpected inputsExpand
Adding a conscience to competitive learning
  • Duane DeSieno
  • Computer Science
  • IEEE 1988 International Conference on Neural Networks
  • 1988
TLDR
The author introduces a modification of Kohonen learning that provides rapid convergence and improved representation of the input data and forms a better approximation of p(x) in many areas of pattern recognition, statistical analysis, and control. Expand
Incremental grid growing: encoding high-dimensional structure into a two-dimensional feature map
TLDR
In the proposed approach, nodes are added incrementally to a regular two-dimensional grid, which is drawable at all times, irrespective of the dimensionality of the input space, resulting in a map that explicitly represents the cluster structure of the high-dimensional input. Expand
Some methods for classification and analysis of multivariate observations
The main purpose of this paper is to describe a process for partitioning an N-dimensional population into k sets on the basis of a sample. The process, which is called 'k-means,' appears to giveExpand
...
1
2
3
4
...