• Corpus ID: 14982670

Medical datamining with probabilistic classifiers

@inproceedings{Abraham2007MedicalDW,
  title={Medical datamining with probabilistic classifiers},
  author={Ranjit Abraham and Jay B. Simha and S. Sitharama Iyengar},
  year={2007}
}
An optical device ( 44 ) is provided with an optical modulator ( 440 ), a color combining optical device ( 444 ) and an optical converting element ( 443 ), the optical modulator ( 440 ) being attached to the color combining optical device ( 444 ) through a position-adjusting spacer ( 449 ) made of a heat-insulative material, so that heat generated on the optical modulator ( 440 ) and the optical converting element ( 443 ) is mutually insulated by the spacer ( 449 ) made of heat-insulative… 

Figures and Tables from this paper

Using Medical History Embedded in Biometrics Medical Card for User Identity Authentication: Privacy Preserving Authentication Model by Features Matching
TLDR
The design of a privacy preserving model by backward inference is introduced in this paper and can verify by inference whether the user has a record of it stored in his smart card.
Measuring Similarity by Prediction Class between Biomedical Datasets via Fuzzy Unordered Rule Induction
TLDR
This paper presents a novel scheme in measuring similarity of two datasets by prediction class, namely SPC, which uses a machine learning model called Fuzzy Unordered Rule Induction to infer the similarity between two datasets based on their common attributes and their degrees of relevance pertaining to a predicted class.

References

SHOWING 1-10 OF 25 REFERENCES
C4.5: Programs for Machine Learning
TLDR
A complete guide to the C4.5 system as implemented in C for the UNIX environment, which starts from simple core learning methods and shows how they can be elaborated and extended to deal with typical problems such as missing data and over hitting.
Support-Vector Networks
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Boosting and Naive Bayesian learning
TLDR
It is shown that boosting applied to naive Bayesian classifiers yields combination classifiers that are representationally equivalent to standard feedforward multilayer perceptrons, which are highly plausible computationally as models of animal learning.
The Power of Decision Tables
TLDR
Experimental results show that on artificial and real-world domains containing only discrete features, IDTM, an algorithm inducing decision tables, can sometimes outperform state-of-the-art algorithms such as C4.5.
Ridge Estimators in Logistic Regression
TLDR
It is shown how ridge estimators can be used in logistic regression to improve the parameter estimates and to diminish the error made by further predictions and to predict new observations more accurately.
Bayesian Network Classifiers
TLDR
Tree Augmented Naive Bayes (TAN) is single out, which outperforms naive Bayes, yet at the same time maintains the computational simplicity and robustness that characterize naive Baye.
A comparative study of discretization methods for naive-Bayes classifiers
TLDR
This study leads to a new discretization method, weighted non-disjoint discretiza- tion (WNDD) that combines WPKID and NDD's advantages, and shows that among all the rival discretized methods, WNDD best helps naive-Bayes classifiers reduce average classification error.
Learning Optimal Augmented Bayes Networks
TLDR
A simple, polynomial time greedy algorithm for learning an optimal Augmented Bayes Network with respect to MDL score is presented.
Instance-based learning algorithms
TLDR
This paper describes how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy and extends the nearest neighbor algorithm, which has large storage requirements.
A NEURAL NETWORK PRIMER
TLDR
This analysis makes clear the strong similarity between linear neural networks and the general linear model developed by statisticians.
...
...