#### Filter Results:

- Full text PDF available (32)

#### Publication Year

1993

2016

#### Publication Type

#### Co-author

#### Publication Venue

#### Data Set Used

#### Key Phrases

Learn More

- Ron Kohavi, George H. John
- Artif. Intell.
- 1997

In the feature subset selection problem, a learning algorithm is faced with the problem of selecting a relevant subset of features upon which to focus its attention, while ignoring the rest. To achieve the best possible performance with a particular learning algorithm on a particular training set, a feature subset selection method should consider how the… (More)

- George H. John, Pat Langley
- UAI
- 1995

When modeling a probability distribution with a Bayesian network, we are faced with the problem of how to handle continuous vari ables. Most previous work has either solved the problem by discretizing, or assumed that the data are generated by a single Gaussian. In this paper we abandon the normality as sumption and instead use statistical methods for… (More)

- George H. John, Ron Kohavi, Karl Pfleger
- ICML
- 1994

We address the problem of nding a subset of features that allows a supervised induction algorithm to induce small high-accuracy concepts. We examine notions of relevance and irrelevance, and show that the deenitions used in the machine learning literature do not adequately partition the features into useful categories of relevance. We present deeni-tions… (More)

- Ron Kohavi, George H. John, Richard Long, David Manley, Karl Pfleger
- ICTAI
- 1994

We present MLC ++ , a library of C ++ classes and tools for supervised Machine Learning. While MLC ++ provides general learning algorithms that can be used by end users, the main objective is to provide researchers and experts with a wide variety of tools that can accelerate algorithm development, increase software reliability, provide comparison tools, and… (More)

- George H. John, Pat Langley
- KDD
- 1996

As data warehouses grow to the point where one hundred gigabytes is considered small, the computational efficiency of data-mining algorithms on large databases becomes increasingly important. Using a sample from the database can speed up the data-mining process, but this is only acceptable if it does not reduce the quality of the mined knowledge. To this… (More)

- George H. John
- KDD
- 1995

Finding and removing outliers is an important problem in data mining. Errors in large databases can be extremely common, so an important property of a data mining algorithm is robustness with respect to errors in the database. Most sophisticated methods in machine learning address this problem to some extent , but not fully, and can be improved by… (More)

- Ron Kohavi, George H John, John Kohavi
- 1998

In the feature subset selection problem, a learning algorithm is faced with the problem of selecting a relevant subset of features upon which to focus its attention, while ignoring the rest. To a c hieve the best possible performance with a particular learning algorithm on a particular training set, a feature subset selection method should consider how the… (More)

- George H John, Pat Langley
- 1995

When modeling a probability distribution with a Bayesian network, we are faced with the problem of how to handle continuous variables. Most previous work has either solved the problem by discretizing, or assumed that the data are generated by a single Gaussian. In this paper we abandon the normality assumption and instead use statistical methods for… (More)

- George H. John
- AAAI
- 1994

The most popular delayed reinforcement learning technique, Q-learning (Watkins 1989), estimates the future reward expected from executing each action in every state. If these estimates are correct, then an agent can use them to select the action with maximal expected future reward in each state, and thus perform optimally. Watkins has proved that Q-learning… (More)

- O Z Karimov, G H John, +4 authors R Airey
- Physical review letters
- 2003

Time-resolved optical measurements in (110)-oriented GaAs/AlGaAs quantum wells show a tenfold increase of the spin-relaxation rate as a function of applied electric field from 20 to 80 kV cm(-1) at 170 K and indicate a similar variation at 300 K, in agreement with calculations based on the Rashba effect. Spin relaxation is almost field independent below 20… (More)