#### Filter Results:

- Full text PDF available (15)

#### Publication Year

2007

2017

- This year (3)
- Last 5 years (12)
- Last 10 years (18)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Michael S. Gashler
- Journal of Machine Learning Research
- 2011

We present a breadth-oriented collection of cross-platform command-line tools for researchers in machine learning called Waffles. The Waffles tools are designed to offer a broad spectrum of functionality in a manner that is friendly for scripted automation. All functionality is also available in a C++ class library. Waffles is available under the GNU Lesser… (More)

- Michael S. Gashler, Dan Ventura, Tony R. Martinez
- NIPS
- 2007

Many algorithms have been recently developed for reducing dimensionality by projecting data onto an intrinsic non-linear manifold. Unfortunately, existing algorithms often lose significant precision in this transformation. Manifold Sculpting is a new algorithm that iteratively reduces dimensionality by simulating surface tension in local neighborhoods. We… (More)

- Michael S. Gashler, Michael R. Smith, Richard G. Morris, Tony R. Martinez
- Computational Intelligence
- 2016

Many data mining and data analysis techniques operate on dense matrices or complete tables of data. Realworld data sets, however, often contain unknown values. Even many classification algorithms that are designed to operate with missing values still exhibit deteriorated accuracy. One approach to handling missing values is to fill in (impute) the missing… (More)

- Michael S. Gashler, Christophe G. Giraud-Carrier, Tony R. Martinez
- 2008 Seventh International Conference on Machine…
- 2008

Using decision trees that split on randomly selected attributes is one way to increase the diversity within an ensemble of decision trees. Another approach increases diversity by combining multiple tree algorithms. The random forest approach has become popular because it is simple and yields good results with common datasets. We present a technique that… (More)

- Michael S. Gashler, Tony R. Martinez
- The 2011 International Joint Conference on Neural…
- 2011

Existing Nonlinear dimensionality reduction (NLDR) algorithms make the assumption that distances between observations are uniformly scaled. Unfortunately, with many interesting systems, this assumption does not hold. We present a new technique called Temporal NLDR (TNLDR), which is specifically designed for analyzing the high-dimensional observations… (More)

- Michael S. Gashler, Stephen C. Ashmore
- ICIC
- 2014

We present a method for training a deep neural network containing sinusoidal activation functions to fit to time-series data. Weights are initialized using a fast Fourier transform, then trained with regularization to improve generalization. A simple dynamic parameter tuning method is employed to adjust both the learning rate and regularization term, such… (More)

- Michael S. Gashler, Stephen C. Ashmore
- Neurocomputing
- 2016

- Michael S. Gashler, Tony R. Martinez
- The 2011 International Joint Conference on Neural…
- 2011

We present an intelligent neighbor-finding algorithm called SAFFRON that chooses neighboring points while avoiding making connections between points on geodesically distant regions of a manifold. SAFFRON identifies the suitability of points to be neighbors by using a relaxation technique that alternately estimates the tangent space at each point, and… (More)

- Luke B. Godfrey, Michael S. Gashler
- IEEE transactions on neural networks and learning…
- 2017

We present a neural network technique for the analysis and extrapolation of time-series data called neural decomposition (ND). Units with a sinusoidal activation function are used to perform a Fourier-like decomposition of training samples into a sum of sinusoids, augmented by units with nonperiodic activation functions to capture linear trends and other… (More)

- Michael S. Gashler, Dan Ventura, Tony R. Martinez
- IEEE Trans. Systems, Man, and Cybernetics, Part B
- 2011

We present an algorithm for manifold learning called manifold sculpting , which utilizes graduated optimization to seek an accurate manifold embedding. An empirical analysis across a wide range of manifold problems indicates that manifold sculpting yields more accurate results than a number of existing algorithms, including Isomap, locally linear embedding… (More)