#### Filter Results:

- Full text PDF available (8)

#### Publication Year

1979

2017

- This year (2)
- Last 5 years (11)
- Last 10 years (11)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

Recurrent neural networks are powerful models for processing sequential data, but they are generally plagued by vanishing and exploding gradient problems. Unitary recurrent neural networks (uRNNs), which use unitary recurrence matrices, have recently been proposed as a means to avoid these issues. However, in previous experiments, the recurrence matrices… (More)

We introduce a novel speech enhancement algorithm for removing reverberation and noise from recorded speech data. Our approach centers around using a single-channel minimum mean-square error log-spectral amplitude (MMSELSA) estimator, which applies gain coefficients in a timefrequency domain to suppress noise and reverberation. The main contribution of this… (More)

- Thomas Powers, Jeff A. Bilmes, David W. Krout, Les E. Atlas
- 2016 19th International Conference on Information…
- 2016

We develop a framework to select a subset of sensors from a field in which the sensors have an ingrained independence structure. Given an arbitrary independence pattern, we construct a graph that denotes pairwise independence between sensors, which means those sensors may operate simultaneously. The set of all fully-connected subgraphs (cliques) of this… (More)

- David W. Krout, Thomas Powers
- 17th International Conference on Information…
- 2014

This paper focuses on sensor management for distributed sensor fields in deep water. This paper will investigate the sensor placement problem for the barrier scenario. Field performance is often based solely on sensor coverage or probability of detection evaluations, which provide only a fleeting snapshot of how the field is performing at a particular time.… (More)

- Scott Wisdom, Thomas Powers, James W. Pitton, Les E. Atlas
- ArXiv
- 2016

Recurrent neural networks (RNNs) are powerful and effective for processing sequential data. However, RNNs are usually considered “black box” models whose internal structure and learned parameters are not interpretable. In this paper, we propose an interpretable RNN based on the sequential iterative soft-thresholding algorithm (SISTA) for solving the… (More)

- Scott Wisdom, Thomas Powers, James W. Pitton, Les E. Atlas
- 2017 IEEE International Conference on Acoustics…
- 2017

Historically, sparse methods and neural networks, particularly modern deep learning methods, have been relatively disparate areas. Sparse methods are typically used for signal enhancement, compression, and recovery, usually in an unsupervised framework, while neural networks commonly rely on a supervised training set. In this paper, we use the specific… (More)

- Thomas Powers, David W. Krout, Les E. Atlas
- 2015 18th International Conference on Information…
- 2015

In this paper we develop a framework to select a subset of sensors from a field in which the sensors have an ingrained independence structure. Given an arbitrary independence pattern, we construct a graph that denotes pairwise independence between sensors, which means those sensors can operate simultaneously. The set of all fully-connected subgraphs… (More)

- Scott Wisdom, Thomas Powers, James W. Pitton, Les E. Atlas
- ArXiv
- 2017

In this paper, we propose a novel recurrent neural network architecture for speech separation. This architecture is constructed by unfolding the iterations of a sequential iterative soft-thresholding algorithm (ISTA) that solves the optimization problem for sparse nonnegative matrix factorization (NMF) of spectrograms. We name this network architecture deep… (More)

- Thomas Powers
- Hospital materiel management quarterly
- 1992

- Thomas Powers, Rodolfo Laucirica, Stephen Brooks, Sonny Leopold, Michelle Whitson, James B. Farnum
- Journal of the Tennessee Medical Association
- 1990