#### Filter Results:

- Full text PDF available (12)

#### Publication Year

2013

2017

- This year (2)
- Last 5 years (15)
- Last 10 years (15)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Brain Region

#### Key Phrases

#### Method

#### Organism

Learn More

Previous research has shown that neural networks can model survival data in situations in which some patients’ death times are unknown, e.g. right-censored. However, neural networks have rarely been shown to outperform their linear counterparts such as the Cox proportional hazards model. In this paper, we run simulated experiments and use real survival data… (More)

- Uri Shaham, Alexander Cloninger, Ronald R. Coifman
- ArXiv
- 2015

We discuss approximation of functions using deep neural nets. Given a function f on a d-dimensional manifold Γ ⊂ R, we construct a sparsely-connected depth-4 neural network and bound its error in approximating f . The size of the network depends on dimension and curvature of the manifold Γ, the complexity of f , in terms of its wavelet description, and only… (More)

- Alexander Cloninger, Wojciech Czaja, Ruiliang Bai, Peter J. Basser
- SIAM J. Imaging Sciences
- 2014

We present an algorithm to solve the two-dimensional Fredholm integral of the first kind with tensor product structure from a limited number of measurements, with the goal of using this method to speed up nuclear magnetic resonance spectroscopy. This is done by incorporating compressive sensing–type arguments to fill in missing measurements, using a priori… (More)

- Ruiliang Bai, Alexander Cloninger, Wojciech Czaja, Peter J. Basser
- Journal of magnetic resonance
- 2015

Potential applications of 2D relaxation spectrum NMR and MRI to characterize complex water dynamics (e.g., compartmental exchange) in biology and other disciplines have increased in recent years. However, the large amount of data and long MR acquisition times required for conventional 2D MR relaxometry limits its applicability for in vivo preclinical and… (More)

- Alexander Cloninger, Wojciech Czaja, Timothy Doster
- IGARSS
- 2014

As new sensing modalities emerge and the presence of multiple sensors per platform becomes widespread, it is vital to develop new algorithms and techniques which can fuse this data. Many of previous attempts to deal with the problem of heterogeneous data integration for the applications in data classification were either highly data dependent or relied on… (More)

- Gal Mishne, Uri Shaham, Alexander Cloninger, Israel Cohen
- ArXiv
- 2015

Non-linear manifold learning enables high-dimensional data analysis, but requires outof-sample-extension methods to process new data points. In this paper, we propose a manifold learning algorithm based on deep learning to create an encoder, which maps a high-dimensional dataset and its low-dimensional embedding, and a decoder, which takes the embedded data… (More)

- Ariel Hafftka, Hasan Hüseyin Çelik, Alexander Cloninger, Wojciech Czaja, Richard G. Spencer
- 2015 International Conference on Sampling Theory…
- 2015

In [1], Cloninger, Czaja, Bai, and Basser developed an algorithm for compressive sampling based data acquisition for the solution of 2D Fredholm equations. We extend the algorithm to N dimensional data, by randomly sampling in 2 dimensions and fully sampling in the remaining N-2 dimensions. This new algorithm has direct applications to 3-dimensional nuclear… (More)

- Alexander Cloninger, Wojciech Czaja, Timothy Doster
- AIPR
- 2013

- Xiuyuan Cheng, Alexander Cloninger, Ronald R. Coifman
- ArXiv
- 2017

The paper introduces a new kernel-based Maximum Mean Discrepancy (MMD) statistic for measuring the distance between two distributions given finitely-many multivariate samples. When the distributions are locally low-dimensional, the proposed test can be made more powerful to distinguish certain alternatives by incorporating local covariance matrices and… (More)

- Alexander Cloninger
- ArXiv
- 2016

We note that building a magnetic Laplacian from the Markov transition matrix, rather than the graph adjacency matrix, yields several benefits for the magnetic eigenmaps algorithm. The two largest benefits are that the embedding becomes more stable as a function of the rotation parameter g, and the principal eigenvector of the magnetic Laplacian now… (More)