• Corpus ID: 236155093

Online structural kernel selection for mobile health

@article{Shin2021OnlineSK,
  title={Online structural kernel selection for mobile health},
  author={Eura Shin and Pedja Klasnja and Susan A. Murphy and Finale Doshi-Velez},
  journal={ArXiv},
  year={2021},
  volume={abs/2107.09949}
}
Motivated by the need for efficient and personalized learning in mobile health, we investigate the problem of online kernel selection for Gaussian Process regression in the multi-task setting. We propose a novel generative process on the kernel composition for this purpose. Our method demonstrates that trajectories of kernel evolutions can be transferred between users to improve learning and that the kernels themselves are meaningful for an mHealth prediction goal. 

References

SHOWING 1-10 OF 19 REFERENCES
Intelligent Pooling in Thompson Sampling for Rapid Personalization in Mobile Health
TLDR
The data collected from a real-world mobile health study is used to build a generative model and the proposed algorithm is evaluated in comparison with two natural alternatives: learning the treatment policy separately per person and learning a single treatment policy for all people.
Structure Discovery in Nonparametric Regression through Compositional Kernel Search
TLDR
This work defines a space of kernel structures which are built compositionally by adding and multiplying a small number of base kernels, and presents a method for searching over this space of structures which mirrors the scientific discovery process.
Discovering Latent Covariance Structures for Multiple Time Series
TLDR
A new GP model is presented which naturally handles multiple time series by placing an Indian Buffet Process (IBP) prior on the presence of shared kernels, and a selective covariance structure decomposition allows exploiting shared parameters over a set of multiple, selected time series.
Differentiable Compositional Kernel Learning for Gaussian Processes
TLDR
The Neural Kernel Network (NKN), a flexible family of kernels represented by a neural network, is presented, which is based on the composition rules for kernels, so that each unit of the network corresponds to a valid kernel.
Personalized HeartSteps: A Reinforcement Learning Algorithm for Optimizing Physical Activity
TLDR
A Reinforcement Learning (RL) algorithm that continuously learns and improves the treatment policy embedded in the JITAI as the data is being collected from the user is developed.
Gaussian Process Multi-task Learning Using Joint Feature Selection
TLDR
A novel Gaussian Process (GP) approach to multi-task learning based on joint feature selection which captures the task similarity by sharing a sparsity pattern over the kernel hyper-parameters associated with each task.
Bayesian optimization for automated model selection
TLDR
This work presents a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices, based on Bayesian optimization in model space, and constructs a novel kernel between models to explain a given dataset.
Scaling up the Automatic Statistician: Scalable Structure Discovery using Gaussian Processes
TLDR
This paper proposes Scalable Kernel Composition (SKC), a scalable kernel search algorithm that extends the Automatic Statistician to bigger data sets and derives a cheap upper bound on the GP marginal likelihood that sandwiches the marginal likelihood with the variational lower bound.
Spike and Slab Variational Inference for Multi-Task and Multiple Kernel Learning
TLDR
A variational Bayesian inference algorithm which can be widely applied to sparse linear models and is based on the spike and slab prior, which is the golden standard for sparse inference is introduced.
...
...