• Corpus ID: 141463823

Multi-resolution Networks For Flexible Irregular Time Series Modeling (Multi-FIT)

@article{Rawat2019MultiresolutionNF,
  title={Multi-resolution Networks For Flexible Irregular Time Series Modeling (Multi-FIT)},
  author={Bhanu Pratap Singh Rawat and Iman Deznabi and Bharat Narasimhan and Bryon Kucharski and Rheeya Uppaal and Akhila Josyula and Madalina Fiterau},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.00125}
}
Missing values, irregularly collected samples, and multi-resolution signals commonly occur in multivariate time series data, making predictive tasks difficult. These challenges are especially prevalent in the healthcare domain, where patients' vital signs and electronic records are collected at different frequencies and have occasionally missing information due to the imperfections in equipment or patient circumstances. Researchers have handled each of these issues differently, often handling… 

Figures and Tables from this paper

A Novel LSTM for Multivariate Time Series with Massive Missingness

This paper proposes a novel model called forward and backward variable-sensitive LSTM (FBVS-LSTM) consisting of two decay mechanisms and some informative data, which is adapted to deal with massive missingness of meteorological datasets.

A Review of Deep Learning Methods for Irregularly Sampled Medical Time Series Data

This paper reviews deep learning methods for irregularly sampled time series (ISMTS) data from the perspectives of technology and task, and implements some representative methods and compares them on four medical datasets with two tasks.

PIETS: Parallelised Irregularity Encoders for Forecasting with Heterogeneous Time-Series

Through extensive experiments on real-world data sets related to COVID-19, the proposed architecture, PIETS, is able to effectively model heterogeneous temporal data and outperforms other state-of-the-art approaches in the prediction task.

Imputing Missing Observations with Time Sliced Synthetic Minority Oversampling Technique

A simple yet novel time series imputation technique that improves over standard mean and median imputation techniques by allowing a wider class of patient trajectories to be recognized by the model, as well as improvement over aggregated classification models.

Process data based Anomaly detection in distributed energy generation using Neural Networks

This paper compares two approaches using Neural Networks with respect to their ability to detect anomalous behavior in real process data of a combined heat and power plant using Long-Short-Term-Memory (LSTM).

References

SHOWING 1-10 OF 21 REFERENCES

Recurrent Neural Networks for Multivariate Time Series with Missing Values

Novel deep learning models are developed based on Gated Recurrent Unit, a state-of-the-art recurrent neural network that takes two representations of missing patterns, i.e., masking and time interval, and effectively incorporates them into a deep model architecture so that it not only captures the long-term temporal dependencies in time series, but also utilizes the missing patterns to achieve better prediction results.

Temporal Belief Memory: Imputing Missing Data during RNN Training

TBM is a missing value imputation method that considers the time continuity and captures latent missing patterns based on irregular real time intervals of the inputs that outperforms all the competitive baseline approaches for the septic shock early prediction task.

The effects of the irregular sample and missing data in time series analysis.

It is concluded that irregularly sampled data sets with as much as 15 percent missing data can potentially be re-sampled or repaired for analysis with techniques that assume regular sampling without introducing substantial errors.

Functional Data Analysis for Sparse Longitudinal Data

We propose a nonparametric method to perform functional principal components analysis for the case of sparse longitudinal data. The method aims at irregularly spaced longitudinal data, where the

Classification of Sparse and Irregularly Sampled Time Series with Mixtures of Expected Gaussian Kernels and Random Features

This paper proposes to first re-represent each time series through the Gaussian process (GP) posterior it induces under a GP regression model, then defines kernels over the space of GP posteriors and applies standard kernel-based classification.

A scalable end-to-end Gaussian process adapter for irregularly sampled time series classification

This work proposes an uncertainty-aware classification framework based on a special computational layer known as the Gaussian process adapter that can connect irregularly sampled time series data to any black-box classifier learnable using gradient descent.

Missing data in medical databases: Impute, delete or classify?

Comparison of correlation analysis techniques for irregularly sampled time series

Abstract. Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation) or sophisticated methods to handle irregular

Disease-Atlas: Navigating Disease Trajectories using Deep Learning

A deep learning approach is proposed to address limitations in standard joint models for longitudinal and time-to-event data, enhancing existing methods with the inherent flexibility and scalability of deep neural networks while retaining the benefits of joint modeling.

Long Short-Term Memory

A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.