# Unsupervised Representation Learning for Time Series with Temporal Neighborhood Coding

@article{Tonekaboni2021UnsupervisedRL, title={Unsupervised Representation Learning for Time Series with Temporal Neighborhood Coding}, author={Sana Tonekaboni and Danny Eytan and Anna Goldenberg}, journal={ArXiv}, year={2021}, volume={abs/2106.00750} }

Time series are often complex and rich in information but sparsely labeled and therefore challenging to model. In this paper, we propose a self-supervised framework for learning generalizable representations for non-stationary time series. Our approach, called Temporal Neighborhood Coding (TNC), takes advantage of the local smoothness of a signal’s generative process to define neighborhoods in time with stationary properties. Using a debiased contrastive objective, our framework learns time…

## 36 Citations

### Contrastive Learning for Time Series on Dynamic Graphs

- Computer Science2022 30th European Signal Processing Conference (EUSIPCO)
- 2022

This paper proposes a framework called GraphTNC for unsupervised learning of joint representations of the graph and the time-series using a contrastive learning strategy, and shows that it can prove beneficial for the classification task with real-world datasets.

### Iterative Bilinear Temporal-Spectral Fusion for Unsupervised Time-Series Representation Learning

- Computer ScienceArXiv
- 2022

This paper proposes a unified framework, namely Bilinear Temporal-Spectral Fusion (BTSF), which firstly utilizes the instance-level augmentation with a simple dropout on the entire time series for maximally capturing long-term dependencies and devise a novel iterative bilinear temporal-spectral fusion to explicitly encode the affinities of abundant time-frequency pairs.

### Unsupervised Time-Series Representation Learning with Iterative Bilinear Temporal-Spectral Fusion

- Computer ScienceICML
- 2022

A novel iterative bilinear temporal-spectral fusion to explicitly encode the afﬁnities of abundant time-frequency pairs, and iteratively re- encode representations in a fusion-and-squeeze manner with Spectrum-to-Time (S2T) and Time- to-Spectrum (T2S) Aggregation modules.

### Cross Reconstruction Transformer for Self-Supervised Time Series Representation Learning

- Computer ScienceArXiv
- 2022

This paper aims at learning representations for time series from a new perspective and proposes Cross Reconstruction Transformer (CRT) to solve the aforementioned problems in a uniﬁed way and shows that CRT consistently achieves the best performance over existing methods.

### Contrastive Learning for Unsupervised Domain Adaptation of Time Series

- Computer ScienceArXiv
- 2022

A novel framework for UDA of time series data, called CLUDA, which proposes a contrastive learning framework to learn contextual representations in multivariate time series, so that these preserve label information for the prediction task.

### Decoupling Local and Global Representations of Time Series

- Computer ScienceAISTATS
- 2022

This paper proposes a novel generative approach for learning representations for the global and local factors of variation in time series, and introduces counterfactual regularization that minimizes the mutual information between the two variables.

### Utilizing Expert Features for Contrastive Learning of Time-Series Representations

- Computer ScienceICML
- 2022

The proposed ExpCLR is a novel contrastive learning approach built on an objective that utilizes expert features to encourage both properties for the learned representation and surpasses several state-of-the-art methods for both unsupervised and semi-supervised representation learning.

### TS2Vec: Towards Universal Representation of Time Series

- Computer ScienceAAAI
- 2022

TS2Vec performs contrastive learning in a hierarchical way over augmented context views, which enables a robust contextual representation for each timestamp, which achieves significant improvement over existing SOTAs of unsupervised time series representation.

### Self-Supervised Contrastive Pre-Training For Time Series via Time-Frequency Consistency

- Computer ScienceArXiv
- 2022

A decomposable pre-training model is defined, where the self-supervised signal is provided by the distance between time and frequency components, each individually trained by contrastive estimation.

### Learning Timestamp-Level Representations for Time Series with Hierarchical Contrastive Loss

- Computer ScienceArXiv
- 2021

This paper presents TS2Vec, a universal framework for learning timestamplevel representations of time series that performs timestamp-wise discrimination, which learns a contextual representation vector directly for each timestamp.

## References

SHOWING 1-10 OF 43 REFERENCES

### Learning Representations for Time Series Clustering

- Computer ScienceNeurIPS
- 2019

A novel unsupervised temporal representation learning model, named Deep Temporal Clustering Representation (DTCR), is proposed, which integrates the temporal reconstruction and K-means objective into the seq2seq model, which leads to improved cluster structures and thus obtains cluster-specific temporal representations.

### Representation Learning with Contrastive Predictive Coding

- Computer ScienceArXiv
- 2018

This work proposes a universal unsupervised learning approach to extract useful representations from high-dimensional data, which it calls Contrastive Predictive Coding, and demonstrates that the approach is able to learn useful representations achieving strong performance on four distinct domains: speech, images, text and reinforcement learning in 3D environments.

### TimeNet: Pre-trained deep recurrent neural network for time series classification

- Computer ScienceESANN
- 2017

TimeNet: a deep recurrent neural network trained on diverse time series in an unsupervised manner using sequence to sequence (seq2seq) models to extract features from time series attempts to generalize time series representation across domains by ingesting time series from several domains simultaneously.

### Similarity Preserving Representation Learning for Time Series Clustering

- Computer ScienceIJCAI
- 2019

An efficient representation learning framework that is able to convert a set of time series with various lengths to an instance-feature matrix that guarantees that the pairwise similarities between time series are well preserved after the transformation, thus the learned feature representation is particularly suitable for the time series clustering task.

### Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA

- Computer ScienceNIPS
- 2016

This work proposes a new intuitive principle of unsupervised deep learning from time series which uses the nonstationary structure of the data, and shows how TCL can be related to a nonlinear ICA model, when ICA is redefined to include temporal nonstationarities.

### Representation Learning: A Review and New Perspectives

- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2013

Recent work in the area of unsupervised feature learning and deep learning is reviewed, covering advances in probabilistic models, autoencoders, manifold learning, and deep networks.

### A review of unsupervised feature learning and deep learning for time-series modeling

- Computer SciencePattern Recognit. Lett.
- 2014

### Attentive State-Space Modeling of Disease Progression

- Computer ScienceNeurIPS
- 2019

The attentive state-space model is developed, a deep probabilistic model that learns accurate and interpretable structured representations for disease trajectories that demonstrates superior predictive accuracy and provides insights into the progression of chronic disease.

### Computational Phenotype Discovery Using Unsupervised Feature Learning over Noisy, Sparse, and Irregular Clinical Data

- Computer SciencePloS one
- 2013

From episodic, longitudinal sequences of serum uric acid measurements in 4368 individuals, this work produced continuous phenotypic features that suggest multiple population subtypes, and that accurately distinguished the uric-acid signatures of gout vs. acute leukemia despite not being optimized for the task.

### A Theoretical Analysis of Contrastive Unsupervised Representation Learning

- Computer ScienceICML
- 2019

This framework allows us to show provable guarantees on the performance of the learned representations on the average classification task that is comprised of a subset of the same set of latent classes and shows that learned representations can reduce (labeled) sample complexity on downstream tasks.