Corpus ID: 7922388

Deep Symbolic Representation Learning for Heterogeneous Time-series Classification

  title={Deep Symbolic Representation Learning for Heterogeneous Time-series Classification},
  author={Shengdong Zhang and Soheil Bahrampour and Naveen Ramakrishnan and Mohak Shah},
In this paper, we consider the problem of event classification with multi-variate time series data consisting of heterogeneous (continuous and categorical) variables. The complex temporal dependencies between the variables combined with sparsity of the data makes the event classification problem particularly challenging. Most state-of-art approaches address this either by designing hand-engineered features or breaking up the problem over homogeneous variates. In this work, we propose and… Expand
1 Citations
Feature Selection for Improving Failure Detection in Hard Disk Drives Using a Genetic Algorithm and Significance Scores
A two-tier approach is presented to select the most effective precursors for a failing HDD, which uses fewer SMART attributes, which reduces the required training time for the classifier and does not require tuning any parameters or thresholds. Expand


Multimodal Task-Driven Dictionary Learning for Image Classification
This paper proposes a multimodal task-driven dictionary learning algorithm under the joint sparsity constraint (prior) to enforce collaborations among multiple homogeneous/heterogeneous sources of information and presents an extension of the proposed formulation using a mixed joint and independent sparsity prior, which facilitates more flexible fusion of the modalities at feature level. Expand
Semi-supervised Sequence Learning
Two approaches to use unlabeled data to improve Sequence Learning with recurrent networks are presented and it is found that long short term memory recurrent networks after pretrained with the two approaches become more stable to train and generalize better. Expand
Task-Driven Dictionary Learning
  • J. Mairal, F. Bach, J. Ponce
  • Computer Science, Medicine
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2012
This paper presents a general formulation for supervised dictionary learning adapted to a wide variety of tasks, and presents an efficient algorithm for solving the corresponding optimization problem. Expand
Sequence to Sequence Learning with Neural Networks
This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier. Expand
A Simple Way to Initialize Recurrent Networks of Rectified Linear Units
This paper proposes a simpler solution that use recurrent neural networks composed of rectified linear units that is comparable to LSTM on four benchmarks: two toy problems involving long-range temporal structures, a large language modeling problem and a benchmark speech recognition problem. Expand
ImageNet classification with deep convolutional neural networks
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective. Expand
Speech recognition with deep recurrent neural networks
This paper investigates deep recurrent neural networks, which combine the multiple levels of representation that have proved so effective in deep networks with the flexible use of long range context that empowers RNNs. Expand
Classification of patterns of EEG synchronization for seizure prediction
The authors' best machine learning technique applied to spatio-temporal patterns of EEG synchronization outperformed previous seizure prediction methods on the Freiburg dataset. Expand
Sparse Representation for Signal Classification
The proposed approach combines the discrimination power of the discriminative methods with the reconstruction property and the sparsity of the sparse representation that enables one to deal with signal corruptions: noise, missing data and outliers. Expand
Long Short-Term Memory
A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Expand