• Corpus ID: 5549418

The Neural Hawkes Process: A Neurally Self-Modulating Multivariate Point Process

@inproceedings{Mei2017TheNH,
  title={The Neural Hawkes Process: A Neurally Self-Modulating Multivariate Point Process},
  author={Hongyuan Mei and Jason Eisner},
  booktitle={NIPS},
  year={2017}
}
Many events occur in the world. [] Key Method This generative model allows past events to influence the future in complex and realistic ways, by conditioning future event intensities on the hidden state of a recurrent neural network that has consumed the stream of past events. Our model has desirable qualitative properties. It achieves competitive likelihood and predictive accuracy on real and synthetic datasets, including under missing-data conditions.

Figures from this paper

Variational Neural Temporal Point Process
TLDR
A variational neural temporal point process (VNTPP) is proposed that outperforms other deep neural network based models and statistical processes on synthetic and real-world datasets and can generalize the representations of various event types.
Self-Adaptable Point Processes with Nonparametric Time Decays
TLDR
SPRITE, a Self-adaptable Point pRocess, which can decouple the influences between every pair of the events and capture various time decays of the influence strengths, and derives a general construction that can cover all possible time-decaying functions.
Self-Attentive Hawkes Processes
TLDR
The proposed method adapts self-attention to fit the intensity function of Hawkes processes and is more powerful to identify complicated dependency relationships between temporal events and is able to capture longer historical information.
Mutually Regressive Point Processes
TLDR
This paper introduces the first general class of Bayesian point process models extended with a nonlinear component that allows both excitatory and inhibitory relationships in continuous time and derives a fully Bayesian inference algorithm for these processes using Polya-Gamma augmentation and Poisson thinning.
Nonparametric Regressive Point Processes Based on Conditional Gaussian Processes
TLDR
This work proposes and develops a new nonparametric regressive point process model based on Gaussian processes that can represent better many commonly observed real-world event sequences and capture the dependencies between events that are difficult to model using existingnonparametric Hawkes process variants.
A Multi-Channel Neural Graphical Event Model with Negative Evidence
TLDR
This work uses a novel multi-channel RNN that optimally reinforces the negative evidence of no observable events with the introduction of fake event epochs within each consecutive inter-event interval to estimate the underlying intensity functions.
Deep Attention Spatio-Temporal Point Processes
TLDR
A novel attention-based sequential model for mutually dependent spatiotemporal discrete event data, which is a versatile framework for capturing the nonhomogeneous influence of events and superior performance compared to the state-of-the-art for both synthetic and real data is demonstrated.
Attentive Neural Point Processes for Event Forecasting
  • Gu
  • Computer Science
  • 2021
TLDR
This paper proposes ANPP, an Attentive Neural Point Processes framework that leverages the time-aware self-attention mechanism to explicitly model the influence between every pair of historical events, result-ing in more accurate predictions of events and better inter- pretation ability.
Neural Spectral Marked Point Processes
TLDR
This paper introduces a novel and general neural network-based non-stationary influence kernel with high expressiveness for handling complex discrete events data while providing theoretical performance guarantees and demonstrates the superior performance of the proposed method compared with the state of the art on synthetic and real data.
Graph Hawkes Neural Network for Future Prediction on Temporal Knowledge Graphs
  • Computer Science
  • 2020
TLDR
The Graph Hawkes Neural Network is proposed that can capture the dynamics of evolving graph sequences and predict the occurrence of a fact in a future time and is effective on large-scale temporal relational databases.
...
...

References

SHOWING 1-10 OF 38 REFERENCES
Joint Modeling of Event Sequence and Time Series with Attentional Twin Recurrent Neural Networks
TLDR
This paper utilizes the rich framework of (temporal) point processes to model event data and timely update its intensity function by the synergic twin Recurrent Neural Networks (RNNs) and proves the superiority of the model in synthetic data and three real-world benchmark datasets.
Learning Network of Multivariate Hawkes Processes: A Time Series Approach
TLDR
The problem of recovering the causal structure in network of multivariate linear Hawkes processes, which encodes the causal factorization of the joint distribution of the processes, is studied and it is shown that the resulting causal influence network is equivalent to the Directed Information graph (DIG).
Recurrent Marked Temporal Point Processes: Embedding Event History to Vector
TLDR
The Recurrent Marked Temporal Point Process is proposed to simultaneously model the event timings and the markers, and uses a recurrent neural network to automatically learn a representation of influences from the event history, and an efficient stochastic gradient algorithm is developed for learning the model parameters.
Wasserstein Learning of Deep Generative Point Process Models
TLDR
An intensity-free approach for point processes modeling that transforms nuisance processes to a target one and is trained using a likelihood-free leveraging Wasserstein distance between point processes.
Isotonic Hawkes Processes
TLDR
It is shown that Isotonic-Hawkes processes can fit a variety of nonlinear patterns which cannot be captured by conventional Hawkes processes, and achieve superior empirical performance in real world applications.
Learning Triggering Kernels for Multi-dimensional Hawkes Processes
TLDR
This paper focuses on the nonparametric learning of the triggering kernels, and proposes an algorithm MMEL that combines the idea of decoupling the parameters through constructing a tight upper-bound of the objective function and application of Euler-Lagrange equations for optimization in infinite dimensional functional space.
Visualizing and Understanding Recurrent Networks
TLDR
This work uses character-level language models as an interpretable testbed to provide an analysis of LSTM representations, predictions and error types, and reveals the existence of interpretable cells that keep track of long-range dependencies such as line lengths, quotes and brackets.
Hawkes Processes with Stochastic Excitations
TLDR
This work generalizes a recent algorithm for simulating draws from Hawkes processes whose levels of excitation are stochastic processes, and proposes a hybrid Markov chain Monte Carlo approach for model fitting.
Dirichlet-Hawkes Processes with Applications to Clustering Continuous-Time Document Streams
TLDR
This new model establishes a previously unexplored connection between Bayesian Nonparametrics and temporal Point Processes, which makes the number of clusters grow to accommodate the increasing complexity of online streaming contents, while at the same time adapts to the ever changing dynamics of the respective continuous arrival time.
SEISMIC: A Self-Exciting Point Process Model for Predicting Tweet Popularity
TLDR
This paper builds on the theory of self-exciting point processes to develop a statistical model that allows for accurate predictions of the final number of reshares of a given post, and demonstrates a strong improvement in predictive accuracy over existing approaches.
...
...