Recursive max-linear models with propagating noise

@article{Buck2021RecursiveMM,
  title={Recursive max-linear models with propagating noise},
  author={Johannes Buck and Claudia Kluppelberg},
  journal={Electronic Journal of Statistics},
  year={2021}
}
Recursive max-linear vectors model causal dependence between node variables by a structural equation model, expressing each node variable as a max-linear function of its parental nodes in a directed acyclic graph (DAG) and some exogenous innovation. For such a model, there exists a unique minimum DAG, represented by the Kleene star matrix of its edge weight matrix, which identifies the model and can be estimated. For a more realistic statistical modeling we introduce some random observational… 

Figures and Tables from this paper

Estimating an extreme Bayesian network via scalings
Fe b 20 20 CONDITIONAL INDEPENDENCE IN MAX-LINEAR BAYESIAN NETWORKS
TLDR
This work uses tropical linear algebra to derive a compact representation of the conditional distribution given a partial observation, and exploits this to obtain a complete description of all conditional independence relations.
Graphical models for extremes
  • Sebastian Engelke, Adrien Hitz
  • Computer Science, Mathematics
    Journal of the Royal Statistical Society: Series B (Statistical Methodology)
  • 2020
TLDR
A general theory of conditional independence for multivariate Pareto distributions is introduced that enables the definition of graphical models and sparsity for extremes and it is shown that, similarly to the Gaussian case, the sparsity pattern of a general extremal graphical model can be read off from suitable inverse covariance matrices.
Estimating a Latent Tree for Extremes
TLDR
QTree is given, a simple and efficient algorithm to solve the Latent River Problem that outperforms existing methods, and under a Bayesian network model for extreme values with propagating noise, it is shown that the QTree estimator returns for n → ∞ a.s. the correct tree.
Causal Discovery of a River Network from its Extremes
TLDR
QTree is provided, a new and simple algorithm to solve the Hidden River Problem that outperforms existing methods and relies on qualitative aspects of the max-linear Bayesian network model.
Markov equivalence of max-linear Bayesian networks
TLDR
The parallel between the two theories via tropicalization is established, and the surprising result that the Markov equivalence classes for max-linear Bayesian networks coincide with the ones obtained by regular CI is established.
Gibbs posterior convergence and the thermodynamic formalism
TLDR
This work establishes tight connections between Gibbs posterior inference and the thermodynamic formalism, which may inspire new proof techniques in the study of Bayesian posterior consistency for dependent processes.
Conditional independence in max-linear Bayesian networks
TLDR
This work uses tropical linear algebra to derive a compact representation of the conditional distribution given a partial observation, and exploits this to obtain a complete description of all conditional independence relations.
Sparse Structures for Multivariate Extremes
TLDR
The different forms of extremal dependence that can arise between the largest observations of a multivariate random vector are described and identification of groups of variables which can be concomitantly extreme is addressed.

References

SHOWING 1-10 OF 36 REFERENCES
Max-linear models on directed acyclic graphs
TLDR
All max- linear models which are generated by a recursive structural equation model are characterized, and it is shown that its max-linear coefficient matrix is the solution of a fixed point equation.
Estimating an extreme Bayesian network via scalings
Identifiability and estimation of recursive max‐linear models
TLDR
This work addresses the identifiability and estimation of recursive max‐linear structural equation models represented by an edge‐weighted directed acyclic graph (DAG) and develops a simple method for identifying the structure of the DAG.
Bayesian Networks for Max-Linear Models
TLDR
It is argued that the structure of a minimal network asymptotically can be identified completely from observational data.
Causal discovery with continuous additive noise models
TLDR
If the observational distribution follows a structural equation model with an additive noise structure, the directed acyclic graph becomes identifiable from the distribution under mild conditions, which constitutes an interesting alternative to traditional methods that assume faithfulness and identify only the Markov equivalence class of the graph, thus leaving some edges undirected.
Graphical models for extremes
  • Sebastian Engelke, Adrien Hitz
  • Computer Science, Mathematics
    Journal of the Royal Statistical Society: Series B (Statistical Methodology)
  • 2020
TLDR
A general theory of conditional independence for multivariate Pareto distributions is introduced that enables the definition of graphical models and sparsity for extremes and it is shown that, similarly to the Gaussian case, the sparsity pattern of a general extremal graphical model can be read off from suitable inverse covariance matrices.
A continuous updating weighted least squares estimator of tail dependence in high dimensions
Likelihood-based procedures are a common way to estimate tail dependence parameters. They are not applicable, however, in non-differentiable models such as those arising from recent max-linear
Basic properties and prediction of max-ARMA processes
A max-autoregressive moving average (MARMA(p, q)) process {Xt } satisfies the recursion for all t where φ i , , and {Zt } is i.i.d. with common distribution function Φ1,σ (X): = exp {–σ x –1} for .
Adaptive function estimation in nonparametric regression with one-sided errors
We consider the model of nonregular nonparametric regression where smoothness constraints are imposed on the regression function $f$ and the regression errors are assumed to decay with some sharpness
Moving-maximum models for extrema of time series
...
1
2
3
4
...