Learning sparse structural changes in high-dimensional Markov networks

@article{Liu2017LearningSS,
  title={Learning sparse structural changes in high-dimensional Markov networks},
  author={Song Liu and Kenji Fukumizu and Taiji Suzuki},
  journal={Behaviormetrika},
  year={2017},
  volume={44},
  pages={265-286}
}
AbstractRecent years have seen an increasing popularity of learning the sparse changes in Markov Networks. Changes in the structure of Markov Networks reflect alternations of interactions between random variables under different regimes and provide insights into the underlying system. While each individual network structure can be complicated and difficult to learn, the overall change from one network to another can be simple. This intuition gave birth to an approach that directly learns the… 

Fast and Scalable Learning of Sparse Changes in High-Dimensional Gaussian Graphical Model Structure

A novel method, DIFFEE is proposed for estimating DIFFerential networks via an Elementary Estimator under a high-dimensional situation and surprisingly achieves the same asymptotic convergence rates as the state-of-the-art estimators that are much more difficult to compute.

Limits on Testing Structural Changes in Ising Models

Novel information-theoretic limits on detecting sparse changes in Ising models are presented, suggesting that testing of graphical models may not be amenable to concepts such as restricted strong convexity leveraged for sparsity pattern recovery, and algorithm development instead should be directed towards detection of large changes.

Adding Extra Knowledge in Scalable Learning of Sparse Differential Gaussian Graphical Models

A novel method KDiffNet is proposed that incorporates Additional Knowledge in identifying Differential Networks via an Elementary Estimator and design a novel hybrid norm as a superposition of two structured norms guided by the extra edge information and the additional node group knowledge.

Differential Network Learning Beyond Data Samples

A novel differential parameter estimator that simultaneously allows the flexible integration of multiple sources of information, being scalable to a large number of variables, and achieving a sharp asymptotic convergence rate is proposed.

Scalable Inference of Sparsely-changing Gaussian Markov Random Fields

This work introduces a new class of constrained optimization problems for the inference of sparsely-changing Gaussian MRFs, and derives sharp statistical guarantees in the high-dimensional regime, showing that such problems can be learned with as few as one sample per time period.

Direct Estimation of Differences in Causal Graphs

This work provides the first provably consistent method for directly estimating the differences in a pair of causal DAGs without separately learning two possibly large and dense DAG models and computing their difference.

Two-Sample Testing can be as Hard as Structure Learning in Ising Models: Minimax Lower Bounds

This work extends the previously developed framework to consider this problem, and shows that, in a certain parameter regime, large changes do not provide any significant improvement in the number of necessary samples for reliable two-sample testing.

Beyond Data Samples: Aligning Differential Networks Estimation with Scientific Knowledge

A novel differential network estimator that allows integrating various sources of knowledge beyond data samples, scalable to a large number of variables and achieves a sharp asymptotic convergence rate is proposed.

Lower bounds for two-sample structural change detection in ising and Gaussian models

The change detection bounds inherit partial tightness from the structure learning schemes in previous literature, demonstrating that in certain parameter regimes, the naive structure learning based approach to change detection is minimax optimal up to constant factors.

DCI: Learning Causal Differences between Gene Regulatory Networks

This work proposes an algorithm that efficiently learns the differences in gene regulatory mechanisms between different conditions through difference causal inference (DCI), and provides a user-friendly Python implementation and enables the user to learn the most robust difference causal graph across different tuning parameters via stability selection.

References

SHOWING 1-10 OF 42 REFERENCES

Support Consistency of Direct Sparse-Change Learning in Markov Networks

Enough conditions are given for successful change detection with respect to the sample size np, nq, the dimension of data m, and the number of changed edges d to give sufficient conditions for successfully change detection.

Direct Learning of Sparse Changes in Markov Networks by Density Ratio Estimation

A new method for detecting changes in Markov network structure between two sets of samples is proposed, which directly learns the network structure change by estimating the ratio of Markovnetwork models.

Estimating networks with jumps.

A procedure is proposed that estimates the structure of a graphical model by minimizing the temporally smoothed L1 penalized regression, which allows jointly estimating the partition boundaries of the VCVS model and the coefficient of the sparse precision matrix on each block of the partition.

Node-based learning of multiple Gaussian graphical models

This work takes a node-based approach to estimation of high-dimensional Gaussian graphical models corresponding to a single set of variables under several distinct conditions, and derives a set of necessary and sufficient conditions that allows the problem to decompose into independent subproblems so that the algorithm can be scaled to high- dimensional settings.

Learning Structural Changes of Gaussian Graphical Models in Controlled Experiments

An effective learning strategy to extract structural changes in Gaussian graphical model using l1-regularization based convex optimization is reported and an efficient implementation by the block coordinate descent algorithm is introduced.

Direct estimation of differential networks.

In this paper, each condition-specific network is modeled using the precision matrix of a multivariate normal random vector, and a method is proposed to directly estimate the difference of the precision matrices.

Generalized Direct Change Estimation in Ising Model Structure

A norm-regularized estimator is presented and analyzed for directly estimating the change in structure, without having to estimate the structures of the individual Ising models, and can be generalized to other graphical models under mild assumptions.

High-dimensional graphs and variable selection with the Lasso

It is shown that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs and is hence equivalent to variable selection for Gaussian linear models.

Structure estimation for discrete graphical models: Generalized covariance matrices and their inverses

For certain graph structures, the support of the inverse covariance matrix of indicator variables on the vertices of a graph reflects the conditional independence structure of the graph, and nonasymptotic guarantees for graph selection methods are provided.

Statistical Learning with Sparsity: The Lasso and Generalizations

Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data and extract useful and reproducible patterns from big datasets.