DANR: Discrepancy-aware Network Regularization

  title={DANR: Discrepancy-aware Network Regularization},
  author={Hongyuan You and Furkan Kocayusufoglu and Ambuj K. Singh},
Network regularization is an effective tool for incorporating structural prior knowledge to learn coherent models over networks, and has yielded provably accurate estimates in applications ranging from spatial economics to neuroimaging studies. Recently, there has been an increasing interest in extending network regularization to the spatio-temporal case to accommodate the evolution of networks. However, in both static and spatio-temporal cases, missing or corrupted edge weights can compromise… 

Figures and Tables from this paper



Subnetwork Mining with Spatial and Temporal Smoothness

A novel algorithm for mining a succinct set of subnetworks that are predictive and evolve along with the progression of global network states, designed in the framework of logistic regression that fits a model for multi-states of network samples.

Graph Laplacian Regularization for Image Denoising: Analysis in the Continuous Domain

This paper interprets neighborhood graphs of pixel patches as discrete counterparts of Riemannian manifolds and performs analysis in the continuous domain, providing insights into several fundamental aspects of graph Laplacian regularization for image denoising.

Network Lasso: Clustering and Optimization in Large Graphs

The network lasso is introduced, a generalization of the group lasso to a network setting that allows for simultaneous clustering and optimization on graphs and an algorithm based on the Alternating Direction Method of Multipliers (ADMM) to solve this problem in a distributed and scalable manner.

Sparsistency of the Edge Lasso over Graphs

This paper investigates sparsistency of fused lasso for general graph structures, i.e. its ability to correctly recover the exact support of piece-wise constant graphstructured patterns asymptotically (for largescale graphs) and refers to it as Edge Lasso on the (structured) normal means setting.

In Response to Comment on "Network-constrained regularization and variable selection for analysis of genomic data"

A network-constrained regularization procedure that efficiently utilizes the known pathway structures in identifying the relevant genes and the subnetworks that might be related to phenotype in a general regression framework is introduced.

Graph Laplacian Regularization for Large-Scale Semidefinite Programming

This paper shows how to solve very large problems of this type by a matrix factorization that leads to much smaller SDPs than those previously studied, and illustrates the approach on localization in large scale sensor networks, where optimizations involving tens of thousands of nodes can be solved in just a few minutes.

Evolving Cluster Mixed-Membership Blockmodel for Time-Evolving Networks

This paper develops a network model featuring a state space mixture prior that tracks complex actor latent role changes through time, and demonstrates its utility as a network analysis tool, by applying it to United States Congress voting data.

Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples

A semi-supervised framework that incorporates labeled and unlabeled data in a general-purpose learner is proposed and properties of reproducing kernel Hilbert spaces are used to prove new Representer theorems that provide theoretical basis for the algorithms.

Manifold Regularized Discriminative Nonnegative Matrix Factorization With Fast Gradient Descent

The manifold regularization and the margin maximization to NMF are introduced and the manifold regularized discriminative NMF (MD-NMF) is obtained to overcome the aforementioned problems.

Temporal Regularized Matrix Factorization for High-dimensional Time Series Prediction

This paper develops novel regularization schemes and uses scalable matrix factorization methods that are eminently suited for high-dimensional time series data that has many missing values, and makes interesting connections to graph regularization methods in the context of learning the dependencies in an autoregressive framework.