• Publications
  • Influence
A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
TLDR
This paper studies an alternative inexact BCD approach which updates the variable blocks by successively minimizing a sequence of approximations of f which are either locally tight upper bounds of $f$ or strictly convex local approximation of f. Expand
Towards K-means-friendly Spaces: Simultaneous Deep Learning and Clustering
TLDR
A joint DR and K-means clustering approach in which DR is accomplished via learning a deep neural network (DNN) while exploiting theDeep neural network's ability to approximate any nonlinear function is proposed. Expand
Learning to Optimize: Training Deep Neural Networks for Interference Management
Numerical optimization has played a central role in addressing key signal processing (SP) problems. Highly effective methods have been developed for a large variety of SP applications such asExpand
On the linear convergence of the alternating direction method of multipliers
TLDR
This paper establishes the global R-linear convergence of the ADMM for minimizing the sum of any number of convex separable functions, assuming that a certain error bound condition holds true and the dual stepsize is sufficiently small. Expand
Convergence analysis of alternating direction method of multipliers for a family of nonconvex problems
TLDR
It is shown that in the presence of nonconvex objective function, classical ADMM is able to reach the set of stationary solutions for these problems, if the stepsize is chosen large enough. Expand
On the Convergence of A Class of Adam-Type Algorithms for Non-Convex Optimization
TLDR
A set of mild sufficient conditions are provided that guarantee the convergence for the Adam-type methods and it is proved that under these derived conditions, these methods can achieve the convergence rate of order $O(\log{T}/\sqrt{T})$ for nonconvex stochastic optimization. Expand
Multi-Agent Distributed Optimization via Inexact Consensus ADMM
TLDR
Low-complexity algorithms are proposed that can reduce the overall computational cost of consensus ADMM by an order of magnitude for certain large-scale problems and offer considerably lower computational complexity. Expand
Topology Attack and Defense for Graph Neural Networks: An Optimization Perspective
TLDR
A novel gradient-based attack method is presented that facilitates the difficulty of tackling discrete graph data and yields higher robustness against both different gradient based and greedy attack methods without sacrificing classification accuracy on original graph. Expand
A Unified Algorithmic Framework for Block-Structured Optimization Involving Big Data: With applications in machine learning and signal processing
TLDR
In this article, various features and properties of the BSUM are discussed from the viewpoint of design flexibility, computational efficiency, parallel/distributed implementation, and the required communication overhead. Expand
Joint Base Station Clustering and Beamformer Design for Partial Coordinated Transmission in Heterogeneous Networks
TLDR
This paper proposes an efficient algorithm that is based on iteratively solving a sequence of group LASSO problems that performs BS clustering and beamformer design jointly rather than separately as is done in the existing approaches for partial coordinated transmission. Expand
...
1
2
3
4
5
...