Abstract: The coordinate descent (CD) methods have seen a resurgence of recent interest because of their applicability in machine learning as well as large scale data analysis and superior empirical performance. CD methods have two variants, cyclic coordinate descents (CCD) and randomized coordinate descent (RCD) which are deterministic and randomized versions of the CD methods. In light of the recent results in the literature, there is a large gap between the theory and practice of CCD methods, in particular existing theoretical guarantees for CCD are far worse than RCD despite the fact that CCD works often well in practice. In the first part of the talk, we provide problem classes in quadratic optimization for which CCD (or CD with any deterministic order) is provably faster than RCD in terms of asymptotic worst-case convergence and we quantify the improvement compared to RCD depending on the order chosen to update the coordinates in CCD. We also provide some lower bounds on the performance of CCD by characterizing the best order and discuss applications to Guassian Belief Propagation and Kaczmarz algorithms for solving Laplacian linear systems.
In the second part of the talk, we focus on distributed algorithms for computing effective resistances in an undirected graph and their applications to continuous-time averaging over networks and solving Laplacian linear systems. The effective resistance between a pair of nodes in a weighted undirected graph is defined as the potential difference induced between them when a unit current is injected at the first node and extracted at the second node, treating edge weights as the conductance values of edges. The effective resistance is a key quantity of interest in many applications and fields including solving linear systems, Markov Chains and continuous-time averaging networks. We develop an efficient linearly convergent distributed algorithm for computing effective resistances and demonstrate its performance through numerical studies. In particular, we apply our algorithm to several problems including the consensus optimization and the distributed solution of Laplacian linear systems where we show that the distributed algorithm we developed for effective resistance can be used to accelerate the convergence of the classical algorithms by a factor depending on the network structure.
Biography: Mert Gürbüzbalaban is an assistant professor at Rutgers University. Previously, he was a postdoctoral associate at the Laboratory for Information and Decision Systems (LIDS) at MIT. He is broadly interested in optimization and computational science driven by applications in large-scale information and decision systems and networks. Previously, he received his B. Sc. Degrees in Electrical Engineering and Mathematics as a valedictorian from Boğazici University, Istanbul, Turkey, the “Diplôme d’ingénieur” degree from École Polytechnique, France, and the M.S. and Ph.D. degrees in Applied Mathematics from the Courant Institute of Mathematical Sciences, New York University.