SAYAS NUMERICS SEMINAR     Register to attend talk
March 23, 2021 at 3:30pm (Eastern Time)

Column Partition based Distributed Algorithms for Coupled Convex Sparse Optimization: Dual and Exact Regularization Approaches

Eswar Kumar Hathibelagal Kammara
University of Maryland, Baltimore County

In this talk, we discuss column partition based distributed schemes for a class of large-scale convex sparse optimization problems, e.g., basis pursuit (BP), LASSO, basis pursuit denosing (BPDN), and their extensions, e.g., fused LASSO. We are particularly interested in the cases where the number of (scalar) decision variables is much larger than the number of (scalar) measurements, and each agent has limited memory or computing capacity such that it only knows a small number of columns of a measurement matrix. These problems in consideration are densely coupled and cannot be formulated as separable convex programs using column partition.
    To overcome this difficulty, we consider their dual problems which are separable or locally coupled. Once a dual solution is attained, it is shown that a primal solution can be found from the dual of corresponding regularized BP-like problems under suitable exact regularization conditions. A wide range of existing distributed schemes can be exploited to solve the obtained dual problems.
    This yields two-stage column partition based distributed schemes for LASSO-like and BPDN-like problems; the overall convergence of these schemes is established using sensitivity analysis techniques. Numerical results illustrate the effectiveness of the proposed schemes.

Quantum-accelerated multilevel Monte Carlo methods for stochastic differential equations in mathematical finance

Jin-Peng Liu
University of Maryland, College Park

Inspired by recent progress in quantum algorithms for ordinary and partial differential equations, we study quantum algorithms for stochastic differential equations (SDEs). Firstly we provide a quantum algorithm that gives a quadratic speed-up for multilevel Monte Carlo methods in a general setting. As applications, we apply it to compute expectation values determined by classical solutions of SDEs, with improved dependence on precision. We demonstrate the use of this algorithm in a variety of applications arising in mathematical finance, such as the Black-Scholes and Local Volatility models, and Greeks. We also provide a quantum algorithm based on sublinear binomial sampling for the binomial option pricing model with the same improvement.
arXiv article

RCHOL: Randomized Cholesky Factorization for Solving SDD Linear Systems

Chao Chen
University of Texas at Austin

We introduce a randomized algorithm, namely RCHOL, to construct an approximate Cholesky factorization for a given sparse Laplacian matrix (a.k.a., graph Laplacian). The (exact) Cholesky factorization for the matrix introduces a clique in the associated graph after eliminating every row/column. By randomization, RCHOL samples a subset of the edges in the clique. We prove RCHOL is breakdown free and apply it to solving linear systems with symmetric diagonally-dominant matrices. In addition, we parallelize RCHOL based on the nested dissection ordering for shared-memory machines. Numerical experiments demonstrated the robustness and the scalability of RCHOL. For example, our parallel code scaled up to 64 threads on a single node for solving the 3D Poisson equation, discretized with the 7-point stencil on a 1024 by 1024 by 1024 grid, or one billion unknowns.
arXiv article