In this talk, we discuss column partition based distributed schemes for a class of
large-scale convex sparse optimization problems, e.g., basis pursuit (BP), LASSO, basis
pursuit denosing (BPDN), and their extensions, e.g., fused LASSO. We are particularly
interested in the cases where the number of (scalar) decision variables is much larger
than the number of (scalar) measurements, and each agent has limited memory or computing
capacity such that it only knows a small number of columns of a measurement matrix. These
problems in consideration are densely coupled and cannot be formulated as separable
convex programs using column partition.
To overcome this difficulty, we consider their
dual problems which are separable or locally coupled. Once a dual solution is attained,
it is shown that a primal solution can be found from the dual of corresponding
regularized BP-like problems under suitable exact regularization conditions. A wide range
of existing distributed schemes can be exploited to solve the obtained dual problems.
This yields two-stage column partition based distributed schemes for LASSO-like and
BPDN-like problems; the overall convergence of these schemes is established using
sensitivity analysis techniques. Numerical results illustrate the effectiveness of the
proposed schemes.
Inspired by recent progress in quantum algorithms for ordinary and partial differential
equations, we study quantum algorithms for stochastic differential equations (SDEs).
Firstly we provide a quantum algorithm that gives a quadratic speed-up for multilevel
Monte Carlo methods in a general setting. As applications, we apply it to compute
expectation values determined by classical solutions of SDEs, with improved dependence on
precision. We demonstrate the use of this algorithm in a variety of applications arising
in mathematical finance, such as the Black-Scholes and Local Volatility models, and
Greeks. We also provide a quantum algorithm based on sublinear binomial sampling for the
binomial option pricing model with the same improvement.
arXiv article
We introduce a randomized algorithm, namely RCHOL, to construct an approximate
Cholesky factorization for a given sparse Laplacian matrix (a.k.a., graph Laplacian). The
(exact) Cholesky factorization for the matrix introduces a clique in the associated graph
after eliminating every row/column. By randomization, RCHOL samples a subset of the
edges in the clique. We prove RCHOL is breakdown free and apply it to solving
linear systems with symmetric diagonally-dominant matrices. In addition, we parallelize
RCHOL based on the nested dissection ordering for shared-memory machines. Numerical
experiments demonstrated the robustness and the scalability of RCHOL. For example,
our parallel code scaled up to 64 threads on a single node for solving the 3D Poisson
equation, discretized with the 7-point stencil on a 1024 by 1024 by 1024 grid, or
one billion unknowns.
arXiv article