The distributed subgradient (DSG) method is a widely used algorithm for coping with large-scale distributed optimization problems in machine-learning applications. Most existing works on DSG focus on ...
In this paper, we present a new ellipsoid-type algorithm for solving nonsmooth problems with convex structure. Examples of such problems include nonsmooth convex minimization problems, convex-concave ...
Abstract: One-bit compressive sensing theory shows that the sparse signals can be almost exactly reconstructed from a small number of one-bit quantized linear measurements. This paper presents the ...
Abstract: We study a distributed computation model for optimizing a sum of convex objective functions corresponding to multiple agents. For solving this (not necessarily smooth) optimization problem, ...
We present an inexact subgradient projection type method for solving a nonsmooth Equilibrium Problem in a finite-dimensional space. The proposed algorithm has a low computational cost per iteration.
and is provided officially by the authors of the paper. All numerical examples presented in the paper are used this implementation. Permission is hereby granted, free of charge, to any person ...
This course is focused on learning to recognize, understand, analyze, and solve unconstrained and constrained convex optimization problems arising in engineering. The course shall focus on the ...
The log-determinant optimization problem with general matrix constraints arises in many applications. The log-determinant term hampers the scalability of existing methods. This paper proposes a highly ...
Under the assumption that the link failures are independent and identically distributed over time (possibly correlated across links), we provide almost sure convergence results for our subgradient ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results