### About Split Proximal Algorithms for the Q-Lasso

#### Abstract

Numerous problems in signal processing and imaging, statistical learning and data mining, or computervision can be formulated as optimization problems which consist in minimizing a sum of convex functions,not necessarily differentiable, possibly composed with linear operators. Each function is typically either a data fidelity term or a regularization termenforcing some properties on the solution, see for example [5, 6] and references therein. In this note we are interested in the general form of Q-Lasso introduced in [1] which generalized the well-known Lasso of Tibshirani [9]. $Q$ is a closed convex subset of a Euclidean $m$-space, for some integer $m\geq1$, that can be interpreted as the set of errors within given tolerance level when linear measurements are taken to cover a signal/image via the Lasso. Only the unconstrained case was discussed in [1], we discuss here some split proximal algorithms for solving the general case. It is worth mentioning that the lasso model a number of applied problems arising from machine learning and signal/image processing due to the fact it promotes the sparsity of a signal.

### Refbacks

- There are currently no refbacks.