About Split Proximal Algorithms for the Q-Lasso

Abdellatif Moudafi


Numerous problems in signal processing and imaging, statistical learning and data mining, or computervision  can be formulated as optimization problems which consist in minimizing a sum of convex functions,not necessarily differentiable, possibly composed with linear operators. Each function is typically either a data fidelity term or a regularization termenforcing some properties on the solution, see for example [5, 6] and references therein. In this note we are interested in  the general form of  Q-Lasso  introduced in [1] which generalized the well-known Lasso of Tibshirani [9]. $Q$ is a closed convex subset of a Euclidean $m$-space, for some integer $m\geq1$, that can be interpreted as the set of errors  within given tolerance level when linear measurements are taken to cover a signal/image via the Lasso.  Only the unconstrained case   was discussed in [1], we  discuss  here some split proximal algorithms  for  solving  the general case. It is worth mentioning that the lasso model  a number of applied problems arising from machine learning and signal/image processing due to the fact it  promotes the sparsity of a signal.

Full Text: PDF


  • There are currently no refbacks.

The Thai Journal of Mathematics organized and supported by The Mathematical Association of Thailand and Thailand Research Council and the Center for Promotion of Mathematical Research of Thailand (CEPMART).

Copyright 2020 by the Mathematical Association of Thailand.

All rights reserve. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without the prior permission of the Mathematical Association of Thailand.

|ISSN 1686-0209|