TY - GEN
T1 - A map approach for ℓq-norm regularized sparse parameter estimation using the em algorithm
AU - Carvajal, Rodrigo
AU - Aguero, Juan C.
AU - Godoy, Boris I.
AU - Katselis, Dimitrios
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2015/11/10
Y1 - 2015/11/10
N2 - In this paper, Bayesian parameter estimation through the consideration of the Maximum A Posteriori (MAP) criterion is revisited under the prism of the Expectation-Maximization (EM) algorithm. By incorporating a sparsity-promoting penalty term in the cost function of the estimation problem through the use of an appropriate prior distribution, we show how the EM algorithm can be used to efficiently solve the corresponding optimization problem. To this end, we rely on variance-mean Gaussian mixtures (VMGM) to describe the prior distribution, while we incorporate many nice features of these mixtures to our estimation problem. The corresponding MAP estimation problem is completely expressed in terms of the EM algorithm, which allows for handling nonlinearities and hidden variables that cannot be easily handled with traditional methods. For comparison purposes, we also develop a Coordinate Descent algorithm for the ℓq-norm penalized problem and present the performance results via simulations.
AB - In this paper, Bayesian parameter estimation through the consideration of the Maximum A Posteriori (MAP) criterion is revisited under the prism of the Expectation-Maximization (EM) algorithm. By incorporating a sparsity-promoting penalty term in the cost function of the estimation problem through the use of an appropriate prior distribution, we show how the EM algorithm can be used to efficiently solve the corresponding optimization problem. To this end, we rely on variance-mean Gaussian mixtures (VMGM) to describe the prior distribution, while we incorporate many nice features of these mixtures to our estimation problem. The corresponding MAP estimation problem is completely expressed in terms of the EM algorithm, which allows for handling nonlinearities and hidden variables that cannot be easily handled with traditional methods. For comparison purposes, we also develop a Coordinate Descent algorithm for the ℓq-norm penalized problem and present the performance results via simulations.
KW - Convergence
KW - Maximum likelihood estimation
KW - Optimization
KW - Parameter estimation
KW - Probability density function
KW - Signal processing algorithms
UR - http://www.scopus.com/inward/record.url?scp=84960926883&partnerID=8YFLogxK
U2 - 10.1109/MLSP.2015.7324321
DO - 10.1109/MLSP.2015.7324321
M3 - Conference contribution
AN - SCOPUS:84960926883
T3 - IEEE International Workshop on Machine Learning for Signal Processing, MLSP
BT - 2015 IEEE International Workshop on Machine Learning for Signal Processing - Proceedings of MLSP 2015
A2 - Erdogmus, Deniz
A2 - Kozat, Serdar
A2 - Larsen, Jan
A2 - Akcakaya, Murat
PB - IEEE Computer Society
T2 - 25th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2015
Y2 - 17 September 2015 through 20 September 2015
ER -