Librería Portfolio Librería Portfolio

Búsqueda avanzada

TIENE EN SU CESTA DE LA COMPRA

0 productos

en total 0,00 €

SPARSE MODELING: THEORY, ALGORITHMS, AND APPLICATIONS
Título:
SPARSE MODELING: THEORY, ALGORITHMS, AND APPLICATIONS
Subtítulo:
Autor:
RISH, I
Editorial:
CRC
Año de edición:
2014
Materia
INTELIGENCIA ARTIFICIAL - GENERAL
ISBN:
978-1-4398-2869-4
Páginas:
253
77,50 €

 

Sinopsis

Features

Presents an introduction to the key concepts and major results in sparse modeling and signal recovery
Covers basic theoretical aspects of sparse modeling, state-of-the-art algorithmic approaches, and practical applications
Describes popular sparsity-enforcing approaches, such as l0- and l1-norm minimization
Explores several fast-developing subareas of sparse modeling, such as sparse Gaussian Markov random fields, structured sparsity, dictionary learning, and sparse matrix factorizations
Summary

Sparse models are particularly useful in scientific applications, such as biomarker discovery in genetic or neuroimaging data, where the interpretability of a predictive model is essential. Sparsity can also dramatically improve the cost efficiency of signal processing.

Sparse Modeling: Theory, Algorithms, and Applications provides an introduction to the growing field of sparse modeling, including application examples, problem formulations that yield sparse solutions, algorithms for finding such solutions, and recent theoretical results on sparse recovery. The book gets you up to speed on the latest sparsity-related developments and will motivate you to continue learning about the field.

The authors first present motivating examples and a high-level survey of key recent developments in sparse modeling. The book then describes optimization problems involving commonly used sparsity-enforcing tools, presents essential theoretical results, and discusses several state-of-the-art algorithms for finding sparse solutions.

The authors go on to address a variety of sparse recovery problems that extend the basic formulation to more sophisticated forms of structured sparsity and to different loss functions. They also examine a particular class of sparse graphical models and cover dictionary learning and sparse matrix factorizations.



Table of Contents

Introduction
Motivating Examples
Sparse Recovery in a Nutshell
Statistical Learning versus Compressed Sensing

Sparse Recovery: Problem Formulations
Noiseless Sparse Recovery
Approximations
Convexity: Brief Review
Relaxations of (P0) Problem
The Effect of lq-Regularizer on Solution Sparsity
l1-norm Minimization as Linear Programming
Noisy Sparse Recovery
A Statistical View of Sparse Recovery
Beyond LASSO: Other Loss Functions and Regularizers

Theoretical Results (Deterministic Part)
The Sampling Theorem
Surprising Empirical Results
Signal Recovery from Incomplete Frequency Information
Mutual Coherence
Spark and Uniqueness of (P0) Solution
Null Space Property and Uniqueness of (P1) Solution
Restricted Isometry Property (RIP)
Square Root Bottleneck for the Worst-Case Exact Recovery
Exact Recovery Based on RIP

Theoretical Results (Probabilistic Part)
When Does RIP Hold?
Johnson-Lindenstrauss Lemma and RIP for Subgaussian Random Matrices
Random Matrices Satisfying RIP
RIP for Matrices with Independent Bounded Rows and Matrices with Random Rows of Fourier Transform

Algorithms for Sparse Recovery Problems
Univariate Thresholding is Optimal for Orthogonal Designs
Algorithms for l0-norm Minimization
Algorithms for l1-norm Minimization (LASSO)

Beyond LASSO: Structured Sparsity
The Elastic Net
Fused LASSO
Group LASSO: l1/l2 Penalty
Simultaneous LASSO: l1/l8 Penalty
Generalizations
Applications

Beyond LASSO: Other Loss Functions
Sparse Recovery from Noisy Observations
Exponential Family, GLMs, and Bregman Divergences
Sparse Recovery with GLM Regression

Sparse Graphical Models
Background
Markov Networks
Learning and Inference in Markov Networks
Learning Sparse Gaussian MRFs

Sparse Matrix Factorization: Dictionary Learning and Beyond
Dictionary Learning
Sparse PCA
Sparse NMF for Blind Source Separation

Epilogue

Appendix: Mathematical Background

Bibliography

Index

A Summary and Bibliographical Notes appear at the end of each chapter.