## Journals

- Advances in Mathematics of Communications
- Big Data & Information Analytics
- Communications on Pure & Applied Analysis
- Discrete & Continuous Dynamical Systems - A
- Discrete & Continuous Dynamical Systems - B
- Discrete & Continuous Dynamical Systems - S
- Evolution Equations & Control Theory
- Inverse Problems & Imaging
- Journal of Computational Dynamics
- Journal of Dynamics & Games
- Journal of Geometric Mechanics
- Journal of Industrial & Management Optimization
- Journal of Modern Dynamics
- Kinetic & Related Models
- Mathematical Biosciences & Engineering
- Mathematical Control & Related Fields
- Mathematical Foundations of Computing
- Networks & Heterogeneous Media
- Numerical Algebra, Control & Optimization
- AIMS Mathematics
- Conference Publications
- Electronic Research Announcements
- Mathematics in Engineering

### Open Access Journals

$R_\alpha(A,z)= arg\min_{u\geq 0} \{T_0(Au;z)+\alpha J(u)\},$

where $T_0(Au;z)$ is the negative-log of the Poisson likelihood functional, and $\alpha>0$ and $J$ are the regularization parameter and functional, respectively. Our goal in this paper is to determine general conditions which guarantee that $R_\alpha$ defines a * regularization scheme* for $z=Au+\gamma$. Determining the appropriate definition for * regularization scheme* in this context is important: not only will it serve to unify previous theoretical arguments in this direction, it will provide a framework for future theoretical analyses. To illustrate the latter, we end the paper with an application of the general framework to a case in which an analysis has not been done.

In the Bayesian point-of-view taken in this paper, a negative-log prior (or regularization) function is added to the negative-log likelihood function, and the resulting function is minimized. We focus on the case where the negative-log prior is the well-known total variation function and give a statistical interpretation. Regardless of whether the least squares or Poisson negative-log likelihood is used, the total variation term yields a minimization problem that is computationally challenging. The primary result of this work is the efficient computational method that is presented for the solution of such problems, together with its convergence analysis. With the computational method in hand, we then perform experiments that indicate that the Poisson negative-log likelihood yields a more computationally efficient method than does the use of the least squares function. We also present results that indicate that this may even be the case when the data noise is i.i.d. Gaussian, suggesting that regardless of noise statistics, using the Poisson negative-log likelihood can yield a more computationally tractable problem when total variation regularization is used.

## Year of publication

## Related Authors

## Related Keywords

[Back to Top]