[1]

L. Ardizzone, J. Kruse, C. Rother and U. Köthe, Analyzing inverse problems with invertible neural networks, In International Conference on Learning Representations, 2019, https://openreview.net/forum?id=rJed6j0cKX.

[2]

M. Asim, M. Daniels, O. Leong, A. Ahmed and P. Hand, Invertible generative models for inverse problems: Mitigating representation error and dataset bias, In Proceedings of the 37th International Conference on Machine Learning, (eds. H. D. Ⅲ and A. Singh), Proceedings of Machine Learning Research, PMLR, 119 (2020), 399–409.

[3]

A. Beskos, M. Girolami, S. Lan, P. E. Farrell and A. M. Stuart, Geometric MCMC for infinitedimensional inverse problems, J. Comput. Phys., 335 (2017), 327351.
doi: 10.1016/j.jcp.2016.12.041.

[4]

H. Bölcskei, P. Grohs, G. Kutyniok and P. Petersen, Optimal approximation with sparsely connected deep neural networks, SIAM J. Math. Data Sci., 1 (2019), 845.
doi: 10.1137/18M118709X.

[5]

S. Borak, W. Härdle and R. Weron, Stable distributions, 21–44, Statistical Tools for Finance and Insurance, (2005), 21–44.
doi: 10.1007/3540273956_1.

[6]

T. BuiThanh, O. Ghattas, J. Martin and G. Stadler, A computational framework for infinitedimensional Bayesian inverse problems part Ⅰ: The linearized case, with application to global seismic inversion, SIAM J. Sci. Comput., 35 (2013), 24942523.
doi: 10.1137/12089586X.

[7]

N. K. Chada, S. Lasanen and L. Roininen, Posterior convergence analysis of $\alpha$stable sheets, 2019, arXiv: 1907.03086.

[8]

N. K. Chada, L. Roininen and J. Suuronen, Cauchy markov random field priors for Bayesian inversion, Stat. Comput., 32 (2022), 33.
doi: 10.1007/s1122202210089z.

[9]

A. Chambolle, M. Novaga, D. Cremers and T. Pock, An introduction to total variation for image analysis, In Theoretical Foundations and Numerical Methods for Sparse Recovery, 2010.

[10]

V. Chen, M. M. Dunlop, O. Papaspiliopoulos and A. M. Stuart, Dimensionrobust MCMC in Bayesian inverse problems, 2019, arXiv: 1803.03344.

[11]

S. L. Cotter, M. Dashti and A. M. Stuart, Approximation of Bayesian inverse problems for PDEs, SIAM J. Numer. Anal., 48 (2010), 322345.
doi: 10.1137/090770734.

[12]

S. L. Cotter, G. O. Roberts, A. M. Stuart and D. White, MCMC methods for functions: Modifying old algorithms to make them faster, Statist. Sci., 28 (2013), 424446.
doi: 10.1214/13STS421.

[13]

M. Dashti, S. Harris and A. Stuart, Besov priors for Bayesian inverse problems, Inverse Probl. Imaging, 6 (2012), 183200.
doi: 10.3934/ipi.2012.6.183.

[14]

A. G. de G. Matthews, J. Hron, M. Rowland, R. E. Turner and Z. Ghahramani, Gaussian process behaviour in wide deep neural networks, In International Conference on Learning Representations, 2018, https://openreview.net/forum?id=H1nGgWC.

[15]

R. Der and D. Lee, Beyond Gaussian processes: On the distributions of infinite networks, In Advances in Neural Information Processing Systems, (eds. Y. Weiss, B. Schölkopf and J. C. Platt), MIT Press, (2006), 275–282, http://papers.nips.cc/paper/2869beyondgaussianprocessesonthedistributionsofinfinitenetworks.pdf.

[16]

J. N. Franklin, Wellposed stochastic extensions of illposed linear problems, J. Math. Anal. Appl., 31 (1970), 682716.
doi: 10.1016/0022247X(70)90017X.

[17]

B. V. Gnedenko and A. N. Kolmogorov, Limit Distributions for Sums of Independent Random Variables, AddisonWesley Publishing Co., Inc., Cambridge, Mass., 1954.

[18]

G. González, V. Kolehmainen and A. Seppänen, Isotropic and anisotropic total variation regularization in electrical impedance tomography, Comput. Math. Appl., 74 (2017), 564576.
doi: 10.1016/j.camwa.2017.05.004.

[19]

M. Hairer, A. M. Stuart and S. J. Vollmer, Spectral gaps for a Metropolis–Hastings algorithm in infinite dimensions, Ann. Appl. Probab., 24 (2014), 24552490.
doi: 10.1214/13AAP982.

[20]

A. Immer, M. Korzepa and M. Bauer, Improving predictions of Bayesian neural nets via local linearization, In AISTATS, (2021), 703–711, http://proceedings.mlr.press/v130/immer21a.html.

[21]

J. Kaipio and E. Somersalo, Statistical and Computational Inverse Problems, Applied Mathematical Sciences, 160. SpringerVerlag, New York, 2005, https://cds.cern.ch/record/1338003.

[22]

J. Kaipio and E. Somersalo, Statistical inverse problems: Discretization, model reduction and inverse crimes, J. Comput. Appl. Math., 198 (2007), 493504.
doi: 10.1016/j.cam.2005.09.027.

[23]

B. Lakshminarayanan, A. Pritzel and C. Blundell, Simple and scalable predictive uncertainty estimation using deep ensembles, In Proceedings of the 31st International Conference on Neural Information Processing Systems, NIPS'17, (2017), 6405–6416.

[24]

M. Lassas, E. Saksman and S. Siltanen, Discretizationinvariant Bayesian inversion and Besov space priors, Inverse Probl. Imaging, 3 (2009), 87122.
doi: 10.3934/ipi.2009.3.87.

[25]

M. Lassas and S. Siltanen, Can one use total variation prior for edgepreserving Bayesian inversion?, Inverse Problems, 20 (2004), 15371563.
doi: 10.1088/02665611/20/5/013.

[26]

M. Markkanen, L. Roininen, J. M. J. Huttunen and S. Lasanen, Cauchy difference priors for edgepreserving Bayesian inversion, J. Inverse IllPosed Probl., 27 (2019), 225240.
doi: 10.1515/jiip20170048.

[27]

R. M. Neal, Priors for infinite networks, Bayesian Learning for Neural Networks, 118 (1996), 2953.
doi: 10.1007/9781461207450_2.

[28]

J. Nocedal and S. J. Wright, Numerical Optimization, 2$^{nd}$ edition, Springer Series in Operations Research and Financial Engineering. Springer, New York, 2006.

[29]

R. Rahaman and A. H. Thiery, Uncertainty quantification and deep ensembles, 2020, arXiv: 2007.08792.

[30]

C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning), MIT Press, Cambridge, MA, 2006.

[31]

V. K. Rohatgi, An Introduction to Probability and Statistics, Wiley, New York, 1976.

[32]

C. Schillings, B. Sprungk and P. Wacker, On the convergence of the Laplace approximation and noiselevelrobustness of Laplacebased Monte Carlo methods for Bayesian inverse problems, Numer. Math., 145 (2020), 915971.
doi: 10.1007/s00211020011311.

[33]

A. M. Stuart, Inverse problems: A Bayesian perspective, Acta Numer., 19 (2010), 451559.
doi: 10.1017/S0962492910000061.

[34]

T. J. Sullivan, Wellposed Bayesian inverse problems and heavytailed stable quasiBanach space priors, Inverse Probl. Imaging, 11 (2017), 857874.
doi: 10.3934/ipi.2017040.

[35]

C. K. I. Williams, Computing with infinite networks, In Proceedings of the 9th International Conference on Neural Information Processing Systems, NIPS'96, MIT Press, Cambridge, MA, USA, (1996), 295–301.

[36]

Z.H. Zhou, J. Wu and W. Tang, Ensembling neural networks: Many could be better than all, Artificial Intelligence, 137 (2002), 239263.
doi: 10.1016/S00043702(02)00190X.
