eISSN:
 2639-8001

All Issues

Volume 1, 2019

Foundations of Data Science invites submissions focusing on advances in mathematical, statistical, and computational methods for data science. Results should significantly advance current understanding of data science, by algorithm development, analysis, and/or computational implementation which demonstrates behavior and applicability of the algorithm. Fields covered by the journal include, but are not limited to Bayesian Statistics, High Performance Computing, Inverse Problems, Data Assimilation, Machine Learning, Optimization, Topological Data Analysis, Spatial Statistics, Nonparametric Statistics, Uncertainty Quantification, and Data Centric Engineering. Expository and review articles are welcome. Papers which focus on applications in science and engineering are also encouraged, however the method(s) used should be applicable outside of one specific application domain.

FoDS Flyer


We are delighted to announce a special issue on the topic of "Sequential Monte Carlo Methods", in honour of the upcoming meeting SMC 2020


The guest editors will be Arnaud Doucet, Víctor Elvira, Fredrik Lindsten, and Joaquín Míguez. Submission will open soon -- please check back for details.

Select all articles

Export/Reference:

Stochastic gradient descent algorithm for stochastic optimization in solving analytic continuation problems
Feng Bao and Thomas Maier
2020, 2(1) : 1-17 doi: 10.3934/fods.2020001 +[Abstract](165) +[HTML](142) +[PDF](418.16KB)
Semi-supervised classification on graphs using explicit diffusion dynamics
Robert L. Peach, Alexis Arnaudon and Mauricio Barahona
2020, 2(1) : 19-33 doi: 10.3934/fods.2020002 +[Abstract](197) +[HTML](125) +[PDF](347.25KB)
Bayesian inference for latent chain graphs
Deng Lu, Maria De Iorio, Ajay Jasra and Gary L. Rosner
2020, 2(1) : 35-54 doi: 10.3934/fods.2020003 +[Abstract](37) +[HTML](23) +[PDF](580.84KB)
Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization
Marc Bocquet, Julien Brajard, Alberto Carrassi and Laurent Bertino
2020, 2(1) : 55-80 doi: 10.3934/fods.2020004 +[Abstract](80) +[HTML](32) +[PDF](800.13KB)
Corrigendum to "Cluster, classify, regress: A general method for learning discontinuous functions [1]"
David E. Bernholdt, Mark R. Cianciosa, Clement Etienam, David L. Green, Kody J. H. Law and Jin M. Park
2020, 2(1) : 81-81 doi: 10.3934/fods.2020005 +[Abstract](39) +[HTML](20) +[PDF](145.16KB)
Quantum topological data analysis with continuous variables
George Siopsis
2019, 1(4) : 419-431 doi: 10.3934/fods.2019017 +[Abstract](422) +[HTML](360) +[PDF](1473.63KB) Cited By(0)
Corrigendum to "Cluster, classify, regress: A general method for learning discontinuous functions [1]"
David E. Bernholdt, Mark R. Cianciosa, Clement Etienam, David L. Green, Kody J. H. Law and Jin M. Park
2020, 2(1) : 81-81 doi: 10.3934/fods.2020005 +[Abstract](39) +[HTML](20) +[PDF](145.16KB) Cited By(0)
Accelerating Metropolis-Hastings algorithms by Delayed Acceptance
Marco Banterle, Clara Grazian, Anthony Lee and Christian P. Robert
2019, 1(2) : 103-128 doi: 10.3934/fods.2019005 +[Abstract](1649) +[HTML](975) +[PDF](685.26KB) Cited By(0)
EmT: Locating empty territories of homology group generators in a dataset
Xin Xu and Jessi Cisewski-Kehe
2019, 1(2) : 227-247 doi: 10.3934/fods.2019010 +[Abstract](955) +[HTML](632) +[PDF](2681.18KB) Cited By(0)
Consistent manifold representation for topological data analysis
Tyrus Berry and Timothy Sauer
2019, 1(1) : 1-38 doi: 10.3934/fods.2019001 +[Abstract](1841) +[HTML](985) +[PDF](3141.49KB) Cited By(0)
On the incorporation of box-constraints for ensemble Kalman inversion
Neil K. Chada, Claudia Schillings and Simon Weissmann
2019, 1(4) : 433-456 doi: 10.3934/fods.2019018 +[Abstract](330) +[HTML](177) +[PDF](1289.35KB) Cited By(0)
Editorial
Ajay Jasra, Kody J. H. Law and Vasileios Maroulas
2019, 1(1) : ⅰ-ⅲ doi: 10.3934/fods.20191i +[Abstract](1109) +[HTML](764) +[PDF](81.25KB) Cited By(0)
Power weighted shortest paths for clustering Euclidean data
Daniel Mckenzie and Steven Damelin
2019, 1(3) : 307-327 doi: 10.3934/fods.2019014 +[Abstract](569) +[HTML](277) +[PDF](663.53KB) Cited By(0)
Partitioned integrators for thermodynamic parameterization of neural networks
Benedict Leimkuhler, Charles Matthews and Tiffany Vlaar
2019, 1(4) : 457-489 doi: 10.3934/fods.2019019 +[Abstract](506) +[HTML](218) +[PDF](10550.03KB) Cited By(0)
Randomized learning of the second-moment matrix of a smooth function
Armin Eftekhari, Michael B. Wakin, Ping Li and Paul G. Constantine
2019, 1(3) : 329-387 doi: 10.3934/fods.2019015 +[Abstract](625) +[HTML](285) +[PDF](1123.92KB) Cited By(0)
Accelerating Metropolis-Hastings algorithms by Delayed Acceptance
Marco Banterle, Clara Grazian, Anthony Lee and Christian P. Robert
2019, 1(2) : 103-128 doi: 10.3934/fods.2019005 +[Abstract](1649) +[HTML](975) +[PDF](685.26KB) PDF Downloads(190)
Consistent manifold representation for topological data analysis
Tyrus Berry and Timothy Sauer
2019, 1(1) : 1-38 doi: 10.3934/fods.2019001 +[Abstract](1841) +[HTML](985) +[PDF](3141.49KB) PDF Downloads(173)
Issues using logistic regression with class imbalance, with a case study from credit risk modelling
Yazhe Li, Tony Bellotti and Niall Adams
2019, 1(4) : 389-417 doi: 10.3934/fods.2019016 +[Abstract](484) +[HTML](181) +[PDF](4084.46KB) PDF Downloads(145)
Particle filters for inference of high-dimensional multivariate stochastic volatility models with cross-leverage effects
Yaxian Xu and Ajay Jasra
2019, 1(1) : 61-85 doi: 10.3934/fods.2019003 +[Abstract](1465) +[HTML](890) +[PDF](935.96KB) PDF Downloads(136)
Modelling dynamic network evolution as a Pitman-Yor process
Francesco Sanna Passino and Nicholas A. Heard
2019, 1(3) : 293-306 doi: 10.3934/fods.2019013 +[Abstract](828) +[HTML](436) +[PDF](1164.04KB) PDF Downloads(119)
Quantum topological data analysis with continuous variables
George Siopsis
2019, 1(4) : 419-431 doi: 10.3934/fods.2019017 +[Abstract](422) +[HTML](360) +[PDF](1473.63KB) PDF Downloads(103)
Approximate Bayesian inference for geostatistical generalised linear models
Evangelos Evangelou
2019, 1(1) : 39-60 doi: 10.3934/fods.2019002 +[Abstract](1615) +[HTML](696) +[PDF](1121.17KB) PDF Downloads(101)
Learning by active nonlinear diffusion
Mauro Maggioni and James M. Murphy
2019, 1(3) : 271-291 doi: 10.3934/fods.2019012 +[Abstract](738) +[HTML](333) +[PDF](4001.74KB) PDF Downloads(99)
General risk measures for robust machine learning
Émilie Chouzenoux, Henri Gérard and Jean-Christophe Pesquet
2019, 1(3) : 249-269 doi: 10.3934/fods.2019011 +[Abstract](730) +[HTML](377) +[PDF](899.95KB) PDF Downloads(98)
Randomized learning of the second-moment matrix of a smooth function
Armin Eftekhari, Michael B. Wakin, Ping Li and Paul G. Constantine
2019, 1(3) : 329-387 doi: 10.3934/fods.2019015 +[Abstract](625) +[HTML](285) +[PDF](1123.92KB) PDF Downloads(91)

Editors

Referees

Librarians

Email Alert

[Back to Top]