eISSN:
 2639-8001

All Issues

Volume 1, 2019

Foundations of Data Science invites submissions focusing on advances in mathematical, statistical, and computational methods for data science. Results should significantly advance current understanding of data science, by algorithm development, analysis, and/or computational implementation which demonstrates behavior and applicability of the algorithm. Fields covered by the journal include, but are not limited to Bayesian Statistics, High Performance Computing, Inverse Problems, Data Assimilation, Machine Learning, Optimization, Topological Data Analysis, Spatial Statistics, Nonparametric Statistics, Uncertainty Quantification, and Data Centric Engineering. Expository and review articles are welcome. Papers which focus on applications in science and engineering are also encouraged, however the method(s) used should be applicable outside of one specific application domain.

FoDS Flyer


A special issue on Data Assimilation will be featured in Foundations of Data Science. Its aim will be to collect a set of high-level papers that expose the state-of-the-art ideas and techniques for assimilating data into models in the many areas of science where it has become an essential tool. Submission is by invitation. If you have some interest in submitting to the issue, please contact Chris Jones (ckrtj@renci.org)
The guest editors are:
Marc Bocquet, Jana de Wiljes, John Harlim, Chris Jones, Matthias Morzfeld, Elaine Spiller and Xin T. Tong

We are delighted to announce a special issue on the topic of "Sequential Monte Carlo Methods", in honour of the upcoming meeting SMC 2020
The guest editors will be Arnaud Doucet, Víctor Elvira, Fredrik Lindsten, and Joaquín Míguez.
The special issue on SMC has been postponed and will open for submission in January, 2021.

Select all articles

Export/Reference:

Stochastic gradient descent algorithm for stochastic optimization in solving analytic continuation problems
Feng Bao and Thomas Maier
2020, 2(1) : 1-17 doi: 10.3934/fods.2020001 +[Abstract](380) +[HTML](267) +[PDF](418.16KB)
Semi-supervised classification on graphs using explicit diffusion dynamics
Robert L. Peach, Alexis Arnaudon and Mauricio Barahona
2020, 2(1) : 19-33 doi: 10.3934/fods.2020002 +[Abstract](415) +[HTML](241) +[PDF](347.25KB)
Bayesian inference for latent chain graphs
Deng Lu, Maria De Iorio, Ajay Jasra and Gary L. Rosner
2020, 2(1) : 35-54 doi: 10.3934/fods.2020003 +[Abstract](213) +[HTML](157) +[PDF](580.84KB)
Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization
Marc Bocquet, Julien Brajard, Alberto Carrassi and Laurent Bertino
2020, 2(1) : 55-80 doi: 10.3934/fods.2020004 +[Abstract](493) +[HTML](200) +[PDF](800.13KB)
Corrigendum to "Cluster, classify, regress: A general method for learning discontinuous functions [1]"
David E. Bernholdt, Mark R. Cianciosa, Clement Etienam, David L. Green, Kody J. H. Law and Jin M. Park
2020, 2(1) : 81-81 doi: 10.3934/fods.2020005 +[Abstract](210) +[HTML](149) +[PDF](145.16KB)
Quantum topological data analysis with continuous variables
George Siopsis
2019, 1(4) : 419-431 doi: 10.3934/fods.2019017 +[Abstract](603) +[HTML](519) +[PDF](1473.63KB) Cited By(0)
Corrigendum to "Cluster, classify, regress: A general method for learning discontinuous functions [1]"
David E. Bernholdt, Mark R. Cianciosa, Clement Etienam, David L. Green, Kody J. H. Law and Jin M. Park
2020, 2(1) : 81-81 doi: 10.3934/fods.2020005 +[Abstract](210) +[HTML](149) +[PDF](145.16KB) Cited By(0)
Accelerating Metropolis-Hastings algorithms by Delayed Acceptance
Marco Banterle, Clara Grazian, Anthony Lee and Christian P. Robert
2019, 1(2) : 103-128 doi: 10.3934/fods.2019005 +[Abstract](1871) +[HTML](1088) +[PDF](685.26KB) Cited By(0)
EmT: Locating empty territories of homology group generators in a dataset
Xin Xu and Jessi Cisewski-Kehe
2019, 1(2) : 227-247 doi: 10.3934/fods.2019010 +[Abstract](1141) +[HTML](731) +[PDF](2681.18KB) Cited By(0)
Consistent manifold representation for topological data analysis
Tyrus Berry and Timothy Sauer
2019, 1(1) : 1-38 doi: 10.3934/fods.2019001 +[Abstract](2166) +[HTML](1122) +[PDF](3141.49KB) Cited By(0)
On the incorporation of box-constraints for ensemble Kalman inversion
Neil K. Chada, Claudia Schillings and Simon Weissmann
2019, 1(4) : 433-456 doi: 10.3934/fods.2019018 +[Abstract](474) +[HTML](292) +[PDF](1289.35KB) Cited By(0)
Editorial
Ajay Jasra, Kody J. H. Law and Vasileios Maroulas
2019, 1(1) : ⅰ-ⅲ doi: 10.3934/fods.20191i +[Abstract](1268) +[HTML](896) +[PDF](81.25KB) Cited By(0)
Power weighted shortest paths for clustering Euclidean data
Daniel Mckenzie and Steven Damelin
2019, 1(3) : 307-327 doi: 10.3934/fods.2019014 +[Abstract](734) +[HTML](393) +[PDF](663.53KB) Cited By(0)
Partitioned integrators for thermodynamic parameterization of neural networks
Benedict Leimkuhler, Charles Matthews and Tiffany Vlaar
2019, 1(4) : 457-489 doi: 10.3934/fods.2019019 +[Abstract](730) +[HTML](327) +[PDF](10550.03KB) Cited By(0)
Randomized learning of the second-moment matrix of a smooth function
Armin Eftekhari, Michael B. Wakin, Ping Li and Paul G. Constantine
2019, 1(3) : 329-387 doi: 10.3934/fods.2019015 +[Abstract](806) +[HTML](379) +[PDF](1123.92KB) Cited By(0)
Issues using logistic regression with class imbalance, with a case study from credit risk modelling
Yazhe Li, Tony Bellotti and Niall Adams
2019, 1(4) : 389-417 doi: 10.3934/fods.2019016 +[Abstract](673) +[HTML](283) +[PDF](4084.46KB) PDF Downloads(254)
Accelerating Metropolis-Hastings algorithms by Delayed Acceptance
Marco Banterle, Clara Grazian, Anthony Lee and Christian P. Robert
2019, 1(2) : 103-128 doi: 10.3934/fods.2019005 +[Abstract](1871) +[HTML](1088) +[PDF](685.26KB) PDF Downloads(208)
Consistent manifold representation for topological data analysis
Tyrus Berry and Timothy Sauer
2019, 1(1) : 1-38 doi: 10.3934/fods.2019001 +[Abstract](2166) +[HTML](1122) +[PDF](3141.49KB) PDF Downloads(203)
Particle filters for inference of high-dimensional multivariate stochastic volatility models with cross-leverage effects
Yaxian Xu and Ajay Jasra
2019, 1(1) : 61-85 doi: 10.3934/fods.2019003 +[Abstract](1678) +[HTML](1014) +[PDF](935.96KB) PDF Downloads(155)
Modelling dynamic network evolution as a Pitman-Yor process
Francesco Sanna Passino and Nicholas A. Heard
2019, 1(3) : 293-306 doi: 10.3934/fods.2019013 +[Abstract](1024) +[HTML](541) +[PDF](1164.04KB) PDF Downloads(140)
Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization
Marc Bocquet, Julien Brajard, Alberto Carrassi and Laurent Bertino
2020, 2(1) : 55-80 doi: 10.3934/fods.2020004 +[Abstract](493) +[HTML](200) +[PDF](800.13KB) PDF Downloads(134)
Quantum topological data analysis with continuous variables
George Siopsis
2019, 1(4) : 419-431 doi: 10.3934/fods.2019017 +[Abstract](603) +[HTML](519) +[PDF](1473.63KB) PDF Downloads(129)
Learning by active nonlinear diffusion
Mauro Maggioni and James M. Murphy
2019, 1(3) : 271-291 doi: 10.3934/fods.2019012 +[Abstract](944) +[HTML](452) +[PDF](4001.74KB) PDF Downloads(122)
Stochastic gradient descent algorithm for stochastic optimization in solving analytic continuation problems
Feng Bao and Thomas Maier
2020, 2(1) : 1-17 doi: 10.3934/fods.2020001 +[Abstract](380) +[HTML](267) +[PDF](418.16KB) PDF Downloads(120)
General risk measures for robust machine learning
Émilie Chouzenoux, Henri Gérard and Jean-Christophe Pesquet
2019, 1(3) : 249-269 doi: 10.3934/fods.2019011 +[Abstract](892) +[HTML](480) +[PDF](899.95KB) PDF Downloads(111)

Editors

Referees

Librarians

Email Alert

[Back to Top]