September  2019, 1(3): 293-306. doi: 10.3934/fods.2019013

Modelling dynamic network evolution as a Pitman-Yor process

Department of Mathematics, Imperial College London, 180 Queen's Gate, London – SW7 2AZ, United Kingdom

Published  August 2019

Dynamic interaction networks frequently arise in biology, communications technology and the social sciences, representing, for example, neuronal connectivity in the brain, internet connections between computers and human interactions within social networks. The evolution and strengthening of the links in such networks can be observed through sequences of connection events occurring between network nodes over time. In some of these applications, the identity and size of the network may be unknown a priori and may change over time. In this article, a model for the evolution of dynamic networks based on the Pitman-Yor process is proposed. This model explicitly admits power-laws in the number of connections on each edge, often present in real world networks, and, for careful choices of the parameters, power-laws for the degree distribution of the nodes. A novel empirical method for the estimation of the hyperparameters of the Pitman-Yor process is proposed, and some necessary corrections for uniform discrete base distributions are carefully addressed. The methodology is tested on synthetic data and in an anomaly detection study on the enterprise computer network of the Los Alamos National Laboratory, and successfully detects connections from a red-team penetration test.

Citation: Francesco Sanna Passino, Nicholas A. Heard. Modelling dynamic network evolution as a Pitman-Yor process. Foundations of Data Science, 2019, 1 (3) : 293-306. doi: 10.3934/fods.2019013
References:
[1]

D. Aldous, Exchangeability and related topics, in École d'Été de Probabilités de Saint-Flour XIII-1983, 1117 (1985), 1–198. doi: 10.1007/BFb0099421.  Google Scholar

[2]

A. L. Barabási, The origin of bursts and heavy tails in human dynamics, Nature, 435 (2005), 207-211.   Google Scholar

[3]

Y. BengioR. DucharmeP. Vincent and C. Janvin, A neural probabilistic language model, Journal of Machine Learning Research, 3 (2003), 1137-1155.   Google Scholar

[4]

W. Buntine and M. Hutter, A Bayesian view of the Poisson-Dirichlet process, preprint, arXiv: 1007.0296. Google Scholar

[5]

C. Chen, L. Du and W. Buntine, Sampling table configurations for the hierarchical Poisson-Dirichlet process, in Machine Learning and Knowledge Discovery in Databases (eds. D. Gunopulos, T. Hofmann, D. Malerba and M. Vazirgiannis), Springer Berlin Heidelberg, 2011, 296–311. doi: 10.1007/978-3-642-23780-5_29.  Google Scholar

[6]

T. S. Ferguson, A Bayesian analysis of some nonparametric problems, The Annals of Statistics, 1 (1973), 209-230.  doi: 10.1214/aos/1176342360.  Google Scholar

[7]

R. A. Fisher, Statistical Methods for Research Workers, Fourteenth edition. Hafner Publishing Co., New York, 1973.  Google Scholar

[8]

A. GoldenbergA. X. ZhengS. E. Fienberg and E. M. Airoldi, A survey of statistical network models, Foundations and Trends in Machine Learning, 2 (2009), 129-233.   Google Scholar

[9]

S. Goldwater, T. L. Griffiths and M. Johnson, Interpolating between types and tokens by estimating power-law generators, in Proceedings of the 18th International Conference on Neural Information Processing Systems, MIT Press, 2005, 459–466. Google Scholar

[10]

N. A. Heard and P. Rubin-Delanchy, Network-wide anomaly detection via the Dirichlet process, in Proceedings of the IEEE workshop on Big Data Analytics for Cyber-security Computing, 2016. Google Scholar

[11]

N. A. Heard and P. Rubin-Delanchy, Choosing between methods of combining $p$-values, Biometrika, 105 (2018), 239-246.  doi: 10.1093/biomet/asx076.  Google Scholar

[12]

H. Ishwaran and L. F. James, Gibbs sampling methods for stick-breaking priors, Journal of the American Statistical Association, 96 (2001), 161-173.  doi: 10.1198/016214501750332758.  Google Scholar

[13]

D. Jurafsky, J. H. Martin, P. Norvig and S. Russell, Speech and Language Processing, Pearson Education, 2014. Google Scholar

[14]

A. D. Kent, Cybersecurity data sources for dynamic network research, in Dynamic Networks and Cyber-Security, World Scientific, 2016. Google Scholar

[15]

H. Lancaster, Statistical control of counting experiments, Biometrika, 39 (1952), 419-422.   Google Scholar

[16]

Y. Lv and C. X. Zhai, Positional language models for information retrieval, in Proceedings of the 32Nd International ACM SIGIR Conference on Research and Development in Information Retrieval, ACM, 2009,299–306. doi: 10.1145/1571941.1571994.  Google Scholar

[17]

C. Matias and V. Miele, Statistical clustering of temporal networks through a dynamic stochastic block model, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 79 (2017), 1119-1141.  doi: 10.1111/rssb.12200.  Google Scholar

[18]

T. Mikolov, M. Karafiát, L. Burget, J. Černocký and S. Khudanpur, Recurrent neural network based language model, in Proceedings of the 11th Annual Conference of the International Speech Communication Association (INTERSPEECH 2010), International Speech Communication Association, 2010, 1045–1048. Google Scholar

[19]

M. E. J. Newman, Power laws, Pareto distributions and Zipf's law, Contemporary Physics, 46 (2005), 323-351.  doi: 10.1080/00107510500052444.  Google Scholar

[20]

K. Pearson, On a method of determining whether a sample of size $n$ supposed to have been drawn from a parent population having a known probability integral has probably been drawn at random, Biometrika, 25 (1933), 379-410.   Google Scholar

[21]

P. O. Perry and P. J. Wolfe, Point process modelling for directed interaction networks, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75 (2013), 821-849.  doi: 10.1111/rssb.12013.  Google Scholar

[22]

J. Pitman, Combinatorial Stochastic Processes, Lecture Notes in Mathematics, 1875. Springer-Verlag, Berlin, 2006.  Google Scholar

[23]

J. Pitman and M. Yor, The two-parameter Poisson-Dirichlet distribution derived from a stable sub-ordinator, Annals of Probability, 25 (1997), 855-900.  doi: 10.1214/aop/1024404422.  Google Scholar

[24]

M. Price-Williams and N. A. Heard, Nonparametric self-exciting models for computer network traffic, Statistics and Computing, 2019, 1–12. doi: 10.1007/s11222-019-09875-z.  Google Scholar

[25]

R. Rosenfeld, A maximum entropy approach to adaptive statistical language modelling, Computer Speech & Language, 10 (1996), 187-228.  doi: 10.1006/csla.1996.0011.  Google Scholar

[26]

P. Rubin-Delanchy, N. A. Heard and D. J. Lawson, Meta analysis of mid-$p$-values: some new results based on the convex order, Journal of the American Statistical Association, 2018. doi: 10.1080/01621459.2018.1469994.  Google Scholar

[27]

B. W. Silverman, Density Estimation, London: Chapman and Hall, 1986.  Google Scholar

[28]

S. A. Stouffer, E. A. Suchman, L. C. DeVinney, S. A. Star and R. M. Williams, The American Soldier. Adjustment During Army Life, Princeton, New Jersey: Princeton University Press, 1949. Google Scholar

[29]

W. Y. Teh, A hierarchical Bayesian language model based on Pitman-Yor processes, in Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association of Computational Linguistics, 2006,985–992. doi: 10.3115/1220175.1220299.  Google Scholar

[30]

L. H. C. Tippett, The Methods of Statistics, 4th ed. John Wiley & Sons, Inc., New York, N. Y.; Williams & Norgate, Ltd., London, 1952.  Google Scholar

[31]

H. M. Wallach, S. T. Jensen, L. Dicker and K. A. Heller, An alternative prior process for nonparametric Bayesian clustering, in Proceedings of the Thireenth International Conference on Artificial Intelligence and Statistics (AISTATS 2010), 2010,892–899. Google Scholar

show all references

References:
[1]

D. Aldous, Exchangeability and related topics, in École d'Été de Probabilités de Saint-Flour XIII-1983, 1117 (1985), 1–198. doi: 10.1007/BFb0099421.  Google Scholar

[2]

A. L. Barabási, The origin of bursts and heavy tails in human dynamics, Nature, 435 (2005), 207-211.   Google Scholar

[3]

Y. BengioR. DucharmeP. Vincent and C. Janvin, A neural probabilistic language model, Journal of Machine Learning Research, 3 (2003), 1137-1155.   Google Scholar

[4]

W. Buntine and M. Hutter, A Bayesian view of the Poisson-Dirichlet process, preprint, arXiv: 1007.0296. Google Scholar

[5]

C. Chen, L. Du and W. Buntine, Sampling table configurations for the hierarchical Poisson-Dirichlet process, in Machine Learning and Knowledge Discovery in Databases (eds. D. Gunopulos, T. Hofmann, D. Malerba and M. Vazirgiannis), Springer Berlin Heidelberg, 2011, 296–311. doi: 10.1007/978-3-642-23780-5_29.  Google Scholar

[6]

T. S. Ferguson, A Bayesian analysis of some nonparametric problems, The Annals of Statistics, 1 (1973), 209-230.  doi: 10.1214/aos/1176342360.  Google Scholar

[7]

R. A. Fisher, Statistical Methods for Research Workers, Fourteenth edition. Hafner Publishing Co., New York, 1973.  Google Scholar

[8]

A. GoldenbergA. X. ZhengS. E. Fienberg and E. M. Airoldi, A survey of statistical network models, Foundations and Trends in Machine Learning, 2 (2009), 129-233.   Google Scholar

[9]

S. Goldwater, T. L. Griffiths and M. Johnson, Interpolating between types and tokens by estimating power-law generators, in Proceedings of the 18th International Conference on Neural Information Processing Systems, MIT Press, 2005, 459–466. Google Scholar

[10]

N. A. Heard and P. Rubin-Delanchy, Network-wide anomaly detection via the Dirichlet process, in Proceedings of the IEEE workshop on Big Data Analytics for Cyber-security Computing, 2016. Google Scholar

[11]

N. A. Heard and P. Rubin-Delanchy, Choosing between methods of combining $p$-values, Biometrika, 105 (2018), 239-246.  doi: 10.1093/biomet/asx076.  Google Scholar

[12]

H. Ishwaran and L. F. James, Gibbs sampling methods for stick-breaking priors, Journal of the American Statistical Association, 96 (2001), 161-173.  doi: 10.1198/016214501750332758.  Google Scholar

[13]

D. Jurafsky, J. H. Martin, P. Norvig and S. Russell, Speech and Language Processing, Pearson Education, 2014. Google Scholar

[14]

A. D. Kent, Cybersecurity data sources for dynamic network research, in Dynamic Networks and Cyber-Security, World Scientific, 2016. Google Scholar

[15]

H. Lancaster, Statistical control of counting experiments, Biometrika, 39 (1952), 419-422.   Google Scholar

[16]

Y. Lv and C. X. Zhai, Positional language models for information retrieval, in Proceedings of the 32Nd International ACM SIGIR Conference on Research and Development in Information Retrieval, ACM, 2009,299–306. doi: 10.1145/1571941.1571994.  Google Scholar

[17]

C. Matias and V. Miele, Statistical clustering of temporal networks through a dynamic stochastic block model, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 79 (2017), 1119-1141.  doi: 10.1111/rssb.12200.  Google Scholar

[18]

T. Mikolov, M. Karafiát, L. Burget, J. Černocký and S. Khudanpur, Recurrent neural network based language model, in Proceedings of the 11th Annual Conference of the International Speech Communication Association (INTERSPEECH 2010), International Speech Communication Association, 2010, 1045–1048. Google Scholar

[19]

M. E. J. Newman, Power laws, Pareto distributions and Zipf's law, Contemporary Physics, 46 (2005), 323-351.  doi: 10.1080/00107510500052444.  Google Scholar

[20]

K. Pearson, On a method of determining whether a sample of size $n$ supposed to have been drawn from a parent population having a known probability integral has probably been drawn at random, Biometrika, 25 (1933), 379-410.   Google Scholar

[21]

P. O. Perry and P. J. Wolfe, Point process modelling for directed interaction networks, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75 (2013), 821-849.  doi: 10.1111/rssb.12013.  Google Scholar

[22]

J. Pitman, Combinatorial Stochastic Processes, Lecture Notes in Mathematics, 1875. Springer-Verlag, Berlin, 2006.  Google Scholar

[23]

J. Pitman and M. Yor, The two-parameter Poisson-Dirichlet distribution derived from a stable sub-ordinator, Annals of Probability, 25 (1997), 855-900.  doi: 10.1214/aop/1024404422.  Google Scholar

[24]

M. Price-Williams and N. A. Heard, Nonparametric self-exciting models for computer network traffic, Statistics and Computing, 2019, 1–12. doi: 10.1007/s11222-019-09875-z.  Google Scholar

[25]

R. Rosenfeld, A maximum entropy approach to adaptive statistical language modelling, Computer Speech & Language, 10 (1996), 187-228.  doi: 10.1006/csla.1996.0011.  Google Scholar

[26]

P. Rubin-Delanchy, N. A. Heard and D. J. Lawson, Meta analysis of mid-$p$-values: some new results based on the convex order, Journal of the American Statistical Association, 2018. doi: 10.1080/01621459.2018.1469994.  Google Scholar

[27]

B. W. Silverman, Density Estimation, London: Chapman and Hall, 1986.  Google Scholar

[28]

S. A. Stouffer, E. A. Suchman, L. C. DeVinney, S. A. Star and R. M. Williams, The American Soldier. Adjustment During Army Life, Princeton, New Jersey: Princeton University Press, 1949. Google Scholar

[29]

W. Y. Teh, A hierarchical Bayesian language model based on Pitman-Yor processes, in Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association of Computational Linguistics, 2006,985–992. doi: 10.3115/1220175.1220299.  Google Scholar

[30]

L. H. C. Tippett, The Methods of Statistics, 4th ed. John Wiley & Sons, Inc., New York, N. Y.; Williams & Norgate, Ltd., London, 1952.  Google Scholar

[31]

H. M. Wallach, S. T. Jensen, L. Dicker and K. A. Heller, An alternative prior process for nonparametric Bayesian clustering, in Proceedings of the Thireenth International Conference on Artificial Intelligence and Statistics (AISTATS 2010), 2010,892–899. Google Scholar

Figure 1.  Cartoon example of the Chinese Restaurant metaphor for the Pitman-Yor process, where $ C_j $ represents the $ j $th customer and $ x_j^\star $ is the unique dish served at table $ T_j $. The vector $ \boldsymbol x_n = (x_1, \dots, x_n) $ denotes the observed sequence of dishes eaten by each customer. Customer $ C_i $ being seated at table $ T_j $ is denoted $ C_i\to T_j $. Then, for example, the tenth customer will sit at $ T_1 $ with probability $ (3-d)/(\alpha+9) $
Figure 2.  Kernel density estimates of the parameter estimates from 10000 simulations, $ |V| = {1000} $
Figure 3.  Kernel density estimates of the parameter estimates from 10000 simulations, $ |V| = {16230} $
Figure 4.  Uniform Q-Q plot for the Pitman-Yor process fitted to six destination nodes
Figure 5.  Plot of the corrected $ K_n $ (a), $ H_{1n} $ (b) and their ratio $ H_{1n}/K_n $ (c) as a function of $ n $ for the connections to the destination computer ${\mathtt C1438} $ , and averaged sample paths from $ 100 $ samples from a Pitman-Yor process, obtained using different estimates of the parameters. The grey lines correspond to 10 realised trajectories of simulated Pitman-Yor processes, obtained using the corrected estimate (5)
Table 1.  Estimated Pitman-Yor parameters for 6 destination nodes
Destination $ N $ $ \tilde K_n $ $ \hat K_n $ $ \tilde H_{1n} $ $ \hat H_{1n} $ $ \hat\alpha $ $ \hat d $
$\mathtt C5716$ $ {113987} $ $ {3401} $ $ {3816.418} $ $ 144 $ $ 182.164 $ $ 651.519 $ $ 0.047 $
$\mathtt U7$ $ {138286} $ $ {2700} $ $ {2952.989} $ $ 350 $ $ 419.819 $ $ 302.093 $ $ 0.142 $
$\mathtt C2525$ $ {204532} $ $ {1555} $ $ {1634.571} $ $ 97 $ $ 107.272 $ $ 183.319 $ $ 0.066 $
$\mathtt C1877$ $ {342766} $ $ {5095} $ $ {6114.758} $ $ 226 $ $ 329.390 $ $ 866.520 $ $ 0.054 $
$\mathtt C395$ $ {518058} $ $ {5957} $ $ {7422.437} $ $ 442 $ $ 698.259 $ $ 841.357 $ $ 0.094 $
$\mathtt C423$ $ {2426512} $ $ {2705} $ $ {2958.988} $ $ 166 $ $ 199.188 $ $ 230.040 $ $ 0.067 $
Destination $ N $ $ \tilde K_n $ $ \hat K_n $ $ \tilde H_{1n} $ $ \hat H_{1n} $ $ \hat\alpha $ $ \hat d $
$\mathtt C5716$ $ {113987} $ $ {3401} $ $ {3816.418} $ $ 144 $ $ 182.164 $ $ 651.519 $ $ 0.047 $
$\mathtt U7$ $ {138286} $ $ {2700} $ $ {2952.989} $ $ 350 $ $ 419.819 $ $ 302.093 $ $ 0.142 $
$\mathtt C2525$ $ {204532} $ $ {1555} $ $ {1634.571} $ $ 97 $ $ 107.272 $ $ 183.319 $ $ 0.066 $
$\mathtt C1877$ $ {342766} $ $ {5095} $ $ {6114.758} $ $ 226 $ $ 329.390 $ $ 866.520 $ $ 0.054 $
$\mathtt C395$ $ {518058} $ $ {5957} $ $ {7422.437} $ $ 442 $ $ 698.259 $ $ 841.357 $ $ 0.094 $
$\mathtt C423$ $ {2426512} $ $ {2705} $ $ {2958.988} $ $ 166 $ $ 199.188 $ $ 230.040 $ $ 0.067 $
Table 2.  Anomaly rankings for the four red-team source computers
Events $ x_{n+1}\vert y_{n+1} $ only. Events $ x_{n+1}\vert y_{n+1} $ only.
Standard $ p $-values $ p_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 5 $ $ 2 $ $ 4 $ $ 2 $ $ 1 $ $ 5 $
Source ${\mathtt C18025}$ $ 138 $ $ 75 $ $ 78 $ $ 151 $ $ 74 $ $ 105 $
computer ${\mathtt C19932}$ $ 3831 $ $ 8870 $ $ 8877 $ $ 3571 $ $ 2754 $ $ 3151 $
${\mathtt C22409}$ $ 3767 $ $ 15773 $ $ 15764 $ $ 3450 $ $ 6984 $ $ 3756 $
Events $ y_{n+1} $ only. Event level combiner: Tippett.
Mid $ p $-values $ q_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 6 $ $ 5 $ $ 5 $ $ 6 $ $ 5 $ $ 5 $
Source ${\mathtt C18025}$ $ 2806 $ $ 1536 $ $ 1674 $ $ 142 $ $ 96 $ $ 107 $
computer ${\mathtt C19932}$ $ 5407 $ $ 8882 $ $ 8914 $ $ 3813 $ $ 2264 $ $ 3232 $
${\mathtt C22409}$ $ 12126 $ $ 15808 $ $ 15878 $ $ 3803 $ $ 6516 $ $ 4196 $
Event level combiner: Fisher. Event level combiner: Fisher.
Standard $ p $-values $ p_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 3 $ $ 5 $ $ 5 $ $ 5 $ $ 2 $ $ 5 $
Source ${\mathtt C18025}$ $ 151 $ $ 88 $ $ 101 $ $ 155 $ $ 90 $ $ 106 $
computer ${\mathtt C19932}$ $ 6339 $ $ 3818 $ $ 4879 $ $ 4937 $ $ 3017 $ $ 3996 $
${\mathtt C22409}$ $ 6120 $ $ 14799 $ $ 5379 $ $ 4451 $ $ 6695 $ $ 5236 $
Events $ x_{n+1}\vert y_{n+1} $ only. Events $ x_{n+1}\vert y_{n+1} $ only.
Standard $ p $-values $ p_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 5 $ $ 2 $ $ 4 $ $ 2 $ $ 1 $ $ 5 $
Source ${\mathtt C18025}$ $ 138 $ $ 75 $ $ 78 $ $ 151 $ $ 74 $ $ 105 $
computer ${\mathtt C19932}$ $ 3831 $ $ 8870 $ $ 8877 $ $ 3571 $ $ 2754 $ $ 3151 $
${\mathtt C22409}$ $ 3767 $ $ 15773 $ $ 15764 $ $ 3450 $ $ 6984 $ $ 3756 $
Events $ y_{n+1} $ only. Event level combiner: Tippett.
Mid $ p $-values $ q_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 6 $ $ 5 $ $ 5 $ $ 6 $ $ 5 $ $ 5 $
Source ${\mathtt C18025}$ $ 2806 $ $ 1536 $ $ 1674 $ $ 142 $ $ 96 $ $ 107 $
computer ${\mathtt C19932}$ $ 5407 $ $ 8882 $ $ 8914 $ $ 3813 $ $ 2264 $ $ 3232 $
${\mathtt C22409}$ $ 12126 $ $ 15808 $ $ 15878 $ $ 3803 $ $ 6516 $ $ 4196 $
Event level combiner: Fisher. Event level combiner: Fisher.
Standard $ p $-values $ p_{n+1} $ Mid-$ p $-values $ q_{n+1} $
Edge level combiner: Node level combiner: Node level combiner:
Tippett Fisher Pearson Stouffer Fisher Pearson Stouffer
${\mathtt C17693}$ $ 3 $ $ 5 $ $ 5 $ $ 5 $ $ 2 $ $ 5 $
Source ${\mathtt C18025}$ $ 151 $ $ 88 $ $ 101 $ $ 155 $ $ 90 $ $ 106 $
computer ${\mathtt C19932}$ $ 6339 $ $ 3818 $ $ 4879 $ $ 4937 $ $ 3017 $ $ 3996 $
${\mathtt C22409}$ $ 6120 $ $ 14799 $ $ 5379 $ $ 4451 $ $ 6695 $ $ 5236 $
[1]

Xianchao Xiu, Ying Yang, Wanquan Liu, Lingchen Kong, Meijuan Shang. An improved total variation regularized RPCA for moving object detection with dynamic background. Journal of Industrial & Management Optimization, 2020, 16 (4) : 1685-1698. doi: 10.3934/jimo.2019024

[2]

Palash Sarkar, Subhadip Singha. Verifying solutions to LWE with implications for concrete security. Advances in Mathematics of Communications, 2021, 15 (2) : 257-266. doi: 10.3934/amc.2020057

[3]

Juan Manuel Pastor, Javier García-Algarra, José M. Iriondo, José J. Ramasco, Javier Galeano. Dragging in mutualistic networks. Networks & Heterogeneous Media, 2015, 10 (1) : 37-52. doi: 10.3934/nhm.2015.10.37

[4]

Braxton Osting, Jérôme Darbon, Stanley Osher. Statistical ranking using the $l^{1}$-norm on graphs. Inverse Problems & Imaging, 2013, 7 (3) : 907-926. doi: 10.3934/ipi.2013.7.907

[5]

Xiaohong Li, Mingxin Sun, Zhaohua Gong, Enmin Feng. Multistage optimal control for microbial fed-batch fermentation process. Journal of Industrial & Management Optimization, 2021  doi: 10.3934/jimo.2021040

[6]

Alexandr Mikhaylov, Victor Mikhaylov. Dynamic inverse problem for Jacobi matrices. Inverse Problems & Imaging, 2019, 13 (3) : 431-447. doi: 10.3934/ipi.2019021

[7]

Mikhail Gilman, Semyon Tsynkov. Statistical characterization of scattering delay in synthetic aperture radar imaging. Inverse Problems & Imaging, 2020, 14 (3) : 511-533. doi: 10.3934/ipi.2020024

[8]

Habib Ammari, Josselin Garnier, Vincent Jugnon. Detection, reconstruction, and characterization algorithms from noisy data in multistatic wave imaging. Discrete & Continuous Dynamical Systems - S, 2015, 8 (3) : 389-417. doi: 10.3934/dcdss.2015.8.389

[9]

Simone Cacace, Maurizio Falcone. A dynamic domain decomposition for the eikonal-diffusion equation. Discrete & Continuous Dynamical Systems - S, 2016, 9 (1) : 109-123. doi: 10.3934/dcdss.2016.9.109

[10]

Alessandro Gondolo, Fernando Guevara Vasquez. Characterization and synthesis of Rayleigh damped elastodynamic networks. Networks & Heterogeneous Media, 2014, 9 (2) : 299-314. doi: 10.3934/nhm.2014.9.299

[11]

Juan Manuel Pastor, Javier García-Algarra, Javier Galeano, José María Iriondo, José J. Ramasco. A simple and bounded model of population dynamics for mutualistic networks. Networks & Heterogeneous Media, 2015, 10 (1) : 53-70. doi: 10.3934/nhm.2015.10.53

[12]

M. Mahalingam, Parag Ravindran, U. Saravanan, K. R. Rajagopal. Two boundary value problems involving an inhomogeneous viscoelastic solid. Discrete & Continuous Dynamical Systems - S, 2017, 10 (6) : 1351-1373. doi: 10.3934/dcdss.2017072

[13]

Fritz Gesztesy, Helge Holden, Johanna Michor, Gerald Teschl. The algebro-geometric initial value problem for the Ablowitz-Ladik hierarchy. Discrete & Continuous Dynamical Systems - A, 2010, 26 (1) : 151-196. doi: 10.3934/dcds.2010.26.151

[14]

Hakan Özadam, Ferruh Özbudak. A note on negacyclic and cyclic codes of length $p^s$ over a finite field of characteristic $p$. Advances in Mathematics of Communications, 2009, 3 (3) : 265-271. doi: 10.3934/amc.2009.3.265

[15]

Yuncherl Choi, Taeyoung Ha, Jongmin Han, Sewoong Kim, Doo Seok Lee. Turing instability and dynamic phase transition for the Brusselator model with multiple critical eigenvalues. Discrete & Continuous Dynamical Systems - A, 2021  doi: 10.3934/dcds.2021035

[16]

Reza Lotfi, Yahia Zare Mehrjerdi, Mir Saman Pishvaee, Ahmad Sadeghieh, Gerhard-Wilhelm Weber. A robust optimization model for sustainable and resilient closed-loop supply chain network design considering conditional value at risk. Numerical Algebra, Control & Optimization, 2021, 11 (2) : 221-253. doi: 10.3934/naco.2020023

[17]

Bin Pei, Yong Xu, Yuzhen Bai. Convergence of p-th mean in an averaging principle for stochastic partial differential equations driven by fractional Brownian motion. Discrete & Continuous Dynamical Systems - B, 2020, 25 (3) : 1141-1158. doi: 10.3934/dcdsb.2019213

[18]

Mohsen Abdolhosseinzadeh, Mir Mohammad Alipour. Design of experiment for tuning parameters of an ant colony optimization method for the constrained shortest Hamiltonian path problem in the grid networks. Numerical Algebra, Control & Optimization, 2021, 11 (2) : 321-332. doi: 10.3934/naco.2020028

[19]

Zengyun Wang, Jinde Cao, Zuowei Cai, Lihong Huang. Finite-time stability of impulsive differential inclusion: Applications to discontinuous impulsive neural networks. Discrete & Continuous Dynamical Systems - B, 2021, 26 (5) : 2677-2692. doi: 10.3934/dcdsb.2020200

[20]

Hirofumi Notsu, Masato Kimura. Symmetry and positive definiteness of the tensor-valued spring constant derived from P1-FEM for the equations of linear elasticity. Networks & Heterogeneous Media, 2014, 9 (4) : 617-634. doi: 10.3934/nhm.2014.9.617

 Impact Factor: 

Article outline

Figures and Tables

[Back to Top]