• Previous Article
    Selling by clicks or leasing by bricks? A dynamic game for pricing durable products in a dual-channel supply chain
  • JIMO Home
  • This Issue
  • Next Article
    Inertial Mann-Type iterative method for solving split monotone variational inclusion problem with applications
doi: 10.3934/jimo.2022079
Online First

Online First articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Online First publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Online First articles via the “Online First” tab for the selected journal.

Statistical analysis of a linear regression model with restrictions and superfluous variables

a. 

School of Economics and Management, Jiangxi University of Finance and Economics, Nanchang, Jiangxi, China

b. 

College of Business and Economics, Shanghai Business School, Shanghai, China

*Corresponding author: Ruixia Yuan

Received  December 2021 Revised  April 2022 Early access May 2022

In statistical analysis of regression models, we frequently come across the cases in which two or more competing statistical models are assumed under the given data, while the corresponding statistical inference results are not necessarily the same. In this paper, we consider some comparison problems in the statistical inference of two given competing statistical models, where one is is a true and the other is assumed to be misspecified with inclusion of some superfluous variables. We shall derive the best linear unbiased estimators of unknown parametric vectors under the two competing models using some precise matrix analytic tools, discuss various statistical properties of the estimators, and analyze the connections among the estimators under the two models.

Citation: Wenming Li, Yongge Tian, Ruixia Yuan. Statistical analysis of a linear regression model with restrictions and superfluous variables. Journal of Industrial and Management Optimization, doi: 10.3934/jimo.2022079
References:
[1]

I. S. Alalouf and G. P. H. Styan, Characterizations of estimability in the general linear model, Ann. Statist., 7 (1979), 194-200. 

[2]

J. K. Baksalary, A study of the equivalence between a Gauss–Markoff model and its augmentation by nuisance parameters, Ser. Statist., 15 (1984), 3-35.  doi: 10.1080/02331888408801743.

[3]

J. K. Baksalary and R. Kala, Best linear unbiased estimation in the restricted general linear model, Ser. Statist., 10 (1979), 27-35.  doi: 10.1080/02331887908801464.

[4]

J. K. Baksalary and P. R. Pordzik, Inverse-partitioned-matrix method for the general Gauss–Markov model with linear restrictions, J. Statist. Plann. Inference, 23 (1989), 133-143.  doi: 10.1016/0378-3758(89)90084-0.

[5]

J. K. Baksalary and P. R. Pordzik, Implied linear restrictions in the general Gauss–Markov model, J. Statist. Plann. Inference, 30 (1992), 237-248.  doi: 10.1016/0378-3758(92)90084-6.

[6]

J. K. Baksalary and G. P. H. Styan, Around a formula for the rank of a matrix product with some statistical applications, In Lecture Notes in Pure and Appl. Math., Dekker, 139 (1993), 1–18.

[7]

P. Bhimasankaram and S. R. Jammalamadaka, Updates of statistics in a general linear model: A statistical interpretation and applications, Commun. Statist. Simul. Comput., 23 (1994), 789-801.  doi: 10.1080/03610919408813199.

[8]

K. L. ChuJ. IsotaloS. Puntanen and G. P. H. Styan, On decomposing the Watson efficiency of ordinary least squares in a partitioned weakly singular linear model, Sankhyā, 66 (2004), 634-651. 

[9]

K. L. ChuJ. IsotaloS. Puntanen and G. P. H. Styan, Some further results concerning the decomposition of the Watson efficiency in partitioned linear models, Sankhyā, 67 (2005), 74-89. 

[10]

H. Drygas, The Coordinate-free Approach to Gauss–Markov Estimation, Springer-Verlag, Berlin-New York 1970.

[11]

H. Drygas, A note on the inverse-partitioned-matrix method in linear regression analysis, Linear Algebra Appl., 67 (1985), 275-277.  doi: 10.1016/0024-3795(85)90201-0.

[12]

M. Dube, Mixed regression estimator under inclusion of some superfluous variables, Test, 8 (1999), 411-418.  doi: 10.1007/BF02595878.

[13]

M. DubeV. K. SrivastavaH. Toutenburg and P. Wijekoon, Stein-rule estimator under inclusion of superfluous variables in linear regression models, Commun. Statist. Theory Meth., 20 (1991), 2009-2022.  doi: 10.1080/03610929108830618.

[14]

S. GanC. Lu and Y. Tian, Computation and comparison of estimators under different linear random-effects models, Commun. Statist. Simul. Comput., 49 (2020), 1210-1222.  doi: 10.1080/03610918.2018.1493507.

[15]

S. GanY. Sun and Y. Tian, Equivalence of predictors under real and over-parameterized linear models, Commun. Statist. Theory Meth., 46 (2017), 5368-5383.  doi: 10.1080/03610926.2015.1100742.

[16]

J. Groß and S. Puntanen, Estimation under a general partitioned linear model, Linear Algebra Appl., 321 (2000), 131-144.  doi: 10.1016/S0024-3795(00)00028-8.

[17]

N. Güler and M. E. Büyükkaya, Notes on comparison of covariance matrices of BLUPs under linear random-efects model with its two subsample models, Iran. J. Sci. Tech. Trans. A, 43 (2019), 2993-3002.  doi: 10.1007/s40995-019-00785-3.

[18]

N. Güler and M. E. Büyükkaya, Rank and inertia formulas for covariance matrices of BLUPs in general linear mixed models, Commun. Statist. Theory Meth., 50 (2021), 4997-5012.  doi: 10.1080/03610926.2019.1599950.

[19]

C. R. HallumT. O. Lewis and T. L. Boullion, Estimation in the restricted general linear model with a positive semidefinite covariance matrix, Commun. Statist., 1 (1973), 157-166.  doi: 10.1080/03610927308827014.

[20]

S. J. Haslett and S. Puntanen, Effect of adding regressors on the equality of the BLUEs under two linear models, J. Statist. Plann. Inference, 140 (2010), 104-110.  doi: 10.1016/j.jspi.2009.06.010.

[21] J. S. Hodges, Richly Parameterized Linear Models: Additive, Time Series, and Spatial Models Using Random Effects, CRC Press, Boca Raton, FL, 2014. 
[22]

J. IsotaloS. Puntanen and G. P. H. Styan, Effect of adding regressors on the equality of the OLSE and BLUE, Internat. J. Statist. Sci., 6 (2007), 193-201. 

[23]

S. R. Jammalamadaka and D. Sengupta, Changes in the general linear model: a unified approach, Linear Algebra Appl., 289 (1999), 225-242.  doi: 10.1016/S0024-3795(97)10047-7.

[24]

S. R. Jammalamadaka and D. Sengupta, Inclusion and exclusion of data or parameters in the general linear model, Stat. Probab. Lett., 77 (2007), 1235-1247.  doi: 10.1016/j.spl.2007.03.008.

[25]

B. Jiang and Y. Sun, On the equality of estimators under a general partitioned linear model with parameter restrictions, Stat. Papers, 60 (2019), 273-292.  doi: 10.1007/s00362-016-0837-9.

[26]

B. Jiang and Y. Tian, Decomposition approaches of a constrained general linear model with fixed parameters, Electron. J. Linear Algebra, 32 (2017), 232-253.  doi: 10.13001/1081-3810.3428.

[27]

B. Jiang and Y. Tian, On additive decompositions of estimators under a multivariate general linear model and its two submodels, J. Multivariate Anal., 162 (2017), 193-214.  doi: 10.1016/j.jmva.2017.09.007.

[28]

B. Jiang and Y. Tian, On equivalence of predictors/estimators under a multivariate general linear model with augmentation, J. Korean Stat. Soc., 46 (2017), 551-561.  doi: 10.1016/j.jkss.2017.04.001.

[29]

B. JiangY. Tian and X. Zhang, On decompositions of estimators under a general linear model with partial parameter restrictions, Open Math., 15 (2017), 1300-1322.  doi: 10.1515/math-2017-0109.

[30]

H. JiangJ. Qian and Y. Sun, Best linear unbiased predictors and estimators under a pair of constrained seemingly unrelated regression models, Stat. Probab. Lett., 158 (2020), 108669.  doi: 10.1016/j.spl.2019.108669.

[31]

S. J. Jun and J. Pinkse, Adding regressors to obtain efficiency, Econometric Theory, 25 (2009), 298-301.  doi: 10.1017/S0266466608090567.

[32]

K. Kadiyala, Mixed regression estimator under misspecification, Economics Lett., 21 (1986), 27-30.  doi: 10.1016/0165-1765(86)90115-1.

[33]

H. Kurata, A generalization of Rao's covariance structure with applications to several linear models, J. Multivariate Anal., 67 (1998), 297-305.  doi: 10.1006/jmva.1998.1771.

[34]

C. LuS. Gan and Y. Tian, Some remarks on general linear model with new regressors, Stat. Prob. Lett., 97 (2015), 16-24.  doi: 10.1016/j.spl.2014.10.015.

[35]

C. LuY. Sun and Y. Tian, A comparison between two competing fixed parameter constrained general linear models with new regressors, Statistics, 52 (2018), 769-781.  doi: 10.1080/02331888.2018.1469021.

[36]

C. LuY. Sun and Y. Tian, Two competing linear random-effects models and their connections, Stat. Papers, 59 (2018), 1101-1115.  doi: 10.1007/s00362-016-0806-3.

[37]

J. R. Magnus and J. Durbin, Estimation of regression coefficients of interest when other regression coefficients are of no interest, Econometrica, 67 (1999), 639-643.  doi: 10.1111/1468-0262.00040.

[38]

A. Markiewicz and S. Puntanen, Admissibility and linear sufficiency in linear model with nuisance parameters, Stat. Papers, 50 (2009), 847-854.  doi: 10.1007/s00362-009-0256-2.

[39]

A. Markiewicz and S. Puntanen, All about the $\perp$ with its applications in the linear statistical models, Open Math., 13 (2015), 33-50.  doi: 10.1515/math-2015-0005.

[40]

G. Marsaglia and G. P. H. Styan, Equalities and inequalities for ranks of matrices, Linear Multilinear Algebra, 2 (1974/75), 269-292.  doi: 10.1080/03081087408817070.

[41]

G. Marsaglia and G. P. H. Styan, Rank conditions for generalized inverses of partitioned matrices, Sankhyā Ser. A, 36 (1974), 437–442.

[42]

T. Mathew, A note on best linear unbiased estimation in the restricted general linear model, Statistics, 14 (1983), 3-6.  doi: 10.1080/02331888308801679.

[43]

S. K. Mitra, Generalized inverse of matrices and applications to linear models, In: P. K. Krishnaiah (ed. ), Handbook of Statistics, North-Holland, 1 (1980), 471–512.

[44]

R. Penrose, A generalized inverse for matrices, Proc. Cambridge Phil. Soc., 51 (1955), 406-413.  doi: 10.1017/S0305004100030401.

[45]

S. Puntanen and G. P. H. Styan, The equality of the ordinary least squares estimator and the best linear unbiased estimator, Amer. Statist., 43 (1989), 153-164.  doi: 10.2307/2685062.

[46]

S. Puntanen, G. P. H. Styan and J. Isotalo, Matrix Tricks for Linear Statistical Models: Our Personal Top Twenty, Berlin: Springer, 2011. doi: 10.1007/978-3-642-10473-2.

[47]

S. PuntanenG. P. H. Styan and Y. Tian, Three rank formulas associated with the covariance matrices of the BLUE and the OLSE in the general linear model, Econometric Theory, 21 (2005), 659-663.  doi: 10.1017/S0266466605050292.

[48]

C. R. Rao, Unified theory of linear estimation, Sankhyā Ser. A, 33 (1971), 371-394. 

[49]

C. R. Rao, Representations of best linear unbiased estimators in the Gauss–Markoff model with a singular dispersion matrix, J. Multivariate Anal., 3 (1973), 276-292.  doi: 10.1016/0047-259X(73)90042-0.

[50]

C. R. Rao and S. K. Mitra, Generalized Inverse of Matrices and Its Applications, John Wiley & Sons, Inc., New York-London-Sydney, 1971.

[51]

S. R. Searle, Additional results concerning estimable functions and generalized inverse matrices, J. Roy. Statist. Soc. Ser. B, 27 (1965), 486-490.  doi: 10.1111/j.2517-6161.1965.tb00608.x.

[52]

J. Seely, Linear spaces and unbiased estimation, Ann. Math. Statist., 41 (1970), 1725-1734.  doi: 10.1214/aoms/1177696817.

[53]

J. Seely, Estimability and linear hypotheses, Am. Stat., 31 (1977), 121-123.  doi: 10.2307/2682960.

[54]

J. Seely and D. Birkes, Estimability in partitioned linear models, Ann. Statist., 8 (1980), 399-406. 

[55]

V. K. SrivastavaM. Dube and V. Singh, Ordinary least squares and Stein-rule predictions in regression models under inclusion of some superfluous variables, Stat. Papers, 37 (1996), 253-265.  doi: 10.1007/BF02926587.

[56]

Y. Tian, Using rank formulas to characterize equalities for Moore–Penrose inverses of matrix products, Appl. Math. Comput., 147 (2004), 581-600.  doi: 10.1016/S0096-3003(02)00796-8.

[57]

Y. Tian, Matrix rank and inertia formulas in the analysis of general linear models, Open Math., 15 (2017), 126-150.  doi: 10.1515/math-2017-0013.

[58]

Y. Tian, Some equalities and inequalities for covariance matrices of estimators under linear model, Stat. Papers, 58 (2017), 467-484.  doi: 10.1007/s00362-015-0707-x.

[59]

Y. Tian, Transformation approaches of linear random-effects models, Statist. Meth. Appl., 26 (2017), 583-608.  doi: 10.1007/s10260-017-0381-3.

[60]

Y. TianM. BeisiegelE. Dagenais and C. Haines, On the natural restrictions in the singular Gauss–Markov model, Stat. Papers, 49 (2008), 553-564.  doi: 10.1007/s00362-006-0032-5.

[61]

Y. Tian and W. Guo, On comparison of dispersion matrices of estimators under a constrained linear model, Statist. Meth. Appl., 25 (2016), 623-649.  doi: 10.1007/s10260-016-0350-2.

[62]

Y. Tian and B. Jiang, Matrix rank/inertia formulas for least-squares solutions with statistical applications, Spec. Matrices, 4 (2016), 130-140.  doi: 10.1515/spma-2016-0013.

[63]

Y. Tian and B. Jiang, An algebraic study of BLUPs under two linear random-effects models with correlated covariance matrices, Linear Multilinear Algebra, 64 (2016), 2351-2367.  doi: 10.1080/03081087.2016.1155533.

[64]

Y. Tian and B. Jiang, Equalities for estimators of partial parameters under linear model with restrictions, J. Multivariate Anal., 143 (2016), 299-313.  doi: 10.1016/j.jmva.2015.09.007.

[65]

Y. Tian and B. Jiang, Quadratic properties of least-squares solutions of linear matrix equations with statistical applications, Comput. Statist., 32 (2017), 1645-1663.  doi: 10.1007/s00180-016-0693-z.

[66]

Y. Tian and B. Jiang, A new analysis of the relationships between a general linear model and its mis-specified forms, J. Korean Stat. Soc., 46 (2017), 182-193.  doi: 10.1016/j.jkss.2016.08.004.

[67]

Y. Tian and B. Jiang, Rank/inertia approaches to weighted least-squares solutions of linear matrix equations, Appl. Math. Comput., 315 (2017), 400-413.  doi: 10.1016/j.amc.2017.07.079.

[68]

Y. Tian and C. Wang, On simultaneous prediction in a multivariate general linear model with future observations, Stat. Prob. Lett., 128 (2017), 52-59.  doi: 10.1016/j.spl.2017.04.007.

[69]

Y. Tian and J. Wang, Some remarks on fundamental formulas and facts in the statistical analysis of a constrained general linear model, Commun. Statist. Theory Meth., 49 (2020), 1201-1216.  doi: 10.1080/03610926.2018.1554138.

[70]

Y. Tian and P. Xie, Simultaneous optimal predictions under two seemingly unrelated linear random-effects model, J. Ind. Manag. Optim., 18 (2022), 561-573.  doi: 10.3934/jimo.2020168.

[71]

Y. Tian and X. Zhang, On connections among OLSEs and BLUEs of whole and partial parameters under a general linear model, Stat. Prob. Lett., 112 (2016), 105-112.  doi: 10.1016/j.spl.2016.01.019.

[72]

X. Zhang and Y. Tian, On decompositions of BLUEs under a partitioned linear model with restrictions, Stat. Papers, 57 (2016), 345-364.  doi: 10.1007/s00362-014-0654-y.

[73]

M. G. Zoia, New findings regarding parameter estimation in the Gauss–Markov model with restrictions on coefficients, J. Stat. Manag. Syst., 8 (2005), 529-543.  doi: 10.1080/09720510.2005.10701177.

show all references

References:
[1]

I. S. Alalouf and G. P. H. Styan, Characterizations of estimability in the general linear model, Ann. Statist., 7 (1979), 194-200. 

[2]

J. K. Baksalary, A study of the equivalence between a Gauss–Markoff model and its augmentation by nuisance parameters, Ser. Statist., 15 (1984), 3-35.  doi: 10.1080/02331888408801743.

[3]

J. K. Baksalary and R. Kala, Best linear unbiased estimation in the restricted general linear model, Ser. Statist., 10 (1979), 27-35.  doi: 10.1080/02331887908801464.

[4]

J. K. Baksalary and P. R. Pordzik, Inverse-partitioned-matrix method for the general Gauss–Markov model with linear restrictions, J. Statist. Plann. Inference, 23 (1989), 133-143.  doi: 10.1016/0378-3758(89)90084-0.

[5]

J. K. Baksalary and P. R. Pordzik, Implied linear restrictions in the general Gauss–Markov model, J. Statist. Plann. Inference, 30 (1992), 237-248.  doi: 10.1016/0378-3758(92)90084-6.

[6]

J. K. Baksalary and G. P. H. Styan, Around a formula for the rank of a matrix product with some statistical applications, In Lecture Notes in Pure and Appl. Math., Dekker, 139 (1993), 1–18.

[7]

P. Bhimasankaram and S. R. Jammalamadaka, Updates of statistics in a general linear model: A statistical interpretation and applications, Commun. Statist. Simul. Comput., 23 (1994), 789-801.  doi: 10.1080/03610919408813199.

[8]

K. L. ChuJ. IsotaloS. Puntanen and G. P. H. Styan, On decomposing the Watson efficiency of ordinary least squares in a partitioned weakly singular linear model, Sankhyā, 66 (2004), 634-651. 

[9]

K. L. ChuJ. IsotaloS. Puntanen and G. P. H. Styan, Some further results concerning the decomposition of the Watson efficiency in partitioned linear models, Sankhyā, 67 (2005), 74-89. 

[10]

H. Drygas, The Coordinate-free Approach to Gauss–Markov Estimation, Springer-Verlag, Berlin-New York 1970.

[11]

H. Drygas, A note on the inverse-partitioned-matrix method in linear regression analysis, Linear Algebra Appl., 67 (1985), 275-277.  doi: 10.1016/0024-3795(85)90201-0.

[12]

M. Dube, Mixed regression estimator under inclusion of some superfluous variables, Test, 8 (1999), 411-418.  doi: 10.1007/BF02595878.

[13]

M. DubeV. K. SrivastavaH. Toutenburg and P. Wijekoon, Stein-rule estimator under inclusion of superfluous variables in linear regression models, Commun. Statist. Theory Meth., 20 (1991), 2009-2022.  doi: 10.1080/03610929108830618.

[14]

S. GanC. Lu and Y. Tian, Computation and comparison of estimators under different linear random-effects models, Commun. Statist. Simul. Comput., 49 (2020), 1210-1222.  doi: 10.1080/03610918.2018.1493507.

[15]

S. GanY. Sun and Y. Tian, Equivalence of predictors under real and over-parameterized linear models, Commun. Statist. Theory Meth., 46 (2017), 5368-5383.  doi: 10.1080/03610926.2015.1100742.

[16]

J. Groß and S. Puntanen, Estimation under a general partitioned linear model, Linear Algebra Appl., 321 (2000), 131-144.  doi: 10.1016/S0024-3795(00)00028-8.

[17]

N. Güler and M. E. Büyükkaya, Notes on comparison of covariance matrices of BLUPs under linear random-efects model with its two subsample models, Iran. J. Sci. Tech. Trans. A, 43 (2019), 2993-3002.  doi: 10.1007/s40995-019-00785-3.

[18]

N. Güler and M. E. Büyükkaya, Rank and inertia formulas for covariance matrices of BLUPs in general linear mixed models, Commun. Statist. Theory Meth., 50 (2021), 4997-5012.  doi: 10.1080/03610926.2019.1599950.

[19]

C. R. HallumT. O. Lewis and T. L. Boullion, Estimation in the restricted general linear model with a positive semidefinite covariance matrix, Commun. Statist., 1 (1973), 157-166.  doi: 10.1080/03610927308827014.

[20]

S. J. Haslett and S. Puntanen, Effect of adding regressors on the equality of the BLUEs under two linear models, J. Statist. Plann. Inference, 140 (2010), 104-110.  doi: 10.1016/j.jspi.2009.06.010.

[21] J. S. Hodges, Richly Parameterized Linear Models: Additive, Time Series, and Spatial Models Using Random Effects, CRC Press, Boca Raton, FL, 2014. 
[22]

J. IsotaloS. Puntanen and G. P. H. Styan, Effect of adding regressors on the equality of the OLSE and BLUE, Internat. J. Statist. Sci., 6 (2007), 193-201. 

[23]

S. R. Jammalamadaka and D. Sengupta, Changes in the general linear model: a unified approach, Linear Algebra Appl., 289 (1999), 225-242.  doi: 10.1016/S0024-3795(97)10047-7.

[24]

S. R. Jammalamadaka and D. Sengupta, Inclusion and exclusion of data or parameters in the general linear model, Stat. Probab. Lett., 77 (2007), 1235-1247.  doi: 10.1016/j.spl.2007.03.008.

[25]

B. Jiang and Y. Sun, On the equality of estimators under a general partitioned linear model with parameter restrictions, Stat. Papers, 60 (2019), 273-292.  doi: 10.1007/s00362-016-0837-9.

[26]

B. Jiang and Y. Tian, Decomposition approaches of a constrained general linear model with fixed parameters, Electron. J. Linear Algebra, 32 (2017), 232-253.  doi: 10.13001/1081-3810.3428.

[27]

B. Jiang and Y. Tian, On additive decompositions of estimators under a multivariate general linear model and its two submodels, J. Multivariate Anal., 162 (2017), 193-214.  doi: 10.1016/j.jmva.2017.09.007.

[28]

B. Jiang and Y. Tian, On equivalence of predictors/estimators under a multivariate general linear model with augmentation, J. Korean Stat. Soc., 46 (2017), 551-561.  doi: 10.1016/j.jkss.2017.04.001.

[29]

B. JiangY. Tian and X. Zhang, On decompositions of estimators under a general linear model with partial parameter restrictions, Open Math., 15 (2017), 1300-1322.  doi: 10.1515/math-2017-0109.

[30]

H. JiangJ. Qian and Y. Sun, Best linear unbiased predictors and estimators under a pair of constrained seemingly unrelated regression models, Stat. Probab. Lett., 158 (2020), 108669.  doi: 10.1016/j.spl.2019.108669.

[31]

S. J. Jun and J. Pinkse, Adding regressors to obtain efficiency, Econometric Theory, 25 (2009), 298-301.  doi: 10.1017/S0266466608090567.

[32]

K. Kadiyala, Mixed regression estimator under misspecification, Economics Lett., 21 (1986), 27-30.  doi: 10.1016/0165-1765(86)90115-1.

[33]

H. Kurata, A generalization of Rao's covariance structure with applications to several linear models, J. Multivariate Anal., 67 (1998), 297-305.  doi: 10.1006/jmva.1998.1771.

[34]

C. LuS. Gan and Y. Tian, Some remarks on general linear model with new regressors, Stat. Prob. Lett., 97 (2015), 16-24.  doi: 10.1016/j.spl.2014.10.015.

[35]

C. LuY. Sun and Y. Tian, A comparison between two competing fixed parameter constrained general linear models with new regressors, Statistics, 52 (2018), 769-781.  doi: 10.1080/02331888.2018.1469021.

[36]

C. LuY. Sun and Y. Tian, Two competing linear random-effects models and their connections, Stat. Papers, 59 (2018), 1101-1115.  doi: 10.1007/s00362-016-0806-3.

[37]

J. R. Magnus and J. Durbin, Estimation of regression coefficients of interest when other regression coefficients are of no interest, Econometrica, 67 (1999), 639-643.  doi: 10.1111/1468-0262.00040.

[38]

A. Markiewicz and S. Puntanen, Admissibility and linear sufficiency in linear model with nuisance parameters, Stat. Papers, 50 (2009), 847-854.  doi: 10.1007/s00362-009-0256-2.

[39]

A. Markiewicz and S. Puntanen, All about the $\perp$ with its applications in the linear statistical models, Open Math., 13 (2015), 33-50.  doi: 10.1515/math-2015-0005.

[40]

G. Marsaglia and G. P. H. Styan, Equalities and inequalities for ranks of matrices, Linear Multilinear Algebra, 2 (1974/75), 269-292.  doi: 10.1080/03081087408817070.

[41]

G. Marsaglia and G. P. H. Styan, Rank conditions for generalized inverses of partitioned matrices, Sankhyā Ser. A, 36 (1974), 437–442.

[42]

T. Mathew, A note on best linear unbiased estimation in the restricted general linear model, Statistics, 14 (1983), 3-6.  doi: 10.1080/02331888308801679.

[43]

S. K. Mitra, Generalized inverse of matrices and applications to linear models, In: P. K. Krishnaiah (ed. ), Handbook of Statistics, North-Holland, 1 (1980), 471–512.

[44]

R. Penrose, A generalized inverse for matrices, Proc. Cambridge Phil. Soc., 51 (1955), 406-413.  doi: 10.1017/S0305004100030401.

[45]

S. Puntanen and G. P. H. Styan, The equality of the ordinary least squares estimator and the best linear unbiased estimator, Amer. Statist., 43 (1989), 153-164.  doi: 10.2307/2685062.

[46]

S. Puntanen, G. P. H. Styan and J. Isotalo, Matrix Tricks for Linear Statistical Models: Our Personal Top Twenty, Berlin: Springer, 2011. doi: 10.1007/978-3-642-10473-2.

[47]

S. PuntanenG. P. H. Styan and Y. Tian, Three rank formulas associated with the covariance matrices of the BLUE and the OLSE in the general linear model, Econometric Theory, 21 (2005), 659-663.  doi: 10.1017/S0266466605050292.

[48]

C. R. Rao, Unified theory of linear estimation, Sankhyā Ser. A, 33 (1971), 371-394. 

[49]

C. R. Rao, Representations of best linear unbiased estimators in the Gauss–Markoff model with a singular dispersion matrix, J. Multivariate Anal., 3 (1973), 276-292.  doi: 10.1016/0047-259X(73)90042-0.

[50]

C. R. Rao and S. K. Mitra, Generalized Inverse of Matrices and Its Applications, John Wiley & Sons, Inc., New York-London-Sydney, 1971.

[51]

S. R. Searle, Additional results concerning estimable functions and generalized inverse matrices, J. Roy. Statist. Soc. Ser. B, 27 (1965), 486-490.  doi: 10.1111/j.2517-6161.1965.tb00608.x.

[52]

J. Seely, Linear spaces and unbiased estimation, Ann. Math. Statist., 41 (1970), 1725-1734.  doi: 10.1214/aoms/1177696817.

[53]

J. Seely, Estimability and linear hypotheses, Am. Stat., 31 (1977), 121-123.  doi: 10.2307/2682960.

[54]

J. Seely and D. Birkes, Estimability in partitioned linear models, Ann. Statist., 8 (1980), 399-406. 

[55]

V. K. SrivastavaM. Dube and V. Singh, Ordinary least squares and Stein-rule predictions in regression models under inclusion of some superfluous variables, Stat. Papers, 37 (1996), 253-265.  doi: 10.1007/BF02926587.

[56]

Y. Tian, Using rank formulas to characterize equalities for Moore–Penrose inverses of matrix products, Appl. Math. Comput., 147 (2004), 581-600.  doi: 10.1016/S0096-3003(02)00796-8.

[57]

Y. Tian, Matrix rank and inertia formulas in the analysis of general linear models, Open Math., 15 (2017), 126-150.  doi: 10.1515/math-2017-0013.

[58]

Y. Tian, Some equalities and inequalities for covariance matrices of estimators under linear model, Stat. Papers, 58 (2017), 467-484.  doi: 10.1007/s00362-015-0707-x.

[59]

Y. Tian, Transformation approaches of linear random-effects models, Statist. Meth. Appl., 26 (2017), 583-608.  doi: 10.1007/s10260-017-0381-3.

[60]

Y. TianM. BeisiegelE. Dagenais and C. Haines, On the natural restrictions in the singular Gauss–Markov model, Stat. Papers, 49 (2008), 553-564.  doi: 10.1007/s00362-006-0032-5.

[61]

Y. Tian and W. Guo, On comparison of dispersion matrices of estimators under a constrained linear model, Statist. Meth. Appl., 25 (2016), 623-649.  doi: 10.1007/s10260-016-0350-2.

[62]

Y. Tian and B. Jiang, Matrix rank/inertia formulas for least-squares solutions with statistical applications, Spec. Matrices, 4 (2016), 130-140.  doi: 10.1515/spma-2016-0013.

[63]

Y. Tian and B. Jiang, An algebraic study of BLUPs under two linear random-effects models with correlated covariance matrices, Linear Multilinear Algebra, 64 (2016), 2351-2367.  doi: 10.1080/03081087.2016.1155533.

[64]

Y. Tian and B. Jiang, Equalities for estimators of partial parameters under linear model with restrictions, J. Multivariate Anal., 143 (2016), 299-313.  doi: 10.1016/j.jmva.2015.09.007.

[65]

Y. Tian and B. Jiang, Quadratic properties of least-squares solutions of linear matrix equations with statistical applications, Comput. Statist., 32 (2017), 1645-1663.  doi: 10.1007/s00180-016-0693-z.

[66]

Y. Tian and B. Jiang, A new analysis of the relationships between a general linear model and its mis-specified forms, J. Korean Stat. Soc., 46 (2017), 182-193.  doi: 10.1016/j.jkss.2016.08.004.

[67]

Y. Tian and B. Jiang, Rank/inertia approaches to weighted least-squares solutions of linear matrix equations, Appl. Math. Comput., 315 (2017), 400-413.  doi: 10.1016/j.amc.2017.07.079.

[68]

Y. Tian and C. Wang, On simultaneous prediction in a multivariate general linear model with future observations, Stat. Prob. Lett., 128 (2017), 52-59.  doi: 10.1016/j.spl.2017.04.007.

[69]

Y. Tian and J. Wang, Some remarks on fundamental formulas and facts in the statistical analysis of a constrained general linear model, Commun. Statist. Theory Meth., 49 (2020), 1201-1216.  doi: 10.1080/03610926.2018.1554138.

[70]

Y. Tian and P. Xie, Simultaneous optimal predictions under two seemingly unrelated linear random-effects model, J. Ind. Manag. Optim., 18 (2022), 561-573.  doi: 10.3934/jimo.2020168.

[71]

Y. Tian and X. Zhang, On connections among OLSEs and BLUEs of whole and partial parameters under a general linear model, Stat. Prob. Lett., 112 (2016), 105-112.  doi: 10.1016/j.spl.2016.01.019.

[72]

X. Zhang and Y. Tian, On decompositions of BLUEs under a partitioned linear model with restrictions, Stat. Papers, 57 (2016), 345-364.  doi: 10.1007/s00362-014-0654-y.

[73]

M. G. Zoia, New findings regarding parameter estimation in the Gauss–Markov model with restrictions on coefficients, J. Stat. Manag. Syst., 8 (2005), 529-543.  doi: 10.1080/09720510.2005.10701177.

[1]

Olof Heden, Martin Hessler. On linear equivalence and Phelps codes. Advances in Mathematics of Communications, 2010, 4 (1) : 69-81. doi: 10.3934/amc.2010.4.69

[2]

Olof Heden, Martin Hessler. On linear equivalence and Phelps codes. Addendum. Advances in Mathematics of Communications, 2011, 5 (3) : 543-546. doi: 10.3934/amc.2011.5.543

[3]

Daijun Jiang, Hui Feng, Jun Zou. Overlapping domain decomposition methods for linear inverse problems. Inverse Problems and Imaging, 2015, 9 (1) : 163-188. doi: 10.3934/ipi.2015.9.163

[4]

C. Bourdarias, M. Gisclon, A. Omrane. Transmission boundary conditions in a model-kinetic decomposition. Discrete and Continuous Dynamical Systems - B, 2002, 2 (1) : 69-94. doi: 10.3934/dcdsb.2002.2.69

[5]

Doan Duy Hai, Atsushi Yagi. Longtime behavior of solutions to chemotaxis-proliferation model with three variables. Discrete and Continuous Dynamical Systems, 2012, 32 (11) : 3957-3974. doi: 10.3934/dcds.2012.32.3957

[6]

Galina Kurina, Sahlar Meherrem. Decomposition of discrete linear-quadratic optimal control problems for switching systems. Conference Publications, 2015, 2015 (special) : 764-774. doi: 10.3934/proc.2015.0764

[7]

Jonathan Bennett. A trilinear restriction problem for the paraboloid in R^3. Electronic Research Announcements, 2004, 10: 97-102.

[8]

Wei Wang, Caifei Li. A variational saturation-value model for image decomposition: Illumination and reflectance. Inverse Problems and Imaging, 2022, 16 (3) : 547-567. doi: 10.3934/ipi.2021061

[9]

Paolo Perfetti. Fixed point theorems in the Arnol'd model about instability of the action-variables in phase-space. Discrete and Continuous Dynamical Systems, 1998, 4 (2) : 379-391. doi: 10.3934/dcds.1998.4.379

[10]

Nikolaos Bournaveas, Vincent Calvez. Global existence for the kinetic chemotaxis model without pointwise memory effects, and including internal variables. Kinetic and Related Models, 2008, 1 (1) : 29-48. doi: 10.3934/krm.2008.1.29

[11]

Ting Kang, Qimin Zhang, Haiyan Wang. Optimal control of an avian influenza model with multiple time delays in state and control variables. Discrete and Continuous Dynamical Systems - B, 2021, 26 (8) : 4147-4171. doi: 10.3934/dcdsb.2020278

[12]

Tianhui Yang, Ammar Qarariyah, Qigui Yang. The effect of spatial variables on the basic reproduction ratio for a reaction-diffusion epidemic model. Discrete and Continuous Dynamical Systems - B, 2022, 27 (6) : 3005-3017. doi: 10.3934/dcdsb.2021170

[13]

Xu Yang, François Golse, Zhongyi Huang, Shi Jin. Numerical study of a domain decomposition method for a two-scale linear transport equation. Networks and Heterogeneous Media, 2006, 1 (1) : 143-166. doi: 10.3934/nhm.2006.1.143

[14]

Michal Fečkan. Blue sky catastrophes in weakly coupled chains of reversible oscillators. Discrete and Continuous Dynamical Systems - B, 2003, 3 (2) : 193-200. doi: 10.3934/dcdsb.2003.3.193

[15]

Flaviano Battelli, Michal Fečkan. Blue sky-like catastrophe for reversible nonlinear implicit ODEs. Discrete and Continuous Dynamical Systems - S, 2016, 9 (4) : 895-922. doi: 10.3934/dcdss.2016034

[16]

Axel Grünrock, Sebastian Herr. The Fourier restriction norm method for the Zakharov-Kuznetsov equation. Discrete and Continuous Dynamical Systems, 2014, 34 (5) : 2061-2068. doi: 10.3934/dcds.2014.34.2061

[17]

Qiao Liang, Qiang Ye. Deflation by restriction for the inverse-free preconditioned Krylov subspace method. Numerical Algebra, Control and Optimization, 2016, 6 (1) : 55-71. doi: 10.3934/naco.2016.6.55

[18]

Benjamin Dodson. Global well-posedness and scattering for the defocusing, cubic nonlinear Schrödinger equation when $n = 3$ via a linear-nonlinear decomposition. Discrete and Continuous Dynamical Systems, 2013, 33 (5) : 1905-1926. doi: 10.3934/dcds.2013.33.1905

[19]

Tristan Roy. Adapted linear-nonlinear decomposition and global well-posedness for solutions to the defocusing cubic wave equation on $\mathbb{R}^{3}$. Discrete and Continuous Dynamical Systems, 2009, 24 (4) : 1307-1323. doi: 10.3934/dcds.2009.24.1307

[20]

Keonhee Lee, Kazuhiro Sakai. Various shadowing properties and their equivalence. Discrete and Continuous Dynamical Systems, 2005, 13 (2) : 533-540. doi: 10.3934/dcds.2005.13.533

2021 Impact Factor: 1.411

Metrics

  • PDF downloads (69)
  • HTML views (39)
  • Cited by (0)

Other articles
by authors

[Back to Top]