图书介绍

线性模型和广义线怀模型 第3版 英文2025|PDF|Epub|mobi|kindle电子书版本百度云盘下载

线性模型和广义线怀模型 第3版 英文
  • (美)拉奥著 著
  • 出版社: 世界图书出版公司北京公司
  • ISBN:9787510086342
  • 出版时间:2014
  • 标注页数:572页
  • 文件大小:75MB
  • 文件页数:594页
  • 主题词:线性模型-英文

PDF下载


点此进入-本书在线PDF格式电子书下载【推荐-云解压-方便快捷】直接下载PDF格式图书。移动端-PC端通用
种子下载[BT下载速度快]温馨提示:(请使用BT下载软件FDM进行下载)软件下载地址页直链下载[便捷但速度慢]  [在线试读本书]   [在线获取解压码]

下载说明

线性模型和广义线怀模型 第3版 英文PDF格式电子书版下载

下载的文件为RAR压缩包。需要使用解压软件进行解压得到PDF格式图书。

建议使用BT下载工具Free Download Manager进行下载,简称FDM(免费,没有广告,支持多平台)。本站资源全部打包为BT种子。所以需要使用专业的BT下载软件进行下载。如BitComet qBittorrent uTorrent等BT下载工具。迅雷目前由于本站不是热门资源。不推荐使用!后期资源热门了。安装了迅雷也可以迅雷进行下载!

(文件页数 要大于 标注页数,上中下等多册电子书除外)

注意:本站所有压缩包均有解压码: 点击下载压缩包解压工具

图书目录

1 Introduction1

1.1 Linear Models and Regression Analysis1

1.2 Plan of the Book3

2 The Simple Linear Regression Model7

2.1 The Linear Model7

2.2 Least Squares Estimation8

2.3 Direct Regression Method10

2.4 Properties of the Direct Regression Estimators12

2.5 Centered Model14

2.6 No Intercept Term Model15

2.7 Maximum Likelihood Estimation15

2.8 Testing of Hypotheses and Confidence Interval Estimation17

2.9 Analysis of Variance20

2.10 Goodness of Fit of Regression23

2.11 Reverse Regression Method24

2.12 Orthogonal Regression Method24

2.13 Reduced Major Axis Regression Method27

2.14 Least Absolute Deviation Regression Method29

2.15 Estimation of Parameters when X Is Stochastic30

3 The Multiple Linear Regression Model and Its Extensions33

3.1 The Linear Model33

3.2 The Principle of Ordinary Least Squares(OLS)35

3.3 Geometric Properties of OLS36

3.4 Best Linear Unbiased Estimation38

3.4.1 Basic Theorems38

3.4.2 Linear Estimators43

3.4.3 Mean Dispersion Error44

3.5 Estimation (Prediction)of the Error Term ε andσ245

3.6 Classical Regression under Normal Errors46

3.6.1 The Maximum-Likelihood(ML)Principle47

3.6.2 Maximum Likelihood Estimation in Classical Normal Regression47

3.7 Consistency of Estimators49

3.8 Testing Linear Hypotheses51

3.9 Analysis of Variance57

3.10 Goodness of Fit59

3.11 Checking the Adequacy of Regression Analysis61

3.11.1 Univariate Regression61

3.11.2 Multiple Regression61

3.11.3 A Complex Example65

3.11.4 Graphical Presentation69

3.12 Linear Regression with Stochastic Regressors70

3.12.1 Regression and Multiple Correlation Coefficient70

3.12.2 Heterogenous Linear Estimation without Normality72

3.12.3 Heterogeneous Linear Estimation under Normality73

3.13 The Canonical Form76

3.14 Identification and Quantification of Multicollinearity77

3.14.1 Principal Components Regression77

3.14.2 Ridge Estimation79

3.14.3 Shrinkage Estimates83

3.14.4 Partial Least Squares84

3.15 Tests of Parameter Constancy87

3.15.1 The Chow Forecast Test88

3.15.2 The Hansen Test91

3.15.3 Tests with Recursive Estimation92

3.15.4 Test for Structural Change93

3.16 Total Least Squares96

3.17 Minimax Estimation98

3.17.1 Inequality Restrictions98

3.17.2 The Minimax Principle101

3.18 Censored Regression105

3.18.1 Overview105

3.18.2 LAD Estimators and Asymptotic Normality107

3.18.3 Tests of Linear Hypotheses108

3.19 Simultaneous Confidence Intervals110

3.20 Confidence Interval for the Ratio of Two Linear Parametric Functions112

3.21 Nonparametric Regression112

3.21.1 Estimation of the Regression Function114

3.22 Classification and Regression Trees (CART)117

3.23 Boosting and Bagging121

3.24 Projection Pursuit Regression124

3.25 Neural Networks and Nonparametric Regression126

3.26 Logistic Regression and Neural Networks127

3.27 Functional Data Analysis(FDA)127

3.28 Restricted Regression130

3.28.1 Problem of Selection130

3.28.2 Theory of Restricted Regression130

3.28.3 Efficiency of Selection132

3.28.4 Explicit Solution in Special Cases133

3.29 LINEX Loss Function135

3.30 Balanced Loss Function137

3.31 Complements138

3.31.1 Linear Models without Moments:Exercise138

3.31.2 Nonlinear Improvement of OLSE for Nonnormal Disturbances139

3.31.3 A Characterization of the Least Squares Estimator139

3.31.4 A Characterization of the Least Squares Estimator:A Lemma140

3.32 Exercises140

4 The Generalized Linear Regression Model143

4.1 Optimal Linear Estimation of β144

4.1.1 R1-Optimal Estimators145

4.1.2 R2-Optimal Estimators149

4.1.3 R3-Optimal Estimators150

4.2 The Aitken Estimator151

4.3 Misspecification of the Dispersion Matrix153

4.4 Heteroscedasticity and Autoregression156

4.5 Mixed Effects Model:Unified Theory of Linear Estimation164

4.5.1 Mixed Effects Model164

4.5.2 A Basic Lemma164

4.5.3 Estimation of Xβ(the Fixed Effect)166

4.5.4 Prediction of Uξ(the Random Effect)166

4.5.5 Estimation of ε167

4.6 Linear Mixed Models with Normal Errors and Random Effects168

4.6.1 Maximum Likelihood Estimation of Linear Mixed Models171

4.6.2 Restricted Maximum Likelihood Estimation of Linear Mixed Models174

4.6.3 Inference for Linear Mixed Models178

4.7 Regression-Like Equations in Econometrics183

4.7.1 Econometric Models186

4.7.2 The Reduced Form190

4.7.3 The Multivariate Regression Model192

4.7.4 The Classical Multivariate Linear Regression Model195

4.7.5 Stochastic Regression196

4.7.6 Instrumental Variable Estimator197

4.7.7 Seemingly Unrelated Regressions198

4.7.8 Measurement Error Models199

4.8 Simultaneous Parameter Estimation by Empirical Bayes Solutions209

4.8.1 Overview209

4.8.2 Estimation of Parameters from Different Linear Models211

4.9 Supplements215

4.10 Gauss-Markov,Aitken and Rao Least Squares Estimators216

4.10.1 Gauss-Markov Least Squares216

4.10.2 Aitken Least Squares217

4.10.3 Rao Least Squares218

4.11 Exercises220

5 Exact and Stochastic Linear Restrictions223

5.1 Use of Prior Information223

5.2 The Restricted Least-Squares Estimator225

5.3 Maximum Likelihood Estimation under Exact Restrictions227

5.4 Stepwise Inclusion of Exact Linear Restrictions228

5.5 Biased Linear Restrictions and MDE Comparison with the OLSE233

5.6 MDE Matrix Comparisons of Two Biased Estimators236

5.7 MDE Matrix Comparison of Two Linear Biased Estimators242

5.8 MDE Comparison of Two(Biased)Restricted Estimators243

5.9 Stein-Rule Estimators under Exact Restrictions251

5.10 Stochastic Linear Restrictions252

5.10.1 Mixed Estimator252

5.10.2 Assumptions about the Dispersion Matrix254

5.10.3 Biased Stochastic Restrictions257

5.11 Stein-Rule Estimators under Stochastic Restrictions261

5.12 Weakened Linear Restrictions262

5.12.1 Weakly(R,r)-Unbiasedness262

5.12.2 Optimal Weakly(R,r)-Unbiased Estimators262

5.12.3 Feasible Estimators—Optimal Substitution of β in β1(β,A)266

5.12.4 RLSE instead of the Mixed Estimator268

5.13 Exercises269

6 Prediction in the Generalized Regression Model271

6.1 Introduction271

6.2 Some Simple Linear Models271

6.2.1 The Constant Mean Model271

6.2.2 The Linear Trend Model272

6.2.3 Polynomial Models273

6.3 The Prediction Model274

6.4 Optimal Heterogeneous Prediction275

6.5 Optimal Homogeneous Prediction277

6.6 MDE Matrix Comparisons between Optimal and Classical Predictors280

6.6.1 Comparison of Classical and Optimal Prediction with Respect to the y Superiority283

6.6.2 Comparison of Classical and Optimal Predictors with Respect to the X βSuperiority285

6.7 Prediction Regions287

6.7.1 Concepts and Definitions287

6.7.2 On q-Prediction Intervals289

6.7.3 On q-Intervals in Regression Analysis291

6.7.4 On(p,q)-Prediction Intervals292

6.7.5 Linear Utilitv Functions294

6.7.6 Normally Distributed Populations-Two-Sided Symmetric Intervals296

6.7.7 Onesided Infinite Intervals298

6.7.8 Utility and Length of Intervals298

6.7.9 Utility and coverage300

6.7.10 Maximal Utility and Optimal Tests300

6.7.11 Prediction Ellipsoids Based on the GLSE302

6.7.12 Comparing the Efficiency of Prediction Ellipsoids305

6.8 Simultaneous Prediction of Actual and Average Values of y306

6.8.1 Specification of Target Function307

6.8.2 Exact Linear Restrictions308

6.8.3 MDEP Using Ordinary Least Squares Estimator309

6.8.4 MDEP Using Restricted Estimator309

6.8.5 MDEP Matrix Comparison310

6.8.6 Stein-Rule Predictor310

6.8.7 Outside Sample Predictions311

6.9 Kalman Filter314

6.9.1 Dynamical and Observational Equations314

6.9.2 Some Theorems314

6.9.3 Kalman Model317

6.10 Exercises318

7 Sensitivity Analysis321

7.1 Introduction321

7.2 Prediction Matrix321

7.3 Effect of Single Observation on Estimation of Parameters327

7.3.1 Measures Based on Residuals328

7.3.2 Algebraic Consequences of Omitting an Observation329

7.3.3 Detection of Outliers330

7.4 Diagnostic Plots for Testing the Model Assumptions334

7.5 Measures Based on the Confidence Ellipsoid335

7.6 Partial Regression Plots341

7.7 Regression Diagnostics for Removing an Observation with Graphics343

7.8 Model Selection Criteria350

7.8.1 Akaikes Information Criterion351

7.8.2 Bayesian Information Criterion353

7.8.3 Mallows Cp353

7.8.4 Example355

7.9 Exercises356

8 Analysis of Incomplete Data Sets357

8.1 Statistical Methods with Missing Data358

8.1.1 Complete Case Analysis358

8.1.2 Available Case Analysis358

8.1.3 Filling in the Missing Values359

8.1.4 Model-Based Procedures359

8.2 Missing-Data Mechanisms360

8.2.1 Missing Indicator Matrix360

8.2.2 Missing Completely at Random360

8.2.3 Missing at Random360

8.2.4 Nonignorable Nonresponse360

8.3 Missing Pattern360

8.4 Missing Data in the Response361

8.4.1 Least-Squares Analysis for Filled-up Data—Yates Procedure362

8.4.2 Analysis of Covariance—Bartlett's Method363

8.5 Shrinkage Estimation by Yates Procedure364

8.5.1 Shrinkage Estimators364

8.5.2 Efficiency Properties365

8.6 Missing Values in the X-Matrix367

8.6.1 General Model367

8.6.2 Missing Values and Loss in Efficiency368

8.7 Methods for Incomplete X-Matrices371

8.7.1 Complete Case Analysis371

8.7.2 Available Case Analysis371

8.7.3 Maximum-Likelihood Methods372

8.8 Imputation Methods for Incomplete X-Matrices373

8.8.1 Maximum-Likelihood Estimates of Missing Values373

8.8.2 Zero-Order Regression374

8.8.3 First-Order Regression375

8.8.4 Multiple Imputation377

8.8.5 Weighted Mixed Regression378

8.8.6 The Two-Stage WMRE382

8.9 Assumptions about the Missing Mechanism384

8.10 Regression Diagnostics to Identify Non-MCAR Processes384

8.10.1 Comparison of the Means384

8.10.2 Comparing the Variance-Covariance Matrices385

8.10.3 Diagnostic Measures from Sensitivity Analysis385

8.10.4 Distribution of the Measures and Test Procedure385

8.11 Treatment of Nonignorable Nonresponse386

8.11.1 Joint Distribution of(X,Y)with Missing Values Only in Y386

8.11.2 Conditional Distribution of Y Given X with Missing Values Only in Y388

8.11.3 Conditional Distribution of Y Given X with Missing Values Only in X389

8.11.4 Other Approaches390

8.12 Further Literature391

8.13 Exercises391

9 Robust Regression393

9.1 Overview393

9.2 Least Absolute Deviation Estimators—Univariate Case394

9.3 M-Estimates:Univariate Case398

9.4 Asymptotic Distributions of LAD Estimators401

9.4.1 Univariate Cage401

9.4.2 Multivariate Case402

9.5 General M-Estimates403

9.6 Tests of Significance407

10 Models for Categorical Response Variables411

10.1 Generalized Linear Models411

10.1.1 Extension of the Regression Model411

10.1.2 Structure of the Generalized Linear Model413

10.1.3 Score Function and Information Matrix416

10.1.4 Maximum-Likelihood Estimation417

10.1.5 Testing of Hypotheses and Goodness of Fit420

10.1.6 Overdispersion421

10.1.7 Quasi Loglikelihood423

10.2 Contingency Tables425

10.2.1 Overview425

10.2.2 Ways of Comparing Proportions427

10.2.3 Sampling in Two-Way Contingency Tables429

10.2.4 Likelihood Function and Maximum-Likelihood Estimates430

10.2.5 Testing the Goodness of Fit432

10.3 GLM for Binary Response435

10.3.1 Logit Models and Logistic Regression435

10.3.2 Testing the Model437

10.3.3 Distribution Function as a Link Function438

10.4 Logit Models for Categorical Data439

10.5 Goodness of Fit—Likelihood-Ratio Test440

10.6 Loglinear Models for Categorical Variables441

10.6.1 Two-Way Contingency Tables441

10.6.2 Three-Way Contingency Tables444

10.7 The Special Case of Binary Response448

10.8 Coding of Categorical Explanatory Variables450

10.8.1 Dummy and Effect Coding450

10.8.2 Coding of Response Models453

10.8.3 Coding ofModels for the Hazard Rate455

10.9 Extensions to Dependent Binary Variables457

10.9.1 Overview458

10.9.2 Modeling Approaches for Correlated Response460

10.9.3 Quasi-Likelihood Approach for Correlated Binary Response460

10.9.4 The GEE Method by Liang and Zeger462

10.9.5 Properties ofthe GEE Estimate βG463

10.9.6 Efficiency ofthe GEE and IEE Methods465

10.9.7 Choice of the Quasi-Correlation Matrix Rt(α)465

10.9.8 Bivariate Binary Correlated Response Variables466

10.9.9 The GEE Method467

10.9.10 The IEE Method468

10.9.11 An Example from the Field of Dentistry469

10.9.12 Full Likelihood Approach for Marginal Models474

10.10 Exercises486

A Matrix Algebra489

A.1 Overview489

A.2 Trace of a Matrix491

A.3 Determinant of a Matrix492

A.4 Inverse of a Matrix494

A.5 Orthogonal Matrices495

A.6 Rank of a Matrix495

A.7 Range and Null Space496

A.8 Eigenvalues and Eigenvectors496

A.9 Decomposition of Matrices498

A.10 Definite Matrices and Quadratic Forms501

A.11 Idempotent Matrices507

A.12 Generalized Inverse508

A.13 Projectors516

A.14 Functions of Normally Distributed Variables517

A.15 Differentiation of Scalar Functions of Matrices520

A.16 Miscellaneous Results,Stochastic Convergence523

B Tables527

C Software for Linear Regression Models531

C.1 Software531

C.2 Special-Purpose Software536

C.3 Resources537

References539

Index563

热门推荐