E C O N S T A T S     About EconStatsTM     Calendar     Data     Regression Examples     Links  
      EXAMPLE:   Two-Variable Regression Example   |   No Intercept
From J. Johnston (1984) p178.

I. Data and Summary Stats 
Two-Variable Regression Example
Observations :  n=5
Independent Variables :  k=2
No Intercept


Data Table 
obsyixi,1xi,2
1. 335
2. 114
3. 856
4. 324
5. 546
sum 201525
mean 435
StD ≡ σ 2.646 1.581 1
Means and Standard Deviations
 Mean 
 Var 
 StD 
 Mx= Σxi/n   
 Varx≡σx2 = Σ(x-Mx)2 / n-1 
 StDx≡σx=Varx1/2
y 4 7.000 2.646
x1 3 2.500 1.581
x2 5 1 1
Covariance Matrix -- Cov(xi,xj)=Σ[(xi-Mxi)(xj-Mxj)] / n-1
NOTE: be careful of MS Excel's COVAR() function,
which divides by n instead of n-1.
y x1 x2
y742.250
x142.5001.500
x22.2501.5001
Correlation Matrix -- Corr(xi,xj)=Σ[(xi-Mxi)(xj-Mxj)] / (n-1)σiσj
y x1 x2
y 1.000 0.956 0.850
x1 0.956 1.000 0.949
x2 0.850 0.949 1.000
The basic input matrices are:
  y =  
(5x1)  
3
1
8
3
5
 
  X = 
(5x2)
35
14
56
24
46
 
   X' = 
(2x5)  
31524
54646



II. Regression Calculations yi = b1 xi,1 + b2 xi,2 + ui The q.c.e. basic equation in matrix form is: y = Xb + e where y (dependent variable) is (nx1) or (5x1) X (independent vars) is (nxk) or (5x2) b (betas) is (kx1) or (2x1) e (errors) is (nx1) or (5x1) Minimizing sum or squared errors using calculus results in the OLS eqn:
b=(X'X)-1.X'y To minimize the sum of squared errors of a k dimensional line that describes the relationship between the k independent variables and y we find the set of slopes (betas) that minimizes Σi=1 to nei2 Re-written in linear algebra we seek to min e'e Rearranging the regression model equation, we get e = y - Xb So e'e = (y-Xb)'(y-Xb) = y'y - 2b'X'y + b'X'Xb (see Judge et al (1985) p14 ) Differentiating by b we get 0 = - 2X'y + X'Xb -> 2X'Xb=2X'y Rearranging, dividing both sides by 2 -> b = X'X-1X'y So to obtain the elements of the (kx1) vector b we need the elements of the (kxk) matrix X'X-1 and of the (kx1) matrix X'y. Caclulating X'y is easy (see (1) below) but X'X-1 requires first calculation of X'X then finding cofactors -- see (4) -- and the deteminant - see (3) - in order to invert.
(1) X'y Matrix (2x1)
    76
    109

(2) X'X Matrix (2x2)
    55 81
    81 129
(3) Determinant Det(X'X)≡|X'X|
i.e. the determinant of matrix of X'X Det(X'X) = 534 Det(X'X) = 55*129 - 81*81 (4) Cofactors(X'X) i.e. cofactor matrix of X 'X (2x2)
    129
    -81
    
    -81
    55
    
(5) Adj(X'X) i.e. adjugate matrix of X'X, this is just the
transpose of the cofactor matrix. (2x2)
For a symmetric matrix, will be same as cofactor matrix.
    129
    -81
    
    -81
    55
    

(6) Inverse Matrix, inv(X'X)≡(X'X)-1
= adj(X'X)/|X'X| = adj(X'X)/534 (2x2)
    0.2416
    -0.1517
    
    -0.1517
    0.1030
    
(7) Beta Matrix (β) b = [X'X-1].[X'y] , this is (2x1). Finally we can calculate b through matrix multiplication.
    Betas
    β 1 1.826
    β 2 -0.3015
  =   X'X-1
0.2416 -0.1517
-0.1517 0.1030
  X   X'y
76
109

Yhat1= + 1.826x3 + -0.3015x5 = 3.9700
Yhat2= + 1.826x1 + -0.3015x4 = 0.6199
Yhat3= + 1.826x5 + -0.3015x6 = 7.3202
Yhat4= + 1.826x2 + -0.3015x4 = 2.4457
Yhat5= + 1.826x4 + -0.3015x6 = 5.4944


ESS
=(3.970 - 4)^2
=(0.6199 - 4)^2
=(7.320 - 4)^2
=(2.446 - 4)^2
=(5.494 - 4)^2
=27.0992509363296

REPORT 
obs calculation of yhatobs
yhatobs = Σβixi,obs
yhatobs a yobs
(data)
Meany (yobs - yhatobs)2 (yhatobs - My)2 (yobs - My)2 a eobs=yobs-yhatobs eobs2
1 Yhat1 = Σβixi,1 = 1.826x3 + -0.3015x5 = 3.970340.94100.0008981 e1 = 3 - 3.970 = -0.97000.9410
2 Yhat2 = Σβixi,2 = 1.826x1 + -0.3015x4 = 0.6199140.144511.439 e2 = 1 - 0.6199 = 0.38010.1445
3 Yhat3 = Σβixi,3 = 1.826x5 + -0.3015x6 = 7.320840.462111.0216 e3 = 8 - 7.320 = 0.67980.4621
4 Yhat4 = Σβixi,4 = 1.826x2 + -0.3015x4 = 2.446340.30732.4161 e4 = 3 - 2.446 = 0.55430.3073
5 Yhat5 = Σβixi,5 = 1.826x4 + -0.3015x6 = 5.494540.24442.2331 e5 = 5 - 5.494 = -0.49440.2444
RSS =
Σ(yobs - yhatobs)2
ESS =
Σ(yhatobs - My)2
TSS =
Σ(yobs - My)2
e'e=Σeobs2
sum->2.09927.10282.099


(11) Betas and their t-Stats
     from the covar matrix of b=σ2(X'X)-1
     the var(βi) = σ2vii where vii is the ith diag element of X'X-1
     where σ2 = e'e / n-k  (k=num of ind vars plus 1 for the intercept if present).
     and where vii is the ith diag element of X'X-1
     Std(βi) = sqr root of Var(βi) 
     TStat(βi) = βi / Std(βi)
     Estimate of σ2 = 0.699750312109863
    Coef value StD(β) tStat(β)
    β1 = 0.2416 * 76
    + -0.1517 * 109
    = 1.826
    (0.6998 * 0.2416)1/2
    = 0.4111
    1.826 / 0.4111
    = 4.441
    β2 = -0.1517 * 76
    + 0.1030 * 109
    = -0.3015
    (0.6998 * 0.1030)1/2
    = 0.2685
    -0.3015 / 0.2685
    = -1.123
(12) Table of Outputs:
    yobs = β1 X obs,1 + β2 X obs,2 + eobs
    1.826 -0.3015
    (4.441)(-1.123) <- tstats
    r2 = 0.925027 | adj r2 = 0.850054
(13) RSS = Sum{y - y_hat }^2 = 2.09925093632959 TSS = Sum{y - y_avg }^2 = 28 ESS(a)= Sum{y_hat - y_avg }^2 = 27.0992509363296 we use the ESSb (below) cuz smthn wrng w ESS when no intercept. ESS(b)= TSS-RSS 25.9007490636704 note: TSS = ESS + RSS (14) r2 = ESS/TSS = 0.925026752273943 (15) adjusted r2 = ESS/TSS = 0.850053504547887 (16) F-stat = [ESS/(k-1)] / [RSS/(n-k)] = 18.507136485281 see Johnston(1984) p186 F measures the joint significance of all explanatory variables. Alternatively: F-stat = r2/(k-1) / (1-r2)/(n-k) (17) Durbin-Watson Statistic (DW or d) measures autocorrelation. DW = 1.18890207242953 ________________________________________________________ Note, RSS, ESS and TSS stand for ... Residual Sum of Squares (RSS), Explained Sum of Squares (ESS), and Total Sum of Squares (TSS). However ESS is sometimes referred to as the Regression Sum of Squares. and RSS is sometimes referred to as the Sum of Squares Rresidual. Note, an alternative way of calculating TSS, ESS is... TSS = y'Ay ESS = bv'Xv'Ay where bv' Xv' are b' & X' wo intercept row col RSS = TSS-ESS Bibliography J. Johnston (1984) Econometric Methods, 3rd ed. Judge et al (1985) The Theory and Practice of Econometrics 2rd ed, Wiley, New York. Donald F. Morrison (1990) Multivariate Statistical Methods, 3rd edition, McGraw Hill, New York. A. H. Studenmund (1997) Using Econometrics: A Practical Guide, 3rd edition. Addison-Wesley, Reading.