Shared Flashcard Set

Details

FRM - Schweser - Topic 18
Multiple regression: estimation and hypothesis testing
7
Finance
Professional
04/17/2010

Additional Finance Flashcards

 


 

Cards

Term
Distinguish between simple and multivariate regression:
Definition

Simple regression is the two-variable regression with one dependent variable, Yi, and one independent variable, Xi.

 

A multivariate regression has more than one independent variable.

Term
What are the assumptions of the multiple linear regression model?
Definition

*** the assumptions are the same as for the 2 variable linear regression model, but it also assumes that the independent variables are not collinear.

 

- a linear relationship exists between the dependent and independent variables

 

- no correlation with the error term for the independent variables

 

- Conditional expectation of the error term is 0

 

- Homoskedasticity (the variance of the error terms is constant for all observations)

 

- The error term for one observation is not correlated with that of another observation

 

- The model only includes the appropriate independent variables and does not omit variables.

 

- The independent variables are not collinear (the independent variables are not random, and there is no exact linear relation between any two or more independent variables).

Term
What is multicollinearity?
Definition

it refers to the condition when two or more of the independent variables or linear combinations of the independent variables in a multiple regression are highly correlated with each other.

 

This condition distorts the standard error of estimate and the coefficient starndard errors. Results in a greater chance of type II errors.

Term
Interpreting regression results: talk about the coefficient of multiple correlation
Definition

The coefficient of multiple correlation is the square root of R-squared. R-square is defined below...

 

R2 = ESS / TSS

 

= sum (Yhat - Ymean)2 / sum(Yi - Ymean)2

 

= 1 - (RSS / TSS)

Term
Whats the F statistic used for and how do you calculate it?
Definition

F statistic is used for testing of the joint hypothesis that both slope coefficients equal 0.

 

F = (ESS / df) / (RSS /df)

 

df = n-number of slope coefficients - 1

Term
How do you calculate adjusted R2 ? and what is it used for?
Definition

R2 will always go up if a new variable with any explanatory power is added to the regression. Thus R2 may reflect the impact of a large set of independent variables rather than how well the set explains the dependent variable.

 

Adjusted R2 uses df as follows:

 

adjusted R2 = 1-(1-R2) * [(n-1)/(n-k-1)]

Supporting users have an ad free experience!