Term
|
Definition
| the null hypothesis refers to a general or default position: that there is no relationship between two measured phenomena |
|
|
Term
|
Definition
| the p-value is the probability of obtaining a test statistic at least as extreme as the one that was actually observed, assuming that the null hypothesis is true |
|
|
Term
|
Definition
| the mean sums of squares, M.S.S., is found in each case by dividing the sum of squares, SS, by the corresponding degrees of freedom. |
|
|
Term
| F-stat (excel's significance F) |
|
Definition
| The test statistic, F, is the ratio of the mean sum of squares due to the differences between the group means and that due to the errors. Essentially the p-stat for model. |
|
|
Term
| durbin watson test statistic |
|
Definition
| a test statistic used to detect the presence of autocorrelation (a relationship between values separated from each other by a given time lag) in the residuals (prediction errors) from a regression analysis. |
|
|
Term
| autocorrelation (aka. serial correlation or cross-autocorrelation) |
|
Definition
| he cross-correlation of a signal with itself at different points in time (that is what the cross stands for). Informally, it is the similarity between observations as a function of the time lag between them. |
|
|
Term
| R-squared (aka Coefficient of Determination) |
|
Definition
| In a multiple regression model, the proportion of the total sample variation in the dependent variable that is explained by the independent variable. |
|
|
Term
|
Definition
| The percentage change in one variable given a 1% ceteris paribus increase in another variable. |
|
|
Term
| Error term aka disturbance |
|
Definition
| The variable in a simple or multiple regression equation that contains unobserved factors which affect the dependent variable. The error term may also include measurement errors in the observed dependent or independent variables. Also Disturbance. |
|
|
Term
| Explained sum of squares (SSE) |
|
Definition
| The total sample variation of the fitted values in a multiple regression model. |
|
|
Term
| Explanatory variable (aka Control Variable or Covariate or Independent Variable or Predictor Variable or Regressor) |
|
Definition
| regression analysis, a variable that is used to explain variation in the dependent variable. |
|
|
Term
|
Definition
| The set of linear equations used to solve for the OLS estimates. |
|
|
Term
|
Definition
| The variance of the error term in a multiple regression model. |
|
|
Term
|
Definition
| In multiple regression analysis, the number of observations minus the number of estimated parameters. |
|
|
Term
| dependent variable (aka predictive or explained variable) |
|
Definition
| The variable to be explained in a multiple regression model (and a variety of other models). |
|
|
Term
|
Definition
| The estimated values of the dependent variable when the values of the independent variables for each observation are plugged into the OLS regression line. |
|
|
Term
|
Definition
| The variance of the error term, given the explanatory variables, is not constant. |
|
|
Term
|
Definition
| The errors in a regression model have constant variance conditional on the explanatory variables. |
|
|
Term
|
Definition
1.Regression model is linear, correctly specified, and has an additive error term 2.Error term has a zero population mean 3.All explanatory variables are uncorrelated with the error term 4.Observations of the error term are uncorrelated with each other(no serial correlation) 5.Error term has a constant variance (no heteroskedasticity) 6.No explanatory variable is a perfect linear function of any other explanatory variable(s) (no perfect mulicollinearity) 7.Error term is normally distributed (optional but usually invoked) |
|
|
Term
| ordinary least squares (OLS) |
|
Definition
| A method for estimating the parameters of a multiple linear regression model. The ordinary least squares estimates are obtained by minimizing the sum of squared residuals. |
|
|
Term
|
Definition
| The percentage change in the dependent variable given a one-unit increase in an independent variable. |
|
|
Term
|
Definition
| error term satisfying Assumptions 1-5 |
|
|
Term
| classical normal error term |
|
Definition
|
|