Term 
        
        | Advantages of factorial design |  
          | 
        
        
        Definition 
        
        (1) uses half the number of subjects (2) Maintains same rate of Type 1 error (3) Can look at interactions  |  
          | 
        
        
         | 
        
        
        Term 
        
        | Assumptions of two factor ANOVA |  
          | 
        
        
        Definition 
        
        (1) Each score is influenced by both independent variables (2) These influences can be separated out by margins (3) variance is not affected by treatments  |  
          | 
        
        
         | 
        
        
        Term 
        
        What does an interaction measure?  |  
          | 
        
        
        Definition 
        
        | the effect of one independent variable depends on the level of the other independent variable |  
          | 
        
        
         | 
        
        
        Term 
        
        Advantages of Within ANOVA  |  
          | 
        
        
        Definition 
        
        The standard deviation will be smaller than a between subj design    |  
          | 
        
        
         | 
        
        
        Term 
        
        Size of Denominator of F effects on power  |  
          | 
        
        
        Definition 
        
        | If the denominator of F is smaller than the F term is bigger, power is greate (and vice versa) |  
          | 
        
        
         | 
        
        
        Term 
        
        | What is SSw in Within ANOVA? |  
          | 
        
        
        Definition 
        
        | 0, because there is only one score per cell |  
          | 
        
        
         | 
        
        
        Term 
        
        Within ANOVA: SSw is replaced with what?  |  
          | 
        
        
        Definition 
         | 
        
        
         | 
        
        
        Term 
        
        | Does Within ANOVA increase power? |  
          | 
        
        
        Definition 
        
        Yes, because it reduces intrinsic variablity  |  
          | 
        
        
         | 
        
        
        Term 
        
        True or False Interactions reflect nonlinear effects of independent variables  |  
          | 
        
        
        Definition 
         | 
        
        
         | 
        
        
        Term 
        
        True or False Main effects reflect linear summationof the effects of independent variables  |  
          | 
        
        
        Definition 
         | 
        
        
         | 
        
        
        Term 
        
        | What does the column factor represent in Within ANOVA? |  
          | 
        
        
        Definition 
        
        | the effects of the treatmet averaged across subjects |  
          | 
        
        
         | 
        
        
        Term 
        
        What does the row factor reflect in regular two factor ANOVA?    |  
          | 
        
        
        Definition 
        
        | The effect of one independent variable averaged across the other |  
          | 
        
        
         | 
        
        
        Term 
        
        | What does the row factor reflect in Within subjects ANOVA? |  
          | 
        
        
        Definition 
        
        | the overall differences between subjects averaged across treatments |  
          | 
        
        
         | 
        
        
        Term 
        
        | What does the interaction term measure in Within ANOVA? |  
          | 
        
        
        Definition 
        
        any differences in the effects of specific treatment conditions on specific subjects  |  
          | 
        
        
         | 
        
        
        Term 
        
        | If subject variablity increases in regular ANOVA, what happens to SSw and SSbet? |  
          | 
        
        
        Definition 
        
        | SSw increases, SSbet decreases |  
          | 
        
        
         | 
        
        
        Term 
        
        If subject variability increases in Within ANOVA what happens to the SSbet and SSw?    |  
          | 
        
        
        Definition 
        
        | SSw decreases, SSbet increases |  
          | 
        
        
         | 
        
        
        Term 
        
        | What are the three types of ANOVA |  
          | 
        
        
        Definition 
        
        One factor (one way) ANOVA Factorial ANOVA Repeated Measures (within subjects) ANOVA  |  
          | 
        
        
         | 
        
        
        Term 
        
        | Chi Square test is used for what? |  
          | 
        
        
        Definition 
        
        (1) Experiments with nominal variables (2) To analyze proportional results  |  
          | 
        
        
         | 
        
        
        Term 
        
        | What information does the Chi square tell us? |  
          | 
        
        
        Definition 
        
        Whether the frequencies observed in our experiment are significant or due to chance  |  
          | 
        
        
         | 
        
        
        Term 
        
        The distribution of the Chi Square under the null depends on what?  |  
          | 
        
        
        Definition 
        
        | the number of categories in the experiment |  
          | 
        
        
         | 
        
        
        Term 
        
        | What are the two types of Chi square test? |  
          | 
        
        
        Definition 
        
        | Goodness of fit and Test of Independence |  
          | 
        
        
         | 
        
        
        Term 
        
        | What does the Goodness of Fit test measure? |  
          | 
        
        
        Definition 
        
        | whether the observed frequencies of the experiment are consistent with a specific distribution |  
          | 
        
        
         | 
        
        
        Term 
        
        | What does a Chi Square test for independence measure? |  
          | 
        
        
        Definition 
        
        | whether two independent variables are really dependent |  
          | 
        
        
         | 
        
        
        Term 
        
        | When is the Chi Square test for Indpendence used? |  
          | 
        
        
        Definition 
        
        | When there is more than one nominal level independent variable |  
          | 
        
        
         | 
        
        
        Term 
        
        | True or False. The Goodness of Fit test is really a test for interactions. |  
          | 
        
        
        Definition 
        
        | False. The Chi Square test for independence is a test for interactions |  
          | 
        
        
         | 
        
        
        Term 
        
        What are the properties of the Chi Square Distribution (under the null)?  |  
          | 
        
        
        Definition 
        
        (1) The mean of the disribution=number of degrees of freeedom (2) the variance= 2 times df (3) As the df gets bigger, the Chi Square approaches a normal.  |  
          | 
        
        
         | 
        
        
        Term 
        
        | How does the Chi Square's critical value differ from the F and T critical values? |  
          | 
        
        
        Definition 
        
        Chi Square: As the df increases, the critical value increases T and F: As the df decreases the critical value decreases  |  
          | 
        
        
         | 
        
        
        Term 
        
        | Assumptions of ANOVA and T test are? |  
          | 
        
        
        Definition 
        
        (1) population is normal (2) groups are sampled randomly (3) the treatment only changes the means, not the variances  |  
          | 
        
        
         | 
        
        
        Term 
        
        | What happens if the assumptions of ANOVA and the t test are violated? |  
          | 
        
        
        Definition 
        
        Increase in type 2 error, sometimes type 1  |  
          | 
        
        
         | 
        
        
        Term 
        
        What are the possible reasons for normality being violated?  |  
          | 
        
        
        Definition 
        
        (1) the dependent variable is not distributed normally (2)floor/ceiling effects (3) distribution is skewed (4) sample comes from more than one population (5) dependent variable is influenced by direct random effects (6) noise is not normally distributed  |  
          | 
        
        
         | 
        
        
        Term 
        
        | How do we deal with non-normal distributions? |  
          | 
        
        
        Definition 
        
        (1) Data transformation (2) Nonparametric tests  |  
          | 
        
        
         | 
        
        
        Term 
        
        | What are all the non-parametric tests? |  
          | 
        
        
        Definition 
        
        (1) rank order (2) permutation/randomization tests (3) Mann Whitney U (4) Kruskal Wallis H (5) Wilkoxen signed rank test    |  
          | 
        
        
         | 
        
        
        Term 
        
        What data transformation should you use for positively skewed data?    |  
          | 
        
        
        Definition 
        
        | logarithmic, square root, or other root transformations |  
          | 
        
        
         | 
        
        
        Term 
        
        | What are the data transformations you should use for negatively skewed data? |  
          | 
        
        
        Definition 
        
        | square or other power transformations |  
          | 
        
        
         | 
        
        
        Term 
        
        How do we estimate the appropiate data transformation to use on our data?  |  
          | 
        
        
        Definition 
        
        (1) operate on the dependent variable  (2) plot histogram (3)guess which data transformation you should use (4)apply transformation (5) plot distribution (6) check that signal and noise is normal    |  
          | 
        
        
         | 
        
        
        Term 
        
        Constraints on transformations    |  
          | 
        
        
        Definition 
        
        (1) same transformation must be applied to all groups in the experiment (2) apply it before hypothesis testing (3) transformation can't affect variance (4) transformation cannot change the ranks    |  
          | 
        
        
         | 
        
        
        Term 
        
        | when are rank order tests used? |  
          | 
        
        
        Definition 
        
        | when data is non-normal or ordinal |  
          | 
        
        
         | 
        
        
        Term 
        
        | Ranks preserve the order of scores, but do not preserve __________ |  
          | 
        
        
        Definition 
        
        | the magnitude of the scores |  
          | 
        
        
         | 
        
        
        Term 
        
        | True or False. Converting to ranks is done across scores |  
          | 
        
        
        Definition 
         | 
        
        
         | 
        
        
        Term 
        
        | If the assumptions of a parametric test are met, will using a nonparametric test instead, increase or decrease power? |  
          | 
        
        
        Definition 
        
        | decrease. if the assumptions were not met, it would increase power |  
          | 
        
        
         | 
        
        
        Term 
        
        Advantages of randomization  |  
          | 
        
        
        Definition 
        
        (1) it doesn't assume your population is normal (2) will give the correct significance value no matter the population distribution (3) if population is normal, randomization and a conventional test will give same level of significance  |  
          | 
        
        
         | 
        
        
        Term 
         | 
        
        
        Definition 
        
        (1) if n is low can't do many permutations (2) we can only make conclusions about the sample NOT the population  |  
          | 
        
        
         | 
        
        
        Term 
        
        Why do we use bootstrapping?  |  
          | 
        
        
        Definition 
        
        To get a better estimate of the population parameter.  |  
          | 
        
        
         | 
        
        
        Term 
        
        | What is the main assumption of bootstapping? |  
          | 
        
        
        Definition 
        
        | That our sample is the best estimate of the population, if you don't have other information |  
          | 
        
        
         | 
        
        
        Term 
        
        | Bootstrap Requirements for the estimated parameter |  
          | 
        
        
        Definition 
        
        (1) its normally distributed  (2) its unbiased (3) the standard deviation is a good estimate after resampling  |  
          | 
        
        
         | 
        
        
        Term 
        
        | The difference between randomization and bootstrapping? |  
          | 
        
        
        Definition 
        
        (1)bootstrapping samples with replacement    |  
          | 
        
        
         | 
        
        
        Term 
        
        What is the difference between sampling with replacement in bootstrapping and shuffling scores?  |  
          | 
        
        
        Definition 
        
        If you sample with replacement the scores will repeat, shuffling your scores uses the exact same scores in different order  |  
          | 
        
        
         | 
        
        
        Term 
        
        | What is the difference between multiple regression and linear regression? |  
          | 
        
        
        Definition 
        
        There is more than one independent variable  |  
          | 
        
        
         | 
        
        
        Term 
        
        | Assumptions of multiple regression |  
          | 
        
        
        Definition 
        
        (1) x variables are independent (2) the coefficients are used the same as in linear regression (3) the noise is normally distributed (4) variance is the same for all the data points  |  
          | 
        
        
         | 
        
        
        Term 
        
        What test do we use to see if the coefficients in a regression equation is statistically significant?  |  
          | 
        
        
        Definition 
        
        | T test (one for each coefficient) |  
          | 
        
        
         | 
        
        
        Term 
        
        | What is partial correlation? |  
          | 
        
        
        Definition 
        
        Partial correlation is the relationship between two variables with the effects of the others removed.  |  
          | 
        
        
         | 
        
        
        Term 
        
        | What is the standardized regression coefficient? |  
          | 
        
        
        Definition 
        
        Your 'r' if you normalized your y data  |  
          | 
        
        
         | 
        
        
        Term 
        
        | How do you turn F into t? |  
          | 
        
        
        Definition 
         | 
        
        
         | 
        
        
        Term 
        
        | If F is significant, will a t test and an 'r' also be significant? |  
          | 
        
        
        Definition 
        
        Yes. If there are only two groups  |  
          | 
        
        
         | 
        
        
        Term 
        
        What is the general linear model?  |  
          | 
        
        
        Definition 
        
        It refers to the way ANOVA can be turned into regression.  |  
          | 
        
        
         | 
        
        
        Term 
        
        | Assumptions of the general linear model... |  
          | 
        
        
        Definition 
        
        (1) treatment only shifts the means, not the variance (2) samples are randomly selected (3) the variance is the same for all the predictors (coefficients)  |  
          | 
        
        
         |