Statistics Primer

(To help in understanding statistics found in readings—does not include the rules for reporting, etc. Some information has been drawn from Williams, F., (1986). Reasoning with statistics, 3rd ed. New York: Holt, Rinehart, & Winston.)

Parameter: characteristic of a population

Statistic: characteristic of a sample

Statistical Inference: estimating parameters from statistics.

Probability—How likely is the result that was found likely to be due to chance?

Significance p < 0.1 p < .05 p <.01 p < .001

Confidence 90% 95% 99% 99.9%

 

Descriptive Statistics

Number/frequency N = 245

Means M = 3.45

Standard Deviation SD = 4.18

Median Mdn = 22

Degrees of Freedom (df): the number of values in a calculation that are free to vary (often it is the size of the sample minus 1; n-1, or the number of categories minus 1). It is used to determine what value of a statistic is needed for significance.

 

Two Basic Analyses: Differences and Relationship

Differences

T-Test—Tests the difference between means of two groups.

t (df of participants) = value, p < value

t = (153) = -1.83, p < .05

This means that in a sample of 154 respondents, a t value of –1.83 was obtained in calculating the difference in two means, and that that value would occur by chance only 5% of the time.

Analysis of Variance (ANOVA)—tests the difference between three or more groups by examining the variance of each group to a grand mean (between group) and within each group (within group) on one independent variable (factor)

F (df of categories, df of participants) = value, p < value

(sometimes eta squared is provided to further support the strength)

F (2, 154) = 3.47, p <.05

Multiple Factor Analysis of Variance (MANOVA)—the same as ANOVA except that it can handle more than one independent variable (factor) and determine the interaction effects among those factors.

Shows as F tests among the various configurations.

Interaction Effects

Sex x Age x Education on Affectionate Communication

F (3, 65) = …

Sex x Age on Affect. Comm.

F (2, 65) =

Sex x Education on AC

F (2, 65) =

Age x Education on AC

F (2, 65) =

Main Effects

Sex on Affectionate Communication F (1, 65) =

Age on Affect. Comm. F (5, 65) =

Education on AC F (7, 65) =

 

 

Tests of Relationship

Correlation (Pearson’s Product-Moment Correlation)—tests whether two variables vary together either positively or negatively.

Coefficient of Correlation is a number between 1.0 and –1.0 (shows magnitude and direction)

+1.0 means there is a perfect positive relationship

0.0 means there is no relationship

-1.0 means there is a perfect negative relationship

The square of the correlation coefficient indicates how much variance is accounted for between the two variables (called the coefficient of determination).

r = .40 produces . r2 = .16, meaning 16% of the variance is

accounted for (84% is due to other factors)

r = .70 produces r2 = .49, meaning 49% of the variance is

accounted for (51% by other factors)

r = -.20 become s r2 = .04, meaning only 4% is accounted

A correlation can be statistically significant but account for very little variance between the two variables.

Multiple Correlation—a correlation calculated between three or more variables. R = +1.0 to –1.0

Partial Correlation—a correlation between two variables (X & Y) in which the effect of one or more other variables (Y) is removed from between the two variables (X & Y).

Factor Analysis—A method for determining which set of variables are most closely related to one another, and which ones aren’t.

 

Regression Analysis