Thursday, November 7, 2019
Example of ANOVA Calculation
Example of ANOVA Calculation One factor analysis of variance, also known as ANOVA, gives us a way to make multiple comparisons of several population means. Rather than doing this in a pairwise manner, we can look simultaneously at all of the means under consideration. To perform an ANOVA test, we need to compare two kinds of variation, the variation between the sample means, as well as the variation within each of our samples. We combine all of this variation into a single statistic, called the ââ¬â¹F statistic because it uses the F-distribution. We do this by dividing the variation between samples by the variation within each sample. The way to do this is typically handled by software, however, there is some value in seeing one such calculation worked out. It will be easy to get lost in what follows. Here is the list of steps that we will follow in the example below: Calculate the sample means for each of our samples as well as the mean for all of the sample data.Calculate the sum of squares of error. Here within each sample, we square the deviation of each data value from the sample mean. The sum of all of the squared deviations is the sum of squares of error, abbreviated SSE.Calculate the sum of squares of treatment. We square the deviation of each sample mean from the overall mean. The sum of all of these squared deviations is multiplied by one less than the number of samples we have. This number is the sum of squares of treatment, abbreviated SST.Calculate the degrees of freedom. The overall number of degrees of freedom is one less than the total number of data points in our sample, or n - 1. The number of degrees of freedom of treatment is one less than the number of samples used, or m - 1. The number of degrees of freedom of error is the total number of data points, minus the number of samples, or n - m.Calculate the mean square of error. T his is denoted MSE SSE/(n - m). Calculate the mean square of treatment. This is denoted MST SST/m - 1.Calculate the F statistic. This is the ratio of the two mean squares that we calculated. So F MST/MSE. Software does all of this quite easily, but it is good to know what is happening behind the scenes. In what follows we work out an example of ANOVA following the steps as listed above. Data and Sample Means Suppose we have four independent populations that satisfy the conditions for single factor ANOVA. We wish to test the null hypothesis H0: à ¼1 à ¼2 à ¼3 à ¼4. For purposes of this example, we will use a sample of size three from each of the populations being studied. The data from our samples is: Sample from population #1: 12, 9, 12. This has a sample mean of 11.Sample from population #2: 7, 10, 13. This has a sample mean of 10.Sample from population #3: 5, 8, 11. This has a sample mean of 8.Sample from population #4: 5, 8, 8. This has a sample mean of 7. The mean of all of the data is 9. Sum of Squares of Error We now calculate the sum of the squared deviations from each sample mean. This is called the sum of squares of error. For the sample from population #1: (12 ââ¬â 11)2 (9ââ¬â 11)2 (12 ââ¬â 11)2 6For the sample from population #2: (7 ââ¬â 10)2 (10ââ¬â 10)2 (13 ââ¬â 10)2 18For the sample from population #3: (5 ââ¬â 8)2 (8 ââ¬â 8)2 (11 ââ¬â 8)2 18For the sample from population #4: (5 ââ¬â 7)2 (8 ââ¬â 7)2 (8 ââ¬â 7)2 6. We then add all of these sum of squared deviations and obtain 6 18 18 6 48. Sum of Squares of Treatment Now we calculate the sum of squares of treatment. Here we look at the squared deviations of each sample mean from the overall mean, and multiply this number by one less than the number of populations: 3[(11 ââ¬â 9)2 (10 ââ¬â 9)2 (8 ââ¬â 9)2 (7 ââ¬â 9)2] 3[4 1 1 4] 30. Degrees of Freedom Before proceeding to the next step, we need the degrees of freedom. There are 12 data values and four samples. Thus the number of degrees of freedom of treatment is 4 ââ¬â 1 3. The number of degrees of freedom of error is 12 ââ¬â 4 8. Mean Squares We now divide our sum of squares by the appropriate number of degrees of freedom in order to obtain the mean squares. The mean square for treatment is 30 / 3 10.The mean square for error is 48 / 8 6. The F-statistic The final step of this is to divide the mean square for treatment by the mean square for error. This is the F-statistic from the data. Thus for our example F 10/6 5/3 1.667. Tables of values or software can be used to determine how likely it is to obtain a value of the F-statistic as extreme as this value by chance alone.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.