Different weeks "belong" to specific years. The weeks are what we call "nested" within the years. Then year B will look better, but so will all of the weeks associated with year B. Say year A was during a recession, and year B was not. One problem is this: A collection of weeks belongs to a certain year. Since your factors are overlapping time periods (within week, between week, between years), the groups are not what we call independent. That being said, your brief description would not be able to get dressed by having several factors. We can extend this as much as we needed, though the more factors that we add, the more complicated it will be to actually understand the results. We could build that into the model as well, and it would be called "Two-factor ANOVA". For instance, say we think that in addition to differences in MPG between car types, we think there may also be differences between Asian, American, and European cars in terms of MPG. ![]() another set of groups) into the analysis. It is absolutely possible to add a second factor (i.e. This would be the three groups / columns that Sal had, group 1, 2, and 3 (or Green, Purple, Pink, if you prefer). For example, say we want to compare the average MPG of an automobile, and we are looking at several groups: Sedans, Minivans, and SUVs/Trucks. A "factor", which is a variable comprising two or more groups. These result in the 9 numbers that Sal was working with in this video.Ģ. Something that we are measuring, like height, weight, MPG of an automobile, etc. In this setting, we are thinking about two variables:ġ. This video is about Analysis of Variance (ANOVA), or more specifically, One-Factor ANOVA. If we have equal sample sizes for each group, they are all "n", so the denominator in the variance is (n-1) for each, meaning we can factor that out and just multiply (n-1) by the sum of all the variances. From there, we just have to do the same thing to every group, and add up the results. For the ith group, we'd take (n-1)*Si^2 to get this sum. So we can just "undo" the division by (n-1) that the variances have. This looks very close to being a variance, doesn't it? In fact, all we need to is divide by (n-1) and we'd have the variance for that group! But we already have the variance, and we want the sum. ![]() The outside sum is just going across the groups, so let's look at each group separately: SUM( (Xij - Mi)^2 ) For notation, I'm going to use Mi as the group means. For every point, we subtract the group mean from each value, square it, and add them all up. To get this, look at what we do for SSwithin. If I take the last variable (which you didn't hold), and change it, the mean itself and all the other following stats would change. ![]() SST = 30) by holding only "n-1" of the values the same in the 3x3 matrix without any other constrains (i.e. ** I challenge anyone to tell me how they can keep all the final values (e.g. Why do we restrict the degrees of freedom in "knowing" the mean, when the mean itself was calculated with a set of values with "n" degrees of freedom - and NOT "n-1" degrees of freedom? To summarize, there are really "n" independent variables / degrees of freedom in the WHOLE calculation. However, how can we lose sight of the fact that calculating the mean ITSELF required "n" independent variables? But, GIVEN that value, there are only "n-1" degrees of freedom. ![]() As in, we calculated the variance (a measure of error), and in calculating that "error", we used the mean as an intermediate value. Now, that being said, when SK talks about "n-1 degrees of freedom", he appears to be talking about the ERROR only FROM the mean. Any one of those "n" variables can have a result in all of the values in SK's blackboard above.** If any of those "n" random variables were different, the mean itself would be different. But, in calculating the mean itself, "n" independent samples were used. 3:40, SK is talking about the "n-1 degrees of freedom" b/c we "know" the mean.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |