So the Georgia DOE released the latest graduation rates and DeKalb County has seen a 9% increase in the number of students in the 4 year cohort who have graduated. Make no mistake, a nine percent increase is actually quite an achievement. This is a number that usually moves no more than a couple of percent per year, so this is a big deal.
However, as with all things, it pays to dig beneath the surface to find the hidden forces that moved the numbers so much. And what it uncovers is interesting.
The first comment that jumps to many people’s minds is that there must have been an external cause. One does not simply improve that much that quickly through internal change. And they would be correct. As almost all of the stories on the graduation rates have stated, one really big change was that this was the first class to not require the Georgia High School Graduation Test in order to be granted a diploma. Even the various press releases from the school systems mention that as a factor.
But how much of a factor is the question, and what does the answer to that question tell us?
Let’s take a look at the numbers in Dekalb County Schools.
The schools that were doing well before, and by that I mean schools that had graduation rates higher than 75% in 2014 (which is hardly what I would call “doing well”, but that’s another topic), showed almost no increase in the latest graduation rates. Chamblee went from 84% to 86%. Redan went from 80% to 81%. Lakeside from 77% to 80%. Arabia Mountain and Dunwoody both dropped a percentage point. What that tells us is that the end of year test wasn’t really a factor before in these schools, and whatever internal factors that may exist only move the needle a few percentage points. This is consistent with prior years’ behavior.
The big changes occurred in those schools that were not performing as well in the past. Leaving aside some of the schools where the small numbers of students can cause statistical anomalies, we still have Tucker jumping 18 percentage points from 69% to 87%. We see Columbia increasing by 14 points from 62% to 76%, Towers by 16 points from 54% to 70%. Miller Grove went from 60% to 80%….an increase of 20 points!
The Administration says that the main reasons for the increases are better data reporting to the Georgia BOE and comprehensive graduation services such individualized support programs. If that were so, we would expect to see similar gains across all schools, not just some of them. We would expect Stephenson High School to have increased its graduation rate by more then just 1%. If the gains were due to comprehensive changes in support services, we would expect Cedar Grove High School to improve more than 4%.
What the data shows is that the gains are very uneven. When we look at the data, removing those schools with 50 or fewer graduating students, we see the average change in graduation rate across DeKalb schools in 2015 was 6.5%. The standard deviation of the change was 6.4. Miller Grove’s increase of 20 points lies more than two standard deviations from the mean. That is outside the bounds of what we would normally expect as random variation. Something different occurred at Miller Grove than the rest of the high schools in DeKalb.
In the absence of a statewide (or even countywide) test, the decisions of who graduates becomes much more subjective. Who decides if a student graduates? How is that decision reached? What criteria are used, and is it the same set of criteria across the county? The analysis strongly suggests it is not, and there are other, hidden forces that are responsible for the gains.
The upcoming release of the school results of the Georgia Milestones test will be revealing, and will show if the unusual gains in graduation rates are supported by corresponding scores compared to other schools in the district.
My prediction is they will not.
Thurmond made a big announcement about this during the 2014-15 school year. The cohort rate calculation hurt DeKalb because when a student transferred, nobody was tracking them. During the last school year DeKalb high schools started tracking students. The new numbers reflect two years of better data in addition to the absence of a graduation test. There should be another, smaller, increase next year.
First, thanks for commenting and contributing to the discussion. I appreciate it.
You’re absolutely correct that the cohort rate calculation can impact the results, in some cases quite significantly. Removing the kids who leave from the calculation may cause the rate to increase, all else being equal. And I think we see that in the numbers for Miller Grove if we look back.
In 2013, Miller Grove graduated 259 students out of a cohort of 434, for a graduation rate of 59.7%.
In 2014, Miller Grove graduated 226 students out of a cohort of 380, for a graduation rate of 59.5%.
In 2015, Miller Grove graduated 296 students out of a cohort of 372, for a graduation rate of 79.6%.
There was a significant drop in the size of the cohort between 2013 and 2014, which may have been caused by more accurate tracking. Of interest is the fact the graduation rate didn’t noticeably change, although the size of the cohort dropped by 12%. One interpretation could be that the size of the cohort actually changed that much even without taking transfers into account. Another would be that the transfers were representative of the population as a whole and so when they were removed from the calculation, there was no real impact.
Between 2014 and 2015, the size of the cohort dropped by 8 students, or 2%. 70 more students graduated, causing the graduation rate to increase by 20 points.
It’s entirely possible that the students in that cohort simply performed better. My concern is that we don’t know, and we can’t know unless there is some sort of consistent reference we can use to measure.
LikeLiked by 1 person