Friday, September 1, 2023

Five Questions and Answers on the 2023 Milestones

Prior to testing pauses during the Covid-19 pandemic, I used Georgia Milestones data to analyze performance and answer five questions that interest me about Atlanta Public Schools (see 2018 and 2019 analyses).  I find it helpful to use consistent metrics over time, so below I have tried to answer the same questions this year, using the same approaches.

Because testing data from 2020 and 2021 were impacted by the pandemic, the averages are unreliable.  Therefore, I use data from 2011 through 2019 and 2022 through 2023 in my analysis.

      1. Has achievement in Atlanta Public Schools (APS) changed?

Not really.  

Through twelve years and three superintendents, APS has performed similarly relative to the state.  Back in 2011, the average student in the district scored 0.62 grade levels behind the state.  In 2023, the average student scored 0.57 grade levels behind the state.  The highest score occurred in 2015 and the lowest in 2022. 

Compared to a year ago when scores were slightly lower than the historical average, scores went up by 0.18 grade levels this year.  

      

      2. Is achievement in APS more equitable than it used to be?

No.

Outcomes in APS are not more equitable today than they were a few years ago.  Across the country, huge gaps exist between poor students and their wealthier peers.  For the nation as a whole, students at the wealthiest schools score about 6 grade levels above students at the poorest schools.

Consider the graph below.  A perfectly equitable district would expect to see student outcomes be independent of how wealthy or poor a school is.  That would mean a flat trend line.  In APS, the slope is similar to the country as a whole.  

Students at the poorest schools would need to go to school 6 years longer to catch up with students at the wealthiest schools.  Because achievement plays a very limited role in how long students are educated, most of the students who are behind will never receive the additional schooling needed.  Instead, they will receive diplomas and enter the world with weaker math and reading skills.  At that point, they will either have to pay to receive remediation in college or forego higher education.  Sadly, many will take out student loans to pay for remediation but ultimately fail out of college.

The equity slope has not changed over the past few years.  The district has implemented a funding system that provides more resources to schools with lower-income students; however, that initiative has not improved student achievement at the schools receiving the additional resources.  The downward slope of achievement is the same as it has been for the past eight years.


      3Are the turnaround partnerships increasing achievement?

It is not clear that the turnaround partnerships are having an impact on student achievement.  It is possible they are having a very modest positive effect.

APS launched a turnaround plan a few years ago.  One component of the plan transfers management of several district schools to charter operators (Purpose Built, Kindezi, and KIPP).  The partnerships were phased in from 2016 through 2019.

One way to analyze the impact of these partnerships is to look at how the schools were doing before the partnership and compare that to how they are doing after the partnership.  The three graphs below do that for partners based on the timing of the intervention.      

2016 Partnership (Thomasville Heights - closed after 2022 Milestones)



2017 Partnerships (Slater, Gideons, and Price)



2019 Partnership (Woodson Park Academy)


Three of the four schools operating in 2023 scored higher than they did prior to the partnership.  On average, the schools scored 0.1 grades higher, so the increase is extremely modest and may just be noise.  The light grey lines in each graph are other schools in the district and these lines illustrate how school scores fluctuate over time for any number of reasons.  Unless an intervention leads to a change greater than normal fluctuation, we can't be certain it had an impact at all.

I did the same analysis above, controlling for changes in the challenge index.  Over a period of seven years, a neighborhood and its students can change.  The result may be that a school's average student in 2023 faces more or less challenges than the average student did in 2016.  After controlling for challenge index, the schools still scored 0.1 grades higher, again a very small change and not clearly anything other than noise.  Click these links to see those analyses for 20162017, and 2019 partners.


      4. Are targeted interventions increasing achievement?

At the same time APS launched its partner school program, it identified certain schools to receive district-led interventions.  Performance at at targeted schools in 2023 was 0.3 grade levels higher than 2016 performance.  This is slightly higher than the increases seen at partner schools.


If one controls for changes in the challenge index, the 2023, results are up 0.2 grade levels since the 2016 intervention.


      5. Are charter operators beating the odds?

Drew is, but 85% of charter students attend a school or a network that is not beating the odds.  On average charter students are 0.3 grade levels behind.  

About twenty-five percent of APS K-8 students now attend a locally-approved charter school.  In 2023, students at Drew scored 0.6 grade levels above what would be expected of them based on the needs of the students who attend.  

All other charter operators performed worse than would be expected based on the students who attend.  

Broadly, charter achievement has declined in recent years.  Networks that were previously exceeding expectations no longer are.  Kindezi students went from scoring 0.9 grade levels ahead in 2015 to 0.6 grade levels behind in 2023.  KIPP students went from scoring 0.3 grade levels ahead of what would be expected in 2015 to 0.3 grade levels behind in 2023. 

Atlanta Classical is one exception to the trend of declining charter achievement.  The school has shown steady improvement over the past 5 years.  Students at the school went from scoring 1.2 grade levels behind what the challenge index would predict to only 0.1 grade levels behind.  In math, which is believed to be more malleable to school-based instruction, the school now meets expectations.

The graph below shows a history of performance by charter operator.   






Monday, September 9, 2019

My Take on Carstarphen's Exit

Imagine you are responsible for managing an employee. One day, you tell that employee their time in the company is coming to an end and you would like to organize a thoughtful transition which will involve searching for a replacement. You communicate that there are a variety of reasons for the choice, but one key reason is the toxicity of the employee’s working relationship with half of your co-managers. The employee responds by saying this isn’t a good time to announce such a move publicly because of an upcoming product launch. She asks that you not make any public announcements for one-to-two months. You understand the request and agree to abide by it. You and your co-managers keep the conversation in the room to be revived after the launch. Once out of the room, the employee starts frantically sending emails. She contacts the local newspaper and requests an interview. She contacts the local NPR station to request an interview. She contacts the business press. The employee goes on all of these interviews and talks about how successful she has been in the company. She says that you and your co-managers may come and go, but as long as *she* sticks around the company will flourish. She corrals every aunt, uncle, and cousin with a pulse and a recognizable name to praise her and pressure you to keep her in the company. You are at a loss. By holding off an announcement, you were trying to do what was in the company’s best interest. Rather than keeping her future in the company out of the press for a month, which she said was the goal, this employee has made it the number-one topic of conversation. All driven from her perspective. You decide to move forward with your original decision to transition this employee out of the company. You make the announcement.

That is more or less what I took away from the Atlanta Board of Education’s (ABOE's) statement announcing Dr. Meria Carstarphen would not receive a contract extension and the Jason Esteves / Eshe Collins interview with WABE’s Rose Scott that followed.

A few weeks ago, when I learned there was a real possibility that the ABOE would not renew Carstarphen's contract, I felt concerned for the uncertainty it would introduce into the system. I still have those concerns, but I also understand the board’s decision, and I hope that time will reveal it to be the right one.

Carstarphen deserves credit for her dedication to the work of leading APS. Even her most ardent critics would have to concede that the superintendent puts an incredible amount of energy into her work. She also deserves credit for taking chances. There are few easy decisions in school-district leadership, and Carstarphen has tackled some of the toughest. She closed schools to correct under-enrollment problems the district has persistently battled since Alonzo Crim was superintendent in the 1970s and 1980s. That's the kind of decision that really upsets some people and excites virtually no one. She's moved the district toward school-level budgets. She brought in partners to manage seven of the district's worst-performing schools. Most of those partnerships seem to have increased achievement.

But along the way Carstarphen has proven to sometimes be about the business of promoting Meria J. Carstarphen. Carstarphen is a master at orchestrating photo-ops, but principals have described her visits as similar to a tornado arriving and leaving. She uses her blog and press team to craft a narrative of dramatic successes, even when the reality is not so newsworthy. When rigorous work is done but results are less favorable, they get swept under the rug. A first-year Mathematica evaluation of the turnaroud was posted quietly to the APS website. Early evaluations suggesting the Target 2021 initiative didn't work were scrapped from board presentations.

Some of the narrative-building and ego-affirmation would be easier to overlook if not for what occurs when Carstarphen's ego gets wounded. The result is not pretty. In a number of cases, the resulting rage has led to irreparable damage in professional relationships. I need multiple hands to count the number of unique people who I have heard say "I'm so glad I don't have to work with her anymore" after leaving the district. Depending on whether board members are currently in Carstarphen's good graces, they are either allowed to travel around to photo-ops on her bus or required to drive themselves separately. It is hard to build human capital in a district when the tone at the top is volatile and sometimes, frankly, petty.

I for one ignited Carstarphen's rage early on by writing that her new slate of principal hires didn't look transformational. After hearing from several parents involved in the principal selection process and looking into the issues they raised independently, I felt the crop of hires didn't look as amazing as she suggested they would be. They looked about like the people APS had hired in the past. My suggestion was that APS should consider hiring more leaders from top colleges. In hindsight we can see that most of those principal hires didn't work out and are no longer at the district. However, out of the ones that came from top colleges 80% were still around in 2018. Only 29% of the hires coming from non-selective colleges were still principals at that point (one was still at the district in another role). The anger that I generated by sharing my opinion on the hires was ridiculous. It was a bruise to her ego.

So, I get where the board members who decided not to renew are coming from. She's a torture to work with. Or as the ABOE more eloquently put it, we need a superintendent to work "in a spirit of continuous collaboration." I do remain concerned about the prospect of hiring another leader right now. If we were entering an election, candidates could talk about their policy ideas and hopefully reach the board with a consensus about the system's direction. Instead, I worry that different board members (though united in their desire to end Carstarphen's tenure) have divergent views on where to head. That will make a superintendent search challenging. I hope that a candidate will come forward who can continue the positive elements of Carstarphen's legacy with the relational-stability she struggled to demonstrate.

Saturday, August 3, 2019

Five Questions and Answers on the 2019 Georgia Milestones

Last year, I used Georgia Milestones data to analyze performance and answer five questions that interest me about Atlanta Public Schools.  I find it helpful to use consistent metrics over time, so below I have tried to answer the same questions this year, using the same approaches.

Overall, things look similar to the conclusions I reached a year ago.

      1. Has achievement in Atlanta Public Schools (APS) changed?

Not really.  

For each of the past eight years, APS has performed similarly relative to the state.  Back in 2011, the average student in the district scored 0.62 grade levels behind the state.  In 2019, the average student scored 0.54 grade levels behind the state.  The highest score occurred in 2015 and the lowest in 2012. 

Compared to a year ago, scores went up by 0.01 grade levels.  




      2. Is achievement in APS more equitable than it used to be?

No.

Outcomes in APS are not more equitable today than they were a few years ago.  Across the country, huge gaps exist between poor students and their wealthier peers.  For the nation as a whole, students at the wealthiest schools score about 6 grade levels above students at the poorest schools.

Consider the graph below.  A perfectly equitable district would expect to see student outcomes be independent of how wealthy or poor a school is.  In APS, the slope is similar to the country as a whole.  The slope has not changed over the past few years.

Equity in the district is about the same as it has been.




      3Are the turnaround partnerships increasing achievement?

On average, they seem to be, though not all are.

APS launched a turnaround plan a few years ago.  One component of the plan transfers management of 6 district schools to charter operators (Purpose Built, Kindezi, and KIPP).  The partnerships are being phased in over several years.  Results are available now to allow us to look at the 2016 and 2017 partnerships to see how they’ve done.

One way to analyze the impact of these partnerships is to look at how the schools were doing before the partnership and compare that to how they are doing after the partnership.  The graph below does that for Thomasville Heights, the only partnership rolled out in the fall of 2016.    

The results show that scores at Thomasville were 0.4, 0.3, and 0.4 grade levels higher in 2017, 2018, and 2019, respectively, than they were in 2016, the last year before the partnership.  Overall the results suggest a bump in the first year that has been maintained.  However, scores have not continued to rise much as students in the school are exposed to additional years of the intervention.



In the fall of 2017, three additional partnerships began at Gideons, Slater, and Price.  A similar analysis is shown for these schools below.  Two saw scores improve, while one saw scores decline.  The greatest first-year increase was at Gideons.  Price saw its scores rise by 0.3 grade levels in 2018 and an additional 0.3 grade levels in 2019.  This overall increase is higher than 93 percent of schools in the district over the same two-year period.  Slater saw its scores decrease the first year and rebound in the second year.



One might wonder if the improvement in scores at partnership schools is driven by changing student populations.  Maybe wealthier families who would not have considered the school before are willing to give it a shot under new management.  I did the same analysis above, controlling for changes in the challenge index, and the results were less favorable for Price and a bit more favorable for Slater.  Click these links to see those analyses for 2016 and 2017 partners.


      4. Are targeted interventions increasing achievement?

Performance at these schools for 2019 is 0.1 grade levels below 2015 performance and 0.2 grade levels higher than 2016 performance. 

Overall, there were slightly more of these schools that improved from 2018 to 2019 than did not.  Seven schools saw scores increase.  Four schools saw scores decline.  Two schools saw scores remain the same.  If one controls for changes in the challenge index, the 2019 results show that about half the schools are up and half the schools are down from their 2016 performance prior to the intervention.




      5. Are charter operators beating the odds?

KIPP and Drew are.  Others are not.

About twenty percent of APS K-8 students now attend a locally-approved charter school.  In 2018, students at KIPP scored 0.2 grade levels above what would be expected of them based on the needs of the students who attend.  Drew students also performed better than would be expected by 0.7 grade levels.

All other charter operators performed worse than would be expected based on the students who attend.  Atlanta Classical improved from the prior year, but its performance remained the worst among charters.  Students at that school scored 1.0 grade level behind what would be expected.  Performance in math improved substantially.  Math scores matched the 2015 performance which was the highest historical year.  Large gaps still remain with peer schools.  Students at the traditional public school, Jackson Elementary, have a similar challenge index.  However, in math, students at Jackson score 1.5 grade levels ahead of Atlanta Classical.  The school would need to sustain improvements of a similar magnitude to this year for a few more years to close this gap.

The graph below shows a history of performance by charter operator.  For some, performance has varied from year to year.  For others, it has been consistently above or below expectations. 






Saturday, July 28, 2018

Five Questions and Answers on the 2018 GA Milestones

It’s Georgia Milestones scores time again!  I always look forward to the state releasing scores on the annual exams.  It gives a good reason to check in on whether different initiatives are having their intended effect.  As I explained a few years ago, the score releases have so many numbers and are hard enough to interpret that they tend to provide lots of fodder for misguided newspaper headlines, tweets, and emoji high-fives.  In general, the conversation overstates how much fundamentals in a district or school change from one year to the next.    

Below I’ve tried to take a relatively rigorous look at five questions that interest me about Atlanta Public Schools.

It’s worth caveating that all the comments below are based on test scores.  If you believe test scores are effective measures of what we want schools to teach students, you may find these conclusions useful.  If not, nothing below will likely be of interest to you because this is about the Georgia Milestones.


      1. Has achievement in Atlanta Public Schools (APS) changed?

Not really. 

For each of the past seven years, APS has performed similarly relative to the state.  Back in 2011, the average student in the district scored 0.62 grade levels behind the state.  In 2018, the average student scored 0.55 grade levels behind the state.  The highest score occurred in 2015 and the lowest in 2012.

In general, district scores jump around a bit from one year to the next.  The graph below shows how changes in APS scores compare to changes for all the other districts in the state.  From this graph it becomes clear that what's remarkable about APS is how little scores have moved.

One reason APS scores haven’t moved much is that it’s a large district.  Small districts tend to see more variation in their scores from one year to the next.  However the stability in APS scores is noticeable even relative to similarly sized districts.  Among large metro districts, Fulton, Gwinnett, Cobb, Clayton and Forsyth all saw score changes that were more substantial than those in APS (both up and down).

A second reason that APS scores haven’t changed much is that the needs of students served by the district remain relatively stable.  The district is slowly gentrifying.  Over the past 4 years, APS has produced a consistent measure of needs for test takers in the district.[i]  That measure is called the challenge index and you can read more about it here.  It fell slightly from 60 to 58 over the past four years.

Overall, performance in APS is about the same.  It has perhaps ticked up very slightly, and its students have become slightly less needy, but overall the changes are incredibly modest.



      2. Is achievement in APS more equitable than it used to be?

No.

Outcomes in APS are not more equitable today than they were a few years ago.  Across the country, huge gaps exist between poor students and their wealthier peers.  For the nation as a whole, students at the wealthiest schools score about 6 grade levels above students at the poorest schools.

Consider the graph below.  A perfectly equitable district would expect to see student outcomes be independent of how wealthy or poor a school is.  In APS, the slope is similar to the country as a whole.  The slope has not changed over the past few years.

Equity in the district is about the same as it has been.



      3Are the turnaround partnerships increasing achievement?

Possibly.

APS launched a turnaround plan a few years ago.  One component of the plan transfers management of 6 district schools to charter operators (Purpose Built, Kindezi, and KIPP).  The partnerships are being phased in over several years.  Results are available now to allow us to look at the 2016 and 2017 partnerships to see how they’ve done.

One way to analyze the impact of these partnerships is to look at how the schools were doing before the partnership and compare that to how they are doing after the partnership.  The graph below does that for Thomasville Heights, the only partnership rolled out in the fall of 2016.    

The results show that scores at Thomasville were 0.4 and 0.3 grade levels higher in 2017 and 2018, respectively, than they were in 2016, the last year before the partnership.  This is encouraging, but scores at schools change for all kinds of reasons, so its helpful to think about how these changes compare to other schools in the district (shown in grey).  We can’t say for sure that changes from one year to the next aren’t just noise unless we see that the changes are outside of the normal range that schools experience.

The change from 2016 to 2017 is greater than the change that occurred for 96% of other APS schools.  The change from 2016 to 2018 is greater than the change that occurred for 70% of other APS schools.  So, I would say that the best evidence is that the turnaround at Thomasville has had a positive impact, but the improvements are not so dramatic that they are unheard of.  For example, Burgess (a traditional public school) saw scores rise faster. 


In the fall of 2017, three additional partnerships began at Gideons, Slater, and Price.  A similar analysis is shown for these schools below.  Two saw scores improve, while one saw scores decline.  The greatest increase was at Gideons.  Scores increased by 0.6 grade levels from 2017.  This improvement was greater than 96% of APS schools.  Hope-Hill was the only traditional public school that saw a similarly large increase in scores.


One might wonder if the improvement in scores at partnership schools is driven by changing student populations.  Maybe wealthier families who would not have considered the school before are willing to give it a shot under new management.  I did the same analysis above, controlling for changes in the challenge index, and the results were consistent.  Click these links to see those analyses for 2016 and 2017 partners.


      4. Are targeted interventions increasing achievement?

Not much.

A second component of the APS turnaround strategy was to provide additional resources to several schools that had performed poorly in the past.  The plan included small-group tutoring and mentoring of principals. 

Overall, there is not much evidence that these investments are paying off.  In 2017 about half the schools saw scores rise while the other half saw scores fall.  In 2018, results look a little better, but that picture doesn’t hold up once you control for changes in the challenge index.


It seems like year-to-year changes at these schools may be driven by noise and school-specific things (for example leadership or staffing changes) rather than the targeted intervention strategy.  The biggest increase among the targeted schools occurred at BAMO.  The biggest decline occurred at Scott.


      5. Are charter operators beating the odds?

KIPP and Drew are.  Others are not.

About twenty percent of APS K-8 students now attend a locally-approved charter school.  In 2018, students at KIPP scored a half a grade level above what would be expected of them based on the needs of the students who attend.  Drew students also performed better than would be expected by 0.4 grade levels.

All other charter operators performed worse than would be expected based on the students who attend.  The worst performance occurred at Atlanta Classical.  Students at that school scored 1.2 grade levels behind what would be expected.  In math skills, which are often thought to change more with school inputs, the performance was 1.8 grade levels behind.  Even though Atlanta Classical has one of the wealthiest student populations in the state, about half the schools in the state score better at math.  Students at the traditional public school, Jackson Elementary, have a similar challenge index.  However, in math, students at Jackson score 2.5 grade levels ahead of Atlanta Classical.  

The graph below shows a history of performance by charter operator.  For some, performance has varied from year to year.  For others, it has been consistently above or below expectations. 




For more analysis and to investigate specific schools, visit APS Insights.    



[i] Before this, a different measure (Free and Reduced Lunch) was generally used.  Over time, this measure became less and less accurate due to changes in federal policy, so districts and researchers moved toward measuring student needs in more accurate ways.

Tuesday, April 10, 2018

2017 NAEP Data

Today, scores were released on the 2017 National Assessment of Education Progress (NAEP), an exam that is administered to randomly selected students in states and large cities around the US.

Students in Charlotte, NC scored the highest while students in Detroit scored the lowest.  Without considering income differences, the raw scores are essentially useless as measures of school quality across cities.

In Charlotte, the typical student has parents who earn $57k a year.  In Detroit, the typical student has parents who earn $27k a year.  So, it isn't at all surprising that Charlotte would score higher than Detroit, even if we knew nothing about the school quality in these cities.  Unfortunately, much of the coverage of the NAEP scores leaves out this context.

Below are 2017 NAEP scores plotted against median income for parents of students attending school in the district.  Differences in median income explain 61% of the variation in scores on the NAEP.



Clicking through the first four tabs of the Tableau graphic above shows that this relationship holds true within race as well.

Income is also important to consider when looking at racial gaps.  An AJC article summarizing findings from the release suggested that Atlanta had a particularly large black/white gap in scores.  This is true.  It is also true that Atlanta has a particularly wide black/white gap in income.  In fact, it is the largest in the country and no city other than DC even comes close.  The typical white student in Atlanta Public Schools has parents who earn $167k.  The typical black student in Atlanta has parents who earn $24k.  So the racial gap in parent income is $143k.  No where else in the country has a gap so large.  The graph below plots this income gap along with the score gap for all cities.



Seventy-four percent of the black/white gap in average district scores is explained by the size of the income gap in the district.  Notably, the score gap in Atlanta is less than would be expected based on the income gap.

Black students in Atlanta actually score better than would be expected based on their parents' income.  Their scores are comparable to black students in Denver whose parents earn about $11k more.



White students in Atlanta score the second highest of any city in the country (DC is slightly ahead) The score is slightly below what would be expected based on their income; however, it is tough to draw many conclusions since most cities are no where close in income profile. 



The income data used above came from the Stanford Education Data Archive and is based on an American Community Survey supplement which reports data for public school and private school children.



   

Tuesday, February 20, 2018

HBCUs are Equally Effective at Graduating Students When Compared to Other Colleges & Universities

A team of Atlanta Journal Constitution reporters recently wrote a series of articles on Historically Black Colleges and Universities (HBCUs).  The collection tackles a wide range of topics from competition with majority-white institutions to financial troubles to claims about college effectiveness.  The articles were met with criticism by some, including Spelman College President Mary Schmidt Campbell who labeled them “a concerted and prolonged assault on HBCUs” in an open letter to the AJC Editors.

Among the authors’ assertions is a claim that the graduation rate is “a standard means of evaluating college effectiveness.”  They then discuss graduation rates at a number of HBCUs which they suggest compare unfavorably to the national average. It may be true that graduation rates are often cited when discussing college quality; however, no expert trained in education policy would assert that these are useful measures of college effectiveness.[i]

The reason for this is quite simple: colleges don’t start out with the same students.

It is almost taken for granted today in discussions of K-12 policy that it would be foolhardy to assert that differences in a school’s outcomes result from the school being more or less effective without considering characteristics of the students who attend. Because evaluations of college effectiveness are less common, it is perhaps less widely known that measures of college effectives require the same consideration to be credible.

Yale has a graduation rate of 98% while the University of West Georgia has a graduation rate of 43%.   If the 18 year olds walking on to the campuses in New Haven and Carrollton were randomly selected, it might be fair to attribute this difference to the schools’ relative effectiveness.[ii]  But no one would be willing to make such a silly assumption.  We all know that Yale freshman arrive at school with backgrounds, skills, and experiences that differ markedly from the typical student at the University of West Georgia.
    
Even if we consider just one characteristic about students (the SAT score they got prior to applying to college), seventy-two percent of the variation in college graduation rates is explained.  The graph below shows the relationship between average SAT score and Cohort Graduation Rate for U.S. colleges and universities.[iii]


While some colleges graduate more students than one would expect based on average SAT score and others graduate fewer, most schools’ graduation rates are pretty well predicted by the SAT scores of the students who attend.  

For this reason, if we want to make claims about school effectiveness, we need to compare the graduation rates of schools that serve similar students.

In this light, HBCU graduation rates are just about what one would expect given the students served.  In fact, the schools graduate about one percent more students than would be expected if SAT scores are used to predict graduation rates.    The graph below shows the average amount by which graduation rates exceed or fall short of expectations separated by HBCU and non HBCU schools (i.e. how much the schools are “beating the odds”).


In addition to this summary information, you can also use the graph below to look at how the graduation rates at specific schools differ from what would be expected based on students’ performance on the SAT.


It is important to remember that SAT scores are only one way that students differ from each other.  Students also differ in other ways not picked up by SAT scores (non-cognitive skills, family background, ability to pay tuition, essay writing skills, etc.).  In addition, each school has different standards for graduation.[iv]  For these reasons, the “beating the odds” measure is best interpreted with caution when attempting to measure school effectiveness.  That said, it is certainly an improvement over a comparison of graduation rates that pays no attention to which students attend.

Legitimate public policy questions remain with respect to low graduation rates at colleges and universities.  In particular, it is not clear that it is in the best interest of students to provide student loans to those whose pre-college record suggests they are very unlikely to complete their course of study, leaving them with debt but no degree.  However, this is not an HBCU-specific issue.  Once SAT scores are considered, and schools are compared to others serving similar populations, there is no evidence that HBCUs are less effective than non-HBCUs at getting their students to graduate.



Disclosure: I do research and teach Mathematical Economics at Spelman College, an Atlanta HBCU.  


[i] Graduation rates may be useful for other purposes.  For example, when seeking to hire candidates, some companies focus on schools with high graduation rates.  This choice is different in that companies do not care whether candidates gained their skills at the school or prior to enrolling.  They simply care about the total skills accumulated upon graduation.  This is quite different from using graduation rates to make claims about college effectiveness.
[ii] In some ways this would still be problematic because it means something different to graduate at one institution vs. another because standards vary from school to schoool.
[iii] This visual was adapted from work done by John Keltz for his Numbers Box blog.  Its uses 2013-14 data.  Visit the blog for more info on these schools including pell-grant eligibility and individual school profiles.
[iv] The model used to predict scores is a linear model for simplicity.  In a small number of schools at the right side of the distribution, this results in predictions that exceed the possible graduation rate of 100%.