What Is The Challenge Index And How It Came To Be

Oct 4, 2018


The list for America’s Most Challenging High Schools began in 1998 as a way to dramatize a deep, unreported flaw I had discovered in the way high schools treat average students.

Jay Mathews

I had written a book about Garfield High School in East Los Angeles where two teachers, Jaime Escalante and Ben Jimenez, had convinced all the students they could to take Advanced Placement Calculus. Those students had no better than average grades and SAT scores. Yet in 1987, Garfield produced 27 percent of all Mexican-Americans in the country who passed an AP Calculus exam.

I learned that what those two teachers did was revolutionary. Most schools told average students they weren’t good enough for AP. Yet, I had seen such students — even impoverished ones — do well on AP exams and go on to college success when given extra time and encouragement to learn.

I decided one way to draw attention to this issue was to rank high schools in a new way, which I called the Challenge Index. Instead of measuring schools by standardized test scores, the usual method, I tried ranking them by their success in getting less than stellar students into the most challenging courses and tests — AP, International Baccalaureate and Cambridge. I used a simple ratio: the number of tests given in May divided by the number of seniors who graduated in May or June.

This is a way to identify which schools are most challenging, not which are best. Best is a word usually too hard to qualify, or too subjective.

This is a way to identify which schools are most challenging, not which are best. “Best” is a word usually too hard to qualify, or too subjective. 

In 1998, I listed all the schools I could find that gave at least as many AP tests as they had graduating seniors in 1996. There were only 243 of them, just 1 percent of all U.S. high schools. Newsweek listed the top 100 schools in the magazine. I continued to do that in Newsweek until 2010 when The Washington Post Co. sold Newsweek and I moved the annual list to The Post newspaper. In 2012, I added a sampling of as many private schools as I could find that provided the data.

Nearly all researchers on Advanced Placement agree that high school students who pass AP tests do better in those subjects in college than students who did not pass AP tests or did not take them at all. A 2013 study by College Board researchers Krista D. Mattern, Jessica P. Marini and Emily J. Shaw, based on a sample of 678,305 students, found that “regardless of what score was earned on the AP Exam, students who took an AP Exam were more likely to graduate in four years or fewer than students who took no AP Exams.” So even those who got the lowest grade, a 1, on an AP exam were more likely to complete college in four years, buttressing what many AP teachers and students have told me:  AP makes students better prepared for college. 

But a student must take an AP exam to have an effective AP experience. Taking the exam, written and graded by outside experts, makes it more likely that both the teacher and the students will take the course seriously. Students who take an AP course but not the exam do no better in college than similar students who did not take AP at all. That is the reason why my list measures exam participation, not course participation.

Over the years, the list has revealed the startling impact of the growing numbers of teachers giving average students college-level classes and tests. Only 1 percent of U.S. schools made the first list in 1998. That number has increased to 12 percent and I am certain it will go higher.

Jay Mathews is an education columnist for The Washington Post, his employer for nearly 50 years. He is the author of nine books, including five about high schools. His 2009 book "Work Hard. Be Nice." about the birth and growth of the KIPP charter school network was a New York Times best-seller. He created and supervises the annual Challenge Index rankings known as America's Most Challenging High Schools.