How We Crunched The Rankings’ Numbers

rankings

Business school deans have a love/hate relationship with rankings. They publicly love them when their schools are rated highly. They privately loathe them when their institutions lag behind. But when it comes to rankings, one thing they pretty much agree on is that the methodologies employed in ranking schools and programs are severely flawed.

We agree. In the world of undergraduate business education, there have been only two rankings that count: U.S. News & World Report and Bloomberg Businessweek, the latter of which produced its last and final list this year.

The U.S. News list is merely a subset of data from its overall university rankings. Largely a popularity contest, it is solely based on a poll of deans and senior faculty members, most of whom have little to no knowledge of the programs at rival schools. Asking deans to rate other schools is less a measure of a school’s reputation than it is a collection of prejudices partly based on the self-fulfilling prophecy of U.S. News’ own rankings.

SEVERAL DEANS ASKED US TO CREATE THE RANKING

This year’s final Businessweek ranking is no better. It is largely based on employment outcomes, a standard that would seem reasonable except that the single biggest factor in the methodology is a badly executed survey of corporate employers. Both surveys pay no attention whatsoever to the incoming quality of the students or the full academic experience.

In creating our inaugural ranking of undergraduate business schools, we invested considerable time and effort into creating a well-balanced approach that was both fair and thorough. We equally weight admission standards, the full academic experience, and employment outcomes. Those three components of a business program are critical factors of the quality of the educational experience. Excluding any one of them would result in a disingenuous effort to rank the very best schools.

The methodology, moreover, was developed not in isolation of but in collaboration with numerous business school deans and administrators. In fact, we only decided to create this ranking after being asked to do so by several business school deans, frustrated and deeply disappointed by earlier ranking efforts from other organizations.

A METHODOLOGY DEVELOPED IN COLLABORATION WITH B-SCHOOL DEANS

Conference calls were held with deans from many leading business schools. A robust exchange of ideas occurred via a flurry of back-and-forth emails and conversations. Once we put together a draft methodology and questions for both school surveys and alumni surveys, it was shared with business school administrators on a Google document so that they could openly comment on the draft. Their opinions led to substantial changes in our approach. We’re grateful to those who shared their views because it resulted in a sound and credible inaugural list.

We did not, however, always agree with what the deans proposed. That may well be one reason why not all of the 80 schools we originally approached to create our first top 50 ranking agreed to cooperate with our effort. In many cases, deans sought changes that would have benefited their institutions. For example, several deans believed we should adjust starting salaries by cost-of-living in their geographic areas. But given the many far-flung locales where graduates choose to work, including many international locations, it was nearly impossible to devise a fair system of taking cost of living into account.

Besides, we discovered that contrary to popular belief, the power of a higher education brand often exceeds either geography or industry choice as a factor in compensation. Prestige also results in a degree that more effectively travels, having value far beyond a geographic region. Fordham University may be based in New York City and more of its graduates may go into finance, for example, but the starting pay of its undergrads is significantly below that of many Midwestern schools, including Washington University in St. Louis, the University of Michigan in Ann Arbor, Michigan, or (for that matter) Carnegie Mellon University’s Tepper School in Pittsburgh, or Notre Dame’s Mendoza School and Indiana University, both in Indiana.

ADMISSIONS STANDARDS (33.3%)

A vital factor in judging any higher education effort is the quality of the incoming students. After all, a tremendous amount of learning—both academic and social—occur as a result of the quality of your classmates. We agree with that old cliche about playing a sport with people who are equal or better than you. If you play with someone you can always beat, you’ll never reach a level of personal excellence.

We relied on three metrics to measure admissions: The average SAT scores for the latest entering class, given a 35% weight in the admissions category; the percentage of the entering class in 2016 who finished in the top 10% of their high school class, accounting for another 35% of the category; and finally the acceptance rate for the business school program, weighted at 30%.

In several instances, prospective students must pass two hurdles to get into a business school program: First, the university admissions standards and then the business school’s own admissions criteria. We used both acceptance rates to calculate the actual odds of admission for students entering a two-year business program in their junior year.

This data was gathered through a survey to some 80 business schools which had ranked among the top 50 on either the U.S. News or Bloomberg Businessweek lists.

ACADEMIC EXPERIENCE (33.3%)

No study of undergraduate business programs would be complete without an assessment of the academic and extracurricular elements of the educational experience. So we sent surveys to alumni of each school to determine how satisfied they were on every level of that experience. Deans suggested that we choose alumni who had been away from their schools for two years. That would give them ample time to road test their education, to give an accurate assessment of how well prepared they were for the world of work.

Our survey asked a dozen core questions of graduates, each rated on a one-to-ten scale of satisfaction. For the full list of questions and the graded results, see “2016 Rankings Report Card: How Alums Grade Their Schools.” We also asked alumni whether they had a “significant experience,” defined as a major consulting project, thesis, or other program feature instrumental to their professional development, or a meaningful global immersion. Some 75% of the weight on our academic experience category was based on the answers to the 12 core questions, while 25% was based on whether students had either a signature experience or a global immersion.

Schools that either declined to allow their Class of 2014 alumni to be surveyed or failed to meet a 10% threshold for the response rate have been given the average scores of peer institutions. Of the 37 schools that allowed us to send our survey to alumni, 28 met the minimum 10% participation. Among those schools, 13,605 alumni received our survey and 1,871 completed it for a response rate of 13.8%. The College of New Jersey had the highest response rate, 51.8%, while Minnesota’s Carlson School of Management had the lowest acceptable rate at 10.3%.

EMPLOYMENT OUTCOMES (33.3%)

Students who go to business school expect to get a job not long after graduating. Summer internships are a key way to open the door to a full-time job opportunity. So our employment outcomes category is based on three metrics: the percentage of the latest graduating class (the Class of 2016) to gain jobs within 90 days, weighted 50%; the average salary and bonus for the latest graduating class, adjusted by the percentage of graduates awarded a bonus, weighted 30%, and finally, the percentage of the Class of 2016 that had internships before their senior year, weighted at 20%.

THE FINAL RANKING

The results of all three categories measured were then combined equally to determine an overall ranking. In each category, index scores were created to give credit to one school’s lead over others. We publish the numerical ranking with underlying index scores so that readers can determine how useful an actual ranking could be in relationship to the other schools on the list. It’s important to note that there are nearly 700 undergraduate business programs accredited by the AACSB. The 50 business schools singled out in our debut ranking are all extraordinary, each in the top seventh percentile of accredited schools.

Though it was not all that difficult to get data on schools that declined to participate in the ranking, many of these institutions failed to provide timely and full disclosure of information that we consider extremely important for prospective students and their parents. That lack of transparency is unforgivable at a time when these schools are asking students to effectively mortgage their futures by borrowing large sums of money to fund their education. Ultimately, this ranking and the mountains of data we are publishing is an effort to more fully inform prospective students and their parents to make the best educational decisions possible.