How We Crunched The Rankings’ Numbers

rankings

Business school deans have a love/hate relationship with rankings. They publicly love them when their schools are rated highly. They privately loathe them when their institutions lag behind. But when it comes to rankings, one thing they pretty much agree on is that the methodologies employed in ranking schools and programs are severely flawed.

We agree. In the world of undergraduate business education, there have been only two rankings that count: U.S. News & World Report and Bloomberg Businessweek, the latter of which produced its last and final list this year.

The U.S. News list is merely a subset of data from its overall university rankings. Largely a popularity contest, it is solely based on a poll of deans and senior faculty members, most of whom have little to no knowledge of the programs at rival schools. Asking deans to rate other schools is less a measure of a school’s reputation than it is a collection of prejudices partly based on the self-fulfilling prophecy of U.S. News’ own rankings.

SEVERAL DEANS ASKED US TO CREATE THE RANKING

This year’s final Businessweek ranking is no better. It is largely based on employment outcomes, a standard that would seem reasonable except that the single biggest factor in the methodology is a badly executed survey of corporate employers. Both surveys pay no attention whatsoever to the incoming quality of the students or the full academic experience.

In creating our inaugural ranking of undergraduate business schools, we invested considerable time and effort into creating a well-balanced approach that was both fair and thorough. We equally weight admission standards, the full academic experience, and employment outcomes. Those three components of a business program are critical factors of the quality of the educational experience. Excluding any one of them would result in a disingenuous effort to rank the very best schools.

The methodology, moreover, was developed not in isolation of but in collaboration with numerous business school deans and administrators. In fact, we only decided to create this ranking after being asked to do so by several business school deans, frustrated and deeply disappointed by earlier ranking efforts from other organizations.

A METHODOLOGY DEVELOPED IN COLLABORATION WITH B-SCHOOL DEANS

Conference calls were held with deans from many leading business schools. A robust exchange of ideas occurred via a flurry of back-and-forth emails and conversations. Once we put together a draft methodology and questions for both school surveys and alumni surveys, it was shared with business school administrators on a Google document so that they could openly comment on the draft. Their opinions led to substantial changes in our approach. We’re grateful to those who shared their views because it resulted in a sound and credible inaugural list.

We did not, however, always agree with what the deans proposed. That may well be one reason why not all of the 80 schools we originally approached to create our first top 50 ranking agreed to cooperate with our effort. In many cases, deans sought changes that would have benefited their institutions. For example, several deans believed we should adjust starting salaries by cost-of-living in their geographic areas. But given the many far-flung locales where graduates choose to work, including many international locations, it was nearly impossible to devise a fair system of taking cost of living into account.

Besides, we discovered that contrary to popular belief, the power of a higher education brand often exceeds either geography or industry choice as a factor in compensation. Prestige also results in a degree that more effectively travels, having value far beyond a geographic region. Fordham University may be based in New York City and more of its graduates may go into finance, for example, but the starting pay of its undergrads is significantly below that of many Midwestern schools, including Washington University in St. Louis, the University of Michigan in Ann Arbor, Michigan, or (for that matter) Carnegie Mellon University’s Tepper School in Pittsburgh, or Notre Dame’s Mendoza School and Indiana University, both in Indiana.

ADMISSIONS STANDARDS (33.3%)

A vital factor in judging any higher education effort is the quality of the incoming students. After all, a tremendous amount of learning—both academic and social—occur as a result of the quality of your classmates. We agree with that old cliche about playing a sport with people who are equal or better than you. If you play with someone you can always beat, you’ll never reach a level of personal excellence.

We relied on three metrics to measure admissions: The average SAT scores for the latest entering class, given a 35% weight in the admissions category; the percentage of the entering class in 2016 who finished in the top 10% of their high school class, accounting for another 35% of the category; and finally the acceptance rate for the business school program, weighted at 30%.

In several instances, prospective students must pass two hurdles to get into a business school program: First, the university admissions standards and then the business school’s own admissions criteria. We used both acceptance rates to calculate the actual odds of admission for students entering a two-year business program in their junior year.

This data was gathered through a survey to some 80 business schools which had ranked among the top 50 on either the U.S. News or Bloomberg Businessweek lists.

ACADEMIC EXPERIENCE (33.3%)

No study of undergraduate business programs would be complete without an assessment of the academic and extracurricular elements of the educational experience. So we sent surveys to alumni of each school to determine how satisfied they were on every level of that experience. Deans suggested that we choose alumni who had been away from their schools for two years. That would give them ample time to road test their education, to give an accurate assessment of how well prepared they were for the world of work.

Our survey asked a dozen core questions of graduates, each rated on a one-to-ten scale of satisfaction. For the full list of questions and the graded results, see “2016 Rankings Report Card: How Alums Grade Their Schools.” We also asked alumni whether they had a “significant experience,” defined as a major consulting project, thesis, or other program feature instrumental to their professional development, or a meaningful global immersion. Some 75% of the weight on our academic experience category was based on the answers to the 12 core questions, while 25% was based on whether students had either a signature experience or a global immersion.

Schools that either declined to allow their Class of 2014 alumni to be surveyed or failed to meet a 10% threshold for the response rate have been given the average scores of peer institutions. Of the 37 schools that allowed us to send our survey to alumni, 28 met the minimum 10% participation. Among those schools, 13,605 alumni received our survey and 1,871 completed it for a response rate of 13.8%. The College of New Jersey had the highest response rate, 51.8%, while Minnesota’s Carlson School of Management had the lowest acceptable rate at 10.3%.

EMPLOYMENT OUTCOMES (33.3%)

Students who go to business school expect to get a job not long after graduating. Summer internships are a key way to open the door to a full-time job opportunity. So our employment outcomes category is based on three metrics: the percentage of the latest graduating class (the Class of 2016) to gain jobs within 90 days, weighted 50%; the average salary and bonus for the latest graduating class, adjusted by the percentage of graduates awarded a bonus, weighted 30%, and finally, the percentage of the Class of 2016 that had internships before their senior year, weighted at 20%.

THE FINAL RANKING

The results of all three categories measured were then combined equally to determine an overall ranking. In each category, index scores were created to give credit to one school’s lead over others. We publish the numerical ranking with underlying index scores so that readers can determine how useful an actual ranking could be in relationship to the other schools on the list. It’s important to note that there are nearly 700 undergraduate business programs accredited by the AACSB. The 50 business schools singled out in our debut ranking are all extraordinary, each in the top seventh percentile of accredited schools.

Though it was not all that difficult to get data on schools that declined to participate in the ranking, many of these institutions failed to provide timely and full disclosure of information that we consider extremely important for prospective students and their parents. That lack of transparency is unforgivable at a time when these schools are asking students to effectively mortgage their futures by borrowing large sums of money to fund their education. Ultimately, this ranking and the mountains of data we are publishing is an effort to more fully inform prospective students and their parents to make the best educational decisions possible.

 

  • One aspect of the methodology sent to schools deferring participation this year: “For
    the alumni survey, your school will be given the median scores for the institutions that agreed to cooperate. We will make all of these details clear to our readers.” With only 28 schools meeting the alumni survey requirement (13 opting out entirely), I suspect this means half of the alumni scores have a third of their final score not representative.

    • Actually it seems that two groups of alumni scores were arbitrarily assigned, for those schools not opting in – “15” for several, and “40” for others. This kind of thing is exactly why some schools opted to see the methods prior to endorsing. The hope was that in seeing a strong method applied, schools might be convinced to participate in future years. This unfortunately simply seems to punish those concerned with methods (while not shoring up those concerns)

    • JohnAByrne

      Unfortunately, we believe your school benefitted from our use of peer group statistics. Ohio State’s employment outcome numbers were among the lowest we collected. The employment rate of your graduates three months after graduation showed that 28% of your students were still without jobs. That put Ohio State 46th out of our 50 schools. The grads who had jobs three months after commencement reported average salary of $53,641, which is actually below the average reported by business schools in the National Association of Colleges & Employers’ latest survey and is 43rd out of our 50 schools.

      • Though sadly, this isn’t about any one school. The fact that P&Q refuses to even attempt to scale salary by cost of living domestically (the majority of students placed), or by discipline (with full knowledge that different field pay different bases), makes the use of your salary data fundamentally misleading. Any prospective student or parent looking at these figures is getting mislead. Its false news. Something we as a society could do with less of. The schools that opted out this year had similar concerns, but P&Q chose to take the low road instead to help demonstrate acceptance of poor assumptions. And even in the case of P&Qs filling in of the blanks, at least as it has been described to schools, the approach seems to be “Admissions” (not placement) predicts “Alumni satisfaction”. {your placement figures in the case below are off by the way}. What is the implied assumption here? And ultimately if some schools were on the outskirts of a peer range on admission… why was that still the chosen peer group (so many 15s and 40s assigned… the cut offs of these peer sets look like nothing more than data manipulation). We teach students to not manipulate numbers towards priors. Anything else is simply unethical.

        • JohnAByrne

          Our reasons for not adjusting salary are well explained in this story. We strongly believe that salary should not be adjusted because the value of a school’s brand has as much or more impact on those figures as geography or industry choice. Brand, moreover, allows the degree to travel more successful outside the boundaries of a “regional” school. Rather than manipulate those salary numbers, we prefer to show our readers the actual amount of money a graduate gets in her or her first job. That is entirely ethical. Pretending that brand has no impact on a salary is not.

  • Kenneth G. Brown

    I am disappointed by the language Mr. Byrne uses to close this post — “The lack of transparency is unforgiveable…” Quite a few of the schools who opted out (including my own) are state institutions. We provide data to governing bodies within our state and are subject to and respond to open record laws. We work with AACSB to provide industry-wide data. And we already participate in other surveys. Why should we be disparaged for making a strategic decision to not participate in one particular survey? The net costs to participate in yet another ranking, even one that has merits like P & Q, are not negligible. We chose to spend our limited resources in other places, specifically on student services like advising and placement.

    • JohnAByrne

      It’s worth noting that providing information to “governing bodies” is not being transparent with your tuition-paying students or their parents. That information goes into an abyss. The net cost of participating in our survey was very low. Any school that does not have the requested data readily at hand is, to my way of thinking, a poorly managed school. I am curious, however, about what other surveys your business school is completing and what information you are willing to disclose to other organizations.

  • Kate in CA

    If you can not get an adequate sample, why did you publish this list??

    • JohnAByrne

      Not sure why you think we did not get an adequate sample. We have every confidence that the sample is more than adequate and think this is the best evaluation of undergraduate business education that has ever been made.