The latest episode delves into the intricacies of the U.S. News & World Report’s MBA rankings, a much-anticipated event in the business education landscape. John Byrne, Maria Wich-Vila, and Caroline Diarte Edwards explore the surprising twists, particularly Harvard Business School’s unchanging sixth-place tie with Dartmouth Tuck and NYU Stern. While the rankings are highly influential—three out of four survey respondents acknowledged their impact on perceptions—the hosts question some unusual placements, especially given the high prestige and success rates of institutions like Harvard.
Maria sheds light on the effectiveness of yield rates as a more telling measure of a school’s desirability. Despite Harvard’s sixth-place tie, its yield rate is nearly 90%, closely rivaling Stanford’s, demonstrating its persistent allure. John underscores that yield rates reflect the choices of discerning applicants, despite them being closely guarded by schools. For instance, Wharton’s significant drop in yield compared to Harvard highlights the impact of applicants’ final decisions over traditional rankings.
Caroline provides an insider’s look into how schools manage such figures, with some leveraging scholarship offers to sway decisions. Interestingly, at INSEAD, yield rates aren’t publicized, and the focus remains less on gaming the numbers and more on maintaining quality. This episode underscores a critical insight for MBA hopefuls: rankings offer a snapshot, but factors like yield and school culture ultimately shape real-world reputations.
Episode Transcript
Note: This transcript was generated by AI and may contain minor inaccuracies.
[00:00:06] – John
Hi, everyone. This is John Byrne with Poets and Quants. Welcome to business Casual, our weekly podcast with my co-host, Caroline Giorgi Edwards and Maria Wickvilla. The big news, the US news and World Report, MBA ranking is out. Just as sure as spring comes along, so does this ranking. We recently did a survey of We asked them, what ranking is most influential in shaping the perceptions of your school’s quality? And 74%, three out of four said it was US news. Here it is again. Now, number one in sole possession of top honor is Wharton. The bigger news to me, however, is not that Stanford and Kellogg is tied at two. Stanford had been tied with Wharton the previous year. But the bigger news is why does US news hate the Harvard Business School? Harvard languishes in sixth place, exactly where it was last year, but this time it has to share sixth with the Dartmouth College, Tuck School of Business, and NYU Stern. I can tell you, I was on the NYU Stern campus on Monday and Tuesday of last week when the ranking came out, and I can’t tell you how happy people were at Stern about their sixth place finish, not merely because they landed in a high spot, but more importantly, because they tied with the Harvard Business School.
[00:01:39] – John
Now, Maria, you’re our Harvard expert. What do you make of this?
[00:01:45] – Maria
Why does US news hate Harvard? I don’t know. But at the end of the day, look, the proof is in the pudding when it comes to yield rate. For a school that is, quote, unquote, only six, Most of the Harvard tends to get almost 90% of people who are offered a spot at Harvard take them up on that offer. I believe it’s either the highest or usually neck and neck with Stanford. You can knock us down all you want. But the proof is in the pudding when students actually have to enroll and submit those deposits and make that choice. The rankings don’t really seem to have impacted Harvard too badly in that, and so I am not too concerned.
[00:02:33] – John
True. You mentioned yield rates, and I think that that’s really a good thing to talk about because after all, yield rates, to a great extent, reflect the wisdom of a very discerning crowd. It’s the percentage of accepted applicants who actually enroll in the program, meaning they’re voting with their feet and choosing one program over another. That’s incredibly valuable information that, incidentally, schools track very religiously, but never share it with the public. You’ll never see a yield number in a class profile or on any business school website. But let me tell you what they are. Stanford is number one, but only slightly. Last year, 85. 1% of the applicants accepted by Stanford enrolled in the class. Harvard, just a tiny bit below it, 84. 5%. And to Maria’s point, the third-place school on yield, among the top 30, at least, is Wharton. Listen to the gap. 57. 8% of the applicants accepted by Wharton go. That’s 57. 8 versus 84. 5 with Harvard above it. Big gap. Then what about NYU Stern and what about Dartmouth Tuck? Dartmouth Tuck, 34. 8% of the applicants go, which also, incidentally, means that over 65% turned them down after being accepted.
[00:04:06] – John
And NYU Stern, 30. 9%, meaning that roughly 69 to 70% of the people accepted at Stern actually prefer not to go there. Now, these numbers are managed, and they’re managed in the sense that admission officers will do two things in managing the number. Actually, Caroline can attest to having been the head of admissions at NCI. You manage a number by throwing scholarship money at people and enticing them to accept your offer, even when they may have been accepted to a higher-ranked school that’s not giving them as much money or no money at all. The other way you manage this number, incidentally, is you basically only accept people who you have a real feeling are going to come. If their stats are way above what the class stats are in your school and you suspect that they could turn you down, you basically reject them before you give them the choice to say no to you. Caroline, is that right? Is that how you manage yield?
[00:05:12] – Caroline
Actually, not at INSEAD. So NCF doesn’t publish yield, and so they are not concerned about gaming it in that way. As you know, NCF doesn’t publish any data on application volume or yield.
[00:05:26] – John
It’s a matter of crime.
[00:05:27] – Caroline
It’s transferred by. And what I What you say is that it is a very high yield, actually. So it’s much closer to what you mentioned for Harvard and Stanford than it is for Wharton, to Wharton’s yield. It’s a very high yield. So it’s definitely a number that they can be proud of. But yeah, I’m sure some schools are doing that when that number is public and when it’s such a low number. It must be so frustrating to admit so many candidates knowing that two-thirds of them are going to turn you down. So of course, there’s this temptation to say, Okay, that candidate, I’m quite sure they are going to go to a rival school. So why waste my offer on them? I’m sure that’s a temptation. And we also see that with waitlisted candidates as well, where schools will found out the waitlisted candidate to get a sense for whether they might accept the offer before they actually give them the offer, because they don’t want to make lots of offers to waitlisted candidates who then turn them down and it drags down their yield. So there definitely are some games going there. But yes, I agree that it is a very good signal of real perceptions about the school.
[00:06:36] – Caroline
Actually, regardless of the rankings, how highly a school is regarded by students and by recruiters and the sense of the long term value of a program, which is, I think, distinct and separate from the year to year fluctuations of a ranking.
[00:06:56] – John
The other thing about yield is why you could even argue that it’s a great proxy for a ranking in and of itself is look, Stanford, Harvard, and Wharton are at the top. Isn’t that what we always think about in terms of the US MBA programs? But here’s another really back of the envelope exercise that you can do It’s a lot of fun. You can rank the schools by yield and then compare that rank with their US news ranking. You can make an argument that some schools are overvalued and other schools are undervalued. One thing that pops out to me in terms of the most undervalued school based on this analysis is Duke, Fuqua. Duke is number 4 right behind Wharton. In fact, its yield number is almost equal to Wharton. It’s only 0. 4 off. 57. 4% of the applicants accepted at Duke take up the offer and enroll. I got to say that is a tribute to Sherry Huber, the head of admissions there, who is definitely doing something right because that is a big surprise. That is 10 places down, or rather 10 places up from the US news rank in terms of the yield numbers, which is incredible.
[00:08:17] – John
And then if you wanted to look at the other way, what’s one school where there’s a 10 position gap between their US news ranking and where they rank on yield? And it would be Northwestern Kellogg. Kellogg’s yield rate is 35. 1%, which puts them 12th on the list of the top 30 while they’re number 2 overall in the US news ranking. Anyway, it’s an interesting fun exercise to do. If you look at our story on the 10 biggest surprises in the US news ranking, you’ll find it. Maria, any other observations about this year’s ranking?
[00:08:58] – Maria
Well, I think one thing that’s interesting is that if I’m not mistaken, they modified the methodology slightly to actually compare salaries by profession. What they did is they took the graduates and divided them into these buckets of, Are you doing consulting? Are you doing operations? Et cetera. Then they’re comparing the salaries across those buckets. I think that that is a very welcome change because in the past, only looking at that salary number would punish schools that might have more of a social enterprise type of focus, and it would overly reward schools that had a Wall Street investment banking type of focus. I was pleased to see that. One thing I wish that US news would do is, if they ever do make a methodology change, admittedly, I don’t think this was a major one, but it was still a change nonetheless. In years when they make a methodology change, I would love for them to also publish almost like a secret, not a secret, but an unofficial list of, Okay, using the old methodology methodology, what would be the list this year? Because when they change the methodology and therefore there may be some resultant swings one way or the other, I do worry that it could shape or tarnish an applicant’s opinion of a school, perhaps unfairly.
[00:10:18] – Maria
If a school, for whatever reason, suddenly drops seven or 10 places, right now, we are recording this in the beginning of April, which is when a lot of students are deciding where to enroll, and this year’s applicants are deciding where to apply. I wish that they would have an asterisk or some a footnote that says, By the way, if we were to keep it the same, this school that dropped eight points, maybe they didn’t really drop eight points, just because I think that these rankings come out at a very sensitive time in the life cycle of many applicants.
[00:10:51] – John
Yeah, that’s a really good point. The other thing that’s worth noting, and this is important, is How rankings turn out is a total function of what gets measured and also what doesn’t get measured. What gets measured by US news is heavily placement in pay. Half of the ranking is based on those two factors. They look at employment rates at graduation and three months after graduation, and then they look at average starting salary and bonus in the latest graduating class. That alone is 20%. Then the point that Maria was making about average salary by professions is 10%, but fully 50% of the entire ranking is all about placement and pay. That’s actually where Harvard Business School fell behind rivals. You can argue how legit this is, but it is based on data provided by the schools. Starting pay for Harvard MBAs, that would be salary and sign-on bonus, primarily, was 13,500 below Stanford MBAs who earned the most at 206,955. At Harvard, it was 193,505. That is surprising to me. But when you look into the data, what you do find is that Stanford puts more people in private equity and venture capital as a percentage of the class than any other school.
[00:12:26] – John
Those two fields pay among the highest salaries paid to MBAs. Then you look at employment, and this is something we covered in a previous podcast. Employment rates for Harvard MBAs were among the worst for any top MBA programs. It was 78. 4% three months after graduation, more than a dozen percentage points below Dartmouth Tuck, which was at 90. 7. Now, these rates were lower than they had been for a number of years due to the uncertain economy last year, the It’s still backed by consulting and tech firms. But it seemed to affect Harvard more than others. We think it’s because Harvard MBAs left jobs that paid them a lot more money than many of the MBAs at other schools. They tend to be very choosy and picky about where they go and what jobs they want. Many of the jobs that they want are not the jobs where companies come to a campus and hire by the boatload. These one-z and two-z placements at PE firms, VC firms, hedge funds, other wealth management positions where a company doesn’t come in and just hire 20 to 50 MBAs in one fell swoop. That’s why we think those numbers are lower at Harvard.
[00:13:53] – John
Then in a ranking like this, which puts so much emphasis on pay and placement, Harvard gets hurt. Caroline, any of your thoughts on some of the results of this ranking?
[00:14:06] – Caroline
Yeah, I mean, I think you can just go back to Harvard, you can pick through the reasons why Harvard ranked slightly lower. And I think you can explain all of them as you said. So the fact that their employment rate is lower doesn’t mean that Harvard MBAs have worse opportunities than graduates coming out of other schools. It’s just that they are not taking the same opportunities, and they’re holding out a bit and they have the confidence to do that. I think that that’s definitely quite clear. And recruitment in some of the industries that you mentioned, I know, for example, in private equity and venture capital, recruitment over the past year has been quite weak, so it’s probably taking them longer to find those positions than they might have experienced a year earlier. And then starting pay. I mean, poor guys, they’re rubbing along on $194,000 a year. I don’t know how they manage. But okay, Sanford MBAs earn a bit more. Well, I know that vast majority or a huge chunk of a Sanford class is staying here in the Bay Area, where the cost of living is extraordinary narrowly high, and many of them are going to work in Silicon Valley companies and PE and venture capital, where they get great salaries and sign on bonuses, and probably a much higher percentage of the class doing that than they at HBS.
[00:15:30] – Caroline
At GSP, it’s a much smaller class. At HBS, I’m sure there’s more diversity in the positions that the students are taking when they graduate compared to the positions that GSP students are taking. I don’t believe for a moment that someone coming out of HBS and going into private equity or venture capital is earning less than someone coming out of GSP going into the same industry. It’s just that I think there’s more diversity in the positions that HBS graduates take up.
[00:15:59] – John
Yeah, A really good point. Theography thing does matter. I mean, NYU Stern Pay, for example, New York was over 200 grand, so about 7,000 more than Harvard. Dartmouth Tuck, which is a real consulting school, and consulting firms pay a heck of a lot of money. I mean, usually 175 to start and a $35,000 sign-on bonus. Their pay was over 200,000 to start, 7,000 as above Harvard because of just the vast numbers of people who go straight into consulting. That Tuck Alumni Network, I’ve got to admit, is one of the best in the world. I mean, those people are really loyal to each other, and that network helps greatly, even when consulting is pulling back on some of the hires. One thing you can argue whether one school should be higher or lower than another, and you can They have all kinds of fights over bragging rights. But one thing I do like about this ranking, frankly, is the data that comes out. It’s pretty standardized data, so you can compare it across the schools. I think that they ask for the data in a fairly tight way, so schools can’t interpret it in a way that benefits them.
[00:17:24] – John
The data is probably the cleanest data you’ll find that has comparative value. That goes to admission statistics, that goes to the standardized test scores, GPAs, of course, the salary and bonus and placement rates, the acceptance rates. Incidentally, even though salaries were pretty much lower and placement was lower in the year measured, which was last year, what you found is that the acceptance rates also were lower, so it was harder to get into any of these schools. At Stanford, it was 6. 8%. At Harvard, it was 11. 2%. At MIT Sloan, it was 14. 1%. Those are the lowest acceptance rates for any of the top schools. They’re pretty impressive rates that show a high degree of selectivity, which is why, in fact, so many people employ folks like Carolina Maria to help them get in because Because you got to know that, let’s say at Harvard, where it’s 11% of the people are getting in and 89% are not. Well, there are a lot of those people in the 89% that are getting rejected, who are fully qualified to attend their exceptional candidates. There is good or even better in some cases than some of the people getting in because there is a randomness to elite MBA admissions that you can’t really account for.
[00:19:01] – John
But it does put all of this into some light. Maria, don’t you think? I mean, you counsel candidates and you see incredible quality. I bet there are times when you scratch your head because they get a dis. They get dinged and turned away. You probably thought, Oh, my God, this is as good a candidate as I ever give Harvard or Stanford or Wharton.
[00:19:27] – Maria
Absolutely. At the end of the day, it’s a human process being run by humans that’s evaluating other humans. There’s quite a bit of subjectivity that goes into it. At certain points, there will be an element of what appears to be randomness. I mean, that happens, I think, in any situation, whether it’s job hunting or even dating at a certain point. Some of it’s going to come down to luck. I wanted to really quickly touch upon the great point that Caroline made earlier about the cost of living. It was interesting. If you look at some of the lower-rank schools, they do tend to say 10 to 15 or 10 to 20, they do tend to be more in the south. I can’t help but wonder if perhaps, maybe, if we start throwing this out there, maybe US news might start to, since they have started to divvy up the salaries by the buckets of what people are doing, maybe what they can do in subsequent years is divvy up the salaries also by the geography. The schools will often publish, perhaps not very super granular statistics, but they will publish rough areas in where graduates end up after graduation.
[00:20:35] – Maria
Similar to how the Financial Times tries to introduce an element of purchasing power parity into their salary calculations, maybe that’s something that US news could do next is looking at roughly like, okay, well, if 35% of, let’s say, Vanderbilt, Owen’s class, I’m making up that number, I actually don’t know, but let’s say it’s a certain chunk ends up in the Southeast, well, the cost of living in a place like Atlanta or Nashville is going to be less. Are those salary numbers accounting for that? I think that that would be an interesting thing for them to dissect. One other nit to pick is that it’s always the GPA counts for 10% of their calculation, and the GMAT or GRE test score is 13%. They are saying that the GPA is essentially identical in weight to the standardized test score, which I’ve always felt is a little bit unfair because it’s If there’s one thing that isn’t standard, it’s GPAs. Gpas will vary tremendously from one undergraduate institution to another, from one major to another. A nuclear engineer from the Naval Academy with a 3, 2, perhaps underwent a much more rigorous undergraduate training than someone who studied something else at a less rigorous institution.
[00:21:50] – Maria
I would also love for them to, and I don’t think this would be a ton of extra work, but at a minimum, we know the incoming class, what they majored in, that is a statistic that is freely Maybe they could ask for GPAs by major somehow. Then that way, if you are getting people who are science or engineering backgrounds who maybe didn’t have as high of a GPA, we’re not going to punish the schools for accepting people who perhaps majored in something where there wasn’t as much of a curve, et cetera.
[00:22:21] – John
Yeah, that’s a legitimate point for sure. Caroline, looking at this ranking, do you think some schools got a raw deal?
[00:22:29] – Caroline
Well, I I think that when there are big leaps, either upwards or downwards, it doesn’t actually reflect any real change in what’s going on at the school. To me, what is interesting is the longer term trend in how schools do and not this shuffling around year by year. And then I think the data is very useful, as you said. That’s a really useful source for candidates to look at. But I do feel sorry for schools that have experience drops of 20 points, because I know from my experience at INSEAD that people, stakeholders, get very upset when these things happen. So, oh, one year. Insead, of course, cares a lot about the FT ranking, which carries more weight in the international market. Of course, we don’t appear even in the US school ranking because it has a very… Us news has a very US-focused view of the world. But if INSEAD goes down a bit in the FT ranking, then suddenly the alumni are upset and they’re sending messages to the dean and to the board, and the students are concerned that the value of their degree is going to plunge. So people do get upset about these things, and it’s very hard on the administrators who have to field those questions and explain what’s happened.
[00:23:50] – Caroline
Then it’ll shuffle around again the next year. It is quite stressful for the schools to go through those volatile rankings things, and there’s not much they can do about it. Often, as you also point out in your article, there’s very little to choose between many of the schools on a lot of these data points. They’re very, very close often. A small shift either way in any individual element can mean you go up or down 10 points, which sounds like it’s quite a dramatic change in the value that you offer to students. But in reality, there is no fundamental change. Unfortunately, I think the rankings have way too much carry more weight than they should. That’s true. I feel sorry for the people who are having to manage the reaction of the market right now if they’ve gone down in this particular ranking.
[00:24:49] – John
That’s true. Small differences have an outsized influence on the ranking because the data is so closely clustered together and the differences are statistically meaningless to underline the point that Caroline just made. I think there are two schools in particular that got a real raw deal in this ranking, and I want to bring them out. I think UNC at Chapel Hill, Kenan Flagler Business School, got a really raw deal. They’re down 8 places to 28. They dropped out of the top 25. Traditionally, they’re in the top 20. They were 19 and 22. They were 20 and 24 and 21 and 20, and now they’re 28. That is an exceptional program, well run, well managed, great faculty, a lot of smart bets that they’ve placed on things like real estate and investment banking. That is a ranking change that I think really does a disservice to the school and to the business school community. Not looking at the numbers and analyzing them, I can’t tell you exactly why they dropped 8 places, but I can just tell you that this is a school that definitely is in the top 20 year after year and should remain in the top 20.
[00:26:16] – John
The other school that I think got a raw deal here is UC Berkeley. Uc Berkeley is almost always in the top 10. It dropped 4 places. It was seventh last year. This year, it’s 11th. So it dropped right out of the top 10, where I believe it deserves to be. It was number seven, incidentally, in 2020. It was number seven in ’21. And ’22 is number eight. As I mentioned before, last year, it was seven. It was one aberration when it fell to this rank two years ago, 11. But Berkeley is getting a raw deal on this ranking as well. And of course, Harvard is as we pointed out. Harvard is not a six-place school. And if you use different metrics to rank the schools, like yield or like a science of once endowment, which obviously gives the school far greater resources, has to build back the best times, the best stands. And more than that, the infrastructure that underlies a program, endowment, you can cancel offer. It’s never measured. Another thing that I think counts for lot is a school’s generosity in offering scholarship aid to its students. That is not included. If it were, no one can beat Harvard because Harvard is the most generous school to its MBA students in the world.
[00:27:46] – John
We’re talking tens of millions of dollars given to their MBA students every single year. No school comes close per grant to student. You included yield, endowment, and scholarship funds, and the whole ranking would change, obviously. Again, it’s to underline the point that what gets measured is what the result ends to be. What tends to get measured is basically decided by a few editors who have little to no knowledge of business education, who have been to few, if any, business campuses, who have been exposed to few, if any, MBA graduates and students It’s a bunch of people in a room making observations that are, frankly, not all that informed by reality. That’s how rankings get done, sadly. Maria, would you agree?
[00:28:46] – Maria
I absolutely agree. I think, as we’ve said many times before on this podcast, these rankings are an interesting starting point. They’re a great source of data collection, but they should not be driving your ultimate decision. Your mention of UNC was absolutely Absolutely. I also noticed that, and I thought, Oh, it’s shocking and such a disappointment because I think Keenan Flagler is indeed a far, far superior program than it’s showing in the rankings this year would indicate. It’s really interesting because I had a client reach out to me who is deciding between UNC and then another school that in previous years were neck and neck in the rankings. They were in a panic, and they were like, Oh, my gosh, does this mean anything? Should I turn them down and go to this other school? It was a very, almost a surreal conversation for me because let’s say that they’re interested in real estate. They’re not. But let’s just say for the sake of argument. I’m like, Okay, this other school, UNC has a major in the thing you want to do. When you look at the career report, something like three and a half X more % of the class goes into this field that you want to go into than this other school.
[00:29:53] – Maria
It is clearly the superior program for you in terms of the classes, the career. Everything about it is clearly a superior program. So please do not let this blip. I don’t know what happened, but I would hope that next year, UNC would be in a ranking placement that is more commensurate with its quality. So please do not let this. I think that’s before. I literally thinking of this specific example was why earlier I said, I wish they would almost put a side-by-side type of explanation when a school does drop, because we are at a time of the year where people are making these decisions It would be a real shame if this person would have let this blip in the rankings funnel them to a school that candidly does not have the same focus or the same resources in their very specific niche interest.
[00:30:47] – John
Really true. That’s the problem. People do make these decisions based on these rankings as opposed to what’s the best fit for me Instead of using it as a starting point, just to understand the landscape of what options there are, and then to explore the options, people say, Okay, I got into this school. It’s ranked higher, so therefore I’m going to it, which is the It’s a long thing to do. To your point about the ups and downs, it’s almost inevitable that the schools that increased the most in one year decreased the most in the previous year. I’ll give you a good example. The school who’s MBA programmed rose higher than any other this year is the Cogod School of Business at American University. Their full-time MBA program jumped 27 places, more than any other, to rank 58th this year. But Last year, they were 85th. If you look a little more in a longer term, two years ago, they ranked 122nd. In ’22, they ranked 76. Five years ago, in 2020, they ranked between 99 and 131. They didn’t even have an actual numerical ranking. It shows you that it’s all over the place. And yes, the school has done better.
[00:32:11] – John
It’s peer assessment survey. They got a better ranking from their peers. They had better employment rates three months after graduation. In fact, better than Harvard, Stanford, MIT, Columbia, and Yale. They made some high-profile faculty hires and had some record fundraising. That helped the peer assessment survey. But come on, up 27 in one year, in 85th the previous year, and then the year before that, 122nd. You got to know that really the actual experience of a student at that school didn’t change all that much over the last two or three years. You do really have to take these rankings with a very great grain of salt. Caroline, last words on this. I know you prefer the Financial Times ranking because it’s a true global ranking, right?
[00:33:04] – Caroline
I do, but I think Maria has something she wanted to add first.
[00:33:08] – Maria
I just had a funny observation that the 27 point jump for Cogon is actually more than I believe the number of students they have per year in their program. I think they have something like 25 students a year. That’s an interesting observation, right?
[00:33:26] – John
Right.
[00:33:27] – Maria
Look, I love American Union. I I grew up in Baltimore, so I love the DC schools. But if you’ve only got 25 kids a year and you’re trying to navigate these rankings in a certain way, I think it’s a lot easier to engineer the 25 people that you accept every year in order to perhaps think ahead to what the rankings are looking for. When there’s such a small sample size, it’s not even a statistically significant sample size. Isn’t 30, I think, the minimum number? When you’re dealing with such a small sample size, I think the smaller programs, I would suspect, are able to jump higher or lower pretty dramatically because one person not getting hired, it’s all of a sudden, wow, that’s a huge impact on these statistics.
[00:34:19] – John
Yeah, and that’s another interesting point, right? How do you legitimately account for the big schools that have 700, 800, 900 to a thousand a year graduating versus those that have fewer than 50 graduate. It’s an odd ball thing because you’re really dealing with apples and oranges, and it really is unfair to make some of these comparisons. But it is what it is. If you want to see the full result and our analysis, take a look at her story. The headline on it is Wharton claims, sole possession of first in the US news MBA ranking. We also have analysis of their part-time MBA ranking, where there was three-way tie in Kellogg and Berkeley and then while you Stern, top that part of the ranking. Then there’s also the specializations where schools are ranked by discipline, whether it be finance, accounting, marketing, operations, and whatnot. We have a story on that as well as the 10 biggest surprises in the ranking where we highlight some of the weird shocks and some of the valuable insights that may not be apparent to you just by looking at where a school ranks. Well, thanks for listening. This is John Burn with Poets & Quants.
