Perceptions of Prestige: It Doesn’t Always Matter Where You Go to College

When counselors task students with crafting a list of schools for their preliminary college list, many factors come into play and those considerations can rank in importance in a variety of ways. Size, location, programs offered, internship opportunities, study abroad options, cost, diversity, services offered, acceptance rate: all of these characteristics assist students in filtering colleges into a more manageable list. Any counselor can tell you that they’ve seen college lists with upwards of 25 schools; however, it is often the hope that students will ultimately apply to a balanced list of 7–10 schools. Narrowing that list by criteria important to the student can be difficult. With so many filters to apply, there also comes a looming factor that remains prevalent in many students’ and families’ processes for creating that college list: prestige, reputation, rank.

Somehow, in decades of college counseling and admissions operations, rank and perceived prestige have come to be occasionally and unnecessarily married to the concept of “a good school”. While there are a plethora of highly-ranked schools that certainly are phenomenal institutions, there is often this understanding that the highly ranked ones or those with significantly selective acceptance rates are the true, only “good” ones. You know, the ones you want to put on your resume or on the rear windshield of your car.

How do school rankings work? We can start with US News, the media company that publishes a list of the year’s “best colleges” annually. US News uses 16 different measures to assess colleges, falling into the following categories: social mobility; undergraduate academic reputation (including peer assessment and, for some, high school counselors’ ratings), graduation and retention rates, faculty resources, student selectivity (average admissions test scores and GPAs of incoming students), financial resources, alumni giving, and graduation rate performance. Certainly, it’s understandable why some of these factors would be part of the equation for assigning a score to a university in order to rank it against other schools. Understanding that these are the indicators, though, is critical, especially because one seemingly significant category of a college’s value is missing (and, oddly enough, the main reason we have recommended college to young adults more broadly): job placement and career readiness. (Important note: “graduation rate performance” does not refer to how graduates perform post-commencement in terms of career success; this indicator represents how a school performed in terms of its previous graduation rate over the course of six years)

But we’ll get back to that.

The Princeton Review also ranks colleges and, in their case, they choose the annual 385 colleges based on figures and information obtained from school administrators. Only the top 20 in 62 categories are “ranked”; the others do not fall within a hierarchy or numerical list. Then, their methodology continues within that list of 385 to include survey results from students at those colleges. As the Princeton Review notes in their own explanation of their process, the most recent list synthesizes data from 140,000 students’ responses to those survey questions. The 80 questions span a multitude of categories, including questions about “their school’s academics/administration, 2) life at their college, 3) their fellow students, and 4) themselves.” Most responses fall within two typical likert scales of level of agreement/disagreement or a choice of statistics, such as 0–20%. Certainly, student perspective is incredibly important to understanding the value of a college, especially since the Princeton Review shows the student data as it relates to “Best Science Labs”, “Best Campus Food” (not so trivial when you’re on your 4th day of chewy, cold pizza), “Party Schools”, “Lots of Race/Class Interaction”, “Best Financial Aid”, and, thankfully, “Career Services”. Included within the Princeton Review’s assessment of the college/university is Return on Investment (ROI) data that comes strictly from alumni reporting and through PayScale.com, a software and data analytics company that employers may purchase to handle employee compensation. It also offers a survey for employees to determine their own “worth” in terms of pay. It’s used by 8,000 employers, so make of it what you will. This information on Princeton Review is not available for all schools on the overall list, though; just those that made the “Best Value” rankings. This ranking (which also includes the academic rating, the financial aid rating, and that of college cost) could be a good place for college searchers to start. It can take students beyond the top 20 fascination, but students and parents alike should also conduct their own research beyond these self-reported polls.

(Side note on PayScale: when asking an anonymous hiring manager about this software company’s product, the response was telling: “What they do is pretty freaking smart, if you can get people to admit what they make.”)

Job placement statistics can be interesting and certainly helpful in terms of trying to anticipate graduates’ future success. Lack of that information on US News or the nature of some information being self-reported on the Princeton Review does not mean that these high-ranking schools lack resources (career services, job fairs, helpful alumni networks), but it should give parents, families, and counselors (private or school-based) pause and encouragement to do more research. For example, if you’re visiting a school in the top ten on US News’ list (as of today, that would include: Princeton University, Harvard University, Columbia University, Massachusetts Institute of Technology, University of Chicago, Yale University, Stanford University, Duke University, the University of Pennsylvania, and John Hopkins University), you don’t need to ask about average GPAs or SAT/ACT scores; you already know where they stand. However, one might want to ask questions about the availability of internships across programs, whether or not career services remain available to students after they graduate, if there is data from graduates about job placement and readiness (and if that data is filtered according to major), and what the tour guide or admission representative has found beneficial regarding any career-related services so far. It doesn’t hurt to look at graduation and retention rates either and inquire about any red flags (but remember: college cost affects retention, too).

Asking those same questions at any college is an essential part of the process. What perhaps is even more important is developing the flexibility of mindset to realize that not all hiring managers put much clout into the school listed under “Education” on one’s resume. I imagine that some readers will find that hard to believe or that many will have an anecdotal experience of a time where it did truly matter, in some shape or form. And it’s wonderful if one’s degree from a particular institution has some bearing on their future success, but it should be widely understood that the vast majority of studies on this subject show that no matter where four-year college graduates went to school, “job outcomes were unaffected in terms of earnings” between elite and “moderately selective schools”. In “Estimating the Payoff to Attending a More Selective College: An Application of Selection on Observables and Unobservables” by Stacy Berg Dale and Alan B. Krueger (1999), they found that “…students who attended more selective colleges do not earn more than other students who were accepted and rejected by comparable schools but attended less selective colleges.” However, and perhaps more interestingly, their study concluded that if the tuition charged is higher, the rate of return is “substantial”. Their study also reported that the rate of return for attendance at elite schools “appears to be greater for students from more disadvantaged family backgrounds,” which is not shocking, but should cause reflection about access to opportunity and socioeconomic advantages as they relate to equity.

Then there’s the other issue with rankings and prestige that has come to the forefront more so in the last few months: colleges that supply false information, both intentionally and accidentally, to the parties responsible for granting official rankings. In May of this year, the University of Oklahoma lost its ranking from US News after decades of providing exaggerated, misleading information about the university. In this case, it was determined that the false information was deliberately supplied since 1999. That’s 20 years of inaccurate information fueling a university’s ranking, which resulted in the university breaking into the top 100 schools. Given just how many colleges and universities there are in the United States (2,618 accredited four year institutions), a spot in the top 100 is more desirable than it may sound. Oklahoma is not alone in this regard, as it was also found that Temple University submitted inaccurate information related to their MBA program for three years leading up to 2018. The list goes on from there, both intentional and reportedly inadvertent, including: Boston University, inaccurate representation of research spending for the school of education (roughly $7,000,000 versus the reported $12,000,000); University of Akron, lots of erroneous data about their business school;University of California, Riverside, more inaccurate data about research spending money, to name a few.

So, if a highly selective, highly ranked college does not guarantee greater earnings or better job placement and some data may be skewed or inaccurate, why are we so attracted to them?

While we’ll never truly be able to track every graduate, monitor their employability, track their earnings, and stalk every facet of their careers, we can look to the people who read the resumes. Inherently, and despite one’s best intentions toward impartiality, there are biases when hiring or even when deciding who to schedule for a phone screen.

For one reason or another, two similar resumes may be seen differently and one may move forward. In my conversations (professionally and personally) with hiring managers and supervisors responsible for writing job descriptions and vetting resumes, I’ve come to learn that the bias rarely falls on the name of the college (or the GPA from that college, unless it’s one’s first time entering an education profession, but even that appears to be waning in importance). The times that it has had more to do with the school’s name certainly relate to alumni and not solely alumni of elite schools. The consensus, though? It’s about the experience, internships, paid positions on campus, summer jobs, presentations, and research.

Is the information associated with college rankings helpful? Sure thing. Is the prestige of the college according to ranking (or other perceptions generated by the American public) of great importance to hiring managers? Not really.

In 2013, Gallup, an analytics and advisory company, conducted two polls with 623 American business leaders regarding perceptions of higher education. In the study, 84% of those business leaders regarded “the amount of knowledge a candidate has in a particular field” as “very important” and 79% of respondents asserted “The candidate’s applied skills in the field” were “important to managers making hiring decisions for organizations”. On the other hand, when asked the same question but in terms of where the candidate received their degree, only 9% found it “very important”. The candidate’s college major also outweighed the degree-granting institution’s reputation (28%). So, knowledge base in the specific field or career area, applied skills in that field, and, to a smaller extent, one’s specific major were considered of much more importance than a job seeker’s alma mater.

Of course, there are times where it can matter and there are some disciplines where studies have shown a connection between the school and future employment, such as business-related majors. Granted, it should be understood that scenarios in which that “advantage” may occur are more so related to the earning of an MBA and the benefits of that alumni network as an older student with more experience. There’s no doubt that being part of the Harvard Business School has its real benefits. The same goes for certain medical schools after the completion of undergraduate studies.

Not so surprisingly though, the American public (not the employers) in the same study viewed this differently, placing quite the emphasis on the origin of the degree. Even though employers are plainly stating that one’s mastery of the content and skills associated with the field at hand are of far greater importance than where a candidate received their degree, the general public remains rigid in their thinking about the perception of a school versus the skill set of the job seeker. Why is that?

One can argue, easily, that society has been given awareness of the most prestigious schools endlessly; we can all rattle off Harvard, Yale, Stanford, Columbia, MIT, etc when asked to name the best colleges in the United States and, therefore, we’ve come to assume that they produce the smartest, brightest, and most career-ready, future thought-leaders. Another argument can be made that individuals who attended universities of that stature can’t imagine that their investment wouldn’t be recognized as a factor in the job search process. In both cases, it’s difficult to abandon those preconceived notions (much like those of community colleges’ value).

Can college rankings be helpful in understanding what a college offers, what students think about the college experience there, and what “type” of student often attends that particular college? Certainly. Both sets of rankings provide information about colleges in a succinct, readily available, and digestible format. But “good fit”, a student’s work ethic, and one’s ability to master the skills necessary for a particular field are critical factors in finding employment success.

It might be cliché or perhaps it’s a sugar coated counselor comment, but I continue to tell my students that college is what you make of it. I refuse to take student agency out of the equation, and no ranking procedure can alter that intangible.

Editorial Side Note: data shared in this article focus strictly on college rankings/predicted earnings in the United States. What’s going on in the UK is a different story (literally).

Subject to review.