Prism Magazine September 2001

 

 

Last Word
rankings We Could Live With

- By Gerald Holder

"Personally, I dislike judging schools that I know virtually nothing about."

Like it or not, rankings of engineering schools are here to stay. The public's demand for this information gives tremendous clout to publications such as U.S. News & World Report. While these rankings may be imperfect, and perhaps seriously flawed, they are widely read, fervently discussed and seem to be increasing in influence.

Educators have a vested interest in national rankings and most deans at research universities spend some time figuring out how to improve the ranking of their institutions. As a result, a serious argument can be made for understanding and improving the ranking system or even creating our own.

How does one get highly ranked? First, it helps to be really big. Virtually all of the mega-engineering schools (4,000 or more undergraduates) are ranked in the top 50 and being big implies more total research expenditures and more Ph.D. students.

Second, being private seems to offer an advantage; you need not to be nearly as big if you are private.

Third, it helps to be in California (7 of the top 23 in the U.S. News rankings are located in California). Unfortunately, doubling enrollment, going private and moving to California are not options that any of us can exercise, at least not without a lot of planning.

Other strategies for moving up include rejecting as many applicants as possible. If you can get a lot of unqualified students to apply, and then reject them, it could help. Or, you could decrease your faculty size. This could improve your standing in regard to several of U.S. News' ranking measures, including Ph.D. student/faculty ratio and research expenditures per faculty.

Less cynically, rankings do give students a list of schools where they can get a great education and it provides the names of some schools that may be overlooked.

How could this or other surveys be improved? I offer a few suggestions.

First, the survey that goes out to deans and department heads could be improved. Personally, I dislike judging schools that I know virtually nothing about. What I would suggest is that each respondent be asked to provide a list of the top 25 and second 25 schools in the nation.

Second, I believe all school and department rankings, not just the top 50, should be published, at least on the Web. For example, few of you probably know that the University of Pittsburgh's graduate school of engineering is 59th in the news magazine's rankings. The reason you don't know is because U.S. News doesn't make this information available to the general public. It is not shared with interested students, corporations or academic peers. If this ranking were listed on the Web, then our strong features (faculty resources and research, which are in the U.S. News top 50) could be recognized.

Third, ASEE data should be used to obtain research expenditures, enrollments and degrees produced. This will provide more consistency and better accuracy overall.

Fourth, I'd vote to eliminate GREs and acceptance ratios as measures of quality. I would add research expenditures per Ph.D., since this ratio more accurately reflects the resources from which students will benefit in their graduate education.

In addition, for the non-academic (industry-based) aspect of the survey, U.S. News should sample a cross section of industry representatives, not just the recruiters at the top schools. This produces potentially biased rankings, especially in schools not in the top 25. Recently, I had an opportunity to survey CEO's and other top executives from such companies as USX, Alcoa, Alcatel, BP, Westinghouse, and EOG Resources, who serve on the Board of Visitors for the school of engineering at the University of Pittsburgh. Their rankings, in order, included: MIT, Carnegie Mellon, Georgia Tech, Stanford, Purdue, Rensselaer, Cal Tech, Michigan, Lehigh, Princeton, Case Western, Cornell, and Berkeley. The University of Pittsburgh was ranked 23rd. This survey samples a highly qualified group and its results are not too different from the U.S. News survey. But it is clear that higher rankings at Pitt and other geographically adjacent schools reflect some bias that these individuals have because of a strong affiliation with the institutions. I suggest that the current U.S. News survey is equally biased because it is surveying those with affiliations to particular schools.

One recommendation to U.S. News might be to provide opportunities for other qualitative rankings that demonstrate outstanding engineering education at the undergraduate level. For example, which schools offer the largest co-op programs, the best study abroad opportunities or focus most on student diversity?

Also, it would be more useful to provide a variety of rankings using different measures rather than a single overall composite. Currently, only the top 50 can be compared using various statistics.

Finally, if U.S. News cannot respond to our needs, we as engineering faculty should consider developing a new ranking system that can be accessed from our Web sites. If this were done, our rankings would likely supplement those of U.S. News as the primary source of information for prospective students. I could live with that.

Gerald Holder is the USX dean of engineering at the University of Pittsburgh.

 

prism@asee.org