… in the Minneapolis Star Tribune notes that the most charitable description of what’s been going on at the clubby University of Minnesota medical school would be “bizarre.”
Sunday, March 14, 2010
Shanghai Rankings Best
It is always instructive to watch videos - or even attend - meetings of the University of Minnesota Board of Regents. This is a Kabuki event where administrators can put their own spin on things in a setting where they are unlikely to be called to account. (Once a year the groundlings are allowed a three minute comment.) A good example of such spin lawyering is what our Provost had to say at approximately 1:40 of the video of the March 12 Board Meeting:
Board of Regents Meeting - March 12, 2010
"...we also see substantial progress and momentum...So, ah, Tom - how about USNews rankings? What makes Shanghai rankings superior to the USNews ranking?
Absolutely and relative to our peer institutions as well.
We can always see that in the best measure, the best ranking which I think is the Shanghai ranking, and one can always quibble with methodology, but we think it's the best ranking that's there and it still has the university somewhere between seven or eighth or ninth best public research university in the US and in the top twenty public research universities in the world."
And where are we in USNews? Somewhere around sixty-one, if I recall. Right in there with Clemson, Fordham, Purdue, and Texas A&M!
And where are we in Forbes? We won't even go there.
And then of course there is the Times Higher Education Supplement? Where are we in that one? Around one hundred. We have dropped eighteen places from the year before. Are people who disagree with you just quibbling over methodology, Tom?
How about:
A Critique of Shanghai's school rankings
And then of course there is data readily available to you, Tom - you presented it last Fall to the Board of Regents - that is not in agreement with your claim about progress relative to our peers:Some Problems with the Shanghai school rankingsSome people follow the world school rankings published in Shanghai blindly. You'd be foolish to do so, and here's an article (in French) explaining why.
http://www.lemonde.fr/societe/article/2009/11/16/le-classement-de-shangh...
[From Le Monde, 16.11.09: Le classement de Shanghaï, étude mal menée, calcul mal fait, par Jean-Charles Billaut, Denis Bouyssou, Philippe Vincke]
This article highlights some very important points about the Shanghai rankings. As the article puts it "the devil is in the details."
1) The way they define an institution automatically creates a heavy bias in favor of larger schools. Indeed, if the Sorbonne in Paris hadn't been split up into 13 different schools in 1968 it would be ranked among the top 10 best schools in the world whereas at the moment it isn't even in the top 100.
2) One of the "objective" criteria that is used to determine the ranking of a school is how many Nobel prizes, and/or Field’s medals the professors of that school have received. Sounds fair enough. However, the prizes that have been won more than 100 years ago count as much as those that were won last year. Why should a school continue to benefit from the success of a professor that was once at their school but is now no longer at their school (and probably deceased)?
3) Obviously, publishing articles in quality journals is important. However, the Shanghai rankings valorizes [sic] longer articles more than shorter ones. In fact, articles published in economic journals have twice the weight as articles that are published in chemistry journals because economics articles are generally twice as long. Longer articles are not necessarily better articles.
4) The people that make these rankings are human, and humans make mistakes. That's why the data that is used to make this ranking needs to be made public so other people can check their work. However, the data that is used is not made public and therefore the results are completely unverifiable.
Using such poorly constructed and unverifiable rankings as a primary source of decision making would thus be foolhardy.
2009 University Plan, Performance,
and Accountability Report
Table 2-2. Percentage of freshmen in top 10 percent of high school class for U of M-Twin Cities and comparative group institutions, 2008-09.
Rank Institution 2008-09
1 University of California - Berkeley 98%
2 University of California - Los Angeles 97
3 University of Michigan - Ann Arbor 92
4 University of Washington - Seattle 87
5 University of Florida 75
6 University of Texas - Austin 75
7 University of Wisconsin - Madison 58
8 University of Illinois - Urbana-Champaign 55
9 Ohio State University - Columbus 53
10 University of Minnesota - Twin Cities 45
11 Pennsylvania State University - Univ. Park 43
Source: Common Data Set Initiative, 2008-09.
What is really funny about the following data is that in the U report the table is listed in alphabetical order, neatly hiding the fact that we are at the bottom. Accidental?
Table 2-3. SAT and ACT scores of new, entering freshmen at comparative group institutions, 2008.
1 University of California - Berkeley
2 University of Michigan - Ann Arbor
3 University of California - Los Angeles
4 University of Wisconsin - Madison
5 University of Illinois - Urbana-Champaign
6 University of Florida
7 University of Texas - Austin
8 Ohio State University - Columbus
9 Pennsylvania State University - University Park
10 University of Washington - Seattle
11 University of Minnesota - Twin Cities
Source: Common Data Set Initiative, 2008-09.
Table 2-7. First-, second-, and third-year retention rates of U of M-Twin Cities’ and comparative group institutions’ students in 2005, 2006, and 2007 entering class cohorts (ranked by 2nd-year rate).
We are dead last 10/10 in this category. Florida is not included or we probably would have been 11/11
1 University of California - Berkeley
2 University of Michigan - Ann Arbor
3 University of California - Los Angeles
4 Pennsylvania State University - Univ. Park
5 Ohio State University - Columbus
6 University of Wisconsin - Madison
7 University of Illinois - Urbana-Champaign
8 University of Washington - Seattle
9 University of Texas - Austin
10 University of Minnesota - Twin Cities
University of Florida Not Available
So Tom, where is this relative improvement you are claiming?
Table 2-21. National Academy members: U of M-Twin Cities and comparative group institutions, 2007.
Group Rank Institution 2007 1-Yr % Change 5-Yr % Change
1 University of California - Berkeley 214 1.4% 5.9%
2 University of Washington - Seattle 90 4.7% 13.9%
3 University of Michigan - Ann Arbor 77 1.3% 10.0%
4 University of California - Los Angeles 73 -3.9% 21.7%
5 University of Wisconsin - Madison 72 1.4% 4.3%
6 University of Texas - Austin 59 -3.3% 11.3%
7 University of Illinois - Urbana-Champaign 56 0.0% 9.8%
8 University of Minnesota - Twin Cities 36 0.0% -5.3%
9 Pennsylvania State University - Univ. Park 26 -13.3% 4.0%
10 Ohio State University - Columbus 21 -4.5% 23.5%
10 University of Florida 21 5.0% 31.3%
Source: The Top American Research Universities: The Center for Measuring University Performance, 2008.
Tom, the numbers above are very important. I won't deconstruct them because you are a smart man and can figure out the implications. So can the rest of the world.
Time to get real and stop talking about rankings? It is a game for losers.
No comments:
Post a Comment