Re: University ranking (and repository content)

From: Armbruster, Chris <Chris.Armbruster_at_EUI.EU>
Date: Sat, 9 Aug 2008 18:53:24 +0200

It would help if those critical of rankings would make the effort to understand the aim and methodology of the rankings that they criticize. And that is only the minimum required. It would also help if scholars would consider the rise of the internet in the context of the information society. For example, search engines (like Google) are ultimately also a ranking and not much else. Should we go back to address books? Moreover, it is science and scientists that are keen on rankings to the point of advising all junior scholars that they must publish in only the journals of the highest rank etc.

I think the critics here have simply missed the rise of research information services that are based on metrics. They have also missed that the webometric ranking undertaken by the Spanish Council for Scientific Research (CSIC) does actually provide some information on how accessible the information (science, knowledge) of universities and repositories is. Anybody interested in Open Access or simply enhanced accessiblity of scientific information should note that the raking does help institutions to understand what they must do to enhance access...

I recommend a look at the repositories ranking:
http://repositories.webometrics.info/

More is available from this 2008 paper:
Armbruster, Chris,Access, Usage and Citation Metrics: What Function for Digital Libraries and Repositories in Research Evaluation?
Available at SSRN: http://ssrn.com/abstract=1088453

Chris Armbruster


-----Original Message-----
From: American Scientist Open Access Forum on behalf of Guédon Jean-Claude
Sent: Sat 8/9/2008 14:03
To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM_at_LISTSERVER.SIGMAXI.ORG
Subject: Re: University ranking
 
The criticism of the university rankings in terms of measuring "what" is quite correct. However, it is also somewhat irrelevant. What is important in the end, whether we like it or not (and I certainly do not like it any more than the previous commentators) is that it creates a benchmark that sticks, so to speak, and is used. If there ever was a good example of social construction of "reality", this is it. What is at stake here is not quality measurement; rather, it is "logo" building for a globalized knowledge economy. If administrators, the press and governmental bureaucracies pay attention - and they obviously have begun to do so - then the strategy works.

The solution? In the absence of an institutionally effective critique, all we can hope for is the existence of competing and somewhat divergent indices... But I fear that, in the end, they will converge in one way or another and will create all the distortions that such artificial and, ultimately, manipulative metrics produce.

Jean-Claude Guédon


-----Original Message-----
From: American Scientist Open Access Forum on behalf of Dana Roth
Sent: Fri 8/8/2008 6:17 PM
To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM_at_LISTSERVER.SIGMAXI.ORG
Subject: Re: University ranking
 
I agree ... ranking Caltech #38 seems odd ...

Dana L. Roth
Millikan Library / Caltech 1-32
1200 E. California Blvd. Pasadena, CA 91125
626-395-6423 fax 626-792-7540
dzrlib_at_library.caltech.edu<mailto:dzrlib_at_library.caltech.edu>
http://library.caltech.edu/collections/chemistry.htm
From: American Scientist Open Access Forum [mailto:AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM_at_LISTSERVER.SIGMAXI.ORG] On Behalf Of Yves Gingras
Sent: Friday, August 08, 2008 8:30 AM
To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM_at_LISTSERVER.SIGMAXI.ORG
Subject: Re: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM Digest - 6 Aug 2008 to 7 Aug 2008 (#2008-151)


Hello all,

This type of ranking is to me clearly a case of crank ranking....

What is THAT supposed to mean? Quality? But of What? In fact it means not much of anything in terms of academic reality that should be the basic focus of universities and indicators rerlated to their missions. The accompanying text of the message seems to imply that the changes in positions are related to any kind of improvement of a university., which is clearly NOT the case. Worst still, in terms of interpretation, the text notes that " the UNAM climbs up to the position number 51, a relevant change from previous number 59." (we underline) Well... In fact, this changes of 8 positions among a thousand is in all probability (99.99...) due to a random change from year to year given the large variance of the sample.

Let us hope those who like constructing indicators wil use their time to find meaningful ones instead of trying to interpret smalll variations in pseudo common-sense indicators (here web hits) which are not really connected to universities' missions ...

Have a nice day


Yves Gingras
Received on Sun Aug 10 2008 - 10:50:12 BST

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:49:26 GMT