Re: University ranking

From: Isidro F. Aguillo <isidro_at_cindoc.csic.es>
Date: Tue, 19 Aug 2008 12:17:33 +0200

    [ The following text is in the "windows-1252" character set. ]
    [ Your display is set for the "iso-8859-1" character set. ]
    [ Some characters may be displayed incorrectly. ]

Dear All:

Sorry for the delay due to the summer holidays. As editor of the Web Ranking
I wish to thank for your critical comments.

One of the first points to tackle here here is the feasibility. For a
complete and precise description of world universities we will need several
dozens of variables and a high level of standardization of such variables
(what is a professor in different countries?). Moreover, for many of these
variables there is no reliable source even from the own university. This is
the reason other Rankings usually focus on Top 200 or Top 500 institutions.

When checking ARWU (Shanghai) or THES (Times) rankings some of the variables
are not very useful for ranking large sets. Consider there are only a few
universities with 2 or more Nobel Prizes, so ranking with low numbers is not
fair. This can explain the lack of reliability of positions below 200 in the
cited rankings.

So, why web ranking? First, the numbers involved are far larger, millions
instead of hundreds (Reaching 51st from 59th means a large electronic
publishing effort by the UNAM, involving millions of web pages or
documents); Second, it is easier to obtain data for institutions worldwide:
Webometrics rank 16000 (Sixteen thousands!) universities including most of
the developing countries not included in other rankings; Third, Web reflects
(or at least should in the near future) the full set of activities of an
academic institution (teaching, research, transfer, community involvement),
not only number of papers published. Probably there is a lot of ?noise? but
there are certainly very important aspects of this noise for candidate
students (sports!) or other citizens (general knowledge, half of the faculty
members came from Humanities and Social Sciences) not very prone to
publication; Fourth, Web is reaching hundreds of millions (paper editions,
thousands at the best). Electronic publication should be mandatory, or at
least an evaluation system should take it very seriously. Webometrics is
paving such road.

Regarding methodology, it is possible to use crawlers instead of search
engines, but this is unfeasible for such a large task. Moreover, most of the
users use Google and Co. for recovering information, so if you information
are not in that databases it is invisible, it does not exist. For avoiding
inconsistencies we choose specific data centers, not the central webservers,
collect the figures twice, correct the biased results and select the most
relevant formats. Of course there are problems with Scholar, but we have
contacted with them and they will provide new tools in the expected non-beta
version.

Finally not to forget the results. Webometrics ranks highly with THES & ARWU
(0.6-0.7), provide useful insights about crazy web policies (Imperial
College, Caltech, Paris & Catalonian Universities, Johns Hopkins, ..) or
great repository initiatives (Penn State, Linkoping, Complutense).

Please, take the Web seriously


Guédon Jean-Claude escribió:
> The criticism of the university rankings in terms of measuring "what" is
> quite correct. However, it is also somewhat irrelevant. What is important
> in the end, whether we like it or not (and I certainly do not like it any
> more than the previous commentators) is that it creates a benchmark that
> sticks, so to speak, and is used. If there ever was a good example of
> social construction of "reality", this is it. What is at stake here is not
> quality measurement; rather, it is "logo" building for a globalized
> knowledge economy. If administrators, the press and governmental
> bureaucracies pay attention - and they obviously have begun to do so -
> then the strategy works.
>
> The solution? In the absence of an institutionally effective critique, all
> we can hope for is the existence of competing and somewhat divergent
> indices... But I fear that, in the end, they will converge in one way or
> another and will create all the distortions that such artificial and,
> ultimately, manipulative metrics produce.
>
> Jean-Claude Guédon
>
>
> -----Original Message-----
> From: American Scientist Open Access Forum on behalf of Dana Roth
> Sent: Fri 8/8/2008 6:17 PM
> To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM_at_LISTSERVER.SIGMAXI.ORG
> Subject: Re: University ranking
> I agree ... ranking Caltech #38 seems odd ...
>
> Dana L. Roth
> Millikan Library / Caltech 1-32
> 1200 E. California Blvd. Pasadena, CA 91125
> 626-395-6423 fax 626-792-7540
> dzrlib_at_library.caltech.edu<mailto:dzrlib_at_library.caltech.edu>
> http://library.caltech.edu/collections/chemistry.htm
> From: American Scientist Open Access Forum
> [mailto:AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM_at_LISTSERVER.SIGMAXI.ORG] On
> Behalf Of Yves Gingras
> Sent: Friday, August 08, 2008 8:30 AM
> To: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM_at_LISTSERVER.SIGMAXI.ORG
> Subject: Re: AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM Digest - 6 Aug 2008 to 7
> Aug 2008 (#2008-151)
>
>
> Hello all,
>
> This type of ranking is to me clearly a case of crank ranking....
>
> What is THAT supposed to mean? Quality? But of What? In fact it means not
> much of anything in terms of academic reality that should be the basic
> focus of universities and indicators rerlated to their missions. The
> accompanying text of the message seems to imply that the changes in
> positions are related to any kind of improvement of a university., which
> is clearly NOT the case. Worst still, in terms of interpretation, the
> text notes that " the UNAM climbs up to the position number 51, a relevant
> change from previous number 59." (we underline) Well... In fact, this
> changes of 8 positions among a thousand is in all probability (99.99...)
> due to a random change from year to year given the large variance of the
> sample.
>
> Let us hope those who like constructing indicators wil use their time to
> find meaningful ones instead of trying to interpret smalll variations in
> pseudo common-sense indicators (here web hits) which are not really
> connected to universities' missions ...
>
> Have a nice day
>
>
> Yves Gingras
>
>

-- 
****************************
Isidro F. Aguillo
Laboratorio de Cibermetría
Cybermetrics Lab
CCHS - CSIC
Joaquin Costa, 22
28002 Madrid. Spain
isidro _at_ cindoc.csic.es
+34-91-5635482 ext 313
****************************
Received on Tue Aug 19 2008 - 19:02:02 BST

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:49:26 GMT