Re: Future UK RAEs to be Metrics-Based

From: Stevan Harnad <>
Date: Thu, 13 Apr 2006 23:37:55 +0100

The following is a comment on an article that appeared in today's
Independent about the RAE and Metrics (followed by a response to another
piece in the Independent about Web Metrics).

Re: Hodges, L. (2006) The RAE is dead - long live metrics
    The Independent April 13 2006

Absolutely no one can justify (on the basis of anything but superstition)
holding onto an expensive, time-wasting assessment system such as the RAE,
which produces rankings that are almost perfectly correlated with, hence
almost exactly predictable from, inexpensive objective metrics such as
prior funding, citations and research student counts.

Hence the only two points worth discussing are (1) which metrics to use
and (2) how to adapt them to each discipline.

The web has opened up a vast and rich universe of potential
metrics that can be tested for their validity and predictive power:
citations, downloads, co-citations, immediacy, growth-rate, longevity,
interdisciplinarity, user tags/commentaries and much, much more. These
are all measures of research uptake, usage, impact, progress and
influence. They have to be tested and weighted according to the
unique profile of each discipline (or even subdiscipline). Prior
funding is highly predictive, but it also generates a Matthew Effect:
a self-fulfilling, self-perpetuating prophecy.

I would not for a moment believe, however, that any (research) discipline
lacks predictive metrics of research performance altogether. Even less
credible is the superstitious notion that the only way (or the best)
to evaluate research is for RAE panels to re-do, needlessly, locally,
the peer review that has already been done, once, by the journals in which
the research has already been published.

The urgent feeling that this human re-review is necessary has
nothing to do with the RAE or metrics in particular; it is
just a generic human superstition (and irrationality) about
population statistics versus my own unique, singular case:


Re: Diary (13 April 2006, other article, same issue)

> 'A new international university ranking has been launched and
> the UK has 25 universities in the world's top 300. The results
> are based on the popularity of the content of their websites on
> other university campuses. The G Factor is the measure of how
> many links exist to each university's website from the sites
> of 299 other research-based universities, as measured by 90,000
> google searches. No British university makes it into the Top 10;
> Cambridge sits glumly just outside at no 11. Oxford languishes at
> n.20. In a shock Southampton University is at no.25 and third in
> Britain. Can anyone explain this? Answers on a postcard. The rest
> of the UK Top 10, is UCL, Kings, Imperial, Sheffield, Edinburgh,
> Bristol and Birmingham.'

The reasons for the University of Southampton's extremely high overall
webmetric rating are four:

        (1) U. Southampton's university-wide research performance

    (2) U. Southampton's Electronics and Computer Science (ECS)
    Department's involvement in many high-profile web projects and
    activities (among them the semantic web work of the web's inventor,
    ECS Prof. Tim Berners-Lee, the Advanced Knowledge Technologies
    (AKT) work of Prof. Nigel Shadbolt, and the pioneering web linking
    contributions of Prof. Wendy Hall)

        (3) The fact that since 2001 U. Southampton's ECS has had a
        mandate requiring that all of its research output be made Open
        Access on the web, and that Southampton has a university-wide
        self-archiving policy (soon to become a mandate) too

    (4) The fact that maximising access to research (by self-archiving
    it free for all on the web) maximises research usage and impact
    (and hence web impact)

This all makes for an extremely strong Southampton web
presence, as reflected in such metrics as the "G factor".
which places Southampton 3rd in the UK and 25th among the world's
top 300 universities or which places
Southampton 6th in UK, 9th in Europe, and 80th among the top 3000
universities it indexes.

Of course, these are extremely crude metrics, but Southampton itself
is developing more powerful and diverse metrics for all Universities
in preparation for the newly announced metrics-only Research Assessment

Stevan Harnad
American Scientist Open Access Forum


    Some references:

        Harnad, S. (2001) Why I think that research access, impact and
        assessment are linked. Times Higher Education Supplement 1487:
        p. 16.

        Hitchcock, S., Brody, T., Gutteridge, C., Carr, L., Hall, W.,
        Harnad, S., Bergmark, D. and Lagoze, C. (2002) Open Citation
        Linking: The Way Forward. D-Lib Magazine 8(10).

        Harnad, S. (2003) Why I believe that all UK research output should
        be online. Times Higher Education Supplement. Friday, June 6 2003.

        Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated
        online RAE CVs Linked to University Eprint Archives: Improving
        the UK Research Assessment Exercise whilst making it cheaper
        and easier. Ariadne 35.

    Berners-Lee, T., De Roure, D., Harnad, S. and Shadbolt, N. (2005)
    Journal publishing and author self-archiving: Peaceful Co-Existence
    and Fruitful Collaboration.

        Brody, T., Harnad, S. and Carr, L. (2006) Earlier Web Usage
        Statistics as Predictors of Later Citation Impact. Journal of
        the American Association for Information Science and Technology

    Shadbolt, N., Brody, T., Carr, L. & Harnad, S. (2006) The Open
    Research Web: A Preview of the Optimal and the Inevitable. In:
    Jacobs, N., Eds. Open Access: Key Strategic, Technical and Economic
    Aspects Chandos.

        Citebase impact ranking engine

        Beans and Bean Counters

    Bibliography of Findings on the Open Access Impact Advantage

Stevan Harnad

A complete Hypermail archive of the ongoing discussion of providing
open access to the peer-reviewed research literature online (1998-2005)
is available at:
        To join or leave the Forum or change your subscription address:
        Post discussion to:

UNIVERSITIES: If you have adopted or plan to adopt an institutional
policy of providing Open Access to your own research article output,
please describe your policy at:

    BOAI-1 ("green"): Publish your article in a suitable toll-access journal
    BOAI-2 ("gold"): Publish your article in a open-access journal if/when
            a suitable one exists.
    in BOTH cases self-archive a supplementary version of your article
            in your institutional repository.
Received on Fri Apr 14 2006 - 02:01:07 BST

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:48:18 GMT