Re: [SIGMETRICS] UUK report looks at the use of bibliometrics

From: Jonathan Adams <Jonathan.adams_at_EVIDENCE.CO.UK>
Date: Thu, 8 Nov 2007 11:49:44 -0000

Adminstrative info for SIGMETRICS (for example unsubscribe):
http://web.utk.edu/~gwhitney/sigmetrics.html

Dear Stephen

Thank you for your informed and interesting comments on our report to
UUK, though I should say that some of your soundbites are addressed
in the body of the report.

I am sure UUK would appreciate receiving your extended commentary.


Jonathan Adams

Evidence Ltd


> From: UNIVERSITIES UK PRESSOFFICES
> EMBARGO 00.01hrs 8 November 2007

> "This report will help Universities UK to formulate its position on
the
> development of the new framework for replacing the RAE after 2008."
> Some of the points for consideration in the report include:
>               *       Bibliometrics are probably the most useful of
a
> number of variables that could feasibly be used to measure research
> performance.

What metrics count as "bibliometrics"? Do downloads?
hubs/authorities?
Interdisciplinarity metrics? Endogamy/exogamy metrics? chronometrics,
semiometrics?

>               *      There is evidence that bibliometric indices do
> correlate with other, quasi-independent measures of research
quality -
> such as RAE grades - across a range of fields in science and
> engineering.

Meaning that citation counts correlate with panel rankings in all
disciplines tested so far. Correct.

>               *      There is a range of bibliometric variables as
> possible quality indicators.  There are strong arguments against
the use
> of (i) output volume (ii) citation volume (iii) journal impact and
(iv)
> frequency of uncited papers.

The "strong" arguments are against using any of these variables
alone, or
without testing and validation. They are not arguments against
including
them in the battery of candidate metrics to be tested, validated and
weighted against the panel rankings, discipline by discipline, in a
multiple regression equation.

>               *      'Citations per paper' is a widely accepted
index
> in international evaluation.  Highly-cited papers are recognised as
> identifying exceptional research activity.

Citations per paper is one (strong) candidate metric among many, all
of
which should be co-tested, via multiple regression analysis, against
the
parallel RAE panel rankings (and other validated or face-valid
performance measures).

>               *       Accuracy and appropriateness of citation
counts
> are a critical factor.

Not clear what this means. ISI citation counts should be supplemented
by
other citation counts, such as Scopus, Google Scholar, Citeseer and
Citebase: each can be a separate metric in the metric equation.
Citations from and to books are especially important in some
disciplines.

>               *       There are differences in citation behaviour
> among STEM and non-STEM as well as different subject disciplines.

And probably among many other disciplines too. That is why each
discipline's regression equation needs to be validated separately.
This
will yield a different constellation of metrics as well as of beta
weights on the metrics, for different disciplines.

>               *       Metrics do not take into account contextual
> information about individuals, which may be relevant.

What does this mean? Age, years since degree, discipline, etc. are
all
themselves metrics, and can be added to the metric equation.

> They also do not
> always take into account research from across a number of
disciplines.

Interdisciplinarity is a measurable metric. There are self-citations,
co-author citations, small citation circles, specialty-wide
citations,
discipline-wide citations, and cross-disciplinary citations. These
are
all endogamy/exogamy metrics. They can be given different weights
in fields where, say, interdisciplinarity is highly valued.

>               *       The definition of the broad subject groups
and
> the assignment of staff and activity to them will need careful
> consideration.

Is this about RAE panels? Or about how to distribute researchers by
discipline or other grouping?

>               *       Bibliometric indicators will need to be
linked
> to other metrics on research funding and on research postgraduate
> training.

"Linked"? All metrics need to be considered jointly in a multiple
regression equation with the panel rankings (and other validated or
face-valid criterion metrics).

>               *       There are potential behavioural effects of
using
> bibliometrics which may not be picked up for some years

Yes, metrics will shape behaviour (just as panel ranking shaped
behaviour), sometimes for the better, sometimes for the worse.
Metrics
can be abused -- but abuses can also be detected and named and
shamed,
so there are deterrents and correctives.

>               *       There are data limitations where researchers'
> outputs are not comprehensively catalogued in bibliometrics
databases.

The obvious solution for this is Open Access: All UK researchers
should
deposit *all* their research output in their Institutional
Repositories
(IRs).  Where it is not possible to set access to a deposit as OA,
access can be set as Closed Access, but the bibliographic metadata
will
be there. (The IRs will not only provide access to the texts and the
metadata, but they will generate further metrics, such as download
counts, chronometrics, etc.)

> The report comes ahead of the HEFCE consultation on the future of
> research assessment expected to be announced later this month.
> Universities UK will consult members once this is published.

Let's hope both UUK and HEFCE are still open-minded about ways to
optimise the transition to metrics!

Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated
online
RAE CVs Linked to University Eprint Archives:
Improving the UK Research Assessment Exercise whilst making it
cheaper
and easier. Ariadne 35.
http://www.ecs.soton.ac.uk/~harnad/Temp/Ariadne-RAE.htm

Brody, T., Kampa, S., Harnad, S., Carr, L. and Hitchcock, S. (2003)
Digitometric Services for Open Archives Environments. In Proceedings
of
European Conference on Digital Libraries 2003, pp. 207-220,
Trondheim,
Norway. http://eprints.ecs.soton.ac.uk/7503/

Harnad, S. (2006) Online, Continuous, Metrics-Based Research
Assessment.
Technical Report, ECS, University of Southampton.
http://eprints.ecs.soton.ac.uk/12130/

Harnad, S. (2007) Open Access Scientometrics and the UK Research
Assessment Exercise. In Proceedings of 11th Annual Meeting of the
International Society for Scientometrics and Informetrics 11(1), pp.
27-33, Madrid, Spain. Torres-Salinas, D. and Moed, H. F., Eds.
http://eprints.ecs.soton.ac.uk/13804/

Brody, T., Carr, L., Harnad, S. and Swan, A. (2007) Time to Convert
to
Metrics. Research Fortnight pp. 17-18.
http://eprints.ecs.soton.ac.uk/14329/

Brody, T., Carr, L., Gingras, Y., Hajjem, C., Harnad, S. and Swan, A.
(2007) Incentivizing the Open Access Research Web:
Publication-Archiving, Data-Archiving and Scientometrics. CTWatch
Quarterly 3(3). http://eprints.ecs.soton.ac.uk/14418/

Stevan Harnad
AMERICAN SCIENTIST OPEN ACCESS FORUM:
http://amsci-forum.amsci.org/archives/American-Scientist-Open-Access-Forum.h
tml
     http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/

UNIVERSITIES and RESEARCH FUNDERS:
If you have adopted or plan to adopt a policy of providing Open
Access
to your own research article output, please describe your policy at:
     http://www.eprints.org/signup/sign.php
     http://openaccess.eprints.org/index.php?/archives/71-guid.html
     http://openaccess.eprints.org/index.php?/archives/136-guid.html

OPEN-ACCESS-PROVISION POLICY:
     BOAI-1 ("Green"): Publish your article in a suitable toll-access
journal
     http://romeo.eprints.org/
OR
     BOAI-2 ("Gold"): Publish your article in an open-access journal
if/when
     a suitable one exists.
     http://www.doaj.org/
AND
     in BOTH cases self-archive a supplementary version of your
article
     in your own institutional repository.
     http://www.eprints.org/self-faq/
     http://archives.eprints.org/
     http://openaccess.eprints.org/

> Notes
>       1.      The report, The use of bibliometrics to measure
research
> quality in UK higher educations, will be available to download from
the
> Universities UK website from 9am on Thursday November 8 2007 at
> <http://bookshop.universitiesuk.ac.uk/latest/>.
>
>       2.      For further press enquiries, please contact the
> Universities UK press office or email
pressunit_at_universitiesuk.ac.uk
> <mailto:pressunit_at_universitiesuk.ac.uk>.
>
>       3.      Universities UK is the major representative body and
> membership organisation for the higher education sector. It
represents
> the UK's universities and some higher education colleges.
>
>       Its 131 members http://www.UniversitiesUK.ac.uk/members/ are
the
> executive heads of these institutions.
>
>               Universities UK works closely with policy makers and
key
> education stakeholders to advance the interests of universities and
to
> spread good practice throughout the higher education sector.
>
>               Founded in 1918 and formerly known as the Committee
for
> Vice-Chancellors and Principals (CVCP), Universities UK will
celebrate
> its 90th anniversary in 2008.
Received on Thu Nov 08 2007 - 12:09:17 GMT

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:49:06 GMT