Re: Does the arXiv lead to higher citations and reduced downloads?

From: Stevan Harnad <harnad_at_ecs.soton.ac.uk>
Date: Fri, 24 Mar 2006 16:36:27 EST

On Wed, 22 Mar 2006, Peter Banks wrote:

> I am unsure what type of "wishful thinking" I am alleged to have engaged
> in. My journals Diabetes Care and Diabetes are freely available after 3
> months, and papers accepted in the journals may be posted on acceptance
> in any institutional repository--making them, at least by Stevan's
> own criteria, open access. Thus I have no interest in disproving--and,
> in fact, an interest in proving--an open access advantage. As I said,
> I think it exists, but doubt that some of the data supporting it is of
> sufficient rigour to accurately measure its magnitude.

My apologies, Peter! You are quite right on both counts: Your
journal has an exemplary green policy and you hence have no
reason at all for wishing the OA effect away. (I must have mixed
you up with another journal editor who was opposing author
self-archiving. Please forgive me!)

> For example, Stevan suggests that Antelman's data and that of
> his colleagues "show the same thing." Actually, they don't. The
> Antelman data show an OA advantage that is quite modest, if it
> exists at all; the Harnad data show one that is quite large in
> some disciplines. When I was in graduate school, it was
> expected that one would try to explain a magnitude of order
> difference between one's own data and those of another
> investigator, not to paper over the differences and call it a
> day simply because they trended the same way.

Fair enough. The differences between the two studies are in (1)
the sample sizes, (2) the samples, (3) the fields, and (4) the
fact that one of the studies was based on within
same-journal/issue comparisons, the other was not.

The overall outcome pattern for both is the same, however: OA
articles have higher average citation impact than NOA articles,
and this is true in every (non-zero) citation range.

Frankly, for authors in all fields contemplating whether or not
it is worthwhile self-archiving, that's all they need to know.
(For the detailed chronometric/scientometric analysis, I agree
that there remains a lot to study and report!)

> It would help to see not only the relative increase in citation
> through OA, but also the absolute increase. What does a 100% or
> 200% increase represent? If it's an increase from 0.1 average
> citations per paper to 0.2 or 0.3, the effect on the
> dissemination of knowledge is much less significant than if one
> is speaking of an increase from 2 citations to 4 or 6. Because
> very few papers have even one citation, I suspect we're talking
> much more about the former case than the latter.

Ah, now I understand what you meant by "significant": You didn't
mean "statistically significant" (which Antelman's reported
results certainly are), but "practically significant," i.e.,
whether the advantage is big enough to matter to researchers,
their institutions and their funders.

The answer is yes, and here are three ways to see this (from the
standpoint, respectively, of (1) the researcher, (2) the research
institution, and (3) the research funding agency [hence,
ultimately, the tax-payer].

(1) Using the lowest-end of the 25%-250% range for the OA
Advantage, 25%, and the lowest-end of Diamond's 1986 ($50-$1300)
range of what a citation is worth ($50-$1300) corrected for 170%
inflation in the past 20 years), $85 x 25% = $21.25 per citation
(in salary increases). That alone is perhaps not "significant"
enough to impel researchers to do or delegate the few keystrokes
(about 5 minutes' worth) of keystrokes it takes to deposit each
of their annual papers.

     Diamond, A. M. (1986) What is a Citation
     Worth? Journal of Human Resources 21: 200-15.
     http://www.garfield.library.upenn.edu/essays/v11p354y1988.pdf

But researchers don't just publish for salary increases, and
that's not the only reason they care about their findings being
used and cited. They are trying to make a contribution to
knowledge, and the uptake of their work matters to them. It also
matters to their institutions:

(2) The UK Research Assessment Exercise ranks and rewards (with
substantial top-sliced research funding) the research performance
of each department in each university. The RAE rank is highly
correlated with citation counts (even though citations are not
directly counted. I leave it as an exercise for the reader to
calculate, for an arbitrary department in the 2001 RAE outcomes,
how much its RAE rank would have been raised, and hence how much
more RAE funding it would receive, with a 25% increase in its
citation counts (relative to the competing ranks). The news is
that future RAEs will be explicitly "metrics-based."

     "Online, Continuous, Metrics-Based Research Assessment"
     http://openaccess.eprints.org/index.php?/archives/75-guid.html

(3) If a country spends R billion dollars a year funding research, and
currently receives C million citations for that investment, then it is
currently losing at least 21% of its potential research impact until
it mandates self-archiving (because spontaneous self-archiving is just
15%, so that means 25% x 85% = 21.25% potential citations lost).

I would say we are talking about effects that are not just
statistically significant but practically significant in all
three cases (researchers, institutions and funders) --
significant enough to make the optimality (and inevitability) of
mandating self-archiving self-evident for all concerned. (It just
sometimes takes a while for people to *see* the self-evident!)

Stevan Harnad
Received on Sat Mar 25 2006 - 21:56:08 GMT

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:48:17 GMT