Tools and sites used for impact measurement (some preliminary results)
For the 2:AM Altmetrics conference on Oct 7-8, 2015 in Amsterdam, we looked at some of our preliminary survey
results on tools and sites used for impact measurement. For this, we
did a brief non-statistical analysis of responses up until October 1
from researchers (PhD-students, postdocs and faculty, n=3481) and
librarians (n=638), comparing their answers to the following survey
question:
The percentage of respondents that selected the various tools shown
as preselected options in the survey question notably differed between
researchers and librarians. Among researchers, traditional tools like
JCR (impact factor), Web of Science and Scopus are considerably more
often used than tools for altmetrics (Altmetric, ImpactStory and PLOS
article-level metrics). Librarians, however, selected Altmetric about as
often as the more traditional metrics tools.
It should be noted here that in the survey, respondents who support researchers (rather than actively carry out research themselves) are asked to indicate which tools they recommend (rather than actively use). This means that the results for librarians indicate endorsement, rather than active use.
The fact that librarians much more often selected altmetrics tools
like Altmetric and ImpactStory, compared to researchers, could either
reflect a greater awareness or a greater enthousiasm for these tools.
To investigate possible differences between fields in the use of
altmetrics tools, we broke down researchers’ responses according to
discipline, and looked at the share of altmetrics tools among all
metrics tools mentioned by these groups. This analysis included tools
mentioned in the ‘others’ category.
Perhaps surprisingly, these preliminary results indicate a relatively
large share of altmetrics tools mentioned by researchers in the Arts
& Humanities, compared to many other discplines. It should be noted
that sample size for this group was comparatively small, and we did not
currently do any statistical analysis. Having said that, it could be
hypothesized that because Arts & Humanities scholars traditionally
have limited use for citation databases like Scopus and Web of Science
due to the coverage of these databases and the absence of humanities
from JCR, they perhaps more readily embrace the opportunity of measuring
the impact of their research output via altmetrics, in as far as this
output can indeed be identified by altmetrics providers.
As yet we have no explanation for the observed difference between
physical and life sciences, other than perhaps the wild guess that
researchers in physical sciences still don’t consider altmetrics as
seriously yet.
Finally, what other tools were mentioned by participants, in addition
to the 7 preselected tools shown above? Across disciplines, by far the
most often mentioned ‘other’ tool was Google Scholar (which was not
counted as an altmetrics tool), followed by ResearchGate (which was).
Strikingly, almost nobody mentioned using alternative journal rankings.
Disclaimer: the results shown here are based on
preliminary data, and are to be treated as such. No claims are made as
to the statistical significance of the results, or lack of bias in the
data. Our survey is running until February 2016, with many institutional partners yet to start their distribution. In addition, we are currently rolling out translations of the survey to increase participation in non-Western countries.
The poster addendum with these results, showed at the 2:AM Altmetrics conference, is also available on Figshare:
http://dx.doi.org/10.6084/m9.figshare.1572175
Tools and sites used for impact measurement (some preliminary results) | Innovations in Scholarly Communication