required to factor into their policies and practices the conditions by
which publicly funded research must be made publicly available.
But in the struggle for competitive funding, how can researchers
provide tangible evidence that their outputs have not only been made
publicly available, but that the public is using them? Or how can they
demonstrate that their research outputs have reached and influenced
those whose tax dollars have helped fund the research?
Traditional impact metricsThe number of raw citations per paper or an aggregate number, such as
the h-index, are indicators of scholarly impact, in that they reveal
the attribution of credit in scholarly works to prior scholarship. This
attribution is normally given by scholars in peer-reviewed journals, and
by citation databases. But they do not provide an indication of public
reach and influence. Traditional metrics also do not provide an
indication of impact for non-traditional research outputs, such as
datasets or creative productions, or of non-journal publications, such
as books and media coverage.
Public impact for all types of research outputs could always be
communicated as narrative or case studies. These forms of evidence can
be extremely useful, perhaps even necessary, in building a case of past
impact as an argument for future funding. However, impact narratives and
case studies require sources of evidence to support their impact
claims. An example of how this can be achieved is in the guidelines for completion of case studies in the recent Australian Technology Network of universities (ATN) / Group of Eight (Go8) Excellence in Innovation in Australia impact assessment trial.
One promising source of evidence is the new suite of alternative
metrics or altmetrics that have been developed to gauge the academic and
public impact of digital scholarship, that is, any scholarly output
that has a digital identifier or online location and that is accessible
by the web-public.
The advent of altmetricsAltmetrics (or alternative metrics) was a term aptly coined in a tweet
by Jason Priem (co-founder of ImpactStory). Altmetrics measure the
number of times a research output gets cited, tweeted about, liked,
shared, bookmarked, viewed, downloaded, mentioned, favourited, reviewed,
or discussed. It harvests these numbers from a wide variety of open
source web services that count such instances, including open access
journal platforms, scholarly citation databases, web-based research
sharing services, and social media.
The numbers are harvested almost in real time, providing researchers
with fast evidence that their research has made an impact or generated a
conversation in the public forum. Altmetrics are quantitative
indicators of public reach and influence.
The monitoring of one’s impact on the social web is not an exercise
in narcissism. Altmetrics enable the creation of data-driven stories for
funding providers and administrators. Being web-native, they also
facilitate the fleshing out of those stories, by providing links to the
sources of the metrics. Researchers can see who it is talking about
their research, what they are saying about it, and even how they intend
to use it for various scholarly, industry, policy and public purposes.
In this way, researchers can find potential collaborators and partners,
and gain constructive feedback from those interacting with the research.
Altmetrics also provide a democratic process of public review, in
which outputs are analysed and assessed by as many students,
researchers, policy makers, industry representatives, and members of the
public that wish to participate in the discussion. Altmetrics provide a
more comprehensive understanding of impact across sectors, including
public impact by publically funded research.
Altmetrics and open accessThere is an interesting relationship between altmetrics and open
access. One could even refer to altmetrics as open metrics. This is
firstly due to the fact that altmetrics data uses open sources.
Altmetrics services access and aggregate the impact of a research
artefact, normally via an application programming interface (API)
made available by the source. Altmetrics services in turn provide APIs
for embedding altmetrics into institutional repositories or third-party
systems. Secondly, open access research outputs that are themselves
promoted via social web
applications enjoy higher visibility and accessibility than those
published within the commercial scholarly communication model,
increasing the prospect of public consumption and engagement.
Altmetrics (also known as article level metrics or ALMs) are seen as complementary to open access. The PLOS Article Level Metrics for Researchers page lists some of these complementarities:
- Researchers can view and collect real-time indicators of the reach
and influence of outputs, and share that data with collaborators,
administrators and funders
- Altmetrics empowers researchers to discover impact-weighted trends and innovations
- Researchers can discover potential collaborators based on the level of interest in their work
- High impact datasets, methods, results and alternative interpretations are discoverable
- Dissemination strategies and outlets can be tracked, evaluated and reported on
- Evaluation of research is based on the content, as opposed to the container (or journal)
- Research recommendations are based on collective intelligence indicators
a special section on altmetrics, in which several articles touch on the
complementarity between altmetrics and open access. These articles
- Provide open source social impact indicators that can be embedded into CVs
- Enable a public filtering system and track social conversations around research
- Provide evidence of access by countries that cannot afford expensive journals
- Provide authors with a more comprehensive understanding of their readership
- Offer repository managers additional metrics for demonstrating the impact of open access
- Provide additional usage data for collection development and resource planning exercises
- Provide supplementary impact indicators for internal reviews and funding applications
- May be used as quantitative evidence of public impact for research evaluation exercises
- Provide a better reflection of the usage and impact of web-native outputs
communication model is one of sharing findings as they occur,
interaction and evaluation by interested parties, and subsequent
conversations leading to future collaborations and revised or new
findings. And altmetrics provide us with an understanding of the impact
received at each point in the cycle.
Providers of altmetricsThe following services are good places to start to monitor your altmetrics:
embedded into repositories, and ImpactStory has the further advantage
that impact “badges” can be embedded into CVs. Altmetric also offers a
free bookmarklet that can be added to your bookmarks and used to get
altmetrics on articles with Digital Object Identifiers (DOI)s or
identifiers in open databases such as PubMed Central or arXiv.
Altmetrics will only work on Chrome, Firefox or Safari. Plum Analytics
probably has the widest coverage of altmetrics sources, and is a paid
service. Both Altmetric and Plum Analytics offer commercial tools that
offer comparative and group reports.
The best way to engage with altmetrics is to jump right in and have a
play. You will be amazed at how quick and easy it is to use the tools
and start generating metrics for your research outputs.
Repository administrators can embed altmetrics at the article level
within institutional repositories to compliment traditional metrics,
views and downloads. Some research information management systems, such
as Symplectic Elements,
that are capable of generating reports on publication activity and
impact, also include article level altmetrics alongside traditional
Pat Loria is the Research Librarian at the University of Southern Queensland.
His twitter handle is @pat_loria.
Altmetrics and open access: a measure of public interest – Australasian Open Access Strategy Group