From bibliometrics to altmetrics
A changing scholarly landscape
- Author Affiliations
- Robin Chin Roemer is communication librarian at American University, e-mail: robincr@american.edu, and
- Rachel Borchardt is science librarian at American University, e-mail: borchard@american.edu
When future Science Citation Index founder Eugene Garfield first came up with the idea of journal impact factor in 1955, it
never occurred to him “that it would one day become the subject of widespread controversy.”1
never occurred to him “that it would one day become the subject of widespread controversy.”1
Today, techniques for measuring scholarly
impact—traditionally known as bibliometrics —are well known for
generating conflict
and concern, particularly as tenure-track scholars
reach beyond previously set boundaries of discipline, media, audience,
and format. From the development of more nuanced
academic specialties to the influence of blogs and social media,
questions
about the scope of scholarly impact abound, even as
the pressure to measure such impact continues unabated or increases.
impact—traditionally known as bibliometrics —are well known for
generating conflict
and concern, particularly as tenure-track scholars
reach beyond previously set boundaries of discipline, media, audience,
and format. From the development of more nuanced
academic specialties to the influence of blogs and social media,
questions
about the scope of scholarly impact abound, even as
the pressure to measure such impact continues unabated or increases.
As faculty at universities around the world
struggle to find new ways of providing evidence of their changing
scholarly value,
many librarians have stepped forward to help negotiate
the landscape of both traditional impact metrics, such as h-index and
journal impact factor, and emerging Web-based
alternatives, sometimes called altmetrics, cybermetrics, or webometrics.
With interest in online venues for scholarly communication on the rise,
and the number of tools available for tracking online
influence growing steadily, librarians are in a key
position to take the lead in bolstering researchers’ knowledge of
current
trends—and concerns—in the new art and science impact
measurement.
struggle to find new ways of providing evidence of their changing
scholarly value,
many librarians have stepped forward to help negotiate
the landscape of both traditional impact metrics, such as h-index and
journal impact factor, and emerging Web-based
alternatives, sometimes called altmetrics, cybermetrics, or webometrics.
With interest in online venues for scholarly communication on the rise,
and the number of tools available for tracking online
influence growing steadily, librarians are in a key
position to take the lead in bolstering researchers’ knowledge of
current
trends—and concerns—in the new art and science impact
measurement.
General resources
Google Scholar Citations.
This free Google service allows authors to create profiles that manage,
calculate, and track citation data such as h-index
and i10-index (i.e., number of articles with
at least ten citations). Using a statistical model based on author and
article
metadata to identify relevant citations, the
service offers the option of automatically adding new articles to users’
public
or private profiles. Google also recently
launched a related service, Google Scholar Metrics, that gauges the
“visibility
and influence” of articles and publications
from 2007 to 2011, based on Google Scholar citation data. Access: http://scholar.google.com/intl/en/scholar/citations.html.
SCImago Journal and Country Rank.
SCImago is a free Web site that runs on Scopus data to calculate two
metrics: SCImago Journal Rank (SJR) and Source Normalized
Impact per Paper (SNIP), which compare
directly to Web of Knowledge’s Impact Factor. SJR is based on times
cited, but also
uses an algorithm similar to Google’s
PageRank to calculate article influence, which it uses to create
rankings. Using SCImago’s
online interface, users can compare rankings
of up to ten journals at a time, display top journals, and even display
countries
with influential journals in a discipline. Access: http://www.SCImagojr.com.
Scopus. Scopus is a
subscription database known primarily as an alternative to Web of
Knowledge, as it offers similar article, author,
and journal-level metrics, but uses slightly
different algorithms to calculate them. Metrics include standard options
such
as times cited and h-index, as well as
original offerings like SJR and SNIP from SCImago. Scopus recently
launched “Altmetric
for Scopus,” a third party application that
runs within the sidebar of Scopus pages to track mentions of papers
across social
media sites, science blogs, media outlets,
and reference managers. Access: http://www.scopus.com.
Web of Knowledge.
This Thomson Reuters subscription database helped usher in modern
bibliometrics with its introduction of the h-index in 1982.
Web of Knowledge includes Web of Science, for
article and author queries, and Journal Citation Reports, for journal
queries.
Its metrics include times cited, h-index,
impact factor, Eigenfactor, and field-based journal rankings. While many
of these
metrics have been criticized for not fully
representing scholarly value in certain disciplines, they are still
considered
the gold standard in traditional
bibliometrics. Access: http://www.webofknowledge.com.
Altmetric resources
Altmetrics.org.
This free Web site is a central hub for information about the growing
altmetrics movement, which it defines as “the creation
and study of new metrics based on the Social
Web for analyzing and informing scholarship.” Cofounded by prominent
figures
in the world of bibliometrics, such as Jason
Priem and Heather Piwowar, altmetrics. org maintains links to new online
tools
for calculating impact. Other prominent
features include an altmetrics “manifesto” that argues how altmetrics
can improve
existing scholarly filters. Access: http://altmetrics.org.
Impact Story.
Formerly known as Total Impact, Impact Story is a free open source tool
designed to support URL-based publishing through
the aggregation of online altmetrics. Users
create collections of materials through online identifiers, such as
Google Scholar
Profiles, DOIs, and PubMed IDs. Impact Story
uses more than a dozen APIs to search for metrics on these collected
items, with
sources ranging from popular social media to
scholarly tools like Mendeley and PLoS. Items are subsequently assigned
impact
categories, such as generally/highly “saved,”
“cited,” “recommended,” or “discussed.” This resource is most useful
for researchers
publishing in nontraditional venues or with
scholarship too new to have accumulated traditional citations. Not a
comprehensive
source for tracing Web impact. Access: http://impactstory.it/.
PLoS Article Level Metrics.
Public Library of Science (PLoS) has emerged as the leading open access
journal repository, in part due to its high traditional
impact factors. However, PLoS offers an
alternative to traditional impact in the form of Article Level Metrics,
which track
the influence of individual PLoS articles,
from times downloaded to mentions in social media and blogs. PLoS also
tracks internal
article metrics, including comments, notes,
and ratings. While a valuable resource for impact, only PLoS articles
benefit
from its metrics. Nevertheless, this resource
represents an important new avenue for metrics, which future publishers
will
likely replicate. Available for free online. Access: http://article-level-metrics.PLoS.org/.
Publish or Perish.
Anne-Wil Harzing created Publish or Perish (PoP) to assist faculty
looking for more diverse bibliometrics. PoP is a free,
downloadable program that harvests data from
Google Scholar based on author name. Users can manually remove records
to refine
the data, similar to what is now offered by
Google Scholar Citations. PoP can also calculate numerous metrics,
including alternatives
to the h-index. However, because few people
are familiar with non h-index calculations, it is up to users to explain
such
metrics to larger audiences. Access: http://www.harzing.com/pop.htm.
ReaderMeter.
ReaderMeter is a free tool that “crowdsources” impact by processing
readership data from Mendeley. Created by Dario Taraborelli
of the Wikimedia Foundation, it contrasts
with traditional bibliometric tools in its focus on readership, not
citation. The
site functions by compiling reports based on
authors’ names, which are subsequently processed through the Mendeley
API. Each
report highlights information such as an
author’s “HR-Index,” “GR-Index,” “Total Bookmarks,” and “Top
Publications by Readership.”
ReaderMeter has been by criticized some in
the altmetrics community for drawing data exclusively from Mendeley.2 However, plans exist to integrate data from multiple reference management sites, such as CiteULike. Access: http://readermeter.org/.
Scholarly peer networks
Academia.edu.
Academia.edu is a free online paper-sharing platform that encourages
academics to increase their visibility and monitor research
within and across its scholarly network. With
nearly 2 million profiles and 1.5 million uploaded papers, academia.edu
has
become a popular player in the world of
online repositories. Impact metrics for the site are similar to those
offered by many
blogs, and include profile views, document
views, and country-based page traffic. In another increasing trend for
scholarly
networks, the site also offers features
geared toward social interaction, such as user statuses and an “ask a
question” tool.
Access: http://www.academia.edu/.
Mendeley. Mendeley
is a relatively recent startup from the same company that created
Last.fm. It combines a citation manager with a
scholarly social network to create a
comprehensive research portal. Researchers with profiles can chart views
and downloads
of their research through the portal, join
groups, and view popular articles within their fields. Mendeley has
gained particular
traction in the sciences, from which most of
its users hail. However, with the integration of Mendeley data into more
altmetrics
tools, it will likely become popular with
other disciplines, too. Mendeley is free with for-cost storage upgrades,
and available
both online and as a download. Access: http://www.mendeley.com.
Social Science Research Network (SSRN).
SSRN is an online article repository, recently listed number one in the
Web of World Repositories’ rankings for 2012. It
encompasses three key features: an database
of more than 400,000 abstracts, a large electronic paper collection, and
20 specialized
subject networks through which registered
users can promote their work and connect to free abstracts and articles.
Though
praised for its ability to facilitate
discovery of scholarship, SSRN has also been criticized for the
strictness of its policies,
which some see as stifling in comparison to
emerging scholarly networks. Still, its site-specific metrics for “top
papers,”
“top authors,” and “top institutions” remain
key to social science faculty. Access: http://www.ssrn.com.
VIVO. VIVO is a
free, downloadable semantic Web application designed to facilitate
research collaboration both within and between
institutions. Originally developed at
Cornell, it invites institutions to upload data related to faculty
profiles, which it
crawls in order to draw meaningful
connections between researchers. VIVO doesn’t directly support
user-centered metrics, but
has the potential to be a powerful tool in
collecting university-level research metrics. To date, only a few large
institutions
have implemented VIVO, as it requires
significant programming knowledge and commitment. Access: http://vivoweb.org.
Blogs and media
Citation Culture.
This two-year-old blog is the creation of Paul Wouters, director of the
Centre for Science and Technology Studies at Leiden
University (LU). Authored by Wouter and a
fellow LU professor, the blog is dedicated to discussion of academic
impact, from
citation analysis to the broader evaluation
of research across universities. Recent multipart posts have touched on
topics,
such as humanities bibliometrics and
scholarly altmetrics. While information on the site is excellent and
detailed, posts
are published sparingly, at a rate of one to
two per month. Access: http://citationculture.wordpress.com/.
Jason Priem’s Web site.
Jason Priem is a Ph.D. candidate at University of North Carolina-Chapel
Hill’s School of Information and Library Science
and the cofounder of Impact Story. Priem has
emerged as one of the strongest advocates for altmetrics, and a champion
for
library involvement. His interests touch on a
variety of altmetrics topics, including the future of scientific
communication,
the open data movement, and author’s rights.
As the emerging altmetrics landscape continues to move forward, expect
Priem
to be at the front. Access: http://jasonpriem.org/.
Scholarly Kitchen.
Established by the Society for Scholarly Publishing, Scholarly Kitchen
is a moderated blog that presents ideas on current
topics of scholarly publishing and
communication. While not strictly focused on bibliometrics, many of the
site’s “chefs”
boast expertise in the intersection between
impact and publishing. The site also offers useful category filters such
as “Metrics
& Analytics,” which includes more than
280 posts and counting. Access: http://scholarlykitchen.sspnet.org/.
Bibliometrics research support
Elsevier Bibliometrics Research Program (EBRP).
EBRP was designed by Elsevier as a way for bibliometrics researchers to
gain access to large amounts of data for free. Available
data includes publication metadata from
Scopus, usage data, and full-text data from ScienceDirect. Researchers
apply for the
data, and successful applicants receive a
dataset specifically designed for their project by Elsevier. Examples of
successful
projects on the site are especially useful to
those who are interested in current altmetrics topics, such as the
relationship
between article downloads and citations. Access: http://ebrp.elsevier.com/index.asp.
OII Toolkit for the Impact of Digitised Scholarly Resources.
This JISC-funded toolkit was developed by the Oxford Internet Institute
to help authors, publishers, and librarians, learn
more about measuring the impact of digital
scholarship. The Web site is divided into three sections: case studies,
quantitative
methods, and qualitative methods. The two
latter sections define and discuss methodological subcategories, such as
bibliometrics/scientometrics
and content analysis. Contributions to the
toolkit are encouraged in the form of articles and comments, which can
be submitted
after creating a free user account. Access: http://microsites.oii.ox.ac.uk/tidsr/welcome.
Organizations, conferences, and electronic lists
ACM Web Science Conference.
The Web Science Conference is dedicated to the study of socio-technical
relationships that shape and engage with the Web.
An official ACM conference since 2011, Web
Science brings together computer scientists with researchers from the
social sciences,
humanities, and law. Each conference has
included a major workshop on the impact of the Web on scholarly
communication—including
this year’s “Altmetrics12” workshop, run by
affiliates of altmetrics.org. Access: http://www.websci12.org/.
ASIST SIGMETRICS.
This electronic list covers bibliometrics and altmetrics from a LIS
perspective. Posts are equal parts information/announcement
and discussion of factors related to
bibliometrics, such as open access or “gaming” metrics systems. This
electronic list
is a great option for those interested in
bibliometrics culture or in networking with bibliometrics specialists.
Includes
a searchable archive. Access: http://web.utk.edu/∼gwhitney/sigmetrics.html.
International Society for Scientometrics and Informetrics (ISSI).
ISSI is a major society dedicated to the study of bibliometrics,
particularly in the sciences. Highlighted features include
a biannual conference, abstracts of
bibliometric journals, and a electronic list. Librarians interested in
detailed analyses
of bibliometrics should look to this site for
a wealth of information. Access: http://www.issi-society.info.
- © 2012 Robin Chin Roemer and Rachel Borchardt
From bibliometrics to altmetrics
No comments:
Post a Comment