How to make a simple research budget
Impact metrics for researchers
Stacy Konkiel is the Director of Marketing & Research at Impactstory, a not-for-profit service that tracks altmetrics.A former academic librarian, Stacy has written and spoken most often about the potential for altmetrics in academic libraries.
Stacy has been an advocate for Open Scholarship since the
beginning of her career, but credits her time at Public Library of
Science (PLOS) with sparking her interest in altmetrics and other
revolutions in scientific communication.
Prior to PLOS, she earned her dual Masters degrees in Information Science and Library Science at Indiana University (2008). You can connect with Stacy on Twitter at @skonkiel.
How do you quantify your “impact” as a researcher?
Most of us use citation counts and, in some fields, journal impact
factors, but these measures only capture a very specific type of
scholarly impact. For many researchers, they’re not useful for
understanding their diverse impacts: on the public, on policy, and on
practice.
For many researchers, quantifying impact can also be a scary
proposition since you often have no control over how others interpret
the numbers you provide. Moreover, citation counts don’t provide the
all-important context behind the numbers: What are people saying about
my work? And who’s saying it?
It’s time for researchers to supplement traditional measures with newer measures – altmetrics – that capture the diverse impacts of their scholarship.
This is the first of two posts about altmetrics. For this first post,
I argue that the impact metrics we’re currently using are good but too
limited in scope to be useful for 21st century researchers.
We should supplement these traditional metrics with ones that help us
understand how scholarship is used and shared on the social web
(altmetrics). In my second post, I’ll give a brief guide to the most
popular altmetrics aggregators and how to get started on collecting and
sharing your own impact metrics.
Traditional metrics miss important types of impact
Citations and journal impact factors have been traditional ways tounderstand scholarly impact. For a long time, they were the appropriate
tools for the job.
In the era of the print journal, citations were the best way to
understand a work’s academic impact. However, citations have many
weaknesses: they are slow to accumulate, lack the ability to express the often-complex motivations behind why a work has been cited, and only measure a narrow type of scholarly impact (leaving other impacts, such as citations in public policy documents, unexplored).
Journal impact factors were also good at doing the job they were
created to do: helping librarians evaluate the quality of scientific
journals. Unfortunately, the measure began to be adopted and abused by
academics, who used the journal-level measure to approximate
article-level impact. So, the impact factor fell into disrepute, being
repeatedly criticised throughout the years (e.g. 1997, 2009, 2010, 2012).
We’re now in the era of web-native research: journal articles are now online more than ever before, along with the related data, figures, software, slides, and other research outputs.
The traces of conversations surrounding those outputs are now
discoverable in ways they weren’t before. These take the form of blog
posts, comments on preprints and articles, tweets, adaptations of
software code, tags assigned on social bookmarking sites like Mendeley
and Delicious, and Wikipedia mentions, among others.
Many academics now participate in web-native research to some extent.
Citations and journal impact factors cannot measure the impact of most
web-native research products.
Why haven’t our means of quantifying impacts evolved at the same pace of research?
Altmetrics allow you to build a more complete impact profile
Altmetrics – measurements of how scholarship are used, discussed, andshared on the social web – are a good way to measure the impacts of
web-native research. Altmetrics move beyond the very blunt tool that is
counting citations to document the many ways that a variety of research
outputs can have impact, as measured by the traces their use leaves
online.
Altmetrics are faster to accumulate than citations and journal impact
factors. Because they’re sourced from website data, you can now know in
days what it once took years to know: your data analysis code is
well-written and useful to others (otherwise, it wouldn’t be “forked” on
Github so often, would it?); your
experiment was flawed (back to the drawing board!); or you are being
read most often by graduate students, who are saving your work because
of its detailed description of your study’s design (Mendeley reference library tags sure can tell you a lot!).
Altmetrics are also better at helping us understand the impact of research products beyond the journal article.
type | Description | Example |
---|---|---|
Shares | Posted publicly in order to share news of research article or outputs. | Twitter, Topsy, Facebook, reddit, news articles, blog posts, Google+, YouTube, Figshare, Mendeley |
Saves | Saved on social bookmarking sites or favourited on social media and social coding websites. | Mendeley, CiteULike, Delicious, Github, Twitter, Slideshare |
Reviews | Discussed with additional commentary added. | Faculty of 1000 (F1000), blog posts, article comments, Facebook comments |
Adaptations | Creation of derivative works using an article or other output. | Github |
Social usage statistics | Downloads or views on web services and social media sites. | Figshare, Slideshare, Dryad, Facebook, YouTube |
- Altmetrics have many flavors: Many altmetrics occur in clusters, and those clusters can give us insights as to how research is being used.
- Scholarly blog posts and Mendeley bookmarks for papers most often correlate with traditional impact metrics: Several studies have found that these measures correlate, to varying degrees, with later citations and high-impact journals.
- Correlations with citations aren’t the point: Altmetrics don’t only “work” if they demonstrate scholarly impact or correlate with scholarly impact metrics. The power in altmetrics data comes from the varied impact flavors they uncover, and also the wealth of qualitative data they bring together in one place, for users to dig into themselves.
- Altmetrics is still a relatively young field of study. We know that certain metrics are clustered, but we don’t know why they occur in the first place. That’s because, of all the great research on altmetrics that’s been done to date,
no reliable qualitative studies have been done that definitively answer
questions like, “Why do researchers bookmark one study over another on
Mendeley?” So, while you can generally say that your work is being
discussed on Twitter and bookmarked by other academics on Mendeley, for
example, you can’t say for certain that you know why. - Altmetrics are not 100% comprehensive: No
altmetrics service tracks all mentions of all scholarly outputs across
the entire social web (yet), and altmetrics aren’t always the best at
helping you understand traditional scholarly impact in the same way
citations can. - No altmetrics reporting service is perfect: The
altmetrics aggregators discussed in our next post are better at
collecting comprehensive metrics for some products than others. Extreme care should be taken in comparisons between products and people. Numbers
should be considered minimums, and keep in mind that some of these
metrics can be easily gamed. This is one reason altmetrics advocates
believe having many metrics is valuable. - No metrics, including altmetrics, are a substitute for personal judgement of quality: Metrics
are only one part of the story, and no metric can accurately measure
the quality of an output. The best way to understand quality is the same
as it has been for over 100 years: look at the research product for
yourself and talk about it with informed colleagues.
How do you plan to use altmetrics?
Do you practice web-native research? Are you interested in usingaltmetrics to help document your research impacts, or will you stick
with more traditional measures? Let’s chat in the comments below or on Twitter.
Impact metrics for researchers | The Research Whisperer
No comments:
Post a Comment