Tuesday 14 November 2017

Thing 23: Altmetrics – 23 Research Things (2017)

 Source: http://blogs.unimelb.edu.au/23researchthings/2017/11/13/thing-23-altmetrics/


Thing 23: Altmetrics

Image: “altmetrics” by AJC1 via Flickr (CC BY-SA 2.0)

The rise of Web 2.0 technologies is
linked to non-traditional scholarly publishing formats such as reports,
data sets, blogs, and outputs on other social media platforms. But how
do you track impact when traditional measures such as citation counts
don’t apply? Altmetrics to the rescue!



Getting Started

The term “altmetrics” (=alternative metrics) was coined in a tweet in 2010, and its development since then has gone from strength to strength, resulting in a manifesto. With no absolute definition the term can refer to




  • impact
    measured on the basis of online activity, mined or gathered from online
    tools and social media (e.g. tweets, mentions, shares, links, downloads, clicks, views,  comments, ratings, followers and so on);

  • metrics for alternative research outputs, for example citations to datasets;
  • alternative ways of measuring research impact. 
Benefits of altmetrics include the fact that they can


  • provide a faster method of accumulation than the more traditional citation-based metrics; 
  • complement traditional citation-based metrics providing a more diverse range of research impact and engagement; 
  • offer
    an opportunity to track the increasing availability of data,
    presentations, software, policy documents and other research outputs in
    the online environment.  
Below we introduce some of the tools available to University of Melbourne staff and students to start collecting altmetrics.




Altmetric Explorer

Altmetric Explorer
has been monitoring online sources since 2012, collating data from Web
of Science, Scopus, Mendeley, Facebook, Twitter, Google+, LinkedIn,
Wikipedia (English language), YouTube, public policy documents, blogs
and more. Altmetric Explorer uses a donut graphic to visually identify the type and quantity of attention a research output has received:


The University of Melbourne Library’s subscription to Altmetric Explorer provides access to institutional data, as well as data for individual researchers and their outputs. Consider installing the Altmetric bookmarklet
on your toolbar to view Altmetric metrics for your publications. (Note:
this is only available for PubMed, arXiv or pages containing a DOI with
Google Scholar friendly citation metadata.)




PlumX Metrics

PlumX brings together research metrics for all types of scholarly research output, categorised as follows:  


  • Usage: clicks, downloads, views, library holdings, video plays… 
  • Captures: bookmarks, code forks, favorites, readers, watchers… 
  • Mention: blog posts, comments, reviews, Wikipedia links, news media… 
  • Social Media: tweets, Facebook likes, shares… 
  • Citations: citation indexes (CrossRef/Scopus/Pubmed Central etc), patent citations, clinical citations, policy citations…
PlumX Metrics for individual articles, conference proceedings, book
chapters, and other resources, are available via the University of
Melbourne Library’s Discovery
search service – look for the Plum Print in the results list for your
search, and hover your cursor over it to expand details of the metrics:


Example of PlumX Metrics for an article retrieved through Discovery

Scopus
also displays PlumX Metrics for articles where available, offering an
interesting opportunity to view altmetrics alongside “traditional”
citation metrics,  and the Scopus field-weighted citation impact.


Impactstory

Impactstory is an
open source, web-based researcher profile that provides altmetrics to
help researchers measure and share the impacts of their research outputs
for traditional outputs (e.g. journal articles), as well as alternative
research outputs such as blog posts, datasets and software. Researchers
can register an Impactstory profile for free via their Twitter account,
then link other profiles such as ORCID and Google Scholar, as well as Pubmed IDs, DOIs, Webpage URLs, Slideshare and Github usernames. Impactstory
then provides an overview of the attention these connected collections
have received. Information from Impactstory can be exported for private
use. 





Have a look at this Impactstory example profile to find out more.


Public Library of Science – Article Level Metrics (PLOS ALMs)

If you publish research in the life sciences you can use PLOS ALMs to help guide understanding of the influence and impact of work before the accrual of academic citations.  


  • All PLOS
    journal articles display PLOS ALMs – quantifiable measures that
    incorporate both academic and social metrics to document the many ways
    in which both scientists and the general public engage with published
    research.  
  • PLOS ALMs are presented on the metrics tab on every published article. 
  • Use ALM reports to guide you to the most important and influential work published. 

Minerva Access

The University of Melbourne’s institutional repository, Minerva Access,
allows research higher degree students and research staff to safely
promote and self-publish their research. There are a number of
incentives for including in the repository: 




  • Minerva Access is harvested by Google Scholar, which in turn provides exposure and potential citation follow on
  • Minerva
    Access provides usage statistics for each item in the Repository as
    well as each collection and sub-collection. See the left hand foot of
    each page and click on the Statistics icon/link to see data on the
    number of times each record has been viewed (and from which countries),
    and – if applicable – the number of times any associated PDF has been
    downloaded. Data is available by month and by year. 

Considerations

  • Altmetrics have several advantages over traditional citation counts:
    they are quicker to accumulate, they document non-scholarly attention
    and influence, and they can be used to track the attention for
    non-traditional research outputs. However, they cannot tell anything
    about the quality of the research. You need both types of metrics –
    traditional and alternative – to get the full picture of research
    impact.
  • Manual work is needed to assess the underlying qualitative data that makes up the metrics (who is saying what about research). 
  • While
    altmetrics are good at indentifying ‘trending’ research, they have not
    yet been proven to be a good indicator for lasting, long-term impact. 
  • Researchers
    seeking to evaluate non-English-language sources will find that
    altmetrics coverage is currently limitied for these outputs. 

Learn More

  • For guidance around the tools, including useful summaries and tips, have a look at the University of Melbourne Library’s Altmetrics Subject Research Guide.

  • The Altmetric Explorer website provides a range of case studies  of what researchers and institutions have used to track the societal attention to their research.
  • In this blog post,
    Prof. Jonathan A. Eisen at the University of California, Davis,
    describes how he used Impactstory to look at the altmetrics of his
    research papers and data. 
  • Dip into the readings of the PLOS Altmetrics Collection
    and gather understanding on the statistical analysis of altmetrics data
    sources, validation of models of scientific discovery and qualitative
    research describing discovery based on altmetrics. 
  • The London School of Economics Impact Blog regularly runs features on Altmetrics.
This post was written by Georgina Binns (Faculty Librarian, VCA and MCM) and Fransie Naudé (Liaison Librarian, Education).








Thing 23: Altmetrics – 23 Research Things (2017)

Thursday 9 November 2017

4 New things about Google Scholar - UI, recommendations, and citation networks



 Source: http://musingsaboutlibrarianship.blogspot.my/2017/10/4-new-things-about-google-scholar-ui.html
















I'm actually a pretty big fan of Google Scholar, which in some ways is better than our library discovery service ,but even if you aren't a fan, given it's popularity it's important for librarians to keep up with the latest developments.



In any case, I'm happy to see that Google continues to enhance Google
Scholar with new features. These are some of the new features and things
I've learnt about Google Scholar lately.





1. Google Scholar's new UI 

The new interface is a lot cleaner, particularly when on mobile and most
of the changes aren't really major (e.g. replacing text of "save" and
"cite" with icons) but I miss the easy access to advanced search the old
interface had. 
Click the down arrow button to get access to advanced search in the old Google Scholar


In new Google Scholar interface, it now tucked under the "ham burger menu", where more people might notice it. 
On the plus side, very few people knew Google Scholar had a advanced
search or even how to access it , so overall it might be a ok trade-off,
though it takes two clicks instead of one to access the advanced
search.
Also the change to Google Scholar doesn't seem to have affected link
resolvers, various extensions that rely on Google Scholar via scraping
such as Publish or Perish , Google Scholar button, so this is still a relatively minor layout change. 

2.  Get recommendations of related works of other scholars' works

Official change announcement. 



For a long time Google Scholar had a odd gap. As arguably the largest
scholarly index in the world, with perhaps the largest number of users
of any scholarly search engine, it was well posed to use all this data
to create a fantastic recommendation system. Add Google's famed machine
learning and it looked like a no-brainer.



But it was only in 2012, nearly 8 years after launch that Google Scholar added a recommendation system.
And as you might expect, the recommendations are excellent. While other
recommendation systems for scholarly material exist (e.g. BX
recommender, Mendeley's, various publisher based ones), none in my
opinion are as broad ranging or timely as Google Scholar for the reasons
already mentioned.





Google Scholar recommended articles


Still there was a curious gap. The recommendation system only gave recommendations based on the works already in your Google Scholar profile.



The flaw here is obvious, what if you were working in a new area you
haven't published formally yet? Arguably this is exactly when you have a
greater need of the help of a recommendation system.



 I wanted a feature where I could put a set of articles into Google
Scholar and it would give recommended articles over time. One crazy idea
I had back at the time was to create a brand new fake Google Scholar
profile , load it up with works of articles I'm interested in , keep the
profile private and leverage on the recommendations provided.



Unfortunately this doesn't work, because the Google Scholar profile has to be public before recommendations appear.



Another way that probably works is to exploit the fact that papers
deposited into ResearchGate, preprint servers do appear in Google
Scholar and hence can be added to your profile fairly quickly. So you
could example, create a quick working paper (with citations to works you
know of) and deposit it in a institutional repository or preprint
server that is indexed by Google Scholar. Add those into your Google
Scholar profile and wait for recommendations to appear. But this still
seems really forced and do you really want to mess up your profile just
to get a few recommendations?



So the new feature added by Google is much appreciated. While you still
cant add any arbitrary set of articles, you can go to any Scholarly
profile and choose to follow the author's new works, citiations and most
importantly articles related to the author's research.








Follow Harzing profile to get recommended articles similar to her research publications in Google Scholar

It's not super clear to me if it just
sends new articles via email or whether it updates the recommended
articles list you get within Google Scholar, I suspect the former and
technically articles shown this way are alerts not recommendations?, but
it's still useful. 

3. Google citation profile improvements - allows one time export to ORCID 

This isn't a new feature in Google Scholar but fairly new feature in ORCID.
I often find many researchers have their Google Scholar profile fully
filled up with their works (no doubt partly because Google makes it so
easy , particularly with auto or semi-auto updates and partly becuase
they reason the profile increases their visibility), but are reluctant
to spend the time to get their ORCID profile populated.  
Exporting works via BibTex
This of course only works as a one-time upload and you would have to
continue to update future works, hopefully by other automated ways (e.g.
via Journal crossref links, from CRIS/RIMs etc). 
Another fairly new feature in Google Scholar citations is that they now
try to group together authors by institutions. So for example when you
search for the name of an institution in Google Scholar, you get
something like this.
Searching by institutions in Google Scholar
Clicking on the link gets you this.
Top cited profiles from the institution
There's a study on how accurate this institution matching is  but what are the practical implications for normal librarians who aren't doing advanced bibliometrics?
For one it allows you to fairly easy get top 10, 20 etc cited authors of
your institution, to complement the lists you get from Web of
Science/incites or Scopus/Scival. 
You can't jump to the end of results to gauge how many authors your institution has on this platform. 

It's unfortunate that for this set of results, Google doesn't list the
number of results, and neither can you gauge it by looking at the number
of pages in the results list  and you can only go forward page by
page(see above). 
I don't know of a way around it , even if you alter the url parameters "&after_author "or "&astart=30" it doesn't work.


4. Scraping of Google Scholar to create network diagrams/ Bibliometric networks

It basically works as follows. First the system allows you to search via Google Scholar for papers to add.
Below I searched the term Open access, and then added some of the papers
into the system. You can of course search for specific papers by title.
Once you are done with a set of papers, you can click on "Check
Citations" and it will use Google Scholar's "search within citing
articles" feature to see if the articles in your set of papers are
connected.
It took me a while to understand how it worked but here's a specific example.
Say you have the following two papers
"Eysenbach, G. (2006). Citation advantage of open access articles. PLoS biology, 4(5), e157."
and
"Harnad, S., & Brody, T. (2004). Comparing the impact of open access
(OA) vs. non-OA articles in the same journals. D-lib Magazine, 10(6)."
The system will automatically go to say the Google Scholar list of citations for Harnad, S., & Brody, T. (2004) and using the "Search within citing articles" check to see if G. Eysenbach, is included. 
It will do this for all pairs of authors in your set of reference
articles automatically. All these searches are done in a popup window,
if the volume is too big , Google Scholar will throw up a captcha for
you to solve and it will then continue. 
You can then export a basic visualization of the author network which
shows coauthorships and citations. Here's my first toy example, using
papers I cited in a recent working paper.
It isn't too impressive probably because I don't have enough papers but
it does show the structure I expected with 2 main clusters - one around
LC smith (1981) (old paper on citation analysis for library collection
evaluation) and one around Eysenbach (2006) , a well cited early paper
on citation advantage. I would have thought they would not be connected
at all (particularly since I remove Eugene Garfield's seminal
publications) but they still seem to be linked indirectly.though an
author who cited both. 
You can export the network for further study into the open source Gephi network visualization tool and
I have spent some time doing so playing with more complicated networks
like publication & author networks, using modularity to identify
clusters of works. I'll probably blog about this in a separate blog post
next time, but for now I'm very intrigued.

How useful are such networks for researchers? 

Could doing such network graphs be useful for researchers, particularly
those new to the field to help them see how their research fits into
existing research, and see connections they wouldn't otherwise have
seen? 
Could this be autogenerated from references of existing papers to help
the reader get a sense of where the current piece of article sits?
Can such network graphs provide improved recommendations (or does
recommendations from Google Scholar etc already implicitly do that?)
How big a network (or set of articles) is needed before this becomes
useful? e.g. Is this useful only for thesis with over 50 references (or
better yet include everything in your reference manager not just what
you cited)? Would most researchers find that these network graphs only
reproduce clusters they already intuitively know or provide some
unexpected insights?
In a future post I will talk about my experiments on these 3 scenarios
a) Visualizing networks between publications that cite my old 2009 article  
b) Visualizing networks between publications cited in my old 2009 article and newest paper
c) Visualizing networks between publications cited in an article not in
my field. (to see if it helps orientate me better in an area I'm not
familiar with). 
Would I learn anything from doing such visualizations?
Of course this idea isn't new, I'm guessing there should be research out there on this. 
Existing tools like web of science have limited citation map
capabilities and the newer incites and scival also provide mapping
capabilities though often at the higher level meant for research
administrators.
On the free side, VOSviewer also provides the ability to visualization networks of citations. The newest version 1.6.6  actually adds the ability to generate networks not just from Scopus and Web of Science citations but also from Crossref.







VOSviewer 1.6.6 supports Web of Science, Scopus, PubMed and Crossref 
So one can generate similar networks using dois from VOSviewer.
Still I suspect scraping from Google Scholar might give richer results
due to the much large scale of Google Scholar compared to say Scopus.
Also given the popularity of Google Scholar has a discovery tool, one
might find relying on other tools such as Scopus to create networks
might risk missing too many works found via Google Scholar. 


Conclusion

Hope you found some of this useful.



It's good to see Google continue to improve Google Scholar, while we may
not know when Google might decide to abandon Google Scholar , the
recent spate of improvements are a good sign it won't  be anytime soon.

Posted by


4 New things about Google Scholar - UI, recommendations, and citation networks

Tuesday 7 November 2017

Thing 20: Avoiding Deceptive, Unethical, Predatory and Vanity Publishing – 23 Research Things (2017)

 Source: http://blogs.unimelb.edu.au/23researchthings/2017/11/06/thing-20-avoiding-deceptive-unethical-predatory-and-vanity-publishing/





Thing 20: Avoiding Deceptive, Unethical, Predatory and Vanity Publishing

Image: “Banner header attention caution” by geralt via Pixabay (CC0)

Some argue that the breakdown of
trusted information sources is one of the major challenges faced in the
21st century (Gray, 2017). This view is influenced by the growth in
deceptive, unethical and predatory publishing practices occurring
online. As victims, academics and their institutions, often experience
financial and reputational damage from unethical scholarly publishing.



Getting Started

When the time comes to consider suitable scholarly publishing outlets
for your research, we highly recommend undertaking due diligence to
select quality sources. Becoming vigilant and regularly updating your
knowledge of scholarly publishing outlets to assess their quality, is a
means to avoid publishing traps and pitfalls.


A predatory publisher has been defined as a type of scholarly
publishing company established primarily to collect Article Processing
Charges (APC) and provide very fast publishing without peer review or
even checking grammar or spelling. They often spam academics with
requests for submissions and reviews and requests to join their
editorial boards (Shen & Björk, 2015).


Warning Signs

Key characteristics of deceptive, unethical, predatory and vanity publishing practices can comprise:


  • spelling and grammar errors, along with distorted images on the website;
  • advertising fake metrics e.g. Global Impact Factor (GIF)
  • a journal website with an overly broad scope;
  • language that targets authors rather than readers;
  • promises of rapid publication;
  • a lack of information about retraction policies, manuscript handling or digital preservation;
  • manuscript submissions by e-mail;
  • taking copyright ownership of material (usually theses);
  • hijacking journal titles, establishing duplicate websites and using business names that are like respected publishers;
  • organizing conferences to collect funds from presenters and participants without peer review or a formal program
  • promoting non-existent conferences;
  • adding academics to editorial boards without permission; and
  • unexpected fees after accepting submissions
The following site monitors problematic publishers:  Distraction Watch.


“Characteristics
of predatory publishing practices” – original graphic by Tanja
Ivacic-Ramljak (Liaison Librarian (Learning & Teaching), Veterinary
& Agricultural Sciences). Click image for full screen.

Considerations/Risks

There is no single and absolute authority to determine the best or worst scholarly publishing outlet.


There are many useful resources to help evaluate suitable publishing
outlets for your scholarly research. The usefulness of serials
directories such as UlrichsWeb, Scimago and SHERPA/RoMEO will depend upon your scholarly publishing requirements and field.


The Committee on Publication Ethics (COPE)
has members worldwide from all academic fields. Membership is open to
editors of academic journals and others interested in publication
ethics. If you find the COPE logo on a journal’s website, it is an
indication that the journal has been critiqued by COPE as a prerequisite
for membership. Together with Open Access Scholarly Publishers Association (OASPA), Directory of Open Access Journals (DOAJ), and World Association of Medical Editors (WAME),
a minimum criteria has been set that journals will be assessed against
when they apply for membership of the respective organisations; here is a
link to the full criteria on principles of transparency and best practice.


Try This

We recommend the following sources for critiquing any publisher that
has approached you with an invitation to publish your research:


  • Cabell’s Scholarly Analytics includes a whitelist of over 11,000 journals and a blacklist of “likely deceptive or fraudulent academic journals” for selected disciplines.

  • PubsHub is
    a database of submission criteria for peer-reviewed medical journals
    and congresses. The database contains information on 6,000 medical
    journals.
  • The openly available resource Think Check Submit. Follow this checklist to make sure you choose trusted journals for your research.
Other useful criteria are available from:


We encourage you to contact your Liaison Librarian for further advice.


Learn More

Author Mills

Stromberg, J. (2014). I sold my undergraduate thesis to a print content farm: A trip through the shadowy, surreal world of an academic book mill. The Slate. 


Growth in Predatory Publishing

Clark J., & Smith R. (2015) Firm action needed on predatory journals. BMJ. 350 (Jan16_1): h210


Beall J. (2012) Predatory publishers are corrupting open access. Nature. 489(7415): 179-180.


Shen, C., & Björk, B.-C. (2015). Predatory’ open access: A longitudinal study of article volumes and market characteristics. BMC Medicine


Xia J. (2015) Predatory journals and their article publishing charges. Learn Pub. 28(1): 69–74.


Predatory Conferences

Pai, M., & Franco, E. (2016, updated 2017). Predatory conferences undermine science and scam academics. Huffington Post Blog.


Byard, Roger W. (2016) The forensic implications of predatory publishing. Forensic Science, Medicine and Pathology 12.4: 391-393.


Hijacking

Dadkhah, Mehdi, Tomasz Maliszewski, and Jaime A. Teixeira da Silva. (2016) Hijacked
journals, hijacked websites, journal phishing, misleading metrics, and
predatory publishing: actual and potential threats to academic integrity
and publishing ethics
.
Forensic Science, Medicine, and Pathology 12.3: 353-362.


Fake News

Gray, R. (2017). Lies, propaganda and fake news: A challenge for our age. [online] BBC.com.


This post was written by Lisa Kruesi (Faculty Librarian, Health
& Life Sciences), Satu Alakangas (Liaison Librarian (Research), Law)
and Sarah Charing (Liaison Librarian (Research), Architecture, Building
& Planning).
Original graphic by Tanja Ivacic-Ramljak (Liaison Librarian (Learning & Teaching), Veterinary & Agricultural Sciences).







Thing 20: Avoiding Deceptive, Unethical, Predatory and Vanity Publishing – 23 Research Things (2017)

Thing 19: Open Access and Your Thesis – 23 Research Things (2017)



 Source: http://blogs.unimelb.edu.au/23researchthings/2017/11/06/thing-19-open-access-and-your-thesis/

Thing 19: Open Access and Your Thesis

Image: “Open Access (storefront)” by Gideon Burton via Flickr (CC BY-SA 2.0)

“By
making my PhD thesis Open Access, I hope to inspire people around the
world to look up at the stars and not down at their feet; to wonder
about our place in the universe and to try and make sense of the
cosmos.” (
Stephen Hawking on the release of his 1966 PhD thesis)

Making your thesis publicly accessible
requires consideration of a number of concepts: institutional policy;
attitudes of prospective publishers; 3rd-party copyright; and indexing
in search engines, including the effect on citation and impact of your
work. This installment of 23 Research Things aims to shed some light on
these considerations. 
 


Institutional Policies

All universities have policies and procedures for enabling public
access to higher degree theses. The specifics of the University of
Melbourne policy are laid out at My Thesis in the Library and Preparation of Graduate Research Theses Rules. Advice elsewhere will reflect particular institutional requirements.  


Over the last 20 years universities have supplemented public access
to print copies of theses on library shelves with online access via
institutional repositories, such as Minerva Access. 


How Are Theses Discovered?

Theses are a link in the scholarly communications chain and the
provision of public access to them is long-standing university practice.
Discovery of print theses in university libraries has been facilitated
by discovery services like Google Books, Google Scholar and Trove. Online open access extends this discovery and access, but comes with particular challenges and opportunities.  


Why Make Your Thesis OA – Impact, Engagement and Profile

Why is OA important?  An online thesis is one way you can establish your profile in a subject area, bringing you to the attention of potential collaborators, colleagues and employers. Theses indexed in Google Scholar will include citation data if referred to in other publications. Repositories provide counts for downloads and views, indicating both volume and location of your readership. Some institutions
have begun to track “alt-metric” counts for theses, providing further
indications of impact and engagement. You may never reach the dizzying
heights of Stephen Hawking’s thesis download or altmetric count but you can always dream!  


Should You Embargo Your Thesis?

So, if OA theses are both personally rewarding and a social good, why
would you choose an embargo? In fact, embargoes are a legitimate
response to institutional and individual concerns around immediate OA. 


While there can be commercial and legal issues, or issues of cultural
sensitivity which demand embargo, most often the concern is around the
perceived threat to subsequent publication from the thesis. Is such
concern warranted? A 2014 survey of science publishers
found that over 80% would, with some qualification, accept article
submissions from work based on OA theses. A 2017 in-house survey of 50 key business and economics journals
found none would outright exclude a publication stemming from an OA
thesis. Two of the 50 commented they would reserve the right to refuse
if there was considerable duplication between the thesis and the
submitted journal article, however it was also noted that it would be
rare for a thesis to simply be repurposed as an article without
substantial changes!   


What about books? Some publishers, based on their public statements,
see OA theses as advantageous, allowing for the early identification of
viable new publications. Another large-scale publisher survey
found that 50% of university presses in social sciences and humanities
would accept submissions based on OA theses. However, a substantial
minority would not or would do so on a case-by-case basis.  


Copyright and Your Thesis

In most parts of the world, including Australia, an OA thesis is
considered to be “published” which means that using copyright material
created by other people – “third party copyright” such as text, images
and graphs etc – requires not only explicit acknowledgment but may also
require permission from the copyright owner. However, there are some
circumstances where permission may not be required, for example for
purposes of criticism or review [or for satire or parody]. For more
information about these circumstances, see here.


Reusing your own, already-published work in your online
“thesis-by/with-publication” may also require permission from other
copyright owners. Publisher author rights policies generally support the
use of the author-accepted-manuscript (see Sherpa/Romeo). However, for OA papers published with a Creative Commons Licence, and subscription papers from some publishers like Elsevier, there is no problem using the final published version. 


Learn More

This post was written by Stephen Cramond (Manager, Institutional Repository) and Jenny McKnight (Research Consultant (Open Access)).

Thing 19: Open Access and Your Thesis – 23 Research Things (2017)