Friday, 28 April 2017

Publishing Research Support Documents in Open Access Platform



Publishing Research Support Documents in Open Access Platform

byNader Ale Ebrahim
Unpublished
papers, white papers, data sets, and teaching materials can be a source
for increasing the author’s visibility. Getting author's documents (the
full range of work produced by scholars and researchers) under control
is a key driver to enhance research visibility and impact. With document
and data publishing tools, authors can put all of their key research
outputs online where they're immediately accessible to the researchers
that need them. Previous studies have found that papers with publicly
available data sets receive a higher number of citations than similar
studies without available data. In addition, new research has found that
by putting your research data online, you’ll become up to 30% more
highly cited than if you kept your data hidden. In this workshop I will
elaborate the advantages of sharing research data and introduce some
relevant “Research Tools” for documents publishing.


Publishing Research Support Documents in Open Access Platform

Thursday, 27 April 2017

Traditional and altmetrics - Research Impact & Visibility - LibGuides at Utrecht University

 Source: http://libguides.library.uu.nl/researchimpact

Traditional and alternative metrics sites compared


Besides traditional citation counts, there are many ways of
tracking research impacts. They try to capture the presence in new
scholarly venues, presence and impact in social media and other forms of
online engagement, such as views, downloads, bookmarks etc.
Collectively, we refer to these as altmetrics, as opposed to traditional
citation measurement using Web of Science, Scopus and other citation
enhanced databases.




Journal Citation Reports Scopus Web of Science Google Scholar Google Scholar Citations Microsoft Academic Search Mendeley ImpactStory PLoS Altmetric Plum Analytics
metrics for:










papers





















a



a














individuals







a
b



institutions


b

b



countries








journals



b
traditional metrics:










citations

altmetrics:










views/downloads






c
readers/bookmarks/tags





comments









news media









blogs






Facebook






Twitter






coverage:










transparancy






multidisciplinary c
access:










free access


d d
registration necessary







paid service


b
b
advanced options:










data download/management c c c
c b b
data standardization/cleaning

c




normalization



c c

API possibilities c c c

c
Notes:
aOnly items/persons/users included in the system (depends on data collected/uploaded by the users)
bPaid services: Mendeley Institutional Edition / Altmetric Institutional Edition / Altmetric Explorer
cWith restrictions/limitations
dArticle level metrics (Mendeley, Altmetric) and author profiles (Impact Story) free to view




based on: Users, narcissism and control: tracking the impact of scholarly publications in the 21st century (SURFfoundation, 2012); last adapted Oct 2014.


</>

Metrics in Scopus


Author metrics

Open Scopus and search your name using the 'Search author'-tab. Select your name from the results list and scroll down to 'Research' to see various metrics like citations and h-index. You can click on 'View Author Evaluator' or 'View h-graph' to see a visual representation of author metrics.


Tip: In calculating metrics, Scopus only uses data going back to 1995.



Tip: Are there errors in your Scopus listing, e.g. missing
documents or multiple author listings (b/c of spelling variants)? You
can request changes to be made by clicking 'Request author detail corrections' at the top of the page with author information.



Article metrics

Open Scopus and search an article or a subject using the 'Document search'-tab. In the results list, the number of citations the article has received is visible in the last column ('Cited by'). Click on this number to see a list of citations; on this page, there is also the option to click 'Analyze results' (top of page) to see a visual representation of article metrics.


Tip: In calculating metrics, Scopus only uses data going back to 1995.



Institutional metrics

Open Scopus and search your affiliation using the 'Affiliation search'-tab.
In the results list, click on your organization to get an overview of
collaborating institutions, subject areas and journal in which your
organization has published. To view citation information on all papers
from your affiliation, click on the number of documents from your
affiliation. In the following screen you can limit these to specific
years. At the top of the results lists, tick the checkbox to select all
documents (use the dropdown menu to select all documents instead of the
current page) and then press 'View citation overview' to view citation information on these documents.
NB. Comprehensive institutional metrics are available from Elsevier's separate product SciVal.

Tip:
If the number of documents is too large to show the citation
information on screen, you can download the citation information as a
.csv file. The maximum number of documents citation information is
available on (as .csv-file) is 20.000.


Tip: Scopus might have separate affiliations listed for e.g.
research institutes or research schools within a university. You will
see these listed in the Affiliation search results. To include papers
from these separate affiliations, tick the checkboxes of all relevant
affiliations and choose 'Show documents' at the top of the results list.


</>

Metrics in Web of Science


Author metrics Web of Science

Open Web of Science and search your name using the 'Author search'-option.
Enter your author name, and optionally proceed to select your research
domain(s) and organization(s). In the results list, you can opt to view
all results, or look at the tab 'Record sets' to
distinguish between different authors with the same name and/or multiple
entries for your own name (tick the boxes of the appropiate record sets
and select 'View records').


You will now see a list with all your publications listed in Web of Science. Click 'Create citation report' (top right) to view author metrics (citations and h-index).


Tip: Web of Science uses ResearcherID to manage author
names/citations. If you have a ResearcherID, you can manually add papers
authored by you and correct any mistakes. More information on creating
ResearcherID is available in workshop 1: Researcher profiles.



Article metrics

Open Web of Science and search an article or a subject using the 'Search' or 'Cited reference search'-options. In the results list, the number of citations the article has received is visible underneath each article ('Times cited'). Click on this number to see a list of citations; on this page, there is also the option to click 'Create citation report' (top right) to see more detailed article metrics.


Tip: To view a visual representation of backwards and
forwards referencing of a given article ('cited in/cited by'), click on
the title of the article in the results list and choose 'Citation map' in the 'Cited References' box in the right sidebar.



Institutional metrics

Extensive institutional metrics are available through Thomson Reuters separate product InCites, but some instutional metrics can be derived directly from Web of Science. Search the institution's name in Web of Science 'Basic search' funtion, choosing 'Organization - enhanced' from the drop-down menu on the right. Alternatively, use the 'Select from index' option underneath the drop-down menu to search for the organization's name as used in Web of Science.


Searching for the organization results in a list of papers that have
the organization listed as affiliation in Web of Science. You can limit
the results to e.g. specific years using the options on the left
sidebar. Then click 'Create citation report' to see aggregated and detailed article metrics for these papers.


Tip: The Citation Report feature is not available from a
search containing more than 10,000 records. You can limit the number of
results by restricting results to specific years of publication or other
criteria.


</>

Metrics in Google Scholar /Google Scholar Citations


Author metrics Google Scholar

Open Google Scholar
and search your name or that of a colleague. If a (public) Google
Citations profile exists, it will show up at the top of the results
list. Click on the profile to see various metrics like citations,
h-index and i10-index (the number of publications with at least 10
citations).


Tip: More information on creating a Google (Scholar) account and activating Google Scholar Citations is available in workshop 1: Researcher profiles.



Article metrics

Open Google Scholar and
search an article or subject. In the results list, the number of
citations the article has received is visible underneath each article ('Cited by'). Click on this number to see a list of all citations.


Tip: When you access Google Scholar through the website of Utrecht University Library,
you will have full-text access to all articles from journals Utrecht
University subscribes to (recognizable by 'Fulltext@UBULink')


</>

Metrics in Microsoft Academic Search


Author metricsMicrosoft Academic Search

Open Microsoft Academic Search
and search your name. A link to your profile will appear at the top of
the results list; alternatively, click on your name in one of the
publications listed to bring it up. In your profile, various metrics are
displayed, including citations, h-index, g-index (modified form of the
h-index based on average number of citations per article) and
information on co-authors.


Tip: An interesting option in Microsoft Academic Search is the 'Co-author graph'
(available in the left sidebar of each author profile): an interactive
visual representation of connections between scientists based on
co-authorship.


Tip: You can edit information in your user profile by clicking the 'Edit'
button at the top right of your profile. A Microsoft Live ID is
required, and edits are pending approval/verification by Microsoft
Academic Search.



Article metrics

Open Microsoft Academic Search
and search an article or subject. In the results list, the number of
citations the article has received is visible following the title of
each publication ('Citations'). Click on this number to see a list of all citations.


 Tip: Microsoft Academic Search offers the option
to see the context of citations, that is, where in a document your
article is cited. To view this from the list of citations, click on 'Citation context' in the left sidebar.



Institutional metrics

Open Microsoft Academic Search
and search for your organization, using the Advanced Search option. If
the organization is recognized, Microsoft Academic Search will show the
organization's profile page, listing number of publications and
citations, top research areas and most cited authors.


Tip: It is also possible to compare two institutions using the 'Comparison'
option at the top of the organization's profile page. and to view a
visualization of domain trends, available in the left sidebar of the
organization's profile page.


</>

Metrics in Mendeley


Article metrics Mendeley

Open Mendeley
(login not required) and search an article or subject using the search
bar in the tab 'Papers'. For each article, the number of Mendeley users
that have added this paper to their Mendeley library ('readers')
is shown underneath the information about the article. When you click
on the article's title, more information on readership statistics can be
found in the right sidebar.



Institutional metrics

Academic institutions can subscribe to Mendeley's Institutional Edition
which offers, among other features, information on research production
(papers from the institution present in Mendeley) and detailed
readership information (which papers Mendeley users from that
institution are reading/bookmarking).


</>

Metrics in ImpactStory


Author metrics ImpactStory

Open ImpactStory and click 'Try it for free'.
In the subsequent window, you are asked to create an account, which is
free for the first 30 days. After creating an account, you can import
your research output connected with your Google Scholar ID or ORCID. If
needed, you can add articles, datasets etc. by filling out the
respective boxes.


(NB. Importing from Google Scholar does not seem to work in Internet Explorer.)


Instead of making your own impact report, you can also click on 'See an example profile'
at the bottom of the main website. You are then shown a sample page
containing links to articles, a dataset, slides and a webpage.


Each item in your collection will have information added as to how
often it is viewed/saved/cited/discussed recommended by scholars (blue
boxes) and by the public (green boxes). To view details on these metrics
(including their sources) click either on one of the blue/green boxes
or on the title of the item. Each metric also carries a percentile
range, measured against a reference set of all papers indexed in Web of
Science the same year.


Tip: More information on creating a Google
(Scholar) account and activating Google Scholar Citations, as well as on
creating an ORCID, is available in the LibGuide Researcher profiles.


Article metrics

To view article level metrics in ImpactStory you need to make an impact report as described above under Author metrics. It is not possible to search for individual articles on ImpactStory.


Tip: After the first 30 days, Impact Story charges $60 a year to
maintain your profile. It should be noted Impact Story is fully
committed to remain open, independent and non-commercial. More
information on the subscription model can be found in the ImpactStory FAQ.


</>

Metrics in PLOS One


Article metrics PLOS One

Open PLOS One (or any of the the other PLOS Journals) and search an article or subject. In the results list, underneath each article it is indicated whether the article has any views (both html views and downloads, in PLOS One and PubMedCentral), citations (in Scopus, Web of Science, CrossRef, PubMedCentral or Google Scholar), saves (in Mendeley or CiteULike) or shares/discussions
(on Twitter, Facebook, blogs or in the comments on PLOS One itself).
Click on either of these categories to see the metrics in more detail.


</>

Metrics in Altmetric


Article metrics

AltmetricAnother commercial provider of altmetrics data is Altmetric.
Their distinctive 'Altmetric donut' with data on coverage of articles
in social media, news outlets, blogs, as well as Mendeley readers, are
included in various databases such as Scopus.


As a demo, they have developed PLOS Impact Explorer,
a PLOS-mashup that shows altmetrics for PLOS papers that have recently
received coverage. It is not possible to search for specific papers
using this tool.


Altmetric also offers a free bookmarklet you can add to your browser, that gives altmetrics data for any DOI it detects on a webpage you are viewing.



Institutional metrics

Altmetric offers subscripion to two analytical tools: Altmetric for institutions, which allows you to see detailed metrics for papers on institutional, departmental and author levels, and Altmetric Explorer,
which allows you to make a selection of papers from a specific journal,
topic or PubMed search and see and download Altmetric data for these
papers. Free access to Altmetric is available for librarians and
institutional repository managers.


</>

Metrics in Plum Analytics


Plum Analytics Plum Analytics is a commercial product owned by Ebsco. In addition to author metrics and article metrics, Plum Analytics offers aggregated institution metrics to subscribing institions.  To see some examples and view sample profiles, go to plu.mx  and click 'Groups' at the top of the page.




Traditional and altmetrics - Research Impact & Visibility - LibGuides at Utrecht University

Increasing Impact of Scholarly Journal Articles

 Source: http://southernlibrarianship.icaap.org/content/v09n01/mullen_l01.html

Electronic Journal of Academic and Special Librarianship

v.9 no.1 (Spring 2008)


Back to Contents

Increasing Impact of Scholarly Journal Articles: Practical Strategies Librarians Can Share

Laura Bowering Mullen, Behavioral Sciences Librarian

Library of Science and Medicine, Rutgers University, Piscataway, New Jersey, USA

lbmullen@rci.rutgers.edu

Abstract

Researchers are extremely interested in increasing the impact of
their individual scholarly work, and may turn to academic librarians
for advice and assistance. Academic librarians may find new roles as
consultants to authors in methods of self-archiving and citation
analysis.  Librarians can be proactive in this new role by
disseminating current information on all citation analysis tools and
metrics, as well as by offering strategies to increase Web visibility
of scholarship to interested faculty. Potential authors of journal
articles, especially those faculty seeking greater research impact,
such as those seeking promotion and tenure, will find practical
suggestions from librarians invaluable. Citation analysis tools
continue to improve in their coverage of social and behavioral science
fields, and emerging metrics allow more flexibility in demonstrating
impact of published journal articles.




Increasing Impact of Scholarly Journal Articles: Practical Strategies Librarians Can Share

Academic librarians are always seeking new ways to use their
expertise to assist faculty and students. Faculty and other researchers
are interested in learning practical tips to increase Web visibility
of their publications, thereby hoping to increase the impact of their
own scholarship by reaching more readers on the internet. The
traditional paradigms are changing, and librarians may be well
positioned for new roles in consulting with clients about methods of
increasing research impact of published articles. This type of reference
service may be especially valuable to faculty seeking promotion and
tenure, or to others wishing to take advantage of developments in open
access for personal gain. By keeping certain strategies in mind when
writing for publication, authors can realize greater impact of their
articles. Academic librarians can disseminate information about
strategies that authors can be use when choosing publications, and
provide information on new methods of proving impact in different ways.


There have been many new developments with citation analysis of
late, and librarians need to be able to educate clientele about
emerging tools and metrics. Impressive new citation analysis tools
allow a researcher to package and demonstrate impact textually and
graphically. New metrics such as the “h-index,” and “eigenfactor” are
providing alternate ways of looking at the impact of citations,
authors, and individual journals.1
Librarians will need to be conversant in these and other emerging
metrics in order to remain relevant to discussions about citation
analysis, especially in STM areas.  New research guides and finding aids
should be made available from the library Website to assist faculty
and others in keeping up with the most current strategies about open
access, and then assisting them in quantitatively demonstrating the
increased impact that may result. There are some concerns about the
costs of providing all of the necessary citation analysis tools within
stretched library budgets. However, some tools are Web-based and free.
Some question whether it should be the province of the library to teach
classes in citation searching and analysis for purposes of promotion
and tenure, or whether it is appropriate for librarians to assist
faculty and other researchers in maximizing their impact through
self-archiving and other means.


By now, it has become fairly well accepted that open access
associated with greater Web visibility increases research impact. A
plethora of quantitative studies are available as part of a helpful
Webliography that librarians may share with researchers. This
Webliography, published by the “Open Citation Project” is updated
regularly, and is a one-stop shop for anyone looking to bolster the
argument that “open access increases research impact.”2
Librarians can offer advice to constituents on strategies to increase
visibility of their peer-reviewed journal articles. Subject specialist
librarians can prepare discipline-specific information on
self-archiving and matters of impact. This information can be
disseminated from the library via the Website, or through personal
consultation between librarian and researcher. Faculty and other
researchers may now be seeking this type of information, and the time
may now be opportune for reference and faculty liaison librarians to
get involved in proactively disseminating practical information. Much
information discussed previously on these topics has largely been
theoretical, or scattered in a variety of library publications and
Websites.


For more than a decade, many librarians and scientists have
persistently made the case that self-archiving is the open access
strategy that would prove most effective for the rapid and widespread
dissemination of peer-reviewed scholarly journal articles. Stevan
Harnad, first in his “subversive proposal” and still today, continues
to advocate for self-archiving of preprints and postprints in
repositories as a mechanism to increase Web visibility. This has often
been called the “green” road to open access.3 
This mechanism of increasing visibility is outside of the traditional
publishing system, and only requires authors to retain rights, and to
deposit their own work in a digital repository of their choice.
Librarians must understand the potential of self-archiving to transform
the scholarly communication system for many disciplines.


Peter Suber has also published many Weblists and articles for
librarians who would like to remain current with open access
initiatives and trends.4
Depending on the university, librarians might not only be expected to
lead the discussion on self-archiving, but also to assist researchers
with the actual process of depositing scholarly work in appropriate
digital repositories. Those working at libraries developing
institutional repositories will also take on the task of encouraging
faculty to participate in the population of the institutional
repository.


There are many other types of open access models. Open access
journals, “born digital” on the Web, also offer promise for authors
seeking impact. Open access journals are included in traditional
indexing and abstracting sources, and many have gained prestige in
their respective fields. As with any journal, authors should make sure
the open access journal is one of quality in the traditional sense.
Peer review status, stature of editors and reviewers, and other measures
of quality have transitioned well to this new publishing model.
Librarians may also be asked to help in choosing an open access
publication outlet for a researcher looking to submit peer-reviewed
scholarship to a journal that would be free to all on the Web. Also,
many traditional journals have liberalized policies and changed
business models to accommodate some aspects of open access. Some of the
largest commercial publishers may have liberal policies when it comes
to self-archiving of postprints.


However it is shared and promulgated, information on open access
journals, self-archiving, choosing between different models offered by
traditional journals, and the most current citation analysis methods 
must be discussed and offered to library clientele.  Who will be
responsible for continuous education of librarians in these areas, and
for making decisions about what services will be offered to various
groups?  Librarians may have broken ranks on some of these issues, not
wanting to be responsible for any negative outcome to researchers, or
not agreeing with some of the open access strategies currently being
trumpeted by library advocacy organizations.


Many have heard of open access, but do not know how to apply the
principles and reap the benefits in a strictly practical sense. Open
access is a ubiquitous topic in the library world at the moment, and is
well-established in some STM disciplines. Those in humanities and some
social sciences areas, which have been slower to adopt changes in
scholarly communications, may be more apt to need background
information on the movement. Many are not sure how open access will
affect them. However, information on any strategy for increasing impact
through greater Web visibility will be welcomed by researchers.  This
is information that faculty members and other research clientele of
academic libraries will undoubtedly find compelling and useful.
Librarians may want to share the following strategies with all library
users in person, from the desk, or through the library Website. The
following is an example of a list that academic librarians may want to
disseminate widely. This type of list is targeted not to librarians, but
to faculty and researchers they work with.


What practical steps can authors take to increase impact of scholarly journal articles:


  • Self-archive/deposit publications (preprints and/or postprints)
    in disciplinary archives. These subject- based repositories allow
    researchers to archive electronic documents through a simple deposition
    process. Examples of disciplinary repositories are:
    CogPrints(cognitive science and psychology), arXiv(physics), and E-LIS
    and dLIST(librarianship).  These subject repositories are crawled by
    search engines, and many readers using services such as Google, Google
    Scholar, or OAIster readily find and cite these full-text open access
    materials. Many more readers will see articles than if they are only
    available in traditional journals. Articles may appear in traditional
    journals as publisher PDFs while also appearing in other versions
    (postprints such as final Word document copies) in subject-based
    disciplinary repositories. Subject archives do not guarantee
    sustainability or preservation of publications.  Self-archiving is
    effective for current Web dissemination of work to all potential
    readers. It is up to authors to make sure that signed copyright
    transfer agreements (CTAs) allow self-archiving of scholarly
    peer-reviewed work. Self-archiving in repositories crawled by search
    engines really gets an article out on the Web for all to find and read.
  • To see what publisher allows in terms of self-archiving, check the publisher or journal name in the SHERPA/RoMEO Website.

    http://www.sherpa.ac.uk/romeo.php

    This Website describes the kind of archiving the publisher allows; for
    anything beyond what's presented, researcher may need to email the
    publisher or editor. Many journals do not make their copyright transfer
    agreements publicly available. Many only mention permission to
    self-archive on personal Web pages, or in institutional repositories,
    not mentioning subject archives.   Researchers may have to seek
    permission to self-archive in disciplinary/subject repositories.
    • If signing a restrictive copyright, authors may need to get a
      copy of SPARC's “Author's Addendum” to retain more personal rights to
      self-archive. There are other examples of added language from many
      universities that can be found on the Web. These statements may serve
      to extend author rights. Authors must be aware of the importance of
      retaining rights to use of their own work, rather than just signing
      their copyright away to publishers.
    • Deposit all work in an institutional repository.  A repository
      will preserve scholarly output, and pulls together all of an author's
      interdisciplinary work in one location. Permanent digital
      preservation/archiving of an author's work, especially if it has not
      been published in print is very important. The institution's repository
      offers this security, as well as a convenient place to direct others
      to find the entire corpus of an author's work. Personal Web pages may
      be subject to a lack of quality control. Some repositories are crawled
      by Google, aiding discovery by many Web searchers outside the
      institution. Institutional repositories have many other benefits to all
      researchers in the academy.  The visibility of interdisciplinary
      research initiatives in progress or completed, the discovery of
      potential collaborators across the institution, the ability to archive
      datasets, the total research production of an institution displayed in
      one place, and the possibility of integration with courseware are just a
      few of the many benefits. A few libraries mandate deposition of
      faculty work in the institutional repository, but for most,
      participation is voluntary.
    • Make sure when submitting work to traditional commercial or
      society journal publishers that they are participating with Google
      Scholar so Google can crawl the content. Most publishers are now
      “partners” with Google Scholar but some are still only participating in
      a limited way, or not at all. If an author's work cannot be found in a
      search of Google Scholar, it is best to contact any publishers that
      are non-partners and ask them to participate with Google Scholar.  You
      will want your publications to appear in Google Scholar with all of
      their versions, both free and subscription. Many libraries link their
      subscribed collections with Google Scholar for enhanced access, drawing
      more readership to an author's work. Those articles appearing in
      Google Scholar will then benefit from the citation analysis that
      results.
    • Seek to publish work in peer-reviewed open access journals.
      Articles published in these scholarly online journals will go quickly
      to the Web to be found by searchers.  Open access journals are free to
      readers, and most are free to authors, so there are no subscription
      barriers. Don't dismiss “author pays” models if research funding is
      available. Make sure open access journals, those “born digital,” have
      high level editorial boards, and prestige in the field.  Authors
      should make sure that the open access journal, as well as any other
      journal of interest is included in as many subject and citation indexes
      as possible to ensure discovery by more searchers of library
      subscription databases. Open access journals are subject to the same
      coverage criteria as any other print or electronic journal when
      applying for coverage by the subject and aggregator indexes. If
      publishing in any journal, make sure that journal is indexed in all
      appropriate subject indexes and databases. Searchers of subject indexes
      will discover these articles, and consider them vetted for scholarly
      value.
    • If an author plans to publish work in a traditional, high impact
      journal, it helps to know that some make their older issues open access
      free on the Web after a short “embargo” period. In this case, the
      journal is not open access per se, but all older issues get wide
      circulation on the Web. An example is “College & Research
      Libraries,” which is free on the Web after a six month embargo period. 
      Many of the publishers of these journals allow self-archiving of
      postprints during the embargo period. Even those that do not convert
      their journals to open access still may allow self-archiving of
      postprints. Elsevier is an example of a commercial publisher that allows
      self-archiving of postprints. Authors should ensure that the business
      models of the publications they submit to will eventually allow some
      version of the article(s) to be discoverable via the open Web.
    • Make sure any journal that you publish in has an electronic
      version. If you find one that doesn't, ask if they are considering
      adding this format, and let them know that it matters to you. Journals
      available only in paper format that don't allow self-archiving are seen
      by limited readers in the age of the internet. Articles in some books
      may suffer from this same lack of Web visibility.
    • Authors should send out information about their articles on
      listservs, personal Websites, blogs, and other online communication
      channels, to increase downloads. In the future, number of downloads may
      also have some significance as far as impact. Some publishers
      advertise their “most downloaded articles.” These articles are featured
      on publishers' Websites, and then downloaded more.
    •  As far as increasing impact, it is advantageous if journals,
      open access or not, are indexed by Web of Science and Scopus.  If the
      journal isn't part of Web of Science, it is less likely to be
      considered “prestigious” by some faculty bodies.  If it is not included
      in Web of Science, it will not have a published “impact factor.”
      Journals with high impact factors are cited more often, and considered
      more prestigious. This is especially true for STM areas, less likely
      for some other disciplines, especially in the humanities. For authors
      trying to demonstrate impact, journals covered by these indexes would
      be important.
    • Follow citation impact in Google Scholar, Web of Science, and
      Scopus to get a more comprehensive picture. Web of Science, although
      the traditional “gold standard” of citation analysis, is especially
      lacking in humanities areas. Scopus has much greater coverage of titles
      in both sciences and social sciences, and some additional features for
      citation analysis. Google Scholar uses an automatic algorithm, and
      therefore returns some interesting results. Still, Google Scholar will
      uncover citations from a different set of materials, and will provide
      some indication of impact for authors. When using Google Scholar, and
      the new metrics based on it, such as Harzing’s “Publish or Perish,”5
      researchers need to be reminded that librarians are not sure what
      publications are being covered, and the algorithm used to do the
      analysis still remains somewhat unknown. The other citation indexes
      publish coverage lists, and are clear about their algorithms.  Still,
      Google Scholar should not be discounted for citation analysis due to
      its heavy use in academia.
    • Consider putting non-refereed materials in repositories also. To
      deposit, material doesn't have to be peer-reviewed. Preprints are
      allowed in many repositories because the material hasn't been
      “published previously.” Preprints can give scholarly articles Web
      visibility prior to certification in a peer-reviewed journal. This
      practice can vary by discipline, and some publishers may not accept
      articles that have been made available on the Web before submission.
      Some fields such as high energy physics have been using a preprint
      model for quite some time. Other fields do not have such a preprint
      culture.
    • Sharing research data has been shown to increase citation impact.
      Depositing supplementary data in a repository, or publishing it
      alongside an article in an open access journal has been shown to gather
      more citations to the accompanying article. One recent study of cancer
      clinical trials shows that sharing data may increase impact in some
      fields by as much as 70%.6
    • Authors may use a combination of many of the above; there is no
      limit as to where a particular work may be self-archived. Rather than
      the traditional practice of simply signing away copyright to a
      scholarly publisher, a copy can also be deposited in the institutional
      repository, archived in subject/disciplinary archives, and on personal
      Webpages if the publisher allows. This deposited article is usually the
      postprint (often in the form of a final Word document), or a preprint
      (often already accepted by a publisher). Branded publisher PDFs, in the
      case of commercial or society journals, usually face restrictions as
      far as archiving.
    The main point is for academic librarians to offer faculty authors
    and other researchers some proven strategies to get their peer-reviewed
    articles seen by more people on the Web. This will potentially raise
    the profile and impact of published work. This impact can then be
    quantitatively demonstrated with both traditional and new citation
    analysis tools. Librarians can compile lists of tips and strategies to
    assist authors and researchers in these areas. These lists can be
    published as Web guides, or shared with faculty and researchers in
    other ubiquitous ways. Appropriate places for this information would be
    the “faculty services” area of the library Website, the scholarly
    communications committee Website, in brochures distributed at desks,
    and as part of research consultations and fora with faculty and other
    researchers. Librarian expertise in these areas will have great value
    to researchers in the academy, and enhance the suite of services that
    the library can provide in a new and changing research environment.
    Librarians must prepare for, and welcome the conversation.



    Notes

    1. Hirsch, J.E. "An Index to Quantify an Individual's Scientific Research Output." Proceedings of the National Academy of Sciences, USA
    102, no. 46 (2005a): 16569-72.; West, Jevin, Ben Althouse, Carl
    Bergstrom, and Ted Bergstrom. "Eigenfactor.Org:  Ranking and Mapping
    Scientific Knowledge."  http://www.eigenfactor.org/.


    2. The Open Citation Project - Reference Linking and Citation Analysis for Open Archives." http://opcit.eprints.org/oacitation-biblio.html.


    3. Okerson, Ann Shumelda, O'Donnell, James J., ed. Scholarly Journals at the Crossroads:  A Subversive Proposal for Electronic Publishing.
    Washington, D.C.: Association of Research Libraries, Office of
    Scientific and Academic Publishing, 1995; Harnad, Stevan. "Fast-Forward
    on the Green Road to Open Access:  The Case against Mixing up Green and
    Gold." Ariadne 42 (2005).


    4. Suber, Peter. "Lists Related to the Open Access Movement." http://www.earlham.edu/~peters/fos/lists.htm#incomplete; Suber, Peter. "Open Access Overview:  Focusing on Open Access to Peer-Reviewed Research Articles and Their Preprints." http://www.earlham.edu/~peters/fos/overview.htm.


    5. Harzing.com: Research in International and Cross-cultural Management. “Publish or Perish.” http://www.harzing.com/index.htm.


    6.
    Piwowar, Heather A., Roger S. Day, and Douglas B. Fridsma. "Sharing
    Detailed Research Data is Associated with Increased Citation Rate." PLoS ONE 2, no. 3 (2007): e308.





Increasing Impact of Scholarly Journal Articles

Spring 2016 Teaching Tips | University of West Florida

 Source: https://secure.uwf.edu/offices/cutla/teaching-tips/spring-2016-teaching-tips/increase-the-visibility-and-impact-of-your-scholarly-work-using-orcid-and-researchid.html

Increase the visibility and impact of your scholarly work using ORCID and ResearchID

March 29, 2016



Increase the visibility and impact of your scholarly work using ORCID and ResearchID


When faculty attempt to document the impact of their work, they must
be able to clearly identify citations for their work and separate these
from citations of work by authors with similar names. If you have ever
run a Google search on your name and found a collection of hits that
include your work and the work of several other people, you are well
aware of the problem created when many scholars have similar names, when
a scholar publishes with various forms of his/her name (e.g., with and
without middle initial), or when a scholar’s last name changes
mid-career (e.g., adopting a new last name or creating a hyphenated name
with a marriage or for other legal reasons).


Scholars now have two options for unambiguously claiming ownership of their work: the Open Research and Contributor ID (ORCID. http://orcid.org) and ResearchID (http://www.researcherid.com/).


ORCID


ORCID is an open, non-profit, and worldwide community that includes
individual researchers, universities, national laboratories, commercial
research organizations, research funders, publishers, national science
agencies, data repositories, and international professional societies.
Registration is independent of membership, which means researchers may
use the identifier throughout their career, irrespective of changes in
discipline, location, name, or affiliation. ORCID provides a persistent
digital identifier that distinguishes you from every other researcher
and ensures that your work is recognized.


The ORCID registry is free. The unique identifier unambiguously
identifies the work of specific researchers. Researchers use their ORCID
identifier to update, maintain, and share research objects (data sets,
articles, media stories, patents, etc.) with collaborators and to
clearly distinguish their research activities from the work of others.
Researchers may use their identifier when they submit a paper or a
dataset to authorize CrossRef or DataCite to formalize the ORCID
identifier-DOI connection when the work is published and to update their
ORCID record. Citations for publications can be imported from many
sources, including Google Scholar. ORCID can be linked to SCOPUS’ Author
ID or Thomson Reuters’ ResearchID and to the NLM SciENcv tool used to
create NIH and NSF Biosketches.


ResearchID


ResearcherID offers a free virtual space to manage and share your
professional information. Each member is assigned a unique identifier to
enable researchers to manage their publication lists, track their times
cited counts and h-index, identify potential collaborators, and avoid
author misidentification. By assigning a unique identifier to each
author who participates, ResearcherID standardizes and clarifies author
names and citations and makes your information search more
straightforward and accessible. It also helps to identify any changes in
institutional affiliations during your career. In addition,
ResearcherID information fully integrates with the Web of Science and is
ORCID compliant, allowing you to increase visibility of your
publications from a single one account.


Faculty who register with ORCID and ResearchID will have an easier
task when they attempt to document the impact of their work. They can
gather information about how often their work has been cited without
having to scrub the names and citations for researchers with similar
names from preliminary citation searches.


Resources


ORCID web site. http://orcid.org/


ResearcherID web site: http://www.researcherid.com





Thanks to Bob Dugan, Dean of Libraries, University of West Florida, for this teaching tip.



Spring 2016 Teaching Tips | University of West Florida

Wednesday, 26 April 2017

Impactstory: Nader Ale Ebrahim




100


most recent



publications









2014

International Education Studies




2014

University of Malaya Research Bulletin




2013

Research Tools in Education Series




2012

Mechanical and Aerospace Engineering, Pts 1-7




2013

Research Tools in Education Series







2012

Proceedings of the Asia Pacific Industrial Engineering & Management Systems Conference 2012







2013









Impactstory: Nader Ale Ebrahim