Thursday, 23 February 2017

Recent progress on emergy research: A bibliometric analysis

 Source: https://www.scopus.com/

Volume 73, 1 June 2017, Pages 1051-1060

Recent progress on emergy research: A bibliometric analysis  (Review)


School of Environmental Science and Engineering, Shanghai Jiao Tong University, Shanghai, China




Key Lab on Pollution Ecology and Environmental Engineering,
Institute of Applied Ecology, Chinese Academy of Sciences, Shenyang,
China




Department of Environmental Engineering Sciences, University of Florida, Box 116350, Gainesville, FL, United States







Abstract

Emergy-related
studies have been widely conducted worldwide in order to evaluate the
total environmental support and sustainability of one system from both
natural and economic sides. Aiming to depict the characteristics of
emergy-related literatures, recognize global research foci, and forecast
future research directions, a complete review on the related research
progresses by using a bibliometric analysis approach was performed in
this study. H-index is applied to evaluate the influence of most
productive journals, countries/territories, and institutions in
emergy-related fields. Social network analysis is also performed to
evaluate the interaction among different countries/territories and
institutions. A holistic picture of the primary performance of
emergy-related literatures published from 1999 to 2014 is presented.
Co-word analysis reveals that emergy-based sustainability research and
the integration of emergy synthesis with other methods (especially life
cycle assessment) will be future research directions in emergy-related
fields. Results obtained from this study can provide valuable
information for researchers to better identify future hotspots in
emergy-related fields. © 2017 Elsevier Ltd

Author keywords

Bibliometric analysis; Co-word analysis; Emergy; H-index; Social network analysis; Web of Science


ISSN: 13640321

CODEN: RSERF
Source Type: Journal
Original language: English


DOI: 10.1016/j.rser.2017.02.041
Document Type: Review
Publisher: Elsevier Ltd


Scopus - Document details

Research Impact Measurement




Research Impact Measurement

Wednesday, 22 February 2017

Qualitative and quantitative analysis of solar hydrogen generation literature from 2001 to 2014 | SpringerLink

 Source: http://link.springer.com/article/10.1007/s11192-015-1730-3

Scientometrics

, Volume 105, Issue 2,
pp 759–771

Qualitative and quantitative analysis of solar hydrogen generation literature from 2001 to 2014

  • Mohammad Reza Maghami
  • Shahin navabi asl
  • Mohammad esmaeil Rezadad
  • Nader Ale Ebrahim
  • Chandima Gomes
  • Mohammad Reza Maghami
    • 1
  • Shahin navabi asl
    • 2
  • Mohammad esmaeil Rezadad
    • 3
  • Nader Ale Ebrahim
    • 4
  • Chandima Gomes
    • 1
  1. 1.Department of Electrical and Electronic Engineering, Faculty of EngineeringUniversiti Putra MalaysiaSerdangMalaysia
  2. 2.Department of Electrical and Electronic Engineering, Faculty of EngineeringIslamic Azad University, Damghan BranchDamghanIran
  3. 3.Department of Mechanical Engineering, Faculty of EngineeringUniversity of Malaya (UM)Kuala LumpurMalaysia
  4. 4.Research Support Unit, Centre of Research Services, Institute of Research Management and Monitoring (IPPP)University of Malaya (UM)Kuala LumpurMalaysia
Open AccessArticle
DOI:
10.1007/s11192-015-1730-3

Cite this article as:
Maghami, M.R., asl, S.., Rezadad, M.. et al. Scientometrics (2015) 105: 759. doi:10.1007/s11192-015-1730-3

Abstract

Solar
hydrogen generation is one of the new topics in the field of renewable
energy. Recently, the rate of investigation about hydrogen generation is
growing dramatically in many countries. Many studies have been done
about hydrogen generation from natural resources such as wind, solar,
coal etc. In this work we evaluated global scientific production of
solar hydrogen generation papers from 2001 to 2014 in any journal of all
the subject categories of the Science Citation Index compiled by
Institute for Scientific Information (ISI), Philadelphia, USA. Solar
hydrogen generation was used as keywords to search the parts of titles,
abstracts, or keywords. The published output analysis showed that
hydrogen generation from the sun research steadily increased over the
past 14 years and the annual paper production in 2013 was about three
times 2010-paper production. The number of papers considered in this
research is 141 which have been published from 2001 to this date. There
are clear distinctions among author keywords used in publications from
the five most high-publishing countries such as USA, China, Australia,
Germany and India in solar hydrogen studies. In order to evaluate this
work quantitative and qualitative analysis methods were used to the
development of global scientific production in a specific research
field. The analytical results eventually provide several key findings
and consider the overview hydrogen production according to the solar
hydrogen generation.

Keywords

Qualitative and quantitative analysis of solar hydrogen generation literature from 2001 to 2014 | SpringerLink

Research Impact Measurement



Research Impact Measurement

Research Impact Measurement

by Nader Ale Ebrahim
Do you know “Over 43% of ISI papers have
never ever received any citations?” (nature.com/top100,
2014
).
Publishing a high quality paper in scientific journals is only halfway towards
receiving citation in the future. The rest of the journey is dependent on
disseminating the publications via proper utilization of the “Research
Tools
”.
Proper tools allow the researchers to increase the research impact and
citations for their publications. This workshop series will provide you various
techniques on how you can increase the visibility and hence the impact of your
research work.
 

Tuesday, 21 February 2017

Enrich Research Visibility and Impact by Citation Tracking



Enrich Research Visibility and Impact by Citation Tracking

Enrich Research Visibility and Impact by Citation Tracking

byNader Ale Ebrahim
Citation
tracking is used to discover how many times a particular article has
been cited by other articles. Citation counts are not perfect. They are
influenced by a number of factors. Review articles are sometimes more
often cited than their quality would warrant. Poor quality papers can be
cited while being criticized or refuted. In this workshop, I will
explain about the advantages of "Citation Tracking" and introduced some
“Research Tools” for improving the research visibility, impact,  and
citations by “Citations Tracking”.

Citation Alerts - Who's Citing Me? Measuring Your Research Impact - Research Guides at University of Kansas Medical Center

 Source: http://guides.library.kumc.edu/c.php?g=451739&p=3084603

Who's Citing Me? Measuring Your Research Impact

A guide to bibliometrics, altmetrics, and citation/journal analysis.

Why Track WHO cited me?


Tracking your publication
citations is not just about numbers, it's about WHO is citing your
work.  Benefits of tracking who has cited your publications include:



  • Learn which researchers or institutions are following your work
  • Identify possible collaborators
  • Identify similar research projects
  • Confirm that research findings were properly attributed and credited
  • Determine if research findings were duplicated, confirmed, corrected, improved or repudiated
  • Determine if research findings were extended (different human populations or animal models/species), etc.
  • Quantify return on research investment
  • Justify future requests for funding
  • Tenure/Promotion

Tracking via Citation Alerts

Use the citation alerts function in databases to be notified when someone cites your work. 
This allows you to follow who is citing you and when you have been
cited.  Alerts can be created for authors or specific articles and can
be sent via email or RSS feed on a specified frequency (daily, weekly,
monthly). 








Web of Science Citation Alerts



You can create an alert for an author or a specific article:



Link to help for creating alerts or view the Web of Science tutorial.







Google & Google Scholar Alerts







PubMed Commons Comments



PubMed Commons
enables authors to share opinions and information about scientific
publications indexed in PubMed.  As an author of an indexed publication,
you can create an alert to be notified when someone posts a comment to
one of your articles. Create a search for yourself as the author and
articles that have comments, as in the example below:




Example: Olivero M [author] AND has_user_comments [filter]
Then create an alert for this search.  View the brief PubMed Tutorial for details on creating an alert.
See the PubMed Commons Guide for more examples of searching for comments in PubMed Commons.











Track Altmetrics



Use the free Altmetric bookmarklet
to track other forms of metrics (non-citations) for you published
journal articles.  Drag the Bookmarklet to your browser's bookmarks bar
and use this for any journal article to learn of any social media
activity for the selected article.








Citation Alerts - Who's Citing Me? Measuring Your Research Impact - Research Guides at University of Kansas Medical Center

Publications Authored by Dr. Nader Ale Ebrahim | PubFacts.com

























Biography

Nader Ale
Ebrahim is currently working as a visiting research fellow with the
Research Support Unit, Centre for Research Services, Institute of
Research Management and Monitoring (IPPP), University of Malaya. Nader
holds a PhD degree in Technology Management from Faculty of Engineering,
University of Malaya. He has over 23 years of experience in the field
of technology management and new product development in different
companies. His current research interests include: E-skills, Research
Tools, Bibliometrics, and managing virtual R&D teams for new product
development.

Nader developed a new method using the “Research Tools” which help
researchers who seek to reduce the search time by expanding the
knowledge of researchers to effectively use the "tools" that are
available on the Internet. Research Tools consist of a hierarchical set
of nodes. It has four main nodes: (1) Searching the literature, (2)
Writing a paper, (3) Targeting suitable journals, and (4) Enhancing
visibility and impact.

He was the winner of Refer-a-Colleague Competition and has received
prizes from renowned establishments such as Thomson Reuters. Nader is
well-known as the creator of “Research Tools” Box and the developer of
“Publication Marketing Tools”. He has so far conducted over 260
workshops within and outside of university of Malaya. His over 100
papers/articles have published and presented in the several journals and
conferences.

Primary Affiliation:
Research Support Unit, Centre for Research Services, Institute of
Research Management and Monitoring (IPPP) - Kuala Lumpur, Wilayah
Persekutuan , Malaysia
Specialties: Technology Management, Research Tools
Research Interests: Virtual R&D Teams, Bibliometrics, Research Visibility,


Publications Authored by Dr. Nader Ale Ebrahim | PubFacts.com

Sunday, 19 February 2017

Nader Ale Ebrahim (@aleebrahim) | Twitter


Nader Ale Ebrahim (@aleebrahim) | Twitter

Literature Review from Search to Publication, Part 2: Finding proper articles

Source: https://doi.org/10.6084/m9.figshare.4668241.v1

Literature Review from Search to Publication, Part 2: Finding proper articles

byNader Ale Ebrahim
“Research
Tools” can be defined as vehicles that broadly facilitate research and
related activities. “Research Tools” enable researchers to collect,
organize, analyze, visualize and publicized research  outputs. Dr. Nader
has collected over 700 tools that enable students to follow the correct
path in research and to ultimately produce high-quality research
outputs with more accuracy and efficiency. It is assembled as an
interactive Web-based mind map, titled “Research Tools”, which is
updated periodically.  “Research Tools” consists of a hierarchical set
of nodes. It has four main nodes: (1) Searching the literature, (2)
Writing a paper, (3) Targeting suitable journals, and  (4) Enhancing
visibility and impact of the research. This workshop continues the
previous one and some other tools from the part 1 (Searching the
literature) will be described. The e-skills learned from the workshop
are useful across various research disciplines and research
institutions.


Literature Review from Search to Publication, Part 2: Finding proper articles

Literature Review from Search to Publication, Part 1: Systematic Review

 Source: https://doi.org/10.6084/m9.figshare.4668232.v1

Literature Review from Search to Publication, Part 1: Systematic Review

byNader Ale Ebrahim
“Research
Tools” can be defined as vehicles that broadly facilitate research and
related activities. “Research Tools” enable researchers to collect,
organize, analyze, visualize and publicized research  outputs. Dr. Nader
has collected over 700 tools that enable students to follow the correct
path in research and to ultimately produce high-quality research
outputs with more accuracy and efficiency. It is assembled as an
interactive Web-based mind map, titled “Research Tools”, which is
updated periodically.  “Research Tools” consists of a hierarchical set
of nodes. It has four main nodes: (1) Searching the literature, (2)
Writing a paper, (3) Targeting suitable journals, and  (4) Enhancing
visibility and impact of the research. In this workshop some tools as an
example from the part 1 (Searching the literature) will be described.
The e-skills learned from the workshop are useful across various
research disciplines and research institutions.


Literature Review from Search to Publication, Part 1: Systematic Review

Using citation analysis to measure research impact | Editage Insights

Source: http://www.editage.com/insights/using-citation-analysis-to-measure-research-impact








Using citation analysis to measure research impact





Measuring research impact





The
landscape of science and research is rapidly evolving. Gone are the
days when all members of a university department would celebrate the
successful publication of a colleague’s paper.1 Earlier,
scientists would simply consider the number of papers they had published
as a measure of their academic standing. Today, the focus is
increasingly shifting from whether a researcher has published a paper to
where he/she has published it and the impact that piece of research has
on the scientific community and the world at large.2 

How
can you measure the quality of a research paper? More importantly, how
can you determine whether your research is making an impact and is
considered important? An objective way is through citation analysis. 



Citation analysis

Why
count citations in the first place? The list of references directing
readers to prior relevant research is considered a fundamental part of
any research paper.
3 A
reference or citation is a form of acknowledgment that one research
paper gives to another. Research is additive—scientists build on past
work to discover new knowledge. To identify gaps in existing research
and choose a research topic, researchers read the relevant published
research and use this existing material as a foundation for arguments
made in their own research papers.


11 reasons to cite previous work

  1. To direct readers to an authentic source of relevant information
  2. To help other researchers trace the genealogy of your ideas
  3. To acknowledge pioneers and peers
  4. To direct readers to previously used methods, and equipment
  5. To criticize or correct previous work
  6. To substantiate your claims and arguments with evidence
  7. To show that you have considered various opinions in framing your arguments
  8. To highlight the originality of your work in the context of previous work
  9. To guide other researchers in their work
  10. To build your credibility as an author
  11. Finally, because not citing sources can amount to plagiarism4
What are the various citation-based metrics?

Citation analyses can be grouped according to some broad types based on who/what is being evaluated.

  1. Ranking journals:
    Journals are ranked by counting the number of times their papers are
    cited in other journals. Journal-level metrics are generally meant to
    serve as an indicator of journal prestige. The most well known of these
    is the journal impact factor, from Journal Citation Reports
    ®(a
    product of Thomson Reuters). The journal impact factor is calculated as
    the average number of citations all articles in a journal receive over a
    specific period of time.
    5
  2. Ranking researchers:
    Various citation metrics are now used for this purpose. Researchers are
    ranked by counting the number of times their individual papers are
    cited in other published studies. These metrics are also used to
    evaluate researchers for hiring, tenure, and grant decisions. A
    researcher-level metric that is gaining popularity is the h index,
    6 which
    is calculated by considering a combination of the number of papers
    published by a researcher and the number of citations these papers have
    received.
  3. Ranking articles:
    Article-level citation counts may provide an accurate evaluation of the
    quality and impact of a specific piece of work, regardless of the
    author. Unfortunately though, such metrics are rarely considered because
    obtaining these data is tedious and time-consuming.
    7
  4. Ranking universities and countries:
    There are databases that rank universities and countries by considering
    their overall research output through criteria such as citable
    documents, citations per document, and total citations. These metrics
    help determine which universities and countries have the most and/or
    best scientific output. For example, Scimago Research Group (
    http://www.scimago.es/ ) releases annual reports of institution- and country-wise rankings.
How can citation analysis help you?

Researchers
today are faced with increasing pressure to get published. Academic
departments are expected to meet specific levels of publication output.
Clearly, there is a lot at stake in the assessment of research quality
for both individuals and institutions. Given this, governments, funding
agencies, and tenure and promotion committees are looking toward simple
and objective methods to assess increasing research volumes in the least
possible time. To this end, they are turning more and more to citation
analysis for objective parameters of impact assessment. 




Pitfalls of citation analysis

When using citation analysis, it is important to bear in mind some of its limitations3,7

  • It
    overlooks the disparity in discipline-wise citation rates, that is, the
    fact that citation patterns differ among disciplines and over time.
  • It
    ignores the fact that certain manuscript types such as letters and case
    reports offer inadequate scope for citation and typically have short
    reference lists. 
    The
    sentiment of the citation is not considered; that is a negative
    citation (one used to refute a prior claim) is given as much merit as a
    positive citation (one used to further the claim being made). So even a
    paper that has been cited simply to discredit it can work to the
    author’s advantage in citation analysis.
  • It
    does not account for author contribution on papers with multiple
    authors: such citations are as meritorious as those to single-author
    papers. Citation analysis attributes equal importance to all authors of a
    paper, regardless of their individual contribution.
Thus,
sole reliance on citation data provides an incomplete understanding of
research. Although citation analysis may be simple to apply, it should
be used with caution to avoid it coming under disrepute through
uncritical use.
3 Ideally,
citation analysis should be performed to supplement, not replace, a
robust system of expert review to determine the actual quality and
impact of published research.
8

Future of citation analysis

Given
the shift to online interfaces by more and more journals and
repositories, digital information is now available at a few clicks. With
the advent of linking tools and digital archives of research papers,
scientific literature is more easily retrievable than ever before.
Therefore, it is only to be expected that the population of researchers
turning to citation data will continue to grow. In such a scenario,
researchers cannot afford to undermine the importance of citation
analysis. 


So
next time you are preparing for a promotion or applying for a new
position, consider using citation analysis as a means to bolster your
eligibility. Use the citation count feature offered by online databases
like Web of Science to compile your citation data and employ multiple
citation metrics to highlight your research output.


Bibliography
  • Dodson MV (2008). Research paper citation record keeping: It is not for wimps. Journal of Animal Science, 86: 2795-2796.
  • Thomson Reuters. History of citation indexing. Essay in Free Scientific Resources. [http://thomsonreuters.com/products_services/science/free/essays/history_of_citation_indexing/] 
  • Smith L (1981). Citation analysis. Library Trends, 30: 83-106.
  • Garfield E. Citation indexing-Its Theory and Application in Science, Technology, and Humanities. New York: Wiley, 1979.
  • Garfield
    E (2006). The history and meaning of the journal impact factor. The
    Journal of the American Medical Association, 295: 90-93.
  • Hirsch
    JE (2005). An index to quantify an individual’s scientific research
    output. Proceedings of the National Academy of Sciences USA, 102:
    16569-16573.
  • Neylon C and Wu S (2009). Article-level metrics and the evolution of scientific impact. PLoS Biology, 7: 1-6.
  • Moed
    HF (2007) The future of research evaluation rests with an intelligent
    combination of advanced metrics and transparent peer review. Science and
    Public Policy, 34: 575-583.




Using citation analysis to measure research impact | Editage Insights

Tuesday, 14 February 2017

Impact of Social Sciences – Tracking the digital footprints to scholarly articles: the fast accumulation and rapid decay of social media referrals

Source: http://blogs.lse.ac.uk/impactofsocialsciences/2017/02/14/tracking-the-digital-footprints-to-scholarly-articles-the-fast-accumulation-and-rapid-decay-of-social-media-referrals






Tracking the digital footprints to scholarly articles: the fast accumulation and rapid decay of social media referrals

wangAcademics
are increasingly encouraged to share their scholarly articles via
social media, as part of a wider drive to maximize their dissemination
and engagement. But what effect does this have? Xianwen Wang
has studied the referral data of academic papers, with particular focus
on social media referrals and how these change over time. Referrals
from social media do indeed account for a significant number of visits
to articles, especially in the days immediately following publication.
But this fast initial accumulation soon gives way to a rapid decay.
PeerJ,
an open access, peer reviewed scholarly journal, provides data on the
referral source of visitors to all of its article pages. This is quite
unique as such data is not available on other publisher or journal
websites. These metrics are updated on a daily basis following an
article’s publication, meaning for the first time we are able to track
the digital footprints to scholarly articles and explore people’s
visiting patterns.
In our previous study examining referral data collected from PeerJ,
social network platforms were proven to be among the top referral
sources. Social media directs many visitors to scholarly articles. In
our more recent study, we used the daily updated referral data of 110 PeerJ articles collected over 90 days (22 January – 20 April 2016) to track the temporal trend of visits directed by social media.
footprintsImage credit: 20070912-16 by Matt Binns. This work is licensed under a CC BY 2.0 license.
Twitter and Facebook account for most social media referrals
During our observation period, 19 February
was the first day on which all 110 sample articles had visiting data,
with 20 April being the last day of the research period and the point at
which all papers in our sample had been published for at least 60 days.
According to the findings of our study, article visits directed by
social referrals account for more than 12% of all visits (as shown in
Figure 1). Twitter and Facebook are the two most important social
referrals directing people to scholarly articles; between them
accounting for more than 95% of all social referrals. Individually
Twitter and Facebook were roughly equivalent to one another, each
falling within the 42-54% range.
figure-1Figure 1: The proportion of article visits from social referrals on two specific days. Source: Wang et al, (2016). Tracking the digital footprints to scholarly articles from social mediaScientometrics. © Akadémiai Kiadó and republished here with permission.
Attention from social media: “easy come, easy go”
To track temporal trends in what
percentages of total visits to articles could be accounted for by social
media referrals, the daily visiting data of each article were grouped
according to the publish–harvest interval days (the number of days from
publication to data being recorded). The visiting dynamics analysis
(Figure 2) shows an obvious overall downward temporal trend in the
proportion of all visits originating from social media. Where papers had
been published for just one day, social referrals accounted for 20% of
all visits. After 90 days, this percentage falls to only 9%.
Overall, during the initial period
following a scholarly article’s publication, social attention comes very
quickly. In most cases, visits from social media are much faster to
accumulate than visits from other referrals, with most of those visits
directed by social referrals being concentrated in the few days
immediately following publication. About 77% of the visits from social
media are generated in the first week after publication. However – “easy
come, easy go” – social buzz around scholarly articles doesn’t last
long, leading to a rapid decay in the article visits from social
referrals.
figure-2Figure 2: Temporal trend of the proportion of visits from social media in the total visits. Source: Wang et al, (2016). Tracking the digital footprints to scholarly articles from social mediaScientometrics. © Akadémiai Kiadó and republished here with permission.
The role of social buzz in directing
people to scholarly articles can be illustrated by a specific example.
As shown in Figure 2, a small but noticeable increase occurs at the
middle part of the curve. We reviewed the data and discovered that this
small burst is attributable to a jump in visits from Twitter to paper 1605.
Paper 1605 was published on 2 February 2016. To 6 March, the number of
article visitors directed by Twitter had reached 381. On 7 March, a
particularly influential Twitter account
(with 1.97 million followers) tweeted about the paper. That tweet was
retweeted 11 times on the same day and is the reason the number of
article visitors from Twitter rose dramatically from 381 to 751 in only a
few days.
The fluctuation visible towards the end of
the curve is caused by the vast decrease in the number of samples with
sufficiently long time windows (in number of days since publication).
Synchronism between the number of tweets and article visitors from Twitter
figure-3Figure 3: Synchronism of temporal trend of tweets and their procured visits for the paper 1605. Source: Wang et al, (2016). Tracking the digital footprints to scholarly articles from social mediaScientometrics. © Akadémiai Kiadó and republished here with permission.
The synchronism of the growth in the
number of tweets and that in article visitors from Twitter testifies
partially that social mentions do direct people to read scholarly
articles, although we don’t know who is directed by which tweet. Article
visitors from social referrals may be researchers, students, or even
the general public. However, it does prove that the public attention on
social media can be transformed into the real clicks on scholarly
articles.
This blog post is based on the author’s co-written article, ‘Tracking the digital footprints to scholarly articles from social media’, published in Scientometrics (DOI: 10.1007/s11192-016-2086-z).
Note: This article gives the views of
the author, and not the position of the LSE Impact Blog, nor of the
London School of Economics. Please review our 
comments policy if you have any concerns on posting a comment below.
About the author
Xianwen Wang is a Professor at WISE Lab, Dalian University of Technology in China and an Associate Editor of Frontiers in Research Metrics and Analytics. His ORCID iD is 0000-0002-7236-9267.
Print Friendly


Impact of Social Sciences – Tracking the digital footprints to scholarly articles: the fast accumulation and rapid decay of social media referrals

Monday, 13 February 2017

Nuts and Bolts: The Super Long List of Things to Do When Starting a New Journal      - The Scholarly Kitchen

 Source: https://scholarlykitchen.sspnet.org/2016/08/04/nuts-and-bolts-the-super-long-list-of-things-to-do-when-starting-a-new-journal/




Nuts and Bolts: The Super Long List of Things to Do When Starting a New Journal     


Launch of the USS New Jersey in 1942. Image courtesy of the US Government.
Launch of the USS New Jersey in 1942. Image courtesy of the US Government.

This past May, I participated in a session at the Council of Science Editors Annual Meeting about starting a new journal. My role was to discuss the logistics and technical issues, or better titled, the Super Long List of Things to Do. There were two very good presentations that went along with mine. Cara Kaufman of Kaufman, Wills, Fusting, & Co. discussed when and how to decide whether to start a new journal. Katherine Bennett presented a case study for the launch of a new open access journal at the American Society of Radiation Oncology.


The idea of launching a new journal may seem easy with today’s
technology. Some may argue that all you need is a website with a content
management system. This may work for some communities but for a journal
that wants to meet the expectations of the typical journal user and/or
subscriber, there are many, many things that need to be done.


I have launched three journals in the last four years, none of which
are open access (OA) journals. I will try to differentiate between a
subscription journal and an OA journal where necessary but I honestly
think the process is pretty much the same, regardless of the business
model.


So let’s assume that the business case for starting a new journal has
been met and you already have an editor in place. Now you are tasked
with all of the details needed to actually launch a new journal. Here
are some things I have learned along the way.


In order to keep track of everything, I keep an Excel spreadsheet.
This was originally created by an über-organized coworker. The
spreadsheet has been refined and  now serves two purposes: first, to
record and keep track of deadlines and responsibilities; and second, to
share critical information with everyone who needs the information.


In order to maintain the integrity of the data, all questions that
come my way are answered with the spreadsheet — I literally send them
the sheet, not cut and paste information. I have seen too many instances
where retyping information results in errors. Of course this means that
your spreadsheet needs to be correct and updates noted.


Identifiers

The first part of the sheet contains what I call “identifiers.” These
are basic metadata elements that need to be correct and decided
relatively early.


Title — What to call a journal can change as more
people get involved with reviewing information; but, it’s important to
make the decision and stick with it. I did have a journal title change
half way through launch once and it required that I get new ISSNs, which
was another unnecessary delay. You should also include an abbreviated
title on your spreadsheet. Again, you want the same abbreviation to
appear everywhere. For my program at the American Society of Civil
Engineers (ASCE), we use the abbreviated title in our references and the
same abbreviations everywhere else.


Internal acronyms and codes — All of our journals
have a two-letter acronym. This acronym is part of our manuscript
numbering system and the URL for our manuscript submission sites. You
may also need a code for internal accounting purposes. Remember that you
probably need accounting codes for outgoing payments but also incoming
payments.


ISSN — Serial publications should have an International Standard Serial Number or ISSN.
Every format of the journal requires an ISSN. If you have a print and
an online format, you need to request two ISSNs. For forthcoming print
titles, an ISSN can be requested prior to the first issue being
published if you provide a journal masthead page. Once the first issue
is published, you will need to mail a copy to the Library of Congress in
order for your ISSN to move from provisional to final.


For online-only publications, you cannot request an ISSN until 5
papers have been published. A URL will be required in lieu of the print
masthead page. Note that many of the library holdings systems require
ISSNs so even OA journals should consider having an ISSN for the
libraries.


In the U.S., ISSNs are assigned by the Library of Congress. There are other ISSN granting institutions outside the U.S. An important note — an ISSN must be registered with the International ISSN Registry
in order for Scopus (and possibly others) to index the journal. ISSNs
from the Library of Congress are covered but some international ISSN
granting groups are not so careful about this.


CODEN — A CODEN
is a combination of six letters and numbers assigned by the Chemical
Abstract Services for cataloging serials. At ASCE, we have always had
CODENS, partly because our first online platform required them. We still
use CODENS as a unique journal identifier in places like the URL for
journals and in the DOI. A CODEN is not required and many journals
outside of the physical sciences do not use them.


DOI — Our Digital Object Identifiers,
or DOIs, have evolved over time. Because we have 36 journals, we like
to at least be able to identify the journal by just glancing at the DOI.
In the beginning, we had loads of information in the DOI, then we
switched to including ISSNs in the DOI string. With the delay in getting
an ISSN for online only journals, we were forced into another change
and now use the CODEN followed by the sequential number string. There
are no requirements to include identifying information in a DOI string
and, I would venture to guess that Crossref would probably rather you not do that anyway!


Format and Design

Frequency and schedule — If you intend to have
“issues,” which is still advantageous for journals that will be indexed
by Abstract and Indexing (A&I) services and others, you will need a
frequency. This information will also be needed if you are selling
subscriptions to the journal. Even if you intend to employ some form of
continuous publication (eFirst, Just Accepted, etc.), you will need to
set a frequency if issues are involved. The schedule for issues may be
fluid for some publications but with 36 journals, we attempt to balance
the number of issues coming out in any given month so as to not
overwhelm the production department.


Cover and interior — “Cover” may not be the correct
word in you have an online-only journal but you will need some branding
and likely something shaped like a cover. Have you ever wondered why
eBooks or online-only journals have a graphic that looks like a regular
cover? It’s because that’s what people expect to see in marketing
pieces. If it doesn’t have a cover, it’s not real. Also, many of the
“spaces” provided on off-the-shelf online platforms for a publication
image are the shape of a cover thumbnail. The spreadsheet should note
any color considerations for branding, additional logos that need to be
included, and notes about interior design.


Submission and Production Set-Up

Submission site — Note the URL for submissions when
available. This will be important for marketing the journal and the
call-for-papers campaign. This portion of the spreadsheet also includes
information about the review style (EIC, Associate Editors, Editorial
Board, Advisory Board, Single-blind, double-blind, open review, etc.). I
also note on this section whether we can pull information from an
existing site, such as a reviewer pool from another one of ASCE’s
journal that has related content.


Classifications and taxonomy — If you have a
taxonomy, it is important to review the taxonomy against the Aims and
Scope of the new journal to ensure that you have appropriate terms. We
use classifications for people and papers in our submission site so
identifying where those will come from and who will review them (likely
the editor) is important.


Article types and production issues — This section
could be quite extensive and perhaps warrant a whole other worksheet
depending on the journal. At ASCE, we try to keep the journals
standardized so I simply note whether there are any additional article
types that production needs to build into the XML metadata.


Metadata

Crossref and other indexing services — Depositing
DOIs with Crossref is an important step for discoverability. You should
inform Crossref and any other indices that a new title is forthcoming.
In order to deposit a DOI for an article, an ISSN is needed and as
mentioned earlier, you cannot apply for one for online only content
until at least 5 papers have been published. You are permitted to
deposit DOIs with a journal title level DOI
but those will need to be replaced when an ISSN is added. Either way,
it’s important to note that your DOIs will need to be deposited off
cycle and that getting the ISSN as soon as possible is important.


Web of Science— You should be sending Thomson
Reuters (or their apparent successor) a frequency chart each year with
any changes to frequency. New journals should be added even if you
haven’t applied for coverage yet. There is an application for getting a new journal indexed and you can apply immediately once you start publishing content.


Thomson Reuters takes timeliness of issues very seriously. Once you
have applied and have published three issues, you are encouraged to ask
for a status update. This will ensure that someone is actually
evaluating your content. You will need to provide access to Thomson
Reuters for evaluation. If your content is behind a paywall, you will
need to provide them with subscriber access. You can read more about the
evaluation criteria and process here.
Generally speaking, you will be informed if and when your journal is
indexed. This could take years. A journal will not be assigned an Impact
Factor until it is accepted into the appropriate database.


Scopus/Compendex — It is important to note that you
cannot apply for coverage in Elsevier’s databases until the journal has
been published for three years. Once the time has passed, there is an online application and evaluation process. The Scopus database is separate from the other Elsevier databases and as such two separate applications are required. More information can be found here. You will be informed if your journal has been accepted or denied. It can take more than a year to find out.


PubMed/Medline — For print journals, you must supply copies to Medline
for evaluation and you can start as soon as the first issue is out. For
online journals, you cannot apply for coverage until you have published
for 12 consecutive months and you have published 40 articles. Medline
requires access to content for evaluation purposes.


Google Scholar — While it may not be entirely necessary to inform Google Scholar
of a new journal, it certainly doesn’t hurt. Google Scholar is quite
accessible and appreciates it when publishers are proactive about their
plans.


Feed and crawler management — The spreadsheet should
indicate if there are any metadata feeds or crawler that the new
journal should be excluded from. If not, then you may actually need to
add this new title to the feeds you are managing (see next section on
Website).


Website Set-Up

Landing page — A new journal needs to be added to
the publication platform. All of the information needed in the
administrative tools for set up should be included in the spreadsheet.
You may need to decide when to make a journal landing page live and
whether having a “coming soon” page makes sense. For us, we include
cover art, editor, Aims and Scope, Submission information, and the
ability to sign up for Tables of Content Alerts. Whether on the platform
or not, potential authors will need access to the Aims and Scope as
well as editor information as early as possible.


In house web ads — Identify other web pages within the platform would be most appropriate for Call for Papers ads and announcements.


Turn feeds on or off — Depending on your platform,
you may need to manually include the journal in routine feeds of
metadata. Sometimes, you may need to suppress a feed until a later date
(like if you don’t have an ISSN yet for Crossref deposits).


Subject categories — If the journal platform has title level subject categories, these should be assigned at set up.


Contract and Notifications

You know you have them, you probably have lots of them. If your
contracts or agreements list the journal titles, you may need to reach
out to those partners with an addendum. You may need to adjust the
contracted number of papers being hosted or typeset depending on the
volume of new journal. Don’t forget to review any agreements with
A&I services as well as archive services like CLOCKSS and Portico.


Marketing

New journals require a serious amount of marketing support. We cover
this in separate meetings between marketing and journals. It is
important for the journals and production teams to know the schedule for
things like annual catalogs and maybe member journal renewals. Annual
meetings or conferences may also be the platform for announcing a new
journal. The marketing schedule should run parallel to the journal
launch schedule to maximize opportunities for promotion. Promotions we
have done for new journals include:


  • Call for Papers PDF flier (can print for conference booths and send to the editors for email distribution)
  • E-mail campaigns to authors or members that may be interested in the new title
  • Editor interview posted to organization website
  • Conference promotions (fliers, posters, etc.)
  • Editor solicitation cards (pocket-sized cards that members of the
    editorial board can use at conferences to solicit submissions from
    presenters)
  • Social media — post early, post often

Internal Communication

There are lots of people within your organization that need to know about new journals. Here is a list that I use:


  • Customer Service — make sure they can answer any questions that come
    in about the new title. You don’t want someone to call with a question
    and the customer service rep says that you don’t have a journal with
    that title.
  • Membership — the new journals should be included on things like a member renewal or services brochure.
  • Website Team — Our corporate website is separate from our
    publication website. It’s important to include the new journal on any
    corporate website pages that focus on publication titles.
  • IT and Accounting — If you pull sales reports on journals or track
    APCs paid per journal, then likely there is a report that needs to have
    the new journal added.
Without a doubt, the hardest part of launching a new journal is
getting the editorial staff or volunteers on board and then soliciting
content. For a subscription journal, constant and steady solicitation is
vitally important to ensure that quality peer-reviewed content is
served to subscribers in a timely fashion. For an OA journal, the
pressure for subscriptions is null but you still want to have a nice
showing of content for the marketing blitz.


There is a ton of competition with new journals being born all the
time. Starting a new journal is not to be taken lightly. Gone are the
days — if they ever existed — to “build it and they will come.” It’s a
lot of work.


In this post, I have tried to outline the more routine details — my “to do” list for starting a new journal. I hope you find the spreadsheet template and PowerPoint slides helpful and I look forward to your comments on how you manage the process.




Nuts and Bolts: The Super Long List of Things to Do When Starting a New Journal      - The Scholarly Kitchen