Saturday, 31 October 2015

Citation Frequency and Ethical Issue

 Source: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4324277/

Electron Physician. 2014 Apr-Jun; 6(2): 814–815.
Published online 2014 May 10. doi:  10.14661/2014.814-815
PMCID: PMC4324277

Citation Frequency and Ethical Issue

Dear Editor:

I read your publication ethics issue on “bogus impact factors” with great interest (1).
I would like to initiate a new trend in manipulating the citation
counts. There are several ethical approaches to increase the number of
citations for a published paper (2). However, it is apparent that some manipulation of the number of citations is occurring (3, 4). Self-citations, “those in which the authors cite their own works” account for a significant portion of all citations (5).
With the advent of information technology, it is easy to identify
unusual trends for citations in a paper or a journal. A web application
to calculate the single publication h-index based on (6) is available online (7, 8). A tool developed by Francisco Couto (9)
can measure authors’ citation impact by excluding the self-citations.
Self-citation is ethical when it is a necessity. Nevertheless, there is a
threshold for self-citations. Thomson Reuters’ resource, known as the
Web of Science (WoS) and currently lists journal impact factors,
considers self-citation to be acceptable up to a rate of 20%; anything
over that is considered suspect (10).
In some journals, even 5% is considered to be a high rate of
self-citations. The ‘Journal Citation Report’ is a reliable source for
checking the acceptable level of self-citation in any field of study.
The Public Policy Group of the London School of Economics (LSE)
published a handbook for “Maximizing the Impacts of Your Research” and
described self-citation rates across different groups of disciplines,
indicating that they vary up to 40% (11).
Unfortunately,
there is no significant penalty for the most frequent self-citers, and
the effect of self-citation remains positive even for very high rates of
self-citation (5). However, WoS has dropped some journals from its database because of untrue trends in the citations (4).
The same policy also should be applied for the most frequent
self-citers. The ethics of publications should be adhered to by those
who wish to conduct research and publish their findings.

References

1. Jalalian M, Mahboobi H. New corruption detected: Bogus impact factors compiled by fake organizations. Electron Physician. 2013;5(3):685–6. doi: 10.14661/2014.685-686. [Cross Ref]
2. Ale
Ebrahim N, Salehi H, Embi MA, Habibi Tanha F, Gholizadeh H, Motahar SM,
et al. Effective Strategies for Increasing Citation Frequency. International Education Studies. 2013;6(11):93–9. doi: 10.5539/ies.v6n11p93. http://opendepot.org/1869/1/30366-105857-1-PB.pdf. [Cross Ref]
3. Mahian O, Wongwises S. Is it Ethical for Journals to Request Self-citation? Sci Eng Ethics. 2014:1–3. doi: 10.1007/s11948-014-9540-1. [PubMed] [Cross Ref]
4. Van Noorden R. Brazilian citation scheme outed. Nature. 2013;500:510–1. doi: 10.1038/500510a. http://boletim.sbq.org.br/anexos/Braziliancitationscheme.pdf. [PubMed] [Cross Ref]
5. Fowler JH, Aksnes DW. Does self-citation pay? Scientometrics. 2007;72(3):427–37. doi: 10.1007/s11192-007-1777-2. [Cross Ref]
6. Schubert A. Using the h-index for assessing single publications. Scientometrics. 2009;78(3):559–65. doi: 10.1007/s11192-008-2208-3. [Cross Ref]
7. Thor
A, Bornmann L. The calculation of the single publication h index and
related performance measures: a web application based on Google Scholar
data. Online Inform Rev. 2011;35(2):291–300.
8. Thor
A, Bornmann L. Web application to calculate the single publication h
index (and further metrics) based on Google Scholar 2011 [cited 2014 3
May]. Available from: http://labs.dbs.uni-leipzig.de/gsh/
9. Couto F. Citation Impact Discerning Self-citations 2013 [cited 2014 3 Ma]. Available from: http://cids.fc.ul.pt/cids_2_3/index.php.
10. Epstein D. Impact factor manipulation. The Write Stuff. 2007;16(3):133–4.
11. Public Policy Group L. Maximizing the impacts of your research: a handbook for social scientists. London School of Economics and Political Science; London, UK.: 2011. http://www.lse.ac.uk/government/research/resgroups/lsepublicpolicy/docs/lse_impact_handbook_april_2011.pdf.


Citation Frequency and Ethical Issue

Friday, 30 October 2015

Workshop On: Improve Research Impact : Enhancing Research Visibility & Improving Citations







INVITATION
TO WORKSHOP BY CENTRE FOR RESEARCH SERVICES






Dear Campus Community,

Do you know “Over 43% of ISI papers has never
received any citations?” (
nature.com/top100, 2014). Publishing a
high-quality paper in scientific journals is only halfway towards receiving the
citation in the future. The rest of the journey is dependent on disseminating
the publications via proper utilization of the “
Research Tools”. Proper tools allow the
researchers to increase the research impact and citations for their
publications. This workshop will provides various techniques on how one can
increase the visibility and enhance the impact of one’s research work.

Research Support Unit, Centre for Research
Services (PPP) would like to invite you to participate in our workshop as
follows:



No


Title of Workshop


Date


Venue


Fees

Registration deadline

1.

Conducting a Literature Search & Writing Review Paper


Registration is closed

2.

Improve Research Impact : Enhancing Research Visibility &
Improving Citations

18 & 19 November
2015

Computer Lab, Level 2, Kompleks Pengurusan Penyelidikan &
Inovasi, IPPP

RM 400.00

11 November 2015


Kindly find the brochure of the workshop and
registration form in the attachment. Please submit the registration form and
proof of payment before the deadline. 

------
About Dr. Nader Ale
Ebrahim:

Nader
Ale Ebrahim
is currently working as a visiting
research fellow with the Research Support Unit, Centre for Research Services,
IPPP, University of Malaya. Nader
holds a PhD degree in Technology Management.
His current research interests include: E-skills, Research Tools, Bibliometrics, university ranking and
research impact.
Nader is well-known as the creator of “Research Tools” Box and the developer of
“Publication Marketing Tools”. The “
Research Tools” help researchers by expanding their
knowledge to effectively use the "tools" that are available on the
Internet to reduce the search time. He was the winner of Refer-a-Colleague
Competition and has received prizes from renowned establishments such as
Thomson Reuters. Nader has so far conducted over 200 workshops within and
outside of university of Malaya. His papers/articles have published and
presented in the several journals and conferences.
------

For further information please contact us at
(03-7967 7812 / 
uspi@um.edu.my). 



Thank you.

Regards,

ASSOC.
PROF. DR. NGOH GEK CHENG


Head of Research Support Unit,

Centre for Research Services,

Institute of Research Management & Monitoring (IPPP),
Level 2, Kompleks
Pengurusan Penyelidikan & Inovasi,
University of Malaya.
 







Research Tools By: Nader Ale Ebrahim - MindMeister Mind Map

Monday, 26 October 2015

"A new research impact measuring system" by Nader Ale Ebrahim

A new research impact measuring system

Nader Ale Ebrahim, Research Support Unit, Centre of Research Services, IPPP, University of Malaya (UM), Kuala Lumpur, Wilayah Persekutuan 50603

Abstract

For years, scientists have been trying to measure the quality of
scholarly work by the number of times an article is cited in other
articles or the impact factor of the journal which published an article.
However, citation is a lagging indicator and journal impact factor may
be misleading since a Journal's citation count is usually caused by a
small number of articles in that journal. With the rise of the web as
the archiving and emerging interaction platform, there is a need for new
ways to measure articles and books impact. Altmetrics attempts to use
the online activity to measure impact, buzz, word of mouth for
scientific information and it includes new ways to measure usage at the
citation level. In this workshop, I will explain about application of
altmetrics tools such as: Altmetric.com, Impactstory.org,
Plumanalytics.com, and PLoS metrics.











Suggested Citation

Nader Ale Ebrahim. "A new research impact measuring system"
Research Support Unit, Centre for Research Services. Research Support
Unit, Centre for Research Services, Institute of Research Management and
Monitoring (IPPP)”, University of Malaya, Malaysia. Oct. 2015.
Available at: http://works.bepress.com/aleebrahim/106

"A new research impact measuring system" by Nader Ale Ebrahim

Wikiscientist - IDMARCH - Document Search Engine

Preprint version of: N. Ale Ebrahim, “Create an Online Researcher Profile on Wikiscientist” University of Malaya Research Bulletin, vol. 3, no. 1, 23 June, 2015. Create an Online Researcher Profile on Wikiscientist N

Preprint
version of: N. Ale Ebrahim, “Create an Online Researcher Profile on
Wikiscientist” University of Malaya Research Bulletin, vol. 3, no. 1, 23
June, 2015. Create an Online Researcher Profile on Wikiscientist N

Source URL: files.figshare.com

Language: English - Date: 2015-04-12 13:07:36

  • Technical communication

  • Electronic publishing

  • Open access

  • ORCID

  • ResearcherID

  • Technology

  • Knowledge



  • Wikiscientist - IDMARCH - Document Search Engine

    What kind of "Research Tools" have you used during your PhD journey? Why? - Quora



    What kind of "Research Tools" have you used during your PhD journey? Why?

    I
    have collected over 700 tools that can help researchers do their work 
    efficiently. It is assembled as an interactive Web-based mind map, 
    titled "Research Tools" (Research Tools  By: Nader Ale Ebrahim),
    which is updated periodically. I would like to  know your personal
    experience on any kind of "Research Tools" that you  are using for
    facilitating your research.

    Aleksey Belikov
    Aleksey Belikov, I'm a professional scientist

    • DuckDuckGo for finding relevant literature. It's surprisingly much better than Google Scholar, Google itself, or PubMed.
    • MS Excel for performing calculations on raw data.
    • GraphPad Prism for making publication-ready graphs and statistical analysis.
    • MS PowerPoint for creating composite figures and line art schemes, as well as presentations, of course.
    • MS Word for writing papers and reports.
    • EndNote for organizing papers and making reference lists.



    What kind of "Research Tools" have you used during your PhD journey? Why? - Quora

    Impact of Social Sciences – An antidote to futility: Why academics (and students) should take blogging / social media seriously

     Source: http://blogs.lse.ac.uk/impactofsocialsciences/2015/10/26/why-academics-and-students-should-take-blogging-social-media-seriously/

    An antidote to futility: Why academics (and students) should take blogging / social media seriously

    duncanBlogs
    are now an established part of the chattersphere/public conversation,
    especially in international development circles, but
    Duncan Green
    finds academic take-up lacking. Here he outlines the major arguments
    for taking blogging and social media seriously. It doesn’t need to
    become another onerous time-commitment. Reading a blog should be like
    listening to the person talk, but with links.



    Before I started teaching at LSE in January, I had the impression
    that the academics and researchers around the school were totally social
    media savvy – prolific tweeters like Charlie Beckett and top blogs like LSE Impact are high up on my follow list.


    It turned out the impression was, ahem, a little misleading. A good
    proportion of the people I have come across may be brilliant in their
    field, but when it comes to using the interwebs, tend to sound like the
    querulous 1960s judge asking ‘What is a Beatle?’ (‘I don’t twitter’).
    Much of life is spent within the hallowed paywalls of academic journals
    (when I pointed out that no-one outside academia reads them, the baffled
    response seemed to be along the lines of ‘and your point is?’).


    So why should they rethink? Here are some initial arguments, confined
    to blogs and twitter (the only bits of social media I engage with). I’m
    sure there are lots of others – feel free to add:


    1. Remember that a blog is a ‘web log’, i.e. an online diary. Regular
      blogging builds up a handy, time-saving archive. I’ve been blogging
      daily since 2008. OK, that’s a little excessive, but what that means is
      that essentially I have a download of my brain activity over the last 7
      years – almost every book and papers I’ve read, conversations and
      debates. Whenever anyone wants to consult me, I have a set of links I
      can send (which saves huge amounts of time). And raw material for the
      next presentation, paper or book.
    2. Making sure someone reads your research. Look no further than the
      excellent LSE Impact blog for evidence: here’s a quick search of their
      posts:
    Patrick Dunleavy argues
    blogging and tweeting from multi-author blogs especially is a great way
    to build knowledge of your work, to grow readership of useful articles
    and research reports, to build up citations, and to foster debate across
    academia, government, civil society and the public in general.


    World Bank research on economics blogging (with regressions, natch) concluded:


    Blogging about a paper causes a large increase in the
    number of abstract views and downloads in the same month: an average
    impact of an extra 70-95 abstract views in the case of Aid Watch (now sadly defunct) and Chris Blattman, 135 for Economix, 300 for Marginal Revolution, and 450-470 for Freakonomics and Krugman. [see regression table here and below]


    These increases are massive compared to the typical abstract views
    and downloads these papers get- one blog post in Freakonomics is
    equivalent to 3 years of abstract views! However, only a minority of
    readers click through we estimate 1-2% of readers of
    the more popular blogs click on the links to view the abstracts, and 4%
    on a blog like Chris Blattman that likely has a more specialized
    (research-focused) readership.
    Source: Academic blogs are proven to increase dissemination of economic research and improve impact.
    LSE Impact resources for twitter users include:


    Source: Who gives a tweet? After 24 hours and 860 downloads, we think quite a few actually do
    1. It gives you a bit of soft power (let’s not exaggerate this, but check out slide 15 of this research presentation [ppt]
      for some evidence). Blogs are now an established part of the
      chattersphere/public conversation, so you get a chance to put your
      favourite ideas out there, and spin those of others. People in your
      organization may well read your blogs and tweets even if they don’t read
      your emails.
    2. Blogging is a great antidote to that feeling of anticlimax and
      futility that comes after you send off the paper or the book manuscript,
      and suddenly the true indifference of the universe becomes apparent.
      You can keep discussing and communicating with interesting people, and
      keep the existential crisis at bay.
    3. And don’t forget the free books, also known as ‘review copies’.
    4. And the chance to publicly insult your enemies (not relevant in my case, obvs, as I don’t have any).

    “I Don’t Have Time”

    The counter-argument is bound to be ‘we don’t have time’, but if you
    take too long, that probably means the blog won’t be very accessible.
    Reading a blog should be like listening to the person talk, but with
    links. This post took me precisely 30 minutes to write, including the
    ‘research’.


    Maybe Twitter’s apparent time-efficiency explains why Twitter seems
    better represented than blogging (though I only found this out by
    writing this post and circulating it!). In the International Development
    faculty (including honorary fellows and Professors in Practice) we have
    Owen Barder, Pritish BehuriaMayling Birney, Benjamin Chemouni, Jean Paul Faguet, Danny Quah, Keith Hart, Sohini Kar, Silvia MasieroRajesh Venugopal and Kevin Watkins. Did I miss anyone? Oh yes, me.


    If you’re interested in dipping your toe in the social media ocean, here are some tips for bloggers on international development and a previous effort to convince sceptics. But the best thing is just to try it and see. At the very least, follow Chris Blattman to see how it’s done.


    This piece originally appeared on the LSE International
    Development blog and is reposted with permission. Keith
    (k.mcdonald@lse.ac.uk) is currently the Managing Editor of the
    International Development blog. Get in touch if you want to have a go.



    Note: This article gives the views of the author, and not the
    position of the Impact of Social Science blog, nor of the London School
    of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.



    About the Author


    Duncan Green is Senior Strategic Adviser for
    Oxfam GB and Professor in Practice in the department for International
    Development at LSE. He is also author of the book ‘From Poverty to Power’. He can be found on twitter @fp2p.




    Impact of Social Sciences – An antidote to futility: Why academics (and students) should take blogging / social media seriously

    Sunday, 25 October 2015

    Impact of Social Sciences – The arXiv cannot replace traditional publishing without addressing the standards of research assessment.

     Source: http://blogs.lse.ac.uk/impactofsocialsciences/2015/10/23/open-repositories-arxiv-scientific-publishing/

    The arXiv cannot replace traditional publishing without addressing the standards of research assessment.

    20140709-jan2cropped-300x225Jan van den Heuvel
    considers the vital role of discipline-specific repositories in the
    research process. The arXiv came into existence because it provided a
    solution to a very practical problem, namely publication time-lags.
    Recent developments like overlay journals suggest these platforms could
    play a bigger role in the publishing process, but as long as recruitment
    and promotion panels attach value to papers published in specific
    journals only, their role will be limited.



    When a researcher in most areas of Physics, Mathematics or Computer
    Science (and increasingly also Statistics, Quantitative Finance and
    Quantitative Biology) is looking for recent publications in their field,
    one of the first places they will look is the arXiv.
    (Pronounced “archive”, with the “X” standing in for the Greek letter
    chi.) The arXiv was started in 1991 as a simple central repository of
    electronic preprints in physics, based on servers at the Los Alamos
    National Laboratory. Soon it expended its scope to other areas. In 1999
    it moved to Cornell University Library, which is still its main base.
    The statistics page of the arXiv
    gives a good indication of its size and activity: over 1 million
    submissions since its start; currently between 8,000 and 9,000 new
    submissions per month and around 10 millions downloads per month.


    So why has the arXiv become so important for researchers in these
    particular fields? Why is it that it is now more or less standard that
    any active researcher in these areas will deposit a close to final
    version of their publications in the archive? Part of it can be
    explained by the increasing prominence of Open Access and related
    developments in academic publishing. But that can only explain a small
    part of the success of the arXiv. The main reason of its success, in my
    opinion, is a specific feature of these research areas: the very long
    lead time between submission and publication in a journal of papers in
    those fields, and hence the historic prominence of “preprints” and
    “reports”. I will describe some of that background below, specifically
    for Mathematics (my field), but similar factors play a role in the other
    subjects covered by the arXiv as well.


    mathsImage credit: Wallpoper (Public Domain)
    In Mathematics, a period of one year between submission and
    publication is quite common, while periods of 3-4 years are nothing
    exceptional. A major reason for those long lead times is the thorough
    refereeing that is expected. Most papers in Mathematics consist for a
    large part of one or more detailed proofs of the main result(s). These
    proofs can vary in length from a few paragraphs to several hundred pages
    (although anything over roughly 30 pages is considered long). And it is
    one of the main duties of the referees to convince themselves of the
    correctness of those proof(s); a process that involves carefully going
    through the arguments, checking if the logic is correct, checking if old
    results are used correctly, etc. Thoroughly checking one page of a
    proof can easily take more than a day. This means that the refereeing
    process usually takes at least several months, or even years if the
    referees need to find the time to do a proper job. And if errors are
    found, the author(s) might be asked to try to correct them, and a 2nd or
    3rd version op the paper may need considerable amount of time to be
    scrutinised again. Added to the lengthy refereeing process in the past
    was the specialised typesetting that was required for mathematical
    texts.


    Because of the long time between submission and publication, the
    existence of “preprints” or “reports” was standard in the mathematical
    community. As soon as a version of a new paper was submitted to a
    journal, the author(s) would make a number of hard-copies of it, often
    in the form of a report in a “Reports Series” based in any respectable
    Mathematics department. (The one at the LSE was called CDAM Research Report Series;
    although still accessible online, it stopped accepting new material in
    2009.) When you would go to a conference or gave a seminar, you would
    bring a couple of those preprints. And after the presentation,
    interested members of the audience would come forward and ask “do you
    have a preprint of this?”. Note that these preprints were different from
    the “working papers” that exist in some other fields. Where a “working
    paper” is a publication that is still in development, a preprint or
    report would be a (hopefully) close to final version, more or less
    identical to the manuscript that was already submitted to a journal.


    Once the World Wide Web became more prominent, those preprints went
    online, usually via personal homepages of the author(s). At the same
    time, institutional preprint series were going online. And once the
    advantages of having a central repository became clear, most of us
    started uploading our work to one of those, and personal homepages and
    the surviving preprint series just link to the article on the arXiv.


    So the arXiv is not something that came into existence because of the
    move towards Open Access. It’s more that it was the solution to a
    practical problem: “if it will take several years before my paper will
    be published, how do I tell the world about my brilliant work in the
    meantime?”. Of course, the arXiv is now seen as a prime example of Open
    Access: it is completely free to search and download all publications.
    It allows uploading new versions of a paper, while at the same time
    keeping previous versions accessible.


    On the other hand, in its present form the arXiv is not in a position
    to replace traditional journals. The main reason for that is the lack
    of refereeing. There is a group of moderators who can reject
    publications that are not scientific or recategorise off-topic
    submissions. But in general any paper can be a brilliant proof of a
    long-standing conjecture, a piece of high-school Mathematics, or
    something that upon serious reading is clearly wrong. As long as
    academic recruitment panels and promotion committees attach value to
    papers published in specific journals only, repositories such as the
    arXiv can have a limited role in the whole publication process.


    An interesting new development is the appearance of “overlay
    journals”. These are journals that have an independent (online)
    presence, but who use a central repository to host the papers appearing
    in them. In other words, the journal will have editors, an editorial
    board, a review process, etc., but in the end the list of papers in it
    will just be a list of links to the relevant papers in some repository.
    Although these overlay journals have existed for a while, they became a
    lot better known when Timothy Gowers announced on his blog
    that he and a number of extremely eminent collaborators would start an
    arXiv overlay journal in their specialism. Gowers became quite
    well-known because of his activities and called for a boycott of the
    traditional commercial scientific publishers, in particular Elsevier.
    (See here, here and here
    for more on that.) So anything he does regarding Open Access and the
    use of open repositories immediately makes people sit up and pay
    attention.


    So could we see a more prominent role of completely open repositories
    such as the arXiv in the scientific publication process? Maybe. But two
    main obstacles remain, from my point of view. How do you set up a
    review process that makes it possible to recognise (top-)quality among
    the publications in the repositories? And how do you overcome the
    inbuilt conservatism in academic recruitment panels and promotion
    committees to look firstly and  mainly at publications at journals they
    recognise? As long as those hurdles are not removed, commercial
    publishers won’t have to worry too much, unfortunately.


    This piece originally appeared on the LSE Library Blog and is reposted under CC BY 4.0


    Note: This article gives the views of the author, and not the
    position of the Impact of Social Science blog, nor of the London School
    of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.



    About the Author


    Professor Jan van den Heuvel teaches and researches in the Department of Mathematics at LSE. He can also be found on Twitter @JanvadeHe.



    Impact of Social Sciences – The arXiv cannot replace traditional publishing without addressing the standards of research assessment.

    DE GRUYTER – Traditional Scholarly Publisher’s Shift Towards Open Access. The Facts Behind the Numbers | Open Science

     Source: http://openscience.com/de-gruyter-traditional-scholarly-publishers-shift-towards-open-access-the-facts-behind-the-numbers/

    DE GRUYTER – Traditional Scholarly Publisher’s Shift Towards Open Access. The Facts Behind the Numbers

    DGart




    October 23, 2015
    De Gruyter Open published 11,115 papers in fully open access
    journals in 2014. 11,276 articles were published under De Gruyter’s
    brand in hybrid journals (of which a small part was also open access).
    In 2015, the company will probably publish 12,898 papers in fully open
    access journals and 12,673 in hybrid venues. So it is very likely that
    the group will publish more open access than conventional articles this
    year.



    There is a growing discussion about the protracted “transition to
    open” period for which research institutions have to pay for both
    subscription to traditional venues and publication fees for publishing
    in open access journals. The people participating in the discussion
    about this phenomenon stress the fact that academic publishers benefit
    from keeping revenues from these two sources, and therefore will
    never be keen to transform all their journals to open access. They also
    use the “double dipping” notion to address the fact that some publishers
    may be paid twice for the same content, by charging publication fees
    for keeping some particular journal article open, but still selling
    subscription to the whole journal, which is partially toll access (this
    is the called ‘hybrid open access’).


    Open access is a growing trend


    For the Open Access Week 2015 I’ve decided to approach these problems with the example of De Gruyter and
    its open access imprint De Gruyter Open. The transition process from
    toll to open access in the case of the De Gruyter group is tangible and
    evident, and there is no doubt that the company sees open access as an
    increasingly important part of its portfolio.


    DG Group has currently 544 fully open access journals (DG Open) and
    290 hybrid journals (DG). I say ‘hybrid’ because all of these 290
    journals are subscription based, and offer an open access option to
    authors. An author with funding covering an Article Processing Charge
    can publish an open access article in any of the group’s journals and
    doesn’t have to pay more for publishing in hybrid journals[1], that are
    in most cases older and more established.


    Double dipping? Not here.


    Is De Gruyter paid twice for publications in hybrid journals? No, and it
    probably never has been. At the moment, not as many authors use the
    hybrid open access option at De Gruyter. Recently the company has introduced an anti-double dipping policy,
    which assumes that the price of the subscription of the journal will be
    lowered proportionally to the number of open access articles published
    in this venue. The policy applies to every journal that has at least 5%
    of its articles published in open access. But this policy is introduced
    ‘just in case’ to prevent possible future ‘double dipping’, since there
    are just a few hybrid journals at De Gruyter with more than 2-3% of
    their articles in open access.


    DGjournals {focus_keyword} DE GRUYTER - Traditional Scholarly Publisher's Shift Towards Open Access. The Facts Behind the Numbers DGjournals


    And fully open access venues at De Gruyter Open are doing very well,
    indeed. The company published 11,115 papers in fully open access
    journals in 2014 versus 11,276 published under De Gruyter’s brand in
    hybrid journals (of which a small part was also open access). In 2015
    the company will probably publish even more articles in fully open
    access journals. According to the current estimates, 12,898 papers will
    be published in De Gruyter Open this year, compared to 12,673 in hybrid
    journals at De Gruyter. So it is very likely that the group will publish
    more open access than conventional articles in 2015.


    Large volumes of published content make De Gruyter Open one of the
    top publisher of Open Access globally. According to the data reblogged
    this week by DOAJ: “traditional publisher De Gruyter has gone from no titles in DOAJ in 2014 to 3rd largest DOAJ publisher” leaving Elsevier and Springer behind.


    Interestingly, a large number of DG Open’s journals are free of
    publication fees. Some of them are co-published by academic societies,
    which cover the costs of publishing with membership fees or public
    donations.


    [1] – Article Processing Charges at De Gruyter vary from 500 euro to
    1500 euro depending on the discipline, but there is no differences
    between fully open access and hybrid journals. Some journals waive APCs
    at all. Information about Article Processing Charge is available on the
    website of every journal published by De Gruyter.


    This entry was posted on October 23, 2015 by Witold Kieńć and tagged , ,

    DE GRUYTER – Traditional Scholarly Publisher’s Shift Towards Open Access. The Facts Behind the Numbers | Open Science

    INVITATION TO WORKSHOP on "Conducting a Literature Search & Writing Review Paper" and "Improve Research Impact : Enhancing Research Visibility & Improving Citations"









    INVITATION
    TO WORKSHOPS BY CENTRE FOR RESEARCH SERVICES










    Dear
    Campus Community,


    Research
    Support Unit, Centre for Research Services (PPP) would like to invite you
    to participate in our workshops as follows:
      



    No


    Title of Workshop


    Date


    Venue


    Fees

    Registration
    deadline

    1.

    Conducting a Literature Search &
    Writing Review Paper

    4 & 5
    November 2015

    Computer Lab, Level 2, Kompleks Pengurusan
    Penyelidikan & Inovasi, IPPP


    RM 400.00

    28 October
    2015

    2.

    Improve Research Impact : Enhancing
    Research Visibility & Improving Citations

    18 &
    19 November 2015

    Computer Lab, Level 2, Kompleks Pengurusan
    Penyelidikan & Inovasi, IPPP

    RM 400.00

    11
    November 2015

    Kindly find the brochures of the workshop and
    registration form in the attachment. Please submit the registration form and
    proof of payment before the deadline. 



    Dr. Nader Ale Ebrahim is
    a visiting research fellow at the Centre for Research Services, IPPP,
    University of Malaya. 
    He is the winner of ‘Refer-a-Colleague Competition’ and has
    received prizes from renowned establishments such as Thomson Reuters. Nader
    is well-known as the creator of “Research
    Tools
    ” Box and the developer of “Publication Marketing Tools”. He
    has so far conducted over 130 workshops within and outside of University of
    Malaya.


    For further information please contact us at
    (03-7967 7812 / uspi@um.edu.my). 



    Thank
    you.

    Regards,

    ASSOC.
    PROF. DR. NGOH GEK CHENG


    Head of Research Support Unit,

    Centre for Research Services,

    Institute of Research Management & Monitoring (IPPP),
    Level 2, Kompleks Pengurusan Penyelidikan & Inovasi,
    University of Malaya.
     




    Research Tools: INVITATION TO WORKSHOP on "Optimize articles for search engine to improve research visibility"