Monday, 22 November 2021

Determining Your Research Impact


Determining Your Research Impact

An impact factor is one measure of the relative importance of a journal, individual publication, or research to literature and research.

Journal impact factors, citations to publications, h-index of researchers are used to measure the importance and impact of research.

Informed and careful use of impact and citation data is essential.  It is important to ensure data is being used to compare like with like:

  • The number of times a paper is cited is not a measure of its actual quality.
  • Some tools that measure the impact data do not incorporate books. So citations appearing in books are underrepresented.
  • Different disciplines have different publication and citing patterns. When making comparisons, ensure the data has been adjusted to account for differences between disciplines as cross-disciplinary comparisons of individual scholars' h-indexes are not valid.
  • Document age influences the number of citations it has.  Because it examines impact over time, the h-index favors established authors.
  • Review articles are cited more often and can change results.
  • Self-citing may skew results.

Journal Impact

The journal impact factor is calculated on the average number of citations per paper published in that journal during the three preceding years.  

Immediacy index based on the number of times articles published in the preceding year were cited in that year.

Eigenfactor Score, based on the number of citations received by articles in a journal, weighted by the rank of the journal the citations appear in.

Article Influence Score is a measure of the average influence of each of its articles over the first five years after publication.    

Acceptance and circulation rates can be useful metrics in determining the relative importance of particular journals.

CiteScore is based on the average citations received per document.  It is calculated on the number of citations received by a journal in on year to documents published in the three previous years, divided by teh number of documents indexed in Scopus published in those same three years. 

Citation Analysis

Citation analysis is a way of measuring the relative importance or impact of an author, an article, or a publication by counting the number of times that author, article, or publication has been cited by other works.

Why is citation analysis important?  It provides answers to:

  • What are the best journals in my field?
  • How do I check who is citing my articles?
  • How many times have I been cited?
  • How do I know this article is important?
  • How can I compare the research impact between journals so I know which journal I should publish in?

The results of citation analysis in various databases will vary depending on the tool(s) used and thoroughness of the search.

Why do different databases retrieve different results?

The citation data will relate only to articles indexed within the database.  Variations occur because databases

  • Index different publications
  • Cover different date ranges
  • Include poor-quality data (duplicate records, misspellings, incorrect citations, etc)

The h-index

The h-iindex is based on the set of a researcher's cited papers and the number of citations that the researcher has received in other people's publications.  

Graphic representation of h-index calculationsThe h-index is the largest number of articles/books that a researcher has published (N) that have been cited N times. 

Example: If a researcher has 6 papers thave have been cited 6 or more times, their h-index is 6.

Three resources include the necessary citation data for h-index in their respective databases. 

  • Web of Science
  • Scopus (UNF does not subscribe)
  • Google Scholar

The h-index of an author will be different in each of these databases, since they calculate using their own journal content.  

Alternative Metrics

Alternative Metrics (Altmetrics) measure the impact of articles by counting mentions by social media sites and other web sources, not considered in traditional bibliometrics such as citation counts and impact factors. Altmetrics measure the impact of articles outside the means of traditional publishing, including

  • number of 'talkbacks' or amount of discussion an article has received in blogs and on Twitter
  • mentions on social networking sites such as Facebook and bookmarking sites
  • discussions on scholarly networking sites and repositories such as Mendeley

Additional Resources

Interfolio Assistance

Marianne Jaffee

Gordon Rakita

Journal Impact Factor

Citation Tools


Why publish articles in journals?


Why publish articles in journals?

Publishing articles in journals may help:

  • Receiving comments and earning approval from prestigious and professional editorial board
  • Reaching a wider audience and enhancing your research visibility
  • Gaining a concrete academic job qualification

How to choose a journal for publication?

Journals vary widely, so think about the following factors when choosing a journal for publishing:


Predatory publishers are publishers whose primary goal is financial gain, based on unethical business practices, demonstrate unusually high acceptance rates, and oftentimes list high-profile editorial board members who have not agreed to serve as reviewers for the publication.

Some journals appear to have been hijacked, meaning that their websites or branding have been co-opted by a predatory journal or publisher.

Check the List of Predatory Journals and List of Hijacked Journals by Stop Predatory Journals and think before you submit your works.




How to prepare and submit articles to a journal?

Journal's editorial policy, instructions to authors and other pertinent information are very useful for you to prepare and submit an article to a journal. Make sure you read and understand such kinds of information clearly before preparing a final version of your article. Please refer to websites of individual publisher or journal for more details.

Bibliometrics & Research Impact

Author Identifiers & Profiles

If you need more information on setting-up the different author profiles, please take a look at the Libraries' guide on Researcher Profiles.

HKU Scholars Hub: Institutional Repository

scholars hub



HKU Scholars Hub is to enhance the visibility of HKU authors and their research:

Data Hub: HKU data repository

Datahub logo



The University of Hong Kong Libraries is providing a comprehensive repository for research data and other forms of scholarly outputs. DataHub is the cloud platform for storing, citing, sharing, and discovering research data and all scholarly outputs. It collects, preserves, and provides stable, long-term global open access to a wide range of research data and scholarly outputs created by HKU researchers and RPG students in the course of their research and teaching.

To know more, check out the LibGuide on DataHub.

Open Access

open access



Open Access brings you the following benefits:

  • More exposure for your work
  • Practitioners can apply your findings
  • Higher citation rates
  • Your research can influence policy
  • The public can access your findings
  • Compliant with grant rules
  • Taxpayers get value for money
  • Researchers in developing countries can see your work

To know more, check out the Open Access@HKUL guide.


Source: Jisc.

Research Impact Measurement: FAQs


Research Impact Measurement: FAQs


1. What are citation metrics?
Citations are an indicator of an article’s worldwide influence and the means by which researchers acknowledge other researchers. Citation metrics are a quantitative way of measuring and ranking research impact based on citations counts. They measure productivity, influence, efficiency, relative impact and specialization and can be applied to an individual, group, institution, subject area or geographic region.

Examples of citation metrics:

  • Total number of papers: the number of papers published by a research within a stated timeframe.
  • Total number of citations: the sum of the citation counts across all papers.
  • Hirsch's h-index: “a researcher has index h if h of his/her x papers have at least h citations each”

2. What does the h-index measure? Why are researchers so interested in it?
The h-index is a measure of an author's impact based on the citation rates of their articles published. The h-index is calculated by establishing how many publications are attributable to an author that contain at least that same number of citations e.g. if an author has a h-index of 8, they have 8 publications that have been cited at least 8 times.

3. How can I find my h-index?
The h-index can be generated in both the Web of Science and Scopus databases. The h-index may not be the same in both databases because different citation databases cover different publication sources as well as date ranges. Citation data obtained from a particular database is derived from journal titles that are indexed by that particular database.  

4. Why use citation metrics?
Citation metrics can be used as an indication of the importance and impact of an individual researcher or that of a research group, department or university, and their value to the wider research community. Applications for funding, research positions or promotion may require citation metric data. University rankings also take citation metrics into account.

5. Is citation metric data reliable?
Citation metrics have their limitations. Self-citations, differences in metrics depending on the data source, multi-author publications giving equal credit to all authors and certain metrics favouring experienced researchers over early career researchers are just some factors that affect the “reliability” of citation metrics.

6. Which databases can I use to search for citations and generate citation metrics?
The main resources for citation searching include Web of Science, Scopus and Google Scholar.

7. How can I find out how often my publications have been cited?
A number of databases include journal article citation information:

You can conduct a search for a published article in any of these databases and you will be able to view a list of all the articles in the database that have cited that published article. This is highly dependent on the journal coverage of that database.

8. Can I be notified each time my publication is cited?
You can set up a citation alert in Web of Science or Scopus so that you receive a notification each time your publication is cited by other sources that are added to the database.

9. What is a Journal Impact Factor?
The Journal Impact Factor (JIF) uses citation data to assess and track the impact of a journal in relation to other journals. The impact factor is recalculated each year: the number of citations received in that year is divided by the total number of articles published in the two previous years.

10. How can I find the impact factor for a particular journal?
You can find out the Journal Impact Factor for a particular journal using Journal Citation Reports (JCR).

11. How can I find out which journals in my field have the highest impact factor?
You can find out the Journal Impact Factors for journals in a particular field using Journal Citation Reports (JCR).

12. How reliable are Journal Impact Factors?
The Journal Impact Factor (JIF) has been criticised for not accurately reflecting the value of the work published in journals. As the measure is based on the number of citation counts received by articles in a journal it cannot be used to compare journals across disciplines and is biased towards journals that contain more heavily cited publication types, such as review articles or methods papers. It is also open to editorial manipulations. The San Francisco Declaration on Research Assessment (DORA) makes recommendations for improving the way in which the quality of research output is evaluated, including the need to eliminate the use of journal-based metrics, such as the Journal Impact Factor.

13. Are there alternatives to the Journal Impact Factor?
Alternative metrics available to measure the impact of a journal include:

  • Eigenfactor score - This is similar to the Journal Impact Factor, but citations are weighted, with citations from highly ranked journals making a greater impact on the final Eigenfactor score than citations from journals with lower rankings. The Eigenfactor score is based on data from the Web of Science database.
  • CiteScore - similar to Journal Impact Factor, this looks at publications indexed in Scopus and provides the average number of citations that are received in the respective journals.
  • SCImago Journal Rank (SJR) - The SJR is very similar to the Eigenfactor score and also gives more weight to highly ranked journals. The SJR is based on data from the Scopus database.
  • SNIP (Source Normalized Impact Per Paper) - SNIP weights citations according to subject field, with higher value given to a citation if it is from a field where articles tend to be cited less frequently. SNIP is based on the Scopus database.

14. What are Altmetrics?
Altmetrics is an emerging field which aims to measure the impact of published research on the social web. This type of measure can supplement the information gained from traditional Citation metrics. Altmetrics can also be used to gauge the impact of publications that would not be included in traditional citation metrics, for example data sets, software, or presentations.

Altmetrics are based on data such as:

  • Blog cites
  • Twitter cites
  • Mendeley records
  • Online repository records
  • Article views and/or downloads
  • News or media mentions etc

Besides demonstrating impact, Altmetrics can be used to find collaborators, or to provide evidence of engagement with the content of a publication.

More information about Altmetrics can be found on our Altmetrics page.

15. What effect does open access publishing have on citation metrics?
Various studies have been carried out to determine whether open access publishing has an effect on the number of downloads or citations a piece of work receives. The results vary across disciplines. SPARC maintains a list of studies which addresses this question.

16. I would like to know more about predatory journals, publishers or conferences. Where can I find out more?
You can refer to our libguide on predatory publishing for more information about predatory publishing and predatory journals. Resources like Think Check Submit or Think Check Attend are also useful as they provide a checklist for authors to refer to when they come across any predatory publisher or organisers. You can also contact us for more information and advice on the topic.

How to improve research visibility and impact: a Scopus workshop with USM


How to Improve Research Visibility and Impact: Session 5, Online Repository



Saturday, 20 November 2021

Research Visibility and Impact


Research Visibility and Impact

Author Impact

An author's impact on their field or discipline has traditionally been measured using the number of times they have published and the number of times their academic publications are cited by other researchers. Although the simplest way to demonstrate your impact is to create a list of your publications and the number of times they have been cited, numerous algorithms based on publication data have also been created. Below are some of the more common metrics and tools you can use to measure research impact.

Citation Counts

Simply put, citation counts are the number of times an article has been cited in other articles, books, or other sources. However the exact number is often difficult to determine because different sources (such as databases) search article references differently.  As a result the counts will differ from source to source.


The h-index, or Hirsch index, measures the impact of a particular scientist rather than a journal. "It is defined as the highest number of publications of a scientist that received h or more citations each while the other publications have not more than h citations each." 1 For example, a scholar with an h-index of 5 had published 5 papers, each of which has been cited by others at least 5 times. The links below will take you to other areas within this guide which explain how to find an author's h-index using specific platforms. 

NOTE: An individual's h-index may be very different in different databases. This is because the databases index different journals and cover different years. For instance, Scopus only considers work from 1996 or later, while the Web of Science calculates an h-index using all years that an institution has subscribed to. (So a Web of Science h-index might look different when searched through different institutions.)  


A newer metric proposed by Leo Egghe in 2006 in the g-index.  The g-index is an alternative for the h-index, which does not average the numbers of citations. The h-index only requires a minimum of n citations for the least-cited article in the set and thus ignores the citation count of very highly cited papers. Roughly, the effect is that h is the number of papers of a quality threshold that rises as h rises; g allows citations from higher-cited papers to be used to bolster lower-cited papers in meeting this threshold. Therefore, in all cases g is at least h, and is in most cases higher. However, unlike the h-index, the g-index saturates whenever the average number of citations for all published papers exceeds the total number of published papers.  It is worth noting that the g-index is not as widely accepted as the h-index, 

Find your h-index in Web of Science

The Citation Report feature displays bar charts for the number of items published each yea, the number of citations each year, the counts for the average number of citations per item, the number of citations per year per publication, average number of citations per year per publication, and the H-index.

For more information on building a Citation Report or read the Citation Report help page.

How to Improve Research Visibility and Impact: Session 4, Online CV



Sunday, 14 November 2021

Maximized Research Impact: Effective Strategies for Increasing Citations


Maximized Research Impact: Effective Strategies for Increasing Citations

  • Nader Ale Ebrahim
  • Hossein Gholizadeh
  • Artur Lugmayr


The high competitive environment has forced higher education authorities to set their strategies to improve university ranking. Citations of published papers are among the most widely used inputs to measure national and global university ranking (which accounts for 20% of QS, 30% of THE, and etc.). Therefore, from one hand, improving the citation impact of a search is one of the university manager’s strategies. On the other hand, the researchers are also looking for some helpful techniques to increase their citation record. This chapter by reviewing the relevant articles covers 48 different strategies for maximizing research impact and visibility. The results show that some features of article can help predict the number of article views and citation counts. The findings presented in this chapter could be used by university authorities, authors, reviewers, and editors to maximize the impact of articles in the scientific community.



[1] M. Fooladi, H. Salehi, M. M. Yunus, M. Farhadi, A. Aghaei Chadegani, H. Farhadi, et al., "Do Criticisms Overcome the Praises of Journal Impact Factor?," Asian Social Science, vol. 9, pp. 176-182, April 27 2013.
[2] P. Smart, H. Maisonneuve, and A. Polderman, "6.11: Maximizing research visibility, impact, and citation: tips for editors and authors."
[3] J. Bar-Ilan, "Which h-index? - A comparison of WoS, Scopus and Google Scholar," Scientometrics, vol. 74, pp. 257-271, Feb 2008.
[4] L. I. Meho and K. Yang, "Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar," Journal of the American Society for Information Science and Technology, vol. 58, pp. 2105-2125, Nov 2007.
[5] H. Gholizadeh, H. Salehi, M. A. Embi, M. Danaee, S. M. Motahar, N. Ale Ebrahim, et al., "Relationship among Economic Growth, Internet Usage and Publication Productivity: Comparison among ASEAN and World’s Best Countries," Modern Applied Science, vol. 8, pp. 160-170, March 14 2014.
[6] C. E. Paiva, J. Lima, and B. S. R. Paiva, "Articles with short titles describing the results are cited more often," Clinics, vol. 67, pp. 509-513, 2012.
[7] N. Ale Ebrahim, H. Salehi, M. A. Embi, F. Habibi Tanha, H. Gholizadeh, S. M. Motahar, et al., "Effective Strategies for Increasing Citation Frequency," International Education Studies, vol. 6, pp. 93-99, October 23 2013.
[8] K. A. Lefaivre, B. Shadgan, and P. J. O'Brien, "100 Most Cited Articles in Orthopaedic Surgery," Clinical Orthopaedics and Related Research, vol. 469, pp. 1487-1497, May 2011.
[9] K. Jones and K. Evans, "Good Practices for Improving Citations to your Published Work," University of BATHFebruary 2013.
[10] S.-A. Marashi, H.-N. Seyed Mohammad Amin, K. Alishah, M. Hadi, A. Karimi, S. Hosseinian, et al., "Impact of Wikipedia on citation trends," EXCLI Journal, vol. 12, pp. 15-19, January 15 2013.
[11] N. Ale Ebrahim. (2012, 7 October 2012). Publication Marketing Tools “Enhancing Research Visibility and Improving Citations”. Research Tools in Education Series [Presentation]. 1(2), 1-86. Available:
[12] N. Ale Ebrahim, H. Salehi, M. A. Embi, F. Habibi Tanha, H. Gholizadeh, and S. M. Motahar, "Visibility and Citation Impact," International Education Studies, vol. 7, pp. 120-125, March 30 2014.
[13] N. Ale Ebrahim, "Introduction to the Research Tools Mind Map," Research World, vol. 10, pp. 1-3, June 14 2013.
[14] LiU E-Press. (2007, 9 May). One way to increase citation frequency. Available:
[15] A. Lugmayr, "Managing Creativeness in a Research Laboratory-Lessons Learned from Establishing NAMU Lab./EMMi Lab," in The 25th Bled eConference "eDependability: Reliable and Trustworthy eStructures, eProcesses, eOperations and eServices for the Future", Slovenia, 2012, pp. 1-9.
[16] A. Lugmayr, "Opening Lecture: Managing Creativeness in a Research Laboratory-Lessons Learned from Establishing NAMU Lab./EMMi Lab," presented at the Doctoral Consortium - Research Supervision Dilemmas, 26th Bled EConference., Slovenia, 2012.
[17] C. Sarli and K. Holmes. (2011, 9 May). Strategies for Enhancing the Impact of Research. Available:
[18] R. Wong, "Ways to Maximise Citations for Researchers," ed. University of Sheffield, 2008, pp. 1-7.
[19] M. Van Wesel, "Evaluation by Citation: Trends in Publication Behavior, Evaluation Criteria, and the Strive for High Impact Publications," Science and Engineering Ethics, pp. 1-27, 2015/03/06 2015.
[20] M. van Wesel, S. Wyatt, and J. ten Haaf, "What a difference a colon makes: how superficial factors influence subsequent citation," Scientometrics, vol. 98, pp. 1601-1615, Mar 2014.
[21] J. Beel, B. Gipp, and E. Wilde, "Academic Search Engine Optimization (ASEO)," Journal of Scholarly Publishing, vol. 41, pp. 176-190, 01/01/ 2010.
[22] (2014, 19 September). Academic Search Engine Optimization: An inevitable evil? Available:
[23] J. Beel and B. Gipp, "On the robustness of google scholar against spam," presented at the Proceedings of the 21st ACM conference on Hypertext and hypermedia (HT’10), Toronto, Ontario, Canada, 2010.
[24] Emerald Guide. (2012, 09 May). How to... write an abstract. Available:
[25] H. R. Jamali and M. Nikzad, "Article title type and its relation with the number of downloads and citations," Scientometrics, vol. 88, pp. 653-661, 2011/08/01 2011.
[26] Z. Corbyn, "An easy way to boost a paper's citations," Nature, vol. 406, 13 August 2010.
[27] E. S. Vieira and J. A. N. F. Gomes, "Citations to scientific articles: Its distribution and dependence on the article features," Journal of Informetrics, vol. 4, pp. 1-13, 1// 2010.
[28] M. Ramos, J. Melo, and U. Albuquerque, "Citation behavior in popular scientific papers: what is behind obscure citations? The case of ethnobotany," Scientometrics, vol. 92, pp. 711-719, 2012/09/01 2012.
[29] G. D. Webster, P. K. Jonason, and T. O. Schember, "Hot topics and popular papers in evolutionary psychology: Analyses of title words and citation counts in Evolution and Human Behavior, 1979–2008," Evolutionary Psychology, vol. 7, pp. 348-362, 2009.
[30] E. Garfield, "Citation indexes for science. A new dimension in documentation through association of ideas," International Journal of Epidemiology, vol. 35, pp. 1123-1127, October 1, 2006 2006.
[31] J. Hudson, "Be known by the company you keep: Citations - quality or chance?," Scientometrics, vol. 71, pp. 231-238, May 2007.
[32] P. Ball, "A longer paper gathers more citations," Nature, vol. 455, pp. 274-275, 2008.
[33] H. A. Abt, "Why some papers have long citation lifetimes," Nature, vol. 395, pp. 756-757, 10/22/print 1998.
[34] T. A. Hamrick, R. D. Fricker, and G. G. Brown, "Assessing What Distinguishes Highly Cited from Less-Cited Papers Published in Interfaces," Interfaces, vol. 40, pp. 454-464, November 1, 2010 2010.
[35] A. G. Gross, J. E. Harmon, and M. S. Reidy, Communicating science: The scientific article from the 17th century to the present: Oxford University Press Oxford, 2002.
[36] Z. Corbyn, "To be the best, cite the best," Nature, vol. 539, 13 October 2010 2010.
[37] E. Deckers and K. Lacy, Branding Yourself: How to Use Social Media to Invent or Reinvent Yourself: Que Publishing Company, 2010.
[38] Taylor & Francis Group. (2012, 9 May). Optimize citations. Available:
[39] J. K. Vanclay, "Factors affecting citation rates in environmental science," Journal of Informetrics, vol. 7, pp. 265-271, April 2013.
[40] ACM. (2013, 30 May). ACM Computing Surveys. Available:
[41] O. Persson, "Are highly cited papers more international?," Scientometrics, vol. 83, pp. 397-401, May 2010.
[42] V. Pislyakov and E. Shukshina, "Measuring Excellence in Russia: Highly Cited Papers, Leading Institutions, Patterns of National and International Collaboration," presented at the Proceedings of STI 2012, Montréal, 2012.
[43] K. Krause. (2009, 28 May 2013). Increasing your Article's Citation Rates. Open Access Week. Available:
[44] D. W. Aksnes, "Characteristics of highly cited papers," Research Evaluation, vol. 12, pp. 159-170, Dec 2003.
[45] T. Van Leeuwen, "Strength and weakness of national science systems: A bibliometric analysis through cooperation patterns," Scientometrics, vol. 79, pp. 389-408, 2009/05/01 2009.
[46] K. Frenken, R. Ponds, and F. Van Oort, "The citation impact of research collaboration in science-based industries: A spatial-institutional analysis," Papers in Regional Science, vol. 89, pp. 351-271, 2010.
[47] E. Y. Li, C. H. Liao, and H. R. Yen, "Co-authorship networks and research impact: A social capital perspective," Research Policy, vol. 42, pp. 1515-1530, 11// 2013.
[48] R. Sooryamoorthy, "Do types of collaboration change citation? Collaboration and citation patterns of South African science publications," Scientometrics, vol. 81, pp. 177-193, 2009/10/01 2009.
[49] S. Wuchty, B. F. Jones, and B. Uzzi, "The Increasing Dominance of Teams in Production of Knowledge," Science, vol. 316, pp. 1036-1039, May 18, 2007 2007.
[50] C. A. Cotropia and L. Petherbridge, "The Dominance of Teams in the Production of Legal Knowledge," ed: Loyola-LA Legal Studies, 2013.
[51] P. Ball, "Are scientific reputations boosted artificially?," in Nature, ed: Nature Publishing Group, 2011.
[52] N. Haslam, L. Ban, L. Kaufmann, S. Loughnan, K. Peters, J. Whelan, et al., "What makes an article influential? Predicting impact in social and personality psychology," Scientometrics, vol. 76, pp. 169-185, 2008/07/01 2008.
[53] M. Sember, A. Utrobicic, and J. Petrak, "Croatian Medical Journal Citation Score in Web of Science, Scopus, and Google Scholar," Croatian Medical Journal, vol. 51, pp. 99-103, Apr 2010.
[54] D. Nicholas, E. Herman, and H. R. Jamali, Emerging reputation mechanisms for scholars. Luxembourg: Publications Office of the European Union, 2015, 2015.
[55] D. Godoy, A. Zunino, and C. Mateos, "Publication practices in the Argentinian Computer Science community: a bibliometric perspective," Scientometrics, vol. 102, pp. 1795-1814, Feb 2015.
[56] S. Dhawan and B. Gupta, "Evaluation of Indian physics research on journal impact factor and citations count: A comparative study," DESIDOC Journal of Library & Information Technology, vol. 25, pp. 3-7, 2005.
[57] Y. W. Chang, "A comparison of citation contexts between natural sciences and social sciences and humanities," Scientometrics, vol. 96, pp. 535-553, Aug 2013.
[58] P. Dorta-Gonzalez and M. I. Dorta-Gonzalez, "Comparing journals from different fields of science and social science through a JCR subject categories normalized impact factor," Scientometrics, vol. 95, pp. 645-672, May 2013.
[59] L. Ortega and K. Antell, "Tracking Cross-Disciplinary Information Use by Author Affiliation: Demonstration of a Method," College & Research Libraries, vol. 67, pp. 446-462, September 1, 2006 2006.
[60] S. Lawrence, "Free online availability substantially increases a paper's impact," Nature, vol. 411, pp. 521-521, 05/31/print 2001.
[61] Y. Gargouri, C. Hajjem, V. Larivière, Y. Gingras, L. Carr, T. Brody, et al., "Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research," PLoS ONE, vol. 5, p. e13636, 2010.
[62] S. Harnad, "Publish or perish—self-archive to flourish: the green route to open access," ERCIM News, vol. 64, January 2006.
[63] C. Pernet and J.-B. Poline, "Improving functional magnetic resonance imaging reproducibility," GigaScience, vol. 4, pp. 1-8, 2015/03/31 2015.
[64] L. Vaughan and D. Shaw, "Bibliographic and Web citations: What is the difference?," Journal of the American Society for Information Science and Technology, vol. 54, pp. 1313-1322, 2003.
[65] J. A. Evans, "Electronic Publication and the Narrowing of Science and Scholarship," Science, vol. 321, pp. 395-399, July 18, 2008 2008.
[66] C. J. MacCallum and H. Parthasarathy, "Open Access Increases Citation Rate," PLoS Biol, vol. 4, p. e176, 2006.
[67] A. Swan, "Title," unpublished|.
[68] R. Frost. (2009, 9 May). Case study: Open Access visibility and impact of an individual researcher. Available:
[69] L. G. Campbell, S. Mehtani, M. E. Dozier, and J. Rinehart, "Gender-Heterogeneous Working Groups Produce Higher Quality Science," PLoS ONE, vol. 8, p. e79147, 2013.
[70] C. A. Bowers, J. A. Pharmer, and E. Salas, "When member homogeneity is needed in work teams - A meta-analysis," Small Group Research, vol. 31, pp. 305-327, Jun 2000.
[71] A. W. Woolley, C. F. Chabris, A. Pentland, N. Hashmi, and T. W. Malone, "Evidence for a Collective Intelligence Factor in the Performance of Human Groups," Science, vol. 330, pp. 686-688, Oct 2010.
[72] L. Hong and S. E. Page, "Problem solving by heterogeneous agents," Journal of Economic Theory, vol. 97, pp. 123-163, Mar 2001.
[73] L. Hong and S. E. Page, "Groups of diverse problem solvers can outperform groups of high-ability problem solvers," Proceedings of the National Academy of Sciences of the United States of America, vol. 101, pp. 16385-16389, Nov 2004.
[74] D. Maliniak, R. Powers, and B. F. Walter, "The Gender Citation Gap in International Relations," International Organization, vol. 67, pp. 889-922, Fal 2013.
[75] SAGE. (2012, 9 May). 10 Ways to Increase Usage and Citation of your Published Article Using Social Media. Available:
[76] N. Ale Ebrahim, S. Ahmed, and Z. Taha, "Virtual R & D teams in small and medium enterprises: A literature review," Scientific Research and Essay, vol. 4, pp. 1575–1590, December 2009.
[77] N. Ale Ebrahim and H. Salehi, "Maximize Visibility: A Way to Increase Citation Frequency," UM HIR SPECIAL FEATURE (27 May 2013), pp. 1-5, 27 May 2013.
[78] W. Kieńć. (2015, 4th June 2015). Blog on your own and blog with your publisher. Available:
[79] A. G. Smith, "Citations and Links as a Measure of Effectiveness of Online LIS Journals," IFLA Journal, vol. 31, pp. 76-84, March 1, 2005 2005.
[80] Taylor & Francis Group. (2012, 9 May). Promote your article. Available:
[81] G. Eysenbach, "Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact (vol 13, e123, 2011)," Journal of Medical Internet Research, vol. 14, p. 2, Jan-Feb 2012.
[82] M. Terras, "The impact of social media on the dissemination of research: Results of an experiment," Journal of Digital Humanities, vol. 1, 2012.
[83] L. Public Policy Group, "Maximizing the impacts of your research: a handbook for social scientists," London School of Economics and Political Science, London, UK.2011.
[84] Elsevier BV, "Get Noticed: Promoting your article for maximum impact," 2014.
[85] Derek. (2010, 9 June 2015). Citation Competition. Available:
[86] A. J. Dorta-Contreras, R. Arencibia-Jorge, Y. Marti-Lahera, and J. A. Araujo-Ruiz, "[Productivity and visibility of Cuban neuroscientists: bibliometric study of the period 2001-2005]," Rev Neurol, vol. 47, pp. 355-60, Oct 1-15 2008.
[87] M. Burger, "How to improve the impact of your paper," Elsevier B.V.2014.
[88] V. Calcagno, E. Demoinet, K. Gollner, L. Guidi, D. Ruths, and C. de Mazancourt, "Flows of Research Manuscripts Among Scientific Journals Reveal Hidden Submission Patterns," Science, vol. 338, pp. 1065-1069, November 23, 2012 2012.
[89] P. Ball. (2012, 11 October) Rejection improves eventual impact of manuscripts. Nature. Available:
[90] H. A. Piwowar, R. S. Day, and D. B. Fridsma, "Sharing Detailed Research Data Is Associated with Increased Citation Rate," PLoS ONE, vol. 2, p. e308, 2007.
[91] S. Dorch, "On the Citation Advantage of linking to data: Astrophysics," ed.
[92] J. R. Sears, "Data sharing effect on article citation rate in paleoceanography."
[93] E. A. Henneken and A. Accomazzi, Linking to Data: Effect on Citation Rates in Astronomy vol. 461, 2012.
[94] A. M. Pienta, G. C. Alter, and J. A. Lyle, "The enduring value of social science research: the use and reuse of primary research data," 2010.
[95] M. C. Whitlock, M. A. McPeek, M. D. Rausher, L. Rieseberg, and A. J. Moore, "Data Archiving," American Naturalist, vol. 175, pp. E45-146, Feb 2010.
[96] H. A. Piwowar and T. J. Vision, "Data reuse and the open data citation advantage," Peerj, vol. 1, Oct 2013.
[97] M. J. McCabe, "Online Access and the Scientific Journal Market: An Economist’s Perspective," University of Michigan and SKEMA Business SchoolJune 2011.
[98] E. Garfield and R. K. Merton, "Perspective on Citation Analysis of Scientists," in Citation indexing: Its theory and application in science, technology, and humanities. vol. 8, ed: Wiley New York, 1979.
[99] N. Ale Ebrahim, "How to Promote Your Article," University of Malaya Research Bulletin, vol. 1, 23 June 2014.
[100] D. Sahu, "Open Access: Why India Should Brace it?," ed, 2005, pp. 1-49.
[101] Utrecht University. (2014, June 2015). Research Impact & Visibility: Researcher profiles. Available:
[102] A. Aghaei Chadegani, H. Salehi, M. M. Yunus, H. Farhadi, M. Fooladi, M. Farhadi, et al., "A Comparison between Two Main Academic Literature Collections: Web of Science and Scopus Databases," Asian Social Science, vol. 9, pp. 18-26, April 27 2013.
How to Cite
EBRAHIM, Nader Ale; GHOLIZADEH, Hossein; LUGMAYR, Artur. Maximized Research Impact: Effective Strategies for Increasing Citations. International SERIES on Information Systems and Management in Creative eMedia (CreMedia), [S.l.], n. 2017/1, p. 29-52, dec. 2017. ISSN 2341-5576. Available at: <>. Date accessed: 14 nov. 2021.
Share |

Most read articles by the same author(s)

Factors affecting the frequency of citation of an article


Factors affecting the frequency of citation of an article

Main Article Content

Rafael Repiso
Alicia Moreno-Delgado
Ignacio Aguaded


The relevance of citations is clear since they constitute a substantial part of most bibliometric indicators. The aims of the present paper are to identify several factors associated with obtaining citations to explain these and, finally, to offer authors a number of useful suggestions. Those studies that have had the greatest influence on science are also those that are most frequently cited. The essential factor leading to a study being cited is that it should make a significant contribution to the advance of science; that is, the relevance of the research. But other essential dimensions exist: Accessibility; Dissemination; Scientific authority. Other predictive factors allow us to predict the number of citations a document may receive: Prior production by the authors; Structural context of the work; Scientific trends; Validity/Obsolescence (expiry) of results; Quality of formal aspects; Theoretical context of the study; Types of work. Finally, some ways are suggested to improve the citations of their works and thus contribute to a wider dissemination and development of science.


  • Citations
  • CrossRef - Citation Indexes: 1
  • Captures
  • Mendeley - Readers: 6
  • Mentions
  • Blogs: 1

Article Details

How to Cite
Repiso, R., Moreno-Delgado, A., & Aguaded, I. (2020). Factors affecting the frequency of citation of an article. Iberoamerican Journal of Science Measurement and Communication, 1(1), 007.
Opinion article


Bookstein, A. (1994). Towards a multi-disciplinary Bradford law. Scientometrics, 30(1), 353–361.

Costas, R., van Leeuwen, T. N., & van Raan, A. F. J. (2010). Is Scientific Literature Subject to a ‘Sell-By-Date’? A General Methodology to Analyze the ‘Durability’ of Scientific Documents. Journal of the American Society for Information Science and Technology, 61(2), 329–339.

Ebrahim, N. A., Salehi, H., Embi, M. A., Tanha, F. H., Gholizadeh, H., Motahar, S. M., & Ordi, A. (2013). Effective strategies for increasing citation frequency. International Education Studies, 6(11), 93–99.

Garfield, E., & Malin, M. V. (1968). Can Nobel Prize winners be predicted. 135th Meetings of the American Association for the Advancement of Science, Dallas, TX, 1–8.

Larivière, V., Archambault, É., & Gingra. (2013). Long-Term Variations in the Aging of Scientific Literature: From Exponential Growth to Steady-State Science (1900–2004). Journal of the American Society for Information Science and Technology, 64(July), 1852–1863.

Lotka, A. J. (1926). The frequency distribution of scientific productivity. Journal of the Washington Academy of Sciences, 16(12), 317–323.

Rovira, C., Guerrero-Solé, F., & Codina, L. (2018). Received citations as a main seo factor of google scholar results ranking. Profesional de La Informacion, 27(3), 559–569.

Van Raan, A. F. J. (2004). Sleeping Beauties in science. Scientometrics, 59(3), 467–472.