Monday, 26 September 2016

Visualizing Citation Cartels | The Scholarly Kitchen

 Source: https://scholarlykitchen.sspnet.org/2016/09/26/visualizing-citation-cartels/

Authority, Controversial Topics, Metrics and Analytics, Research

Visualizing Citation Cartels

Caption goes here
A citation cartel or valid study?
By their very nature, citation cartels are difficult to detect.
Unlike self-citation, which can be spotted when there are high levels of
references to other papers published in the same journal, cartels work
by influencing incoming citations from other journals.


In 2012, I reported on the first case of a citation cartel involving four biomedical journals. Later that year, Thomson Reuters suspended three of the four titles from receiving an Impact Factor. In 2014, they suspended six business journals for similar behavior.


This year, Thomson Reuters suspended Applied Clinical Informatics (ACI) for its role in distorting the citation performance of Methods of Information in Medicine (MIM). Both journals are published by Schattauer Publishers in Germany. According to the notice, 39% of 2015 citations to MIM came from ACI.
More importantly, 86% of these citations were directed to the previous
two years of publication — the years that count toward the journal’s
Impact Factor.


Thomson Reuters purposefully avoids using the term “citation cartel,”
which implies a willful attempt to game the system, and uses the more
ambiguous term “citation stacking” to describe the pattern itself.
Ultimately, we never know the intent of the authors who created the
citation pattern in the first place, only that it can distort the
ranking of a journal within its field. This is what Thomson Reuters
wants to avoid.


Schattauer Publishers appealed the suspension,
offering to exclude the offending papers from their Impact Factor
calculation as a concession. Their appeal was denied. Offering some
consolation to its readers, the publisher made all 2015 ACI papers freely available. It has also offered all ACI authors one free open access publication in 2016.


To better understand the citation pattern that resulted in ACI being suspended, I created, using VOS Viewer, a visualization of the citation network of papers published in ACI (blue) and MIM (red) from 2013 through 2015. Each paper lists its first author, year of publication and links to the papers it cites.


Caption goes here
Citation
network of papers published in Applied Clinical Informatics (blue) and
Methods of Information in Medicine (red), 2013–2015.
From the graph, there appear to be four papers that strongly influence the flow of citations in this network, two MIM papers by Lehmann (red) and two ACI
papers by Haux (blue). Each of these papers cites a large number of
papers published in the other journal within the previous two
years. Does this alone imply an intent to distort one’s Impact Factors?
We need more information.


Both Lehmann and Haux are on the editorial boards of both journals. Lehman is the Editor-in-Chief of ACI and also sits on the editorial board of MIM. Haux is the Senior Consulting Editor of MIM and also sits on the ACI editorial board.
This illustrates that there is a close relationship among the two
editors, but still this is not enough to imply intent. We need to look
at the four offending papers:


  • The 2014 Lehmann paper (coauthored
    by Haux) includes the following methods statement in its abstract:
    “Retrospective, prolective observational study on recent publications of
    ACI and MIM. All publications of the years 2012 and 2013 from these journals were indexed and analysed.”
  • Similarly, the 2014 Haux paper (coauthored
    by Lehmann) includes this methods statement: “Retrospective, prolective
    observational study on recent publications of ACI and MIM. All publications of the years 2012 and 2013 were indexed and analyzed.”
  • The 2015 Lehman paper states: “We conducted a retrospective observational study and reviewed all articles published in ACI during the calendar year 2014 (Volume 5)…”, and lastly,
  • The 2015 Haux paper states: “We conducted a retrospective, observational study reviewing MIM articles published during 2014 (N=61) and analyzing reference lists of ACI articles from 2014 (N=70).
What is similar among these four papers written by ACI and MIM
editors is that they are analyzing papers published in their own
journals within the time frame that affects the calculation of their
Impact Factors. Again, this alone does not imply an intent to game their
Impact Factor. Indeed, the publisher explained that
citation stacking was “an unintentional consequence of efforts to
analyze the effects of bridging between theory and practice.”


I can’t dispute what the editors and publisher state was their
intent. However, what is uniformly odd about these papers is that they
cite their dataset as if each datapoint (paper) required a reference.


Why is this odd? If I conducted a brief analysis and summary of
all papers published in a journal, would I need to cite each paper
individually, or merely state in the methods section that my dataset
consists of all 70 research papers published in Journal A in years X
and Y? While ACI and MIM are relatively small journals, if this approach were used to analyze papers published in, say, PNAS, their reference section would top 8000+ citations. Similarly, a meta-analysis of publication in PLOS ONE
would require citing nearly 60K papers. Clearly, there is something
about the context of paper-as-datapoint that distinguishes it from
paper-as-reference.


One could play devil’s advocate by assuming that it is normal
referencing behavior in the field of medical informatics to cite one’s
data points, even if they are papers, and unfortunately we’ve seen this
pattern before. In 2012, I took the editor of another medical
informatics journal to task for a similar self-referencing study. The editor conceded by removing all data points from his reference list, acknowledging that this was a “minor error” in a correction statement.
Citing papers-as-datapoints, in the cases of Lehmann and Haux is not
standard citation practice. The editors should have known this.


If it was not the intention of the editors to influence their
citation performance, there were other options open to them at the time
of authorship:


  1. They could have simply described their dataset without citing each paper.
  2. If citing each paper was important to the context of their paper,
    they could have worked from a group of papers published outside the
    Impact Factor window. Or,
  3. They could have listed their papers in a footnote, appendix, or provided simple online links instead of formal references.
Suspension from receiving a Journal Impact Factor can be a serious
blow to the ability of a journal to attract future manuscripts. The
editors apologized for their actions in an editorial published soon after ACI suspension. In the future, they will refrain from publishing these kinds of papers or put their references in an appendix.


Thanks to Ludo Waltman for his assistance with VOS Viewer.



Visualizing Citation Cartels | The Scholarly Kitchen

Measuring Scientific Impact Beyond Citation Counts



 Source: http://www.dlib.org/dlib/september16/patton/09patton.html

"Measuring Scientific Impact Beyond Citation Counts"

Robert M. Patton, Christopher G. Stahl and Jack C. Wells have published "Measuring Scientific Impact Beyond Citation Counts" in D-Lib Magazine.

Here's an excerpt:

The measurement of scientific progress remains a significant
challenge exasperated by the use of multiple different types of metrics
that are often incorrectly used, overused, or even explicitly abused.
Several metrics such as h-index or journal impact factor (JIF) are often
used as a means to assess whether an author, article, or journal
creates an "impact" on science. Unfortunately, external forces can be
used to manipulate these metrics thereby diluting the value of their
intended, original purpose. This work highlights these issues and the
need to more clearly define "impact" as well as emphasize the need for
better metrics that leverage full content analysis of publications.
Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

Measuring Scientific Impact Beyond Citation Counts

Tuesday, 13 September 2016

Conducting a Literature Search & Writing Review Paper

Contribute to Wikipedia: An approach to Increase Research Visibility on...

Visibility and Citation Impact | Ebrahim | International Education Studies

 Source: http://dx.doi.org/10.5539/ies.v7n4p120

Visibility and Citation Impact

Nader Ale Ebrahim, Hadi Salehi, Mohamed Amin Embi, Farid Habibi Tanha, Hossein Gholizadeh, Seyed Mohammad Motahar


Abstract



The number of publications is the first criteria for assessing
a researcher output. However, the main measurement for author
productivity is the number of citations, and citations are typically
related to the paper's visibility. In this paper, the relationship
between article visibility and the number of citations is investigated. A
case study of two researchers who are using publication marketing tools
confirmed that the article visibility will greatly improve the citation
impact. Some strategies to make the publications available to a larger
audience have been presented at the end of this paper.


Full Text:

PDF




DOI: http://dx.doi.org/10.5539/ies.v7n4p120








International Education Studies ISSN 1913-9020 (Print), ISSN 1913-9039 (Online)

Copyright © Canadian Center of Science and Education

To make sure that you can receive messages from us, please add the '

Visibility and Citation Impact | Ebrahim | International Education Studies

La participación femenina en publicaciones colombianas de economía y administración indexadas en Scopus (1974 – junio de 2014) | Lis Gutiérrez | Revista Facultad de Ciencias Económicas

Source: http://revistas.unimilitar.edu.co/index.php/rfce/article/view/2219

La participación femenina en publicaciones colombianas de economía y administración indexadas en Scopus (1974 – junio de 2014)

Jenny Paola Lis Gutiérrez, Clorith Angélica Bahos Olivera


Resumen



A partir de la construcción de: (i) indicadores de género
(distribución horizontal, distribución vertical, índice de feminidad,
índice de masculinidad, índice de Duncan, índice de segregación, índice
de contribución al sexismo e índice de interacción), (ii) un análisis
bibliométrico descriptivo y evaluativo, segmentado por el sexo de los
autores, y (iii) representaciones de cartografía temática y estadística,
se pretendió establecer si existe una brecha de género en la producción
académica nacional en Economía y Administración, esto, a pesar de que
el número de autoras en ciencias económicas se haya incrementado en años
recientes. Los resultados indican que los artículos con mujeres como
autoras principales no superaron el 27% y como co-autoras solo fue del
24%. A su vez se identificó que los mejores indicadores en cuanto a
reiteración de las citaciones (índices H, R y A) los obtienen los
artículos escritos por hombres sin co-autoría femenina. Finalmente se
logró establecer que los indicadores de género calculados con respecto
al número de autores, reiteran la brecha de género y la masculinización
en la publicación científica en ciencias económicas, sin embargo, no hay
evidencia de segregación en las tres áreas analizadas. 


Palabras clave



Bibliometría; cienciometría; economía; administración; índices bibliométricos; indicadores de género; brecha de género; Scopus


Texto completo:

PDF

Referencias



Abramo, G., Cicero, T. & D'Angelo, C. (2014). Are the
authors of highly cited articles also the most productive ones? Journal
of Informetrics, 8(1): 89-97.
http://dx.doi.org/10.1016/j.joi.2013.10.011


Abramo, G., D'Angelo, C. & Murgia, G. (2013). Gender
differences in research collaboration. Journal of Informetrics, 7(4):
811-822. http://dx.doi.org/10.1016/j.joi.2013.07.002


Albert, C. (2000). Higher education demand in Spain; the
influence of labour market signals and family background. Higher
Education, 40(2): 147-162. http://dx.doi.org/10.1023/A:1004070925581


Albert, C., González, C. & Mora, J. (2011). Análisis de la
evolución y caracterización de la demanda de educación universitaria en
Colombia (Borradores de Economía y Finanzas, 28). Cali: ICESI.
Disponible en:
http://bibliotecadigital.icesi.edu.co/biblioteca_digital/bitstream/10906/67533/1/analisis_evolucion_caracterizacion.pdf


Ale
Ebrahim, N., Salehi, H., Embi, M., Habibi, F., Gholizadeh, H. &
Motahar, S. (2014). Visibility and Citation Impact. International
Education Studies, 7(4): 120-125. http://dx.doi.org/10.5539/ies.v7n4p12
0


Aleixandre, R., González, G., Alonso, A., Castellano, M. &
Valderrama, J. (2007). Valoración de la paridad en la autoría de los
artículos publicados en la Revista Enfermedades Infecciosas y
Microbiología Clínica durante el quinquenio 2001-2005. Enfermedades
Infecciosas y Microbiología Clínica, 25(10): 619-626.
http://dx.doi.org/10.1157/13112937


Alonso, A., Aleixandre, R., Vidal, A., Anguita, M., Chorro,
F., Bola-os, M., Castelló, L., Navarro, C. & Valderrama, J. (2014).
Publicaciones derivadas de las comunicaciones a los congresos anuales de
la Sociedad Espa-ola de Cardiología. Revista Espa-ola de Cardiología,
67(1): 15-21. http://dx.doi.org/10.1016/j.recesp.2013.05.010


Amodio, P. & Brugnano, L. (2014). Recent advances in
bibliometric indexes and the PaperRank problem. Journal of Computational
and Applied Mathematics, 267: 182-194.
http://dx.doi.org/10.1016/j.cam.2014.02.018


Araújo, J. & Arencibia, R. (2002). Informetría,
bibliometría y cienciometría: aspectos teórico-prácticos. ACIMED, 10
(4): 5-6.


Arencibia, R. & Carvajal, R. (2008). Los índices H, G y R:
su uso para identificar autores líderes en el área de la Comunicación
durante el período 2001-2006. ACIMED, 17(4).


Arias, E., Velasco, J. & Novo, M. (2016). Análisis
bibliométrico sobre la investigación en violencia de género. Fundamentos
y nuevas tendencias. Disponible en:
https://www.researchgate.net/profile/Esther_Arias/publication/296485748_ANALISIS_BIBLIOMETRICO_SOBRE_LA_INVESTIGACION_EN_VIOLENCIA_DE_GENERO_FUNDAMENTOS_Y_NUEVAS_TENDENCIAS_BIBLIOMETRIC_ANALYSIS_ON_INTIMATE_PARTNER_VIOLENCE_RESEARCH_BASIS_AND_NEW_TRENDS/links/56d5dd7108aee1aa5f730caa.pdf


Arias, A. (2015). Nuevas aproximaciones metodológicas al
estudio de la colaboración en la ciencia a través de las publicaciones
científicas. [Tesis doctoral] Salamanca: Universidad de Salamanca.
Disponible en: http://hdl.handle.net/10366/127969


Armfield, N., Edirippulige, S., Caffery, L., Bradford, N.,
Grey, J. & Smith, A. (2014). Telemedicine – A bibliometric and
content analysis of 17,932 publication records. International Journal of
Medical Informatics, In Press, Corrected Proof.
http://dx.doi.org/10.1016/j.ijmedinf.2014.07.001


Bailyn, L. (2003). Academic Careers and Gender Equity: Lessons
Learned from MIT. Gender, Work and Organisation, 10(2): 137-153.
http://dx.doi.org/10.1111/1468-0432.00008


Benoit, M. (2015). Qui fait quoi? Analyse des libellés de
contribution dans les articles savants [Tesis de Maestría]. Montreal:
Université de Montreal. Disponible en:
https://papyrus.bib.umontreal.ca/xmlui/handle/1866/12541


Bonilla,V., López de Méndez, A., Cintrón, M., Ramírez, S.
& Román, R. (2005). Feminización de la matrícula de Educación
Superior en Puerto Rico. Disponible en:
http://cie.uprrp.edu/cuaderno/ediciones/20/pdf/c20art7.pdf


Bouyssou, D. & Marchant, T. (2014). An axiomatic approach
to bibliometric rankings and indices. Journal of Informetrics, 8(3):
449-477. http://dx.doi.org/10.1016/j.joi.2014.03.001


Brooks, C., Fenton, E. & Walker, J. (2014). Gender and the
evaluation of research. Research Policy, 43(6): 990-1001.
http://dx.doi.org/10.1016/j.respol.2013.12.005


Brones, F., Monteiro de Carvalho, M. & de Senzi Zancul, E.
(2014). Ecodesign in project management: a missing link for the
integration of sustainability in product development? Journal of Cleaner
Production, 80(1): 106-118.
http://dx.doi.org/10.1016/j.jclepro.2014.05.088


Brzezinski, M. (2014). Power Laws in Citation Distributions:
Evidence from Scopus Disponible en: http://ssrn.com/abstract=2397685 or
http://dx.doi.org/10.2139/ssrn.2397685


Bustos, O. & Blázquez, N. (2003). Qué dicen las académicas acerca de la UNAM. México: UNAM.


Buquet, A., Cooper, J. & Rodríguez, H. (2010). Sistema de
indicadores para la equidad de género en instituciones de educación
superior. México: UNAM. Disponible en:
http://www.pueg.unam.mx/images/stories/Equidad/Investigacion/sist.%20de%20indicadores.pdf


Buquet, A., Cooper, J., Rodríguez, H. & Botello, L.
(2006). Presencia de mujeres y hombres en la UNAM: una radiografía.
México: UNAM.


Buquet, A., Hernández, A. & Jiménez, V. (2011).
Diagnóstico de la situación de mujeres y hombres por dependencia.
Instituto de Matemáticas de la UNAM. México D.F.: UNAM.


Caputo, C., Vargas, D. & Requena, J. (2016).
Desvanecimiento de la brecha de género en la universidad venezolana.
Interciencia, 41(3): 154-161.


Correa, M. (2003). Los géneros en la educación superior en
Colombia. Santiago de Chile: Instituto Internacional para la Educación
Superior en América Latina y el Caribe.


Costa, P. R. da. (2015). Inovação no ensino e aprendizagem em
finanças: análise da literatura entre 2005 e 2015. [Tesis de maestría].
São Paulo: FECAP.


Daza, S. (2010). Las mujeres en el SNCTI. Balance de una
década en condiciones diferentes. En OCyT, Indicadores de Ciencia y
Tecnología (pp. 279-316). Bogotá: OCyT.


Daza, S. & Pérez, T. (2008). Contando mujeres. Una
reflexión sobre los indicadores de género y ciencia en Colombia. Revista
de Antropología y sociología Virajes, 10: 29-51.


De Oliveira, M., Mazer, S., Guillaumon, M. & Fernández, E.
(2014). Análisis de la producción científica en Brasil sobre
dificultades de aprendizaje: una revisión bibliométrica. Aula Abierta,
42(1): 31-38. http://dx.doi.org/10.1016/S0210-2773(14)70006-X


De Pablos, L. & Gil, M. (2011). Las políticas de educación
desde la perspectiva de género. Presupuesto y Gasto Público, 64:
179-208


Delgado, E. (2016). ¿Evaluar la investigación con Google
Scholar? Yes we can. Ponencia presentada en el V Seminario EC3,
celebrado en Granada el 12 de junio de 2015. Disponible en:
http://hdl.handle.net/10481/41069


Egghe, L. (2006). Theory and practise of the g-index. En:
Scientometrics, 69(1): 131-152.
http://dx.doi.org/10.1007/s11192-006-0144-7


Eraso, L. (2016). La mujer en la medicina colombiana. Revista Medicina, 38(1): 73-81.


Escolano, E. (2006). Discriminación en un medio meritocrático:
las profesoras en la Universidad espa-ola. Revista Mexicana de
Sociología, 2, 231-263. Disponible en:
http://www.revistas.unam.mx/index.php/rms/article/view/6055


Escolano, E. (2009). El poder como asignatura pendiente de las
académicas en las universidades espa-olas. En: Chávez, M. A., Chávez,
M.R, Ramírez, E., Cruz, M. & Cervantes, G. Género y trabajo en las
universidades (pp. 83-127). Guadalajara: Universidad de Guadalajara.


Estéba-ez, M. (2007). Género e investigación científica en las
universidades latinoamericanas. Educación superior y sociedad, 1 (1):
1-26.


Galvéz, J. (2009). La mujer a través del telescopio. Disponible en http://www.oei.es/cienciayuniversidad/spip.php?article109


García, P. (2009). Las académicas entre la materialidad
política y la subjetividad. En: Chávez, M.A., Chávez, M.R, Ramírez, E.,
Cruz, M. & Cervantes, G. Género y trabajo en las universidades (pp.
31-48). Guadalajara: Universidad de Guadalajara.


García, P. (2004). Mujeres académicas: el caso de una
universidad estatal mexicana. México: Plaza y Valdés. Universidad de
Guadalajara.


García, P. (1992). Notas sobre la participación de la mujer
académica en la Universidad de Guadalajara. Revista Tiempo de Ciencia,
28: 33-36.


González, B., Guerrero, V. & Moya, F. (2009). The SJR
indicator: A new indicator of journals' scientific prestige. Tech. Rep.
arxiv:abs/0912.4141 [cs.DL]. Consultado el 10 de junio de 2010 en
http://arxiv.org/ftp/arxiv/papers/0912/0912.4141.pdf.


Gregorio, O., Méndez, C. & Peralta, M. (2015).
Acercamiento bibliométrico a las revistas científicas colombianas de
ciencias sociales: comparación y nuevas miradas hacia la evaluación y
categorización a partir de ISI WoS y Scielo Citation Index. IV Jornadas
de Intercambio y Reflexión acerca de la Investigación en
Bibliotecología. Disponible en: http://hdl.handle.net/10915/52479


Grupo Scimago (2007). "El índice h de Hirsch: su aplicación a
algunos de los científicos espa-oles más destacados". El profesional de
la información, 16(1): 47-49. http://dx.doi.org/10.3145/epi.2007.jan.05


Filippo, D. & Fernández, M. (2002). Bibliometría:
importancia de los indicadores bibliométricos. El estado de la ciencia:
principales indicadores deficiencia y tecnología
iberoamericanos/interamericanos. Disponible em:
http://www.ricyt.org/component/docman/doc_view/113-bibliometria-importancia-de-los-indicadores-bibliometricos?Itemid=2


Hagen, N. (2014). Reversing the byline hierarchy: the effect
of equalizing bias on the accreditation of primary, secondary and senior
authors. Journal of Informetrics, 8(3): 618-627.
http://dx.doi.org/10.1016/j.joi.2014.05.00


Hirsch, J. (2005).An index to quantify an individual's
scientific research output, Proceedings of the National Academy of
Sciences, 102 (46): 16569-16572.
http://dx.doi.org/10.1073/pnas.0507655102


Jatobá, I. & Gomes, J. (2012). Cartografia digital: o
software philcarto no ensino da geocartografia. Revista Metáfora
Educacional, 12: 49-65.


Jin, B., Liang L., Rousseau, R. & Egghe, L. (2007). The R-
and AR-indices: Complementing the h- index. Chinese Science Bulletin,
52(6): 855-863. http://dx.doi.org/10.1007/s11434-007-0145-9


Joyce, C., Kelly, J. & Sugrue, C. (2014). A bibliometric
analysis of the 100 most influential papers in burns. Burns, 40(1):
30-37. http://dx.doi.org/10.1016/j.burns.2013.10.025


Kretschmer, H. & Aguillo, I. (2005). New indicators for
gender studies in Web networks. Information Processing & Management,
41(6): 1481-1494. http://dx.doi.org/10.1016/j.ipm.2005.03.009


Koc, E. & Boz, H. (2014). Triangulation in tourism
research: A bibliometric study of top three tourism journals. Tourism
Management Perspectives, 12: 9-14.
http://dx.doi.org/10.1016/j.tmp.2014.06.003


Lemarchand, G. (ed.) (2010). Sistemas Nacionales de ciencia,
tecnología e innovación en América Latina y el Caribe. Montevideo:
Unesco.


Lis-Gutiérrez, J-P. (2012). Análisis de los grupos de
investigación colombianos en ciencias económicas desde una perspectiva
de género. Revista de la Facultad de Ciencias Económicas, XX (2):
143-164.


López, M. & de Pablos, J. (2013). El" índice h" en las
estrategias de visibilidad, posicionamiento y medición de impacto de
artículos y revistas de investigación. Investigar la Comunicación hoy.
Revisión de políticas científicas y aportaciones metodológicas: Simposio
Internacional sobre Política Científica en Comunicación, 133-150.


Lozano, I. & Rodríguez, Y. (2012). Análisis de los índice
H, G Y R en el sector agropecuario cubano a través de Scopus, 2005-2009.
Anales de Documentación, 15(1): 1-17.
http://dx.doi.org/10.6018/analesdoc.15.1.147641


Lozano, I., Iglesias, M. & Martínez, M. (2016). Un estudio
cualitativo sobre los diferenciales de género en la educación superior:
percepciones de las académicas en contextos masculinizados. La manzana
de la discordia, 11 (1): 41-54.


López, J., Basora, J., Orozco, D. & Bellón, J. (2014a).
Mapa bibliométrico de la investigación realizada en atención primaria en
Espa-a durante el periodo 2008-2012. Atención Primaria, In Press,
Corrected Proof.


López, S., Svider, P. Misra, P., Bhagat, N., Langer, P. &
Eloy, J. (2014b). Gender differences in promotion and scholarly impact:
an analysis of 1460 academic ophthalmologists. Journal of Surgical
Education, In Press, Corrected Proof.
http://dx.doi.org/10.1016/j.jsurg.2014.03.015


Lozano, I. & Rodríguez, Y. (2012). Análisis de los índice
H, G Y R en el sector agropecuario cubano a través de Scopus, 2005-2009.
Anales de Documentación, 15(1): 1-17.
http://dx.doi.org/10.6018/analesdoc.15.1.147641


Machado, C., Saraiva de Souza, M., dos Santos Parisotto, I.
& Palmisano, A. (2016). As Leis da Bibliometria em Diferentes Bases
de Dados Científicos. Ciencias da Administraçao, 18(44): 111-123.


Martínez, J., Ríos, J., Mero-o, A., Martínez, J. &
del-Ba-o, M. (2014). Caracterización de la base intelectual de la
fisioterapia a través del análisis de cocitación de documentos.
Fisioterapia, 36(4): 167-176. http://dx.doi.org/10.1016/j.ft.2013.10.001


Melo, H. (2013). Avaliação da Produção Acadêmica da Revista
Gestão & Regionalidade de 2005 a 2012 através de Bibliometria e
Sociometria. Gestão e Sociedade, 7 (18). Doi:
http://dx.doi.org/10.21171/ges.v7i18.1899


Observatorio Colombiano de Ciencia y Tecnología OCyT (2009).
Indicadores de Ciencia y Tecnología, 2008. Bogotá: Observatorio
Colombiano de Ciencia y Tecnología. Disponible en:
http://ocyt.org.co/es-es/InformeAnualIndicadores/ArtMID/542/ArticleID/17/Libro-de-Indicadores-de-Ciencia-y-Tecnolog237a-2008


Ortega, E., Valdivia, P., Olmedilla, A., Martínez, M. &
Villarejo, D. (2015). Estudio bibliométrico del papel de la mujer en las
tesis doctorales de ciencias del deporte. Journal of Sport and Health
Research, 7(2): 139-148.


Ortiz, E. & Hidalgo, Y. (2016). Detección de comunidades a
partir de redes de coautoría en grafos RDF. Revista Cubana de
Información en Ciencias de la Salud, 27(1): 90-99.


Papadópulos, J. & Radakovich, R. (2006). Educación
superior y género en América Latina y el Caribe. En: Informe sobre la
Educación Superior en América Latina y el Caribe 2000-2005. La
metamorfosis de la educación superior. Caracas: UNESCO, IESALC.


Pérez, T. (2012). Miércoles de ciencia y tecnología. Un Radio.
Podcast disponible en
http://www.unradio.unal.edu.co/detalle/cat/un-analisis/article/mujeres-cientificas.html


Portugal, M., Carvalho, J., Ribeiro de Almeida, M. & Reis,
N. (2014). Mergers & acquisitions research: A bibliometric study of
top strategy and international business journals, 1980–2010. Journal of
Business Research, In Press.


Pritchard, A. (1969). Statistical bibliography or bibliometrics. Journal of documentation, 25: 348-349.


Ponomariov, B. & Toivanen, H. (2014). Knowledge flows and
bases in emerging economy innovation systems: Brazilian research
2005–2009. Research Policy, 43(3): 588-596.
http://dx.doi.org/10.1016/j.respol.2013.09.002


Rehn, C., & Kronman, U. (2008). Bibliometric handbook for Karolinska Institutet. Huddinge: Karolinska Institutet.


Requena, J., Vargas, D. & Caputo, C. (2016). Género en la
ciencia venezolana: desvanecimiento de la brecha. Interciencia, 41(3) ,
162-170. Disponible en http://www.interciencia.org/v41_03/162.pdf


Restrepo, C. & Urbizagástegui, R. (2016). Métrica de la
literatura sobre los indígenas de México. Encontros Bibli: revista
eletrônica de biblioteconomia e ciência da informação, 21(46): 104-120.


Rubio, M. (1999). Bibliometría y ciencias sociales. Clio, 7.
Disponible en http://clio.rediris.es/numero007.html. Consultado el 20 de
septiembre de 2010.


SCImago (2016). SJR — SCImago Journal & Country Rank. Disponible en: http://www.scimagojr.com


Scopus (2016). Content overview. Disponible en: http://www.elsevier.com/online-tools/scopus/content-overview


Schreiber, M. (2014). Examples for counterintuitive behavior
of the new citation-rank indicator P100 for bibliometric evaluations.
Journal of Informetrics, 8(3): 738-748.
http://dx.doi.org/10.1016/j.joi.2014.06.007


Si Niu, X. (2014). International scientific collaboration
between Australia and China: A mixed-methodology for investigating the
social processes and its implications for national innovation systems.
Technological Forecasting and Social Change, 85: 58-68.
http://dx.doi.org/10.1016/j.techfore.2013.10.014


Silva Andrade, L., de Paiva, A., de Castro Alcântara, V. &
Brito, M. (2016). Desvelando o Campo da Estratégia como Prática e suas
Relações, Iberoamerican Journal of Strategic Management, 15(1).


Sos Pe-a, R. (2015). La influencia de las primeras psicólogas
norteamericanas en la historia de la psicología. Revista de Historia de
la Psicología, 36(2): 31-46.


Tomei, K., Nahass, M., Husain, Q., Agarwal, N., Patel, S.,
Svider, P., Eloy, J. & Liu, J. (2014). A gender-based comparison of
academic rank and scholarly productivity in academic neurological
surgery. Journal of Clinical Neuroscience, 21(7): 1102-1105.
http://dx.doi.org/10.1016/j.jocn.2013.11.006


Tovar, P. (2002). Género y ciencia en Colombia: Algunos
indicadores. En: Colombia Ciencia y Tecnología, 20 (2). Bogotá:
Colciencias.


Tovar, P. (2004). Indicadores nacionales de género, ciencia y
tecnología. En: Memorias seminario: Las mujeres colombianas en el
sistema de ciencia y tecnología: obstáculos y logros. Bogotá: ICANH.
Abril 19-21 de 2006.


Tovar, P. (2005). La percepción que tienen los colombianos
sobre la ciencia y la tecnología: la importancia de tener una
perspectiva de género. En: Aguirre, J. (ed.). La percepción que tienen
los colombianos sobre la ciencia y la tecnología. Bogotá: Colciencias.


Tú-ez, M. & de Pablos, J. (2013). El "índice h" en las
estrategias de visibilidad, posicionamiento y medición de impacto de
artículos y revistas de investigación. En: Pacheco, M., Rueda, M.,
Mari-o, V. & González, T. (coords.). 2º Congreso Nacional sobre
Metodología de la Investigación en Comunicación (p. 133-150).
Valladolid: Facultad de Ciencias Sociales, Jurídicas y de la
Comunicación.


UNESCO (2009). S&T World Data Fact Sheet. Montreal: Unesco.


Valcárcel de Laiglesia, M., Alfonso, F., Miró, O., Casademont,
J., Burbano Santos, P., Burillo, G., Fernández, C. & Martín, F.
(2014). Characteristics and Longevity of Electronic Citations in Four
Leading Biomedical Journals in Spain. Revista Espa-ola de Cardiología
(English Edition), In Press. http://dx.doi.org/10.1016/j.rec.2014.01.029


Veiga, D., Conforto, G., Malheiros, M., Carneiro, M., Cohen,
M. & da Silveira, C. (2014). Níveis de evidência da cirurgia de
joelho em periódicos nacionais. Revista Brasileira de Ortopedia, 49(1):
13-16. http://dx.doi.org/10.1016/j.rbo.2013.05.003


Vieira, E., Cabral, J. & Gomes, J. (2014). How good is a
model based on bibliometric indicators in predicting the final decisions
made by peers? Journal of Informetrics, 8(2): 390-405.
http://dx.doi.org/10.1016/j.joi.2014.01.012


Yu, C., Davis, C. & Dijkema, G. (2014). Understanding the
Evolution of Industrial Symbiosis Research (April 2014). Journal of
Industrial Ecology, 18(2): 280-293. http://dx.doi.org/10.1111/jiec.12073


Weiss, D., Kovshilovskaya, B. & Breyer, B. (2012). Gender
Trends of Urology Manuscript Authors in the United States: A 35-Year
Progression. The Journal of Urology, 187(1): 253-258.
http://dx.doi.org/10.1016/j.juro.2011.09.029


Zapata, M. (2010). La equidad de género en las universidades
alemanas. En: Mingo, A. (ed.) Desasosiegos. Relaciones de género en la
educación (pp. 109-150). México D.F.: Instituto de Investigaciones sobre
la Universidad y la Educación, UNAM.








DOI: http://dx.doi.org/10.18359/rfce.2219


Métricas de artículo

No metrics found.


Metrics powered by PLOS ALM















Copyright (c) 2016 Revista Facultad de Ciencias Económicas



Licencia de Creative Commons
Este obra está bajo una licencia de Creative Commons Reconocimiento-NoComercial-SinObraDerivada 4.0 Internacional.









ISSN 0121-6805

ISSN ON LINE 1909-7719











La participación femenina en publicaciones colombianas de economía y administración indexadas en Scopus (1974 – junio de 2014) | Lis Gutiérrez | Revista Facultad de Ciencias Económicas

Monday, 12 September 2016

Conducting a Literature Search & Writing Review Paper - October 2016



Conducting a Literature Search & Writing Review Paper - October 2016

H-index, Citation analysis, Bibliometrics, Impact factor, Performance evaluation, Relations between citations and references

 Source:





Ale Ebrahim, Nader, et al.
"Does a Long Reference List Guarantee More Citations? Analysis of
Malaysian Highly Cited and Review Papers." The International Journal of
Management Science and Business
 1.3 (2015): 6-15.




http://dx.doi.org/10.18775/ijmsba.1849-5664-5419.2014.13.1001

















Does a Long Reference List Guarantee More Citations? Analysis of Malaysian Highly Cited and Review Papers

0



International Journal of Management Science and Business Administration

Volume 1, Issue 3, February 2015, Pages 6 -16

Nader Ale Ebrahim *1, H. Ebrahimian 2, Maryam Mousavi3, Farzad Tahriri3
1Research Support Unit, Centre
of Research Services, Institute of Research Management and Monitoring
(IPPP), University of Malaya, Malaysia
2Institute of Mathematical Sciences, Faculty Science, University Malaya.
3Centre for Product Design and
Manufacturing, Department of Mechanical Engineering, Faculty of
Engineering, University of Malaya, 50603, Kuala Lumpur, Malaysia
*Corresponding author (e-mail): aleebrahim@um.edu.my
 Abstract: Earlier publications
have shown that the number of references as well as the number of
received citations are field-dependent. Consequently, a long reference
list may lead to more citations. The purpose of this article is to study
the concrete relationship between number of references and citation
counts. This article tries to find an answer for the concrete case of
Malaysian highly cited papers and Malaysian review papers. Malaysian
paper is a paper with at least one Malaysian affilation. A total of 2466
papers consisting of two sets, namely 1966 review papers and 500
highly-cited articles, are studied. The statistical analysis shows that
an increase in the number of references leads to a slight increase in
the number of citations. Yet, this increase is not statistically
significant. Therefore, a researcher should not try to increase the
number of received citations by artificially increasing the number of
references.


 Key words: H-index, Citation analysis, Bibliometrics, Impact factor, Performance evaluation, Relations between citations and references


Does a Long Reference List Guarantee More Citations – Analysis of Malaysian Highly Cited and Review Papers


 1. Introduction

 Researchers seeking citation tracking to find the
most influential articles for a particular topic and to see how often
their own published papers are cited (Bakkalbasi et al. 2006). On the other hand universities are looking for citations because of its influence in the university ranking (Ale Ebrahim et al. 2013, Ioannidis 2010, Bornmann, Leydesdorff, and Wang 2014).
A citation count is the number of times a research work such as a
journal article is cited by other works. The citation per paper
meaningfully influence a number of metrics, including total citation
counts, citation speed, the ratio of external to internal cites,
diffusion scores and h-index (Carley, Porter, and Youtie 2013). Citation counts still commonly use for the measure of research papers quality and reputation (Abt and Garfield 2002). The number of citations that an article receives measured its impact on a specific field (Lai, Darius, and Lerut 2012). Citation analysis is one of the most important tools to evaluate research performance (Bornmann et al. 2012). Citation indicator is important for scientists and universities in all over the world (Farhadi, Salehi, Yunus, et al. 2013).
In the early stage, the relationship between the number of references
and the number of the paper citation was investigated in the 1965 (UZUN 2006, de Solla Price 1965). A long reference list at the end of a research paper may be the key to ensuring that it is well cited (Corbyn 2010, Ball 2008). Hence, citation counts are correlated with reference frequencies (Abt and Garfield 2002). Webster, Jonason, and Schember (2009)
raised the question “Does the number of references an article contains
predict its citation count?” and found that reference counts explained
19% of the variance in the citation counts. Lancho-Barrantes, Guerrero-Bote, and Moya-Anegón (2010)
found that not only the number, but also the citation impact of the
cited references correlated with the citation counts for a paper. The
higher the impact of the cited references, the higher the later impact
of the citing paper (Bornmann et al. 2012). Review articles are usually highly cited compare to other types of papers (Meho 2007).


Review papers represent the existing knowledge in a given field and more likely to be cited (Alimohammadi and Sajjadi 2009). Several bibliometric studies highlighted that citation counts are a function of many factors besides the scientific quality (Bornmann et al. 2012), length of paper (Abt and Garfield 2002), visibility (Ale Ebrahim et al. 2014), optimize scholarly literature for academic search engines (Beel, Gipp, and Wilde 2010), add the name of study in the title of all publications (Sarli and Holmes 2011), publishing in a journal with higher impact factor (Vanclay 2013), internet usage (Farhadi, Salehi, Embi, et al. 2013), gross domestic product (GDP) (Gholizadeh et al. 2014), number of authors (Krause 2009), self-archiving (Gargouri et al. 2010), publish in an open access journal (Swan 2010), collaborate with international authors (Pislyakov and Shukshina 2012), write paper with a Nobel laureates (Ball 2011) and many other (Ale Ebrahim et al. 2013) including write a review paper  (Vanclay 2013) and use more references (Corbyn 2010). In this study the relationship between number of references and citation counts is determined. Webster, Jonason, and Schember (2009) mentioned “On average, review articles actually showed less of the relationship than standard articles” (Corbyn 2010).
So, in this research both review and standard articles (papers) were
investigated. 2466 articles consist of 1966 Malaysian review and 500
highly cited papers were selected to examine the relationship between
number of references and citation counts in the given article.


2.Materials and methods

 All data were obtained through Web of Science
online academic database provided by Thomson Scientific. This database
included the necessary information to examine the relationship between
reference and citation counts for every review and highly cited papers
published in Malaysia since 1980 to October 2013. Science Citation Index
Expanded, Social Sciences Citation Index and Arts & Humanities
Citation Index, were searched for reviews and highly cited papers. For
each paper, all Bibliometrics data, especially the number of references
and the number of times the paper has been cited during the interval
between the year of publication and the year 2013, have been
collected.Two samples set were selected: 1- The sample number one
consisted of 1966 review papers in all disciplines from Malaysia,
according to the Web of Knowledge’s classification system. Citation
statistics produced by shorter than three years’ time frame may not be
sufficiently stable (Adams 2005, UZUN 2006).
Because, papers appearing in the Web of Science databases over the last
few years, have not had enough time to accumulate a stable number of
citations (Webster, Jonason, and Schember 2009).
Therefore, the time span limited from 1980 to November, 2010; yielding a
subsample of 721 publications (37% of the original sample).
Publications with zero citation were removed. In order to select the
highly cited paper a threshold 10 times cited per year is considered.
The association between the number of references (independent variable)
and time cited per year (dependent variable) of highly cited review
papers investigated with linear and non-linear models. 2- The sample
number two comprises 500 highly cited publications from Malaysia.
According to the Web Of Science classification, the results are obtained
based on the article type and exclude the review articles, editorial
material, conference papers and book review.


3. Results and discussion

 Two sets of data 1- 1966 review papers  and 2- 500
high cited papers, were investigated separately. The results and
discussions are coming as follows.


Outliers for sample one (1966 review papers)


Due to the effect of the age of an article, the number of citations
cannot be a reference of highly cited paper. Therefore, the citation per
year selected as a reference for highly cited paper. Papers with 10
times cited per year is considered as highly cited paper. Figure 3-1
shows the number of times cited per year for 660 review papers. A
threshold was visually determined on 50 times cited per year. Papers
with more than 50 times cited yearly is called “extremely high cited
paper” and detected as outliers. Papers with more than 300 listed
references also detected as outliers (3-2).


1


Figure 3-1 Number of times cited per year vs number of review papers references
2


Figure 3-2 Number of times cited per year vs number of references in review paper
 Correlation analysis for sample one (1966 review papers)


The correlation between variables was modeled with regression model, linear model

y = α x + β and exponential model, non-linear model y = α eβx. The
goodness of both model was then measured with Spearman’s rho , Kendall’s
tau and Pearson correlation coefficient . The result of correlation
analysis is summarized in 3-1.
The association between variables is
graphically illustrated with scatter plots. The trend of these
associations was drawn with solid lines. Refer to Figure 3 and Figure 4,
both linear and non-linear models are not significantly fitted, trends
are positive which support the hypothesis “For a given review paper,
increasing in the number of references may have result of increasing the
times cited per year”.
The-result-of-correlation-analysis-of-highly-cited-review-papers


Table 3-1 The result of correlation analysis of highly-cited review papers3Figure 3-3 Relationship between number of references andcitation counts in review papers (linear model)4 Figure 3-4 Relationship between number ofreferences and citation counts in review papers (Exponential model) 
Outlier detection for sample two (500 highly cited papers)
Papers with 10 times cited per year is considered as highly cited
paper. Papers that cited more than 100 times per year is considered as
extremely high cited paper and detected as an outlier. Figure 5 and
Figure 6 are showing raw data and filtered data respectively.


Raw-data---Number-of-times-cited1
Figure 3-5 Raw data – Number of times cited per year vs number of references 500 highly cited papers7Figure 3-6  Filtered data – Number of times citedper year vs number of references in 500 highly cited papers
Correlation analysis for sample two (500 highly cited papers)
The association between the number of
references (independent variable) and time cited per year (dependent
variable) of first 500 high cited papers investigated with linear and
non-linear model correlation analysis. The correlation was modeled with
regression model, linear model y = α x + β and exponential model,
non-linear model y = α eβx. The goodness of fit was then measured with
Spearman’s rho , Kendall’s tau and Pearson correlation coefficient . The
result of correlation analysis is summarized in Table 3-2.
The-result-of-correlation-analysis-of-500-highly-cited-papers
Table 3-2 The result of correlation analysis of 500 highly cited papers.
The association between variables is
graphically illustrated with scatter plots. The trend of these
associations is shown by the solid lines. Figure 3-7 and Figure 3-8
shows, although both linear and non-linear models are not significantly
fitted, positive values of correlation coefficients are still suggesting
a positive trend (positive correlation) on the number of references and
the number of times cited per year.
8Figure 3-7 Relationship between number of references and citation counts in 500 highly cited (linear model)9Figure 3-8 Relationship between number of referencesand citation counts in 500 highly cited (Exponential Model) 

4. Conclusion 

This study shows that since the trend
between the citation count and the number of references is not
statistically significant, we cannot conclude that there is a
significant association between the citation count of Malaysia review
papers between the given period and number of references contained in
the paper. The correlation coefficient is not statistically significant.
However, r = 0.152 based on the population of 721 articles. Malaysian
review papers get more citations than other types of papers. The number
of references in the article has the lowest impact on the citation
compares with review paper. As this study looked only Malaysia review
papers and 500 highly-cited article, it would be necessary to conduct a
similar study in the otherworld and types of papers. It would be
important to examine whether in other types of papers the relationship
investigated here have significant correlated or not. The research
considered the general definition of citations. Therefore, future
studies may make a diffrentianain between “perfunctory citations” and
“organic citations” citations as Tang and Safer (2008)
defined “perfunctory citations” is occurred only once and in the
introduction, “organic citations” as references cited for “conceptual
ideas” and “methodology and data” reasons.ACKNOWLEDGEMENT Sincere
thanks to Dr. Bojan Obrenović and the International Journal of
Management Science and Business Administration’s board members for their
useful advices. References
  • Abt, Helmut A., and Eugene Garfield. 2002. “Is the relationship
    between numbers of references and paper lengths the same for all
    sciences?” Journal of the American Society for Information Science and
    Technology 53 (13):1106-1112. doi: 10.1002/asi.10151.
  • Adams, Jonathan. 2005. “Early citation counts correlate with
    accumulated impact.” Scientometrics 63 (3):567-581. doi:
    10.1007/s11192-005-0228-9.
  • Ale Ebrahim, Nader, Hadi Salehi, Mohamed Amin Embi, Farid Habibi
    Tanha, Hossein Gholizadeh, and Seyed Mohammad Motahar. 2014. “Visibility
    and Citation Impact.” International Education Studies 7 (4):120-125.
    doi: 10.5539/ies.v7n4p120.
  • Ale Ebrahim, Nader, Hadi Salehi, Mohamed Amin Embi, Farid Habibi
    Tanha, Hossein Gholizadeh, Seyed Mohammad Motahar, and Ali Ordi. 2013.
    “Effective Strategies for Increasing Citation Frequency.” International
    Education Studies 6 (11):93-99. doi: 10.5539/ies.v6n11p93.
  • Alimohammadi, Dariush, and Mahshid Sajjadi. 2009. “Correlation between references and citations.” Webology 6 (2):a71.
  • Bakkalbasi, Nisa, Kathleen Bauer, Janis Glover, and Lei Wang. 2006.
    “Three options for citation tracking: Google Scholar, Scopus and Web of
    Science.” Biomedical Digital Libraries 3 (1):7. doi:
    10.1186/1742-5581-3-7.
  • Ball, Philip. 2008. “A longer paper gathers more citations.” Nature 455 (7211):274-275. doi: 10.1038/455274a.
  • Ball, Philip. 2011. “Are scientific reputations boosted artificially?” Nature, 6 May.
  • Beel, Jöran, Bela Gipp, and Erik Wilde. 2010. “Academic Search
    Engine Optimization (ASEO).” Journal of Scholarly Publishing 41
    (2):176-190. doi: 10.3138/jsp.41.2.176.
  • Bornmann, L., L. Leydesdorff, and J. Wang. 2014. “How to improve the
    prediction based on citation impact percentiles for years shortly after
    the publication date?” Journal of Informetrics 8 (1):175-180. doi:
    10.1016/j.joi.2013.11.005.
  • Bornmann, Lutz, Hermann Schier, Werner Marx, and Hans-Dieter Daniel.
    2012. “What factors determine citation counts of publications in
    chemistry besides their quality?” Journal of Informetrics 6 (1):11-18.
    doi: http://dx.doi.org/10.1016/j.joi.2011.08.004.
  • Carley, S., A. L. Porter, and J. Youtie. 2013. “Toward a more
    precise definition of self-citation.” Scientometrics 94 (2):777-780.
    doi: 10.1007/s11192-012-0745-2.
  • Corbyn, Zoë. 2010. “An easy way to boost a paper’s citations.” Nature 406. doi: 10.1038/news.2010.406
  • de Solla Price, Derek J. 1965. “Networks of Scientific Papers.” Science 149 (3683):510-515. doi: 10.1126/science.149.3683.510.
  • Farhadi, Hadi, Hadi Salehi, Melor Md Yunus, Arezoo Aghaei Chadegani,
    Maryam Farhadi, Masood Fooladi, and Nader Ale Ebrahim. 2013. “Does it
    Matter Which Citation Tool is Used to Compare the h-index of a Group of
    Highly Cited Researchers?” Australian Journal of Basic and Applied
    Sciences 7 (4):198-202. doi: arXiv:1306.0727.
  • Farhadi, Maryam, Hadi Salehi, Mohamed Amin Embi, Masood Fooladi,
    Hadi Farhadi, Arezoo Aghaei Chadegani, and Nader Ale Ebrahim. 2013.
    “Contribution of Information and Communication Technology (ICT) in
    Country’S H-Index.”  Journal of Theoretical and Applied Information
    Technology 57 (1):122-127. doi: 10.5281/zenodo.7715.
  • Gargouri, Yassine, Chawki Hajjem, Vincent Larivière, Yves Gingras,
    Les Carr, Tim Brody, and Stevan Harnad. 2010. “Self-Selected or
    Mandated, Open Access Increases Citation Impact for Higher Quality
    Research.” PLoS ONE 5 (10):e13636. doi: 10.1371/journal.pone.0013636.
  • Gholizadeh, Hossein, Hadi Salehi, Mohamed Amin Embi, Mahmoud Danaee,
    Seyed Mohammad Motahar, Nader Ale Ebrahim, Farid Habibi Tanha, and Noor
    Azuan Abu Osman. 2014. “Relationship among Economic Growth, Internet
    Usage and Publication Productivity: Comparison among ASEAN and World’s
    Best Countries.” Modern Applied Science 8 (2):160-170. doi:
    10.5539/mas.v8n2p160.
  • Ioannidis, J. P. A. 2010. “Is there a glass ceiling for highly cited
    scientists at the top of research universities?” Faseb Journal 24
    (12):4635-4638. doi: 10.1096/fj.10-162974.
  • Krause, Kate. 2009. Increasing your Article’s Citation Rates. Open Access Week. Accessed 28 May 2013.
  • Lai, Quirino, Tom Darius, and Jan Lerut. 2012. “The one hundred most
    frequently cited articles in the field of clinical liver
    transplantation.” Transplant International 25 (6):e76-e77.
  • Lancho-Barrantes, Bárbara S., Vicente P. Guerrero-Bote, and Félix
    Moya-Anegón. 2010. “What lies behind the averages and significance of
    citation indicators in different disciplines?” Journal of Information
    Science 36 (3):371-382. doi: 10.1177/0165551510366077.
  • Meho, Lokman I. 2007. “The rise and rise of citation analysis.” Physics World 20:32–36.
  • Pislyakov, Vladimir, and Elena Shukshina. 2012. “Measuring
    Excellence in Russia: Highly Cited Papers, Leading Institutions,
    Patterns of National and International Collaboration.” Proceedings of
    STI 2012, Montréal.
  • Sarli, Cathy, and Kristi Holmes. 2011. “Strategies for Enhancing the
    Impact of Research.” Washington University School of Medicine, Accessed
    9 May. https://becker.wustl.edu/impact-assessment/strategies.
  • Swan, Alma. 2010.
  • Tang, Rong, and Martin A. Safer. 2008. “Author-rated importance of
    cited references in biology and psychology publications.” Journal of
    Documentation 64 (2):246-272. doi: 10.1108/00220410810858047.
  • Uzun, Ali. 2006. “Statistical relationship of some basic
    bibliometric indicators in scientometrics research.” International
    Workshop on Webometrics, Informetrics and Scientometrics & Seventh
    COLLNET Meeting, Nancy (France), May 10 – 12, 2006.
  • Vanclay, Jerome K. 2013. “Factors affecting citation rates in
    environmental science.” Journal of Informetrics 7 (2):265-271. doi: http://dx.doi.org/10.1016/j.joi.2012.11.009.
  • Webster, Gregory D, Peter K Jonason, and Tatiana Orozco Schember.
    2009. “Hot topics and popular papers in evolutionary psychology:
    Analyses of title words and citation counts in Evolution and Human
    Behavior, 1979–2008.” Evolutionary Psychology 7 (3):348-362


H-index, Citation analysis, Bibliometrics, Impact factor, Performance evaluation, Relations between citations and references