Wednesday, 22 March 2023

The top list of academic research databases

 Source: https://paperpile.com/g/academic-research-databases/

The top list of academic research databases

best research databases

Whether you are writing a thesis, dissertation, or research paper it is a key task to survey prior literature and research findings. More likely than not, you will be looking for trusted resources, most likely peer-reviewed research articles. Academic research databases make it easy to locate the literature you are looking for. We have compiled the top list of trusted academic resources to help you get started with your research:

1. Scopus

Scopus is one of the two big commercial, bibliographic databases that cover scholarly literature from almost any discipline. Beside searching for research articles, Scopus also provides academic journal rankings, author profiles, and an h-index calculator.

  • Coverage: approx. 71 million items
  • References: 1.4 billion
  • Discipline: Multidisciplinary
  • Access options: Limited free preview, full access by institutional subscription only
  • Provider: Elsevier
Search interface of Scopus
Scopus: a multidisciplinary bibliographic database that covers 71+ million scholarly items

2. Web of Science

Web of Science also known as Web of Knowledge is the second big bibliographic database. Usually, academic institutions provide either access to Web of Science or Scopus on their campus network for free.

  • Coverage: approx. 100 million items
  • References: 1.4 billion
  • Discipline: Multidisciplinary
  • Access options: institutional subscription only
  • Provider: Clarivate (formerly Thomson Reuters)
Web of Science landing page
Web of Science: 100+ million scientific articles

3. PubMed

PubMed is the number one resource for anyone looking for literature in medicine or biological sciences. PubMed stores abstracts and bibliographic details of more than 30 million papers and provides full text links to the publisher sites or links to the free PDF on PubMed Central (PMC).

  • Coverage: approx. 30 million items
  • References: NA
  • Discipline: Medicine, Biological Sciences
  • Access options: free
  • Provider: NIH
Search interface of PubMed
The new PubMed labs interface: a glimpse into the future of the newly designed PubMed search interface.

4. ERIC

For education sciences, ERIC is the number one destination. ERIC stands for Education Resources Information Center, and is a database that specifically hosts education-related literature.

  • Coverage: approx. 1.3 million items
  • References: NA
  • Discipline: Education science
  • Access options: free
  • Provider: U.S. Department of Education
Search interface of ERIC academic database
ERIC: there is no better source for education-related literature

5. IEEE Xplore

IEEE Xplore is the leading academic database in the field of engineering and computer science. It's not only journal articles, but also conference papers, standards and books that can be search for.

  • Coverage: approx. 5 million items
  • References: NA
  • Discipline: Engineering
  • Access options: free
  • Provider: IEEE (Institute of Electrical and Electronics Engineers)
Search interface of IEEE Xplore
IEEE Xplore: an academic database specifically for engineering and computer science

6. ScienceDirect

ScienceDirect is the gateway to the millions of academic articles published by Elsevier. 2,500 journals and more than 40,000 e-books can be searched via a single interface.

  • Coverage: approx. 16 million items
  • References: NA
  • Discipline: Multidisciplinary
  • Access options: free
  • Provider: Elsevier
Search interface of ScienceDirect
ScienceDirect: a multidisciplinary database featuring article from one of the largest academic publishers in the world

7. Directory of Open Access Journals (DOAJ)

The DOAJ is very special academic database since all the articles indexed are open access and can be accessed freely of charge.

  • Coverage: approx. 4.3 million items
  • References: NA
  • Discipline: Multidisciplinary
  • Access options: free
  • Provider: DOAJ
Search interface of DOAJ database
DOAJ: Any document you find on this academic database is open access and can be accessed free of charge.

8. JSTOR

JSTOR is another great resource to find research papers. Any article published before 1924 in the United States is available for free and JSTOR also offers scholarships for independent researchers.

  • Coverage: approx. 12 million items
  • References: NA
  • Discipline: Multidisciplinary
  • Access options: free
  • Provider: ITHAKA
Search interface of JSTOR
JSTOR: 12 million scientifc articles dating back to as early as 1876.

Frequently Asked Questions about academic research databases

🗾 What is Scopus?
🌋 What is Web of Science?
🏖️ What is PubMed?
🏜️ What is ERIC?
🏕️ What is IEEE Xplore?

Cite seeing: A brief guide for academics to increase their citation count

 Source: https://drmarkgriffiths.wordpress.com/2016/06/24/cite-seeing-a-brief-guide-for-academics-to-increase-their-citation-count/

Cite seeing: A brief guide for academics to increase their citation count

Apologies to any non-academics reading my blog today but this article will be of more interest to academic researchers than anyone else as it examines the strategies that I have used to get (what some people have claimed as) an “excessive” number of citations to my published work. All academics are aware that the use of bibliometric data is becoming ever more important in academia. Along with impact factors of academic journals, one of the most important bibliometric indicators is citation counts. These are increasingly being used in a number of contexts including internal assessment (e.g., going for a job promotion) and external assessments (e.g., use in the Research Excellence Framework [REF] as a proxy measure of quality and impact).

In June 2016 I reached close to 30,000 citations on Google Scholar and this is good evidence that what I do day-to-day works. I have an h-index of 91 (i.e., at least 91 of my papers have been cited 91 times) and an i10-index of 377 (i.e., a least 377 of my papers have been cited 10 times).

Citation counts take years to accumulate but you can help boost your citations in a number of different ways. Here are my tips and strategies that I personally use and that I know work. It probably goes without saying that the more you write and publish, the greater the number of citations. However, here are my top ten tips and based on a number of review papers on the topic (see ‘Further reading’ below):

  • Choose your paper’s keywords carefully: In an age of search engines and academic database searching, keywords in your publications are critical. Key words and phrases in the paper’s title and abstract are also useful for search purposes.
  • Use the same name on all your papers and use ORCID: I wish someone had told me at the start of my career that name initials were important. I had no idea that there were so many academics called ‘Mark Griffiths’. Adding my middle initial (‘D’) has helped a lot. You can also use an ORCID or ResearcherID and link it to your publications.
  • Make your papers as easily accessible as possible: Personally, I make good use of many different websites to upload papers and articles to (ResearchGate and academia.edu being the two most useful to me personally). Your own university institutional repositories can also be useful in this respect. All self-archiving is useful. It is also especially important to keep research pages up-to-date if you want your most recent papers to be read and cited.
  • Disseminate and promote your research wherever you can: I find that many British academics do not like to publicise their work but ever since I was a PhD student I have promoted my work in as many different places as possible including conferences, seminars, workshops and the mass media. More recently I have used social media excessively (such as tweeting links to papers I’ve just published). I also write media releases for work that I think will have mass appeal and work with my university Press Office to ensure dissemination is as wide as possible. I also actively promote my work in other ways including personal dissemination (e.g., my blogs) as well as sending copies of papers to key people in my field in addition to interested stakeholder groups (policymakers, gaming industry, treatment providers, etc.). I have a high profile web presence via my many websites.
  • Cite your previously published papers: Self-citation is often viewed quite negatively by some academics but it is absolutely fine to cite your own work where relevant on a new manuscript. Citing my own work has never hurt my academic career.
  • Publish in journals that you know others in your field read: Although many academics aim to get in the highest impact factor journal that they can, this doesn’t always lead to the highest number of citations. For instance, when I submit a gambling paper I often submit to the Journal of Gambling Studies (Impact factor=2.75). This is because gambling is a very interdisciplinary field and many of my colleagues (who work in disparate disciplines – law, criminology, social policy, economics, sociology, etc.) don’t read psychology journals. Some of my highest cited papers have been in specialist journals.
  • Try to publish in Open Access journals: Research has consistently shown that Open Access papers get higher citation rates than non-Open Access papers.
  • Write review papers: Although I publish lots of empirical papers I learned very early on in my academic career that review papers are more likely to be cited. I often try to write the first review papers in particular areas as everyone then has to cite them! Some types of outputs (especially those that don’t have an abstract) are usually poorly cited (e.g., editorials, letters to editors).
  • Submit to special issues of journals: Submitting a paper to a special issue of a journal increases the likelihood that others in your field will read it (as it will have more visibility). Papers won’t be cited if they are not read in the first place!
  • Publish collaboratively and where possible with international teams. Again, research has consistently shown that working with others collaboratively (i.e., team-authored papers) and in an international context has been shown to significantly increase citation counts.

Finally, here are a few more nuggets of information that you should know when thinking about how to improve your citation counts.

  • There is a correlation between number of citations and the impact factor of the journal but if you work in an interdisciplinary field like me, more specialist journals may lead to higher citation counts.
  • The size of the paper and reference list correlates with citation counts (although this may be connected with review papers as they are generally longer and get more cited than non-review papers.
  • Publish with ‘big names’ in the field. Publishing with the pioneers in your field will lead to more citations.
  • Get you work on Wikipedia References cited by Wikipedia pages get cited more. In fact, write Wikipedia pages for topics in your areas.
  • Somewhat bizarrely (but true) papers that ask a question in the title have lower citation rates. Titles that have colons in the title have higher citation rates.

Note: A version of this article was first published in the PsyPAG Quarterly (see below)

Dr Mark Griffiths, Professor of Behavioural Addictions, International Gaming Research Unit, Nottingham Trent University, Nottingham, UK

Further reading

Ball, P. (2011). Are scientific reputations boosted artificially? Nature, May 6. Located at: http://www.nature.com/news/2011/110506/full/news.2011.270.html (last accessed April 27, 2015).

Bornmann, L., & Daniel, H. D. (2008). What do citation counts measure? A review of studies on citing behavior. Journal of Documentation, 64(1), 45-80.

Corbyn, Z. (2010). An easy way to boost a paper’s citations. Nature, August 13. Located at: http://dx.doi.org/10.1038/news.2010.406 (last accessed April 27, 2015).

Ebrahim. N. A. (2012). Publication marketing tools – Enhancing research visibility and improving citations. University of Malaya. Kuala Lumpur, Malaysia. Available at: http://works.bepress.com/aleebrahim/64

Ebrahim, N., Salehi, H., Embi, M. A., Habibi, F., Gholizadeh, H., Motahar, S. M., & Ordi, A. (2013). Effective strategies for increasing citation frequency. International Education Studies, 6(11), 93-99.

Ebrahim, N.A., Salehi, H., Embi, M. A., Habibi, F., Gholizadeh, H., & Motahar, S. M. (2014). Visibility and citation impact. International Education Studies, 7(4), 120-125.

Griffiths, M.D. (2005). Self-citation: A practical guide. Null Hypothesis: The Journal of Unlikely Science (‘Best of’ issue), 15-16.

Griffiths, M.D. (2015). How to improve your citation count. Psy-PAG Quarterly, 96, 23-24.

Jamali, H. R., & Nikzad, M. (2011). Article title type and its relation with the number of downloads and citations. Scientometrics, 88(2), 653-661.

Marashi, S.-A., Amin, H.-N., Alishah, K., Hadi, M., Karimi, A., & Hosseinian, S. (2013). Impact of Wikipedia on citation trends. EXCLI Journal, 12, 15-19.

MacCallum, C. J., & Parthasarathy, H. (2006). Open Access increases citation rate. PLoS Biology, 4(5), e176, http://dx.doi.org/10.1371/journal.pbio.0040176

Swan, A. (2010) The Open Access citation advantage: Studies and results to date. Located at: http://eprints.soton.ac.uk/268516/ (last accessed April 27, 2015).

Vanclay, J. K. (2013). Factors affecting citation rates in environmental science. Journal of Informetrics, 7(2), 265-271.

van Wesel, M., Wyatt, S., & ten Haaf, J. (2014). What a difference a colon makes: How superficial factors influence subsequent citation. Scientometrics, 98(3): 1601–1615.

Share this:

Broadening your Research Impact

 Source: https://www.universityofgalway.ie/institutionalresearchoffice/publishing-guidelines-for-researchers/broadeningyourresearchimpact/#

Broadening your Research Impact

Tips for Increasing Research Impact

Increase the impact of your Manuscript

  • Publish where it counts - journals that are indexed by major citation services, eg Scopus help increase recognition for your work.
  • Select the appropriate Journal –consider Journal Impact Factor, Scopus ASJC Codes, cross-discipline, where do your peers and competitors publish?
  • Aim high - papers in highly cited journals attract more citations and sooner - Top Journals in Scopus
  • Consider the publication timeline - does the journal do preprints? Digital Object Identifier?
  • Title – longer more descriptive article titles attract more citations.
  • Title (and Abstract) words are heavily weighted by search engines and a keyword-rich title will push your article towards the top.
  • Write a clear Abstract, repeat key phrases (search engines search the Abstract)
  • Write a Review - Citation rates of reviews are generally higher than other papers
  • Use more references – strong relationship between no of references and citations.
  • Open Access to underlying research data and materials – makes your paper very attractive Check the review period and on-line pre-prints.
  • Publish in Open Access journals and Open Access Digital repository – greater access, visibility, digital access and some research funders insisting.

 International Collaboration 

  • International experts in your field (Scival can help identify potential collaborators)
  • Multi author and multi institutes
  • Correlation with higher citation rates 

Promotion, Visibility and Accessibility

  • Importance of Self Promotion, Networking and Visibility
  • Participate in conferences and meetings – present your work at every opportunity
  • Offer to give lectures or talk about your research.
  •  Build an online presence: 
                Create a website that lists your publications –include University of Galway.
                Use Social Media - Facebook, Twitter, ResearchGate, LinkedIn, Blogs, Youtube video, TedEd lesson etc
  • Utilize both Institution and publisher press releases and public relations.
  • Distribute reprints to scientists you have cited or to those who may find your work interesting.
  • Publish in Open Access Journals and Open Access Digital repository – greater access, visibility, digital access and some research funders insisting.

Cite and you will be Cited  

  • Cite your colleagues, including those with results contrary to yours
  • Cite leaders in your field and pertinent papers.
  • Self Citations - Cite your own relevant work (limit to 3 or 4, only include Journal Papers) 

and Finally - Make sure you get the credit for your work - see Publishing Guidelines for Researchers

  • Manage your online identity – Consistent form of your name, ORCID ID 
  • Make sure you include University of Galway address in the correct form.
  • Reclaim any misspelt citations by others – Scopus feedback service.
  • Monitor your output ensuring bibliometric databases accurately capture your work. 

Sources:
• Ref: Effective Strategies for Increasing Citation Frequency:  http://eprints.rclis.org/20496/1/30366-105857-1-PB.pdf
http://www.jobs.ac.uk/careers-advice/working-in-higher-education/2169/how-to-increase-your-citation-rates-in-10-easy-steps-part-1
http://www.aje.com/en/arc/10-easy-ways-increase-your-citation-count-checklist/

How good are AI “Answering Engines” really?

 Source: https://blog.kagi.com/kagi-ai-search#aitest

How good are AI “Answering Engines” really?

When implementing a feature of this nature, it is crucial to establish the level of accuracy that users can anticipate. This can be accomplished by constructing a test question dataset encompassing challenging and complex queries, typically necessitating human investigation but answerable with certainty using the web. It is important to note that AI answering engines aim to streamline the user’s experience in this realm. To that end, we have developed a dataset of ‘hard’ questions from the most challenging we could source from Natural Questions dataset, Twitter and Reddit.

The questions included in the dataset range in difficulty, starting from easy and becoming progressively more challenging. We plan to release the dataset with the next update of the test results in 6 months. Some of the questions can be answered “from memory,” but many require access to the web (we wanted a good mix). Here are a few sample questions from the dataset:

  • “Easy” questions like “Who is known as the father of Texas?” - 15 / 15 AI providers got this right (all AI providers answered only four other questions).
  • Trick questions like “During world cup 2022, Argentina lost to France by how many points?” - 8 / 15 AI providers were not fooled by this and got it right.
  • Hard questions like “What is the name of Joe Biden’s wife’s mother?” - 5 / 15 AI providers got this right.
  • Very hard questions like “Which of these compute the same thing: Fourier Transform on real functions, Fast Fourier Transform, Quantum Fourier Transform, Discrete Fourier Transform?” that only one provider got right. (thanks to @noop_noob for suggesting this question on Twitter.

In addition to testing Kagi AI’s capabilities, we also sought to assess the performance of every other “answering engine” available for our testing purposes. These included Bing, Neeva, You.com, Perplexity.ai, ChatGPT 3.5 and 4, Bard, Google Assistant (mobile app), Lexii.ai, Friday.page, Komo.ai, Phind.com, Poe.com, and Brave Search. It is worth noting that all providers, except for ChatGPT, have access to the internet, which enhances their ability to provide accurate answers. As Google’s Bard is not yet officially available, we opted to test the Google Assistant mobile app, considered state-of-the-art in question-answering on the web just a few months ago. Update 3 / 21: We now include Bard results.

To conduct the test, we asked each engine the same set of 56 questions and recorded whether or not the answer was provided in the response. The answered % rate reflects the number of questions correctly answered, expressed as a percentage (e.g., 75% means that 42 out of 56 questions were answered correctly).

And now the results.

Answering engine Questions Answered Answered %
Human with a search engine [1] 56 100.0%
——————————- ——————– ———-
Phind 44 78.6%
Kagi 43 76.8%
You 42 75.0%
Google Bard 41 73.2%
Bing Chat 41 73.2%
ChatGPT 4 41 73.2%
Perplexity 40 71.4%
Lexii 38 67.9%
Komo 37 66.1%
Poe (Sage) 37 66.1%
Friday.page 37 66.1%
ChatGPT 3.5 36 64.3%
Neeva 31 55.4%
Google Assistant (mobile app) 27 48.2%
Brave Search 19 33.9%

AI answering engines accuracy on “hard questions” dataset, March 21 (updated with Bard), 2023

[1] Test was not timed and this particular human wanted to make sure they were right

Disclaimer: Take these results with a grain of salt, as we’ve seen a lot of diversity in the style of answers and mixing of correct answers and wrong context, which made keeping the objective score challenging. The relative strength should generally hold true on any diverse set of questions.

Our findings revealed that the top-performing AI engines exhibited an accuracy rate of approximately 75% on these questions, which means that users can rely on state-of-the-art AI to answer approximately three out of four questions. When unable to answer, these engines either did not provide an answer or provided a convincing but inaccurate answer.

ChatGPT 4 has shown improvement over ChatGPT 3.5 and was close to the best answering engines, although having no internet access. This means that access to the web provided only a marginal advantage to others and that answering engines still have a lot of room to improve.

On the other hand, three providers (Neeva, Google Assistant, and Brave Search), all of which have internet access, performed worse than ChatGPT 3.5 without internet access.

Additionally, it is noteworthy that the previous state-of-the-art AI, Google Assistant, was outperformed by almost every competitor, many of which are relatively small companies. This speaks to the remarkable democratization of the ability to answer questions on the web, enabled by the recent advancements in AI.

The main limitation of the top answering engines at this time seems to be the quality of the underlying ‘zero-shot’ search results available for the verbatim queries. When humans perform the same task, they will search multiple times, adjusting the query if needed, until they are satisfied with the answer. Such an approach still needs to be implemented in any tested answering engine. In addition, the search results returned could be optimized for use in answering engines, which is currently not the case.

In general, we are catiously optimistic with Kagi’s present abilities, but we also see a lot of opportunities to improve. We plan to update the test results and release the questions in 6 months as we compare the progress made by the field.