Tracking Cited References
Cited references are the articles, books, and other resources listed in a bibliography, a "Works Cited" list, or in a "References" list. Cited references are useful for finding additional articles and books on a topic, for identifying the top researchers in a field, and for promotion and tenure decisions.
Databases tracking cited references make it possible to follow the instances where an author has been cited. This technique may be useful to:
- Track the research of an individual
- Track the history of a research idea
- Locate current research based on earlier research
- Find out how many times and where a publication is being cited
- Find out who is citing a particular source
- Find out how a particular research topic is being used to support other research and to analyze its impact
Effective Strategies for Increasing Citation Frequency
Journal Reputation and Impact: publishing a paper in a journal based on disciplinary reputatation or with a high impact factor is the most well known way of getting your paper cited. But there are many other things a scholar can do to promote his or her work and make it easy for others to find.
Utilize Open Access Tools: Open Access journals tend to be cited more than non open access. Deposit your paper in a repository such as Scholars Archive here on campus or a disciplinary repository. Share your detailed research data in a repository.
Standarize Identifying Info: try to use the same name throughout your career as well as the name of your affiliated insitution. Using common "official" names will allow for consistency and easy retrieval of your work by author or affiliation.
Bring Colleagues on Board: team-authored articles are cited more frequently, as does publishing with international authors. Working cross-or inter-disciplinarily helps as well.
Beef Up That Paper: use more references, publish a longer paper. Also papers which are published elsewhere after having been rejected are cited more frequently.
Beyond Peer-Reviewed Original Research: Write a review paper. Present a working paper. Write and disseminate web-based tutorials on your topic.
Search Optimization: use keywords in the abstract and assign them to the manuscript. Use descriptive titles that utilize the obvious terms searchers would use to look for your topic, avoiding questions in the title. Select a journal that is indexed in the key library databases for your field.
Market Yourself: create a key phrase that describes your research career and use it. Update your professional web page and publication lists frequently. Link to your latest and greatest article in your professional email signature file.
Utliize Social Media: Use author profiles such as ResearcherID and ORCID. Contribute to Wikipedia, start a blog and/or podcast, join academic social media sites.
From: Ebrahim, N.A., et al. (2013). Effective strategies for increasing citation frequency. International Education Studies, 6(11):93-99. DOI.5539/ies.v6n11p93
Take the iLearn Workshop!
Come to one of our iLearn sessions for faculty and graduate students on Maximizing your Research Impact.
Academics who publish (or hope to publish) scholarly research find measuring the impact and influence of their work helps others understand its value within one’s department, institution, even throughout the discipline. In this workshop, learn how to generate unique author identifiers using ORCID and Researcher ID, and how they are used. Discover indicators such as the Journal Impact Factor, the h-index, and altmetrics, and their significance. We will also discuss issues like choosing the best journal for your research, and scholarly networking through tools such as Mendeley. The workshop length is 1 hour. The workshop is held in LI B14. See the iLearn registration page for details.
Overview of Citation Metrics
What's the Difference Between All of These Tools?
Essential Concepts of Scholarly Metrics
Altmetrics: a new form of measuring scholarly impact based on web-based and social media sources which can show influence and impact.
Bibliometrics: The variety of metrics available based on cited reference data to measure scholarly output, impact, relevance and ranking. Analytics include citation count, impact factor, SNIP, h-index, e-index, and a wide variety of related measurements.
Citation Analysis: the process of tracing various patterns of scholarly behavior through analyzing the cited and/or citing references of a body of work. This could be done on an individual article, author, journal, institution, or other group.
Citation Count: The number of times an article, author, journal, institution, etc. has been cited. It is very difficult to locate every single time something or someone has been cited. Commonly accepted citation counts come from Web of Science. Each source which provides citation counts draws from a different base of resources and therefore the results may differ between Web of Science and Google Scholar, for example.
Citation Evaluation: Simply identifying the number of times someone or something has been cited does not account for certain citation patterns. For example, an author may have one or two articles early in his or her career that have very high citation counts, but later articles have substantially fewer. Another author may have a relatively steady number of citations for each article throughout his or her career.
Journal Ranking: There are a number of metrics that seek to measure the influence of a journal based on how it is being cited in other works. One such metric is the Journal Impact Factor. It should be emphasized that the ranking of a journal is not necessarily a reflection of a single specific article within the journal.
Quality Factors & Caveats
Journal Prestige: There are basically two approaches to assessing journal prestige: (1) Perception/ranking of the journals by experts in the field, and (2) Journal ranking metrics providing analysis of citation rates. Other factors, such as journal submission and acceptance rates are also sometimes considered. Consult your Subject Librarian for assistance in this area.
"Good" Metric Scores (citation count, h-index, journal impact factor, journal ranking, etc.): Due to the varying citation rates from discipline to discipline, and even from specialty to specialty within a discipline, it is not possible to give a blanket statement regarding "good" metrics.
Caveats: There are many reasons why an author will cite previous research in his or her paper, and not all are an endorsement of the previous research. Self-citation, disagreeing or contradicting previous findings, and other motvations may not accurately reflect the influence of that work. This holds true for altmetrics counts as well.
For more information see: Leydesdorv, L. (2007) Caveats for the use of citation indicators in research and journal evaluations. Journal of the Association for Information Science and Technology, 59(2): 278-287. DOI: 10.1002/asi.20743