Skip to main content

Reporting characteristics of journal infographics: a cross-sectional study

Abstract

Background

Infographics have become an increasingly popular method to present research findings and increase the attention research receives. As many scientific journals now use infographics to boost the visibility and uptake of the research they publish, infographics have become an important tool for medical education. It is unknown whether such infographics convey the key characteristics that are needed to make useful interpretations of the data such as an adequate description of the study population, interventions, comparators and outcomes; methodological limitations; and numerical estimates of benefits and harms. This study described whether infographics published in peer-reviewed health and medical research journals contain key characteristics that are needed to make useful interpretations of clinical research.

Methods

In this cross-sectional study, we identified peer-reviewed journals listed in the top quintile of 35 unique fields of medicine and health research listed in the Journal Citation Reports database. Two researchers screened journals for the presence of infographics. We defined an infographic as a graphical visual representation of research findings. We extracted data from a sample of two of the most recent infographics from each journal. Outcomes were the proportion of infographics that reported key characteristics such as study population, interventions, comparators and outcomes, benefits, harms, effect estimates with measures of precision, between-group differences and conflicts of interest; acknowledged risk of bias, certainty of evidence and study limitations; and based their conclusions on the study’s primary outcome.

Results

We included 129 infographics from 69 journals. Most infographics described the population (81%), intervention (96%), comparator (91%) and outcomes (94%), but fewer contained enough information on the population (26%), intervention (45%), comparator (20%) and outcomes (55%) for those components of the study to be understood without referring to the main paper. Risk of bias was acknowledged in only 2% of infographics, and none of the 69 studies that had declared a conflict of interest disclosed it in the infographics.

Conclusions

Most infographics do not report sufficient information to allow readers to interpret study findings, including the study characteristics, results, and sources of bias. Our results can inform initiatives to improve the quality of the information presented in infographics.

Peer Review reports

Introduction

‘Infographic’ is an abbreviated term for an information graphic. They generally use images and data visualisations (pie charts, bar graphs, line graphs) to foster knowledge translation through increasing attention, comprehension and recall; and are considered aesthetically appealing and useful to communicate research findings among peers, the media and the public [1, 2]. Infographics have become an increasingly popular method to present research findings to non-academic audiences and increase the attention research receives [2,3,4]. Many scientific journals now use infographics to boost the visibility and uptake of the research they publish [5]. This includes healthcare journals with broad coverage (e.g., New England Journal of Medicine), and those focused on a specific discipline (e.g., JAMA Oncology, British Journal of Sports Medicine).

There is limited guidance on how to appropriately report research findings within infographics. To the best of our knowledge, only one guideline has been developed to inform the design of infographics (7-item GRAPHIC guidelines) [6]. However, this guideline only provides recommendations for infographic formatting.

Infographics that summarise the results of clinical research (e.g., observational studies, randomised trials, reviews) could improve knowledge translation and increase uptake of new evidence in clinical practice. However, it is unknown whether such infographics convey the key characteristics that are needed to make useful interpretations of the data. Such characteristics include but are not limited to: an adequate description of the study population, interventions, comparators and outcomes; methodological limitations; and numerical estimates of benefits and harms.

There is yet to be a systematic assessment of the reporting of key research characteristics in infographics. The aim of this study was to describe the proportion of infographics of clinical research that:

  • describe the study population, interventions, comparators and outcomes (and do so well enough for the infographic to be understood independently of the main paper);

  • report the benefits and harms of an intervention, effect estimates with measures of precision, between-group differences, the relationship of the effect estimates to known thresholds of clinical importance, and clear summary statistics for dichotomous outcomes;

  • acknowledge risk of bias, the certainty of evidence (if applicable), and study limitations;

  • acknowledge risk of bias/certainty of evidence in their conclusion, base conclusions on the correct populations, interventions or outcomes (i.e. no issue with indirectness), and base conclusions on the primary outcome; and

  • report conflicts of interest.

Methods

Data sources

We reported this cross-sectional study following the STROBE guidelines [7]. We defined an infographic as a graphical visual representation of research findings. We only included infographics summarising clinical research studies (i.e., observational studies, randomised and non-randomised trials, systematic reviews). There was no restriction on the population, intervention, or outcomes investigated. We did not consider infographics from in vitro or in silico studies. The search strategy comprised three steps:

Step 1: One researcher selected the 35 unique fields related to health and medical research from the Journal Citation Reports database (Additional file 1: Appendices A,B). Within each field, journals ranked in the top quintile based on journal impact factor using data from the 2019 journal impact factor index were selected (n = 597 journals).

Step 2: Two researchers from a panel of six independently checked each journal’s website (n = 597) for infographics. This was done by searching terms synonymous with “infographic” in the journal’s search box (e.g., “graphic abstract”, “graphical abstract”, “visual abstract”), and by manually checking articles published ahead of print and in all issues from August 2018 to October 2020. We did not consider issues designated to conference proceedings, special issues, or supplements. We also searched for special sections within the journal’s website (e.g., the BMJ visual abstract and infographic Sects [8].). If no infographics were identified with those procedures, we considered the journal not to have infographics. Due to feasibility issues, we only checked the journal’s websites. Other sources such as Twitter and Facebook were not checked. This step was conducted between September 21st and October 2nd, 2020.

Step 3: Pairs of investigators from a panel of six independently selected the two most recently published and eligible infographics from each journal. If the pair of investigators identified different infographics, they met to discuss and reach consensus.

Data extraction

Using a standardised data extraction form that was pilot tested prior to data extraction, two investigators independently extracted data from the infographics. Investigators were provided with instructions and examples of how to code every item of interest (Additional file 1: Appendix C). Discrepancies were resolved by discussion between the pair of investigators. This method is consistent with that recommended for high-quality Cochrane systematic reviews [9]. When an item was not relevant to the study design, it was recorded as “not applicable”. For example, in an observational study with no fixed intervention, it would not be applicable to report a description of the intervention or an estimate of the between-group difference.

Data analysis

We summarised data from the overall sample using counts and percentages. We also reported outcomes stratified by study design – differences between We calculated differences between proportions of each analysed item stratified by study design using Pearson’s Chi-squared. We used Stata version 16.1 (StataCorp LLC, Texas, USA) for the analyses.

Wherever “not applicable” was used, that infographic was not counted in the denominator for that item. The following data were extracted:

  • Study design (observational study, randomised trial, or review);

  • Whether the infographic described the study population, interventions, comparators, and outcomes, and whether the description was adequate for the infographic to be understood independently from the article. We considered these descriptions to be adequate when investigators did not need to check the original study report to understand key details. For example, the population needed to include some demographic characteristics (e.g., mean age). The intervention and comparison needed to include some information on the intervention parameters (e.g., drug dose, frequency of treatment). The outcome needed to be specific about the measure used (e.g., all-cause mortality).

  • Whether the infographic reported benefits and harms (e.g., adverse events), effect estimates and measures of precision, and between-group differences; and presented effect estimates in relation to known thresholds of clinical importance, and a clear summary statistic for dichotomous outcomes (e.g., proportions, relative risk (RR), number needed to treat (NTT), or charts commonly used to communicate absolute risk (e.g., icon array)) [10].

  • Whether the infographic acknowledged risk of bias, the certainty of evidence (if applicable), and study limitations.

  • Whether the infographic had a conclusion, acknowledged limitations/risk of bias/certainty of evidence in their conclusion, based their conclusions on the correct populations, interventions or outcomes (i.e. no issue with indirectness), and based their conclusions on the primary outcome (i.e. no ‘spin’) [11].

  • Whether the infographic reported conflicts of interest.

Patient and public involvement

Patients or the public were not involved in the design, or conduct, or reporting, or dissemination plans of our research.

Results

Selection and characteristics of infographics

Within the 35 unique fields related to medicine and health research listed in the Journal Citation Reports database, we identified 597 journals that were listed in the top quintile of these fields. Of these, we identified 69 journals from 18 fields that contained infographics that met our eligibility criteria (Fig. 1). We were able to find two infographics from 60 journals and only one from 9 journals (Additional file 1: Appendix A). Hence, we included 129 infographics in this study.

Fig. 1
figure 1

Study flow diagram

Fields with the highest number of journals included were Medicine, General & Internal (11 journals), followed by Cardiac & Cardiovascular Systems and Surgery (10 journals each), Gastroenterology & Hepatology and Urology & Nephrology (7 journals each). The other 14 fields contributed fewer journals, ranging from 1 to 5 per field (Fig. 1). Most infographics summarised observational studies (50%), followed by randomised trials (35%) and reviews (16%) (Table 1). Of the 20 reviews included, 65% included randomised controlled trials only.

Table 1 Characteristics of infographics summarising studies evaluating the effects of an intervention (n = 129 unless stated otherwise). P-values are for differences in proportions in each outcome stratified by study design

Main findings

Most infographics described the population (81%), intervention (96%), comparator (91%) and outcomes (94%) of the study. However, fewer infographics contained enough information on the population (26%), intervention (45%), comparator (50%) and outcomes (55%) for these components of the study to be understood without referring to the main paper.

Fewer infographics reported harms (26%) compared to benefits (84%). Only 67% and 22% reported an effect estimate and measures of imprecision around an effect estimate, respectively. Risk of bias was acknowledged in only 2% of infographics, and certainty of evidence was only mentioned by 10% of infographics of systematic reviews. Of the 63 infographics that contained a conclusion, most (92%) did not have issues with indirectness, and most were based on findings from the primary outcome (86%). Only 5% of these conclusions considered risk of bias. None of the 69 studies that declared a conflict of interest disclosed it in the infographics.

There were some differences in some of the outcomes when data were stratified by study design. These data are displayed in Table 1. A higher proportion of infographics from observational studies and randomised trials described the comparators, outcomes, effect estimates, and clearly labelled dichotomous outcomes compared to reviews. A higher proportion of randomised trials reported on the benefits of an intervention compared to observational studies and reviews, whereas a higher proportion of reviews reported harms and acknowledged risk of bias compared to observational studies and randomised trials.

Discussion

Summary of findings

Infographics typically presented information on patients, interventions, comparators and outcomes. However, fewer reported sufficient information to allow readers to understand them without reference to the main paper. Critical aspects of results such as reporting measures of imprecision around the effect estimate or clearly labelling the statistic used to summarise dichotomous outcomes were seldom reported. Sources of bias, certainty of evidence, and study limitations were rarely acknowledged. No infographic disclosed conflicts of interest even though more than half of the original studies in our sample had originally disclosed at least one source of conflict of interest.

Implications

Infographics have been shown to increase measures of research attention such as engagement on social media and Altmetric scores [5, 12, 13], yet our results indicate that in many cases the increase in attention may be at odds with high quality information. This is concerning because the absence of key information in the infographic may compromise the reader’s ability to truly understand the study and its findings, limitations, and implications for clinical practice. These limitations could be addressed by reading the full text provided that the full-text was reported following best practices in reporting eg adhered to reporting guidelines. However, whilst infographics are often made freely available on social media [5] or in dedicated sections on journal websites [14], access to the original studies is often restricted by journal paywalls.

Most infographics that had a conclusion reported findings for the primary outcome of the study (86%). In other words, only 14% of infographics were considered to have some form of spin. The proportion of spin in our sample was much lower than spin in other samples (26% to 85%) depending on the study design [15]. This could be explained by more rigorous assessments of spin being used in other studies [15,16,17].

None of the infographics included in our study disclosed conflicts of interest, although more than half of the studies from our sample had some form of conflict of interest declared. Conflicts of interest are an important source of bias in clinical research [18], so this information should be present in every resource designed to disseminate study findings.

How an infographic should look like

Anyone creating an infographic could consider the items that were assessed in our study and ensure that any items that are relevant to the study design being summarised are included in the infographic. Reporting checklists for infographics for individual study designs would simplify that process and our group has commenced preparing these. In proposing such reporting checklists, we acknowledge that some items may be more or less relevant depending on the purpose of the infographic, such as notifying the general public about the existence of a new study versus informing clinicians about the evidence generated by that study. Depending on the purpose and on the format in which the infographic will be distributed (e.g., social media, journal website, poster, other), all relevant items may not be incorporated in every infographic but, in general, the more items on the checklist that are incorporated in the infographic the more informative it will be.

One item that we choose to highlight here is risk of bias. This was achieved by only 2% of the infographics we analysed, but it can be succinctly summarised, as shown in the infographic for the study by van de Leemkolk et al. [19].

The infographics that satisfied more relevant items than most infographics were the ones produced by JAMA [20, 21]. Apart from providing more complete information than others, their layout makes its interpretation clear and easy. For example, they seemed to conform well to the GRAPHIC principles that are recommended for visual presentation of infographics: restricted colours, aligned elements, prioritise parts, highlight the heading, good imagery, and careful selection of charts [22]. From the sample of infographics that we analysed in our study, the ones produced by JAMA have the best combination of reporting completeness and aesthetic appeal.

Our team has recently completed a survey (submitted for publication – data not available yet) conducted with consumers of infographics summarising health or medical research (eg health professionals, researchers, academics and patients/the public) and found that 41% used infographics as a substitute for the full-text at least half of the time, 55% thought infographics should be detailed enough so they do not have to read the full text, and 64% viewed infographics as tools to reduce the time burden of reading the full text.

Study limitations

Searching for infographics is challenging because nomenclatures vary by journal (e.g., infographic, graphical abstract, visual abstract) and journal policies are unclear about whether infographics are routinely produced for all published papers. Furthermore, there are currently thousands of medical journals, so searching for infographics across all journals is not feasible. Most infographics cannot be located by searching indexing databases like PubMed. To overcome these limitations, we designed a comprehensive multi-step search that allowed us to search a large number of journals (n = 597) across 35 research fields. Our definition of infographics was broad, which allowed us to capture a broad range of different types of infographics (e.g. visual abstracts, graphical abstracts, infographics). A potential limitation of this approach is that different types of infographics might have been lumped together. However it is worth mentioning that the terms currently used by different journals to describe the various types of infographics they produce is not standardised. Future research could develop a classification system for different types of infographics and repeat our analysis.

We limited searches to journals in the top quintile of each research field, which could be considered a limitation. Although such journals typically enforce use of reporting checklists in their published papers [23], the information contained in their infographics was consistently insufficient for a useful interpretation of the data. Because journals in the other quintiles typically use less robust reporting in their papers, we anticipate that their infographics would also be insufficiently detailed for readers to make a useful interpretation of the data. In any case, the implication for clinicians remains the same regardless of the journal’s impact factor: unless an infographic reports the necessary details for clinical decision making in the infographic, users should read the full-text publication.

Conclusion

Most infographics do not report sufficient information to allow readers interpret study findings, including the study characteristics, results, and sources of bias. Our results can inform initiatives to improve the quality of the information presented in infographics. While infographics could be made more informative, clinicians and other users of clinical research should not only rely on infographics for decision-making. These decisions should only be made after reading the full-text publication.

Availability of data and materials

All data generated or analysed during this study are included in this published article.

References

  1. Crick K, Hartling L. Preferences of Knowledge Users for Two Formats of Summarizing Results from Systematic Reviews: Infographics and Critical Appraisals. PLoS ONE. 2015;10(10):e0140029.

    Article  Google Scholar 

  2. Thoma B, Murray H, Huang SYM, et al. The impact of social media promotion with infographics and podcasts on research dissemination and readership. CJEM. 2018;20(2):300–6.

    Article  Google Scholar 

  3. Murray IR, Murray AD, Wordie SJ, Oliver CW, Murray AW, Simpson AHRW. Maximising the impact of your work using infographics. Bone Joint Res. 2017;6(11):619–20.

    Article  Google Scholar 

  4. Scott H, Fawkner S, Oliver C, Murray A. Why healthcare professionals should know a little about infographics. Br J Sports Med. 2016;50(18):1104.

    Article  Google Scholar 

  5. Ibrahim AM, Lillemoe KD, Klingensmith ME, Dimick JB. Visual Abstracts to Disseminate Research on Social Media: A Prospective. Case-control Crossover Study Ann Surg. 2017;266(6):e46–8.

    Google Scholar 

  6. M SCG. The G.R.A.P.H.I.C principles of public health infographic design. Leeds: University of Leeds; 2015.

  7. von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP. Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. BMJ. 2007;335(7624):806–8.

    Article  Google Scholar 

  8. Journal BM. Infographics. https://www.bmj.com/infographics Published 2020. Accessed 10/12/2020, 2020.

  9. Higgins JPT TJ, Chandler J, Cumpston M, Li T, Page MJ, Welch VA (editors). Cochrane Handbook for Systematic Reviews of Interventions version 6.3. Cochrane; 2022.

  10. Ancker JS, Senathirajah Y, Kukafka R, Starren JB. Design features of graphs in health risk communication: a systematic review. J Am Med Inform Assoc. 2006;13(6):608–18.

    Article  Google Scholar 

  11. Boutron I, Dutton S, Ravaud P, Altman DG. Reporting and Interpretation of Randomized Controlled Trials With Statistically Nonsignificant Results for Primary Outcomes. JAMA. 2010;303(20):2058–64.

    Article  Google Scholar 

  12. Huang S, Martin LJ, Yeh CH, et al. The effect of an infographic promotion on research dissemination and readership: A randomized controlled trial. CJEM. 2018;20(6):826–33.

    Article  Google Scholar 

  13. Barlow B, Barlow A, Webb A, Cain J. “Capturing your audience”: analysis of Twitter engagements between tweets linked with an educational infographic or a peer-reviewed journal article. J Vis Commun Med. 2020;43(4):177–83.

    Article  Google Scholar 

  14. Medicine NEJo. Visual abstracts. https://www.nejm.org/multimedia/visual-abstracts Published 2020. Accessed.

  15. Chiu K, Grundy Q, Bero L. “Spin” in published biomedical literature: A methodological systematic review. PLoS Biol. 2017;15(9): e2002173.

    Article  Google Scholar 

  16. Mathieu S, Giraudeau B, Soubrier M, Ravaud P. Misleading abstract conclusions in randomized controlled trials in rheumatology: comparison of the abstract conclusions and the results section. Joint Bone Spine. 2012;79(3):262–7.

    Article  Google Scholar 

  17. Patel SV, Chadi SA, Choi J, Colquhoun PH. The use of “spin” in laparoscopic lower GI surgical trials with nonsignificant results: an assessment of reporting and interpretation of the primary outcomes. Dis Colon Rectum. 2013;56(12):1388–94.

    Article  Google Scholar 

  18. Bero L. Addressing Bias and Conflict of Interest Among Biomedical Researchers. JAMA. 2017;317(17):1723–4.

    Article  Google Scholar 

  19. van de Leemkolk FEM, Schurink IJ, Dekkers OM, et al. Abdominal Normothermic Regional Perfusion in Donation After Circulatory Death: A Systematic Review and Critical Appraisal. Transplantation. 2020;104(9):1776–91.

    Article  Google Scholar 

  20. Spinner CD, Gottlieb RL, Criner GJ, et al. Effect of Remdesivir vs Standard Care on Clinical Status at 11 Days in Patients With Moderate COVID-19: A Randomized Clinical Trial. JAMA. 2020;324(11):1048–57.

    Article  Google Scholar 

  21. Fleischer DM, Greenhawt M, Sussman G, et al. Effect of Epicutaneous Immunotherapy vs Placebo on Reaction to Peanut Protein Ingestion Among Children With Peanut Allergy: The PEPITES Randomized Clinical Trial. JAMA. 2019;321(10):946–55.

    Article  Google Scholar 

  22. Scott H, Fawkner S, Oliver CW, Murray A. How to make an engaging infographic? Br J Sports Med. 2017;51(16):1183–4.

    Article  Google Scholar 

  23. Caulley L, Cheng W, Catalá-López F, et al. Citation impact was highly variable for reporting guidelines of health research: a citation analysis. J Clin Epidemiol. 2020;127:96–104.

    Article  Google Scholar 

Download references

Acknowledgements

 Dr Ferreira, Dr Cashin, and Dr Zadro are supported by a National Health and Medical Research Council (NHMRC) Emerging Leadership fellowship (APP2009808, APP2010088 and APP1194105).

Funding

None.

Author information

Affiliations

Authors

Contributions

Giovanni Ferreira: conceptualisation, investigation, formal analysis, supervision, project administration, methodology, writing – original draft. Mark Elkins: conceptualisation, methodology, writing – review & editing. Caitlin Jones: investigation, formal analysis, writing – review & editing. Mary O’Keeffe: conceptualisation, methodology, writing – review & editing Aidan Cashin: conceptualisation, investigation, writing – review & editing. Rosa Becerra: investigation, writing – review & editing. Andrew Gamble: investigation, writing – review & editing. Joshua Zadro: conceptualisation, investigation, writing – review & editing. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Giovanni E. Ferreira.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Appendix A.

Fields ofmedicine and health research. Appendix B. List ofincluded journals and number of infographics included from each journal. Appendix C. Coding instructions.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Ferreira, G.E., Elkins, M.R., Jones, C. et al. Reporting characteristics of journal infographics: a cross-sectional study. BMC Med Educ 22, 326 (2022). https://doi.org/10.1186/s12909-022-03404-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-022-03404-9

Keywords

  • Medical education
  • Infographics
  • Information Dissemination
  • Visual abstracts