Effective Presentation of Your Evaluation Results: What, So What, Now What

Deborah Simpson, PhD, is Director of Education, Academic Affairs Advocate Aurora Health, Adjunct Clinical Professor of Family Medicine, University of Wisconsin School of Medicine and Public Health and Medical College of Wisconsin, and Deputy Editor, JGME

Find articles by Deborah Simpson

Janet M. Riddle, MD, is Research Assistant and Professor of Medical Education, University of Illinois Chicago College of Medicine, and Associate Editor, Journal of Graduate Medical Education (JGME)

Dorene F. Balmer, PhD, is Associate Professor of Pediatrics, Perelman School of Medicine, University of Pennsylvania

Deborah Simpson, PhD, is Director of Education, Academic Affairs Advocate Aurora Health, Adjunct Clinical Professor of Family Medicine, University of Wisconsin School of Medicine and Public Health and Medical College of Wisconsin, and Deputy Editor, JGME

Corresponding author.

Corresponding author: Deborah Simpson, PhD, Aurora Health Care, gro.haa@nospmis.bed, Twitter @debsimpson3

The Challenge

Your program evaluation team has completed its evaluation of a comprehensive diversity, equity, and inclusion (DEI) workshop series for program residents and faculty. The workshops were designed to foster dialogue about DEI and build skillsets, including ways to address microaggressions. You are now scheduled to preview the evaluation results individually with your program director and your vice chair for DEI, and then present the full report at the next combined resident/fellow and faculty meeting. As each of these stakeholders have different perspectives and their availability to meet varies from 15 to 45 minutes, the challenge is how to approach presenting the evaluation results to each of these groups.

Rip Out Action Items

Reporting the results of a program evaluation must explicitly consider how to:

Align the results with the original evaluation questions and stakeholders' inputs. Make it actionable: evaluations are conducted to inform decisions. Adapt report to stakeholder audience and present using multiple formats and media. Follow up to ascertain changes associated with evaluation.

What Is Known

Systematically designed evaluations yield information about the value of a program, project, or initiative to inform key stakeholders' decisions regarding the program (eg, continuation, revision, expansion). Typically, evaluation reports include the results of the evaluation (“What”), interpretation of results (“So What”), and recommendations for continuing and improving the educational activity (“Now What”). How evaluation findings are communicated directly influences how stakeholders understand and react to the data and ultimately their decisions. Evaluators use both comprehensive reports and targeted presentations to address stakeholders' information needs—evidence that matters to them. Targeted presentations can be as short as an “elevator pitch” or as involved as a visual abstract. 1 Using data visualizations (graphic or pictorial formats) for quantitative results (graphs, pie charts, diagrams) enables decision-makers to quickly grasp difficult concepts or identify new patterns. 2 Qualitative data can be effectively presented through word clouds, photos, and quotations.

How You Can Start TODAY

What: Return to your evaluation questions and evaluation standards. Use the questions generated by key stakeholders to organize your presentation. Consider framing the results by levels of data (reaction, learning, behavior, results). Highlight the key findings for all groups then focus on areas by stakeholder interest. Make sure your findings are accurate and your recommendations are useful; demonstrate integrity by differentiating results from opinion.

Now What: Adapt your report to the stakeholder audience. Presentations and reports may differ by stakeholder group. Do you want to inform or persuade? Choose written or verbal reports that best tell the “evaluation story.” 2 Use simple, plain writing. Include quotations, specific examples, and/or a case study. Metaphors help statistics have practical impact and build the story. Use data visualizations. 2 Evaluation snapshots (short summaries) or storybooks 3 are useful formats for quick review of key points. Emphasize data specific to your stakeholder input(s) during the planning stage ( Figure ). Presentations by resident members of your evaluation team can be powerful.

An external file that holds a picture, illustration, etc. Object name is i1949-8357-13-2-281-f01.jpg

A Visual Abstract

Now What: Reinforce use of evaluation results. Seek to present the findings in multiple forums to reinforce messages. Identify who needs or wants to see or hear the evaluation findings. Consider existing forums, such as regularly occurring meetings (eg, residency curriculum committee, clinical competency committee, graduate medical education council) and new forums.

So What: Make your report actionable. Make sure recommendations are relevant, useful, aligned with stakeholder values, and actionable.

What You Can Do LONG TERM

What: Maintain an evaluation master file. Include all the various presentations and reports your team has created. Consider using your evaluation findings to populate the annual program evaluation or self-study, and sponsoring institution's documents.

So What: Revisit your evaluation report in a year. Check the utility of your evaluation by stakeholder groups. Was there action taken based on the report? Were your recommendations followed?

Now What: Consider disseminating your evaluation as scholarship. Your evaluation results might contribute to a larger conversation about learning in graduate medical education. Learn about how program evaluations are best presented 4 or consult with an evaluation expert.

Resources

1. Ibrahim AM. A Primer on How to Create a Visual Abstract. 2021 https://journals.lww.com/journaloftraumanursing/Documents/VisualAbstract_Primerv1.pdf Accessed January 25.

2. National Cancer Institute. Making Data Talk A Workbook. 2021 https://www.cancer.gov/publications/health-communication/making-data-talk.pdf Accessed January 25.

3. Evaluation Toolbox. How We Ran a Behaviour Change Pilot Program and the Lessons We Learnt. 2021 http://evaluationtoolbox.net.au/docs/C500eBook_Mar09.pdf Accessed January 25.

4. Li ST, Klein MD, Balmer DF, Gusic ME. Scholarly evaluation of curricula an educational programs: using a systematic approach to produce publishable scholarship. Acad Pediatr. 2020; 20 (8):1083–1093. doi: 10.1016/j.acap.2020.07.005. [PubMed] [CrossRef] [Google Scholar]

Articles from Journal of Graduate Medical Education are provided here courtesy of Accreditation Council for Graduate Medical Education