Educational outcome measurement (EOM) is a creative process. The ultimate goal is to develop a practical plan to measure changes in CME participant knowledge, competence, performance or even the health of their patients. However, each CME activity comes with a unique set of challenges: specialized content that doesn’t fit traditional assessment methods, diverse learning populations that won’t be engaged by a one-size-fits-all assessment approach and / or CME participants who are too busy to respond to assessments. Negotiating these issues requires a creative mindset, one that can maintain empirical integrity by utilizing evidence-based research methods yet also respond to the practical issues of everyday CME. And most importantly, the end result of the EOM plan has to make sense. EOM may produce pages of data and statistical tests, but that doesn’t matter if it can’t answer simple questions like how effective was this CME activity overall? or how does this effectiveness relate to other CME activities? or how do the results inform the development of future CME?
I have 10+ years of experience in EOM, which means a lot of mistakes and enough successes to establish an approach to EOM focused on providing clearly communicated results derived from creative applications of traditional evidence-based methods. Most of my evaluations address outcomes Level 2 (satisfaction) through 5 (performance). My EOM reporting focuses on summary measures of effectiveness (click here for more information) that allow CME supporters and providers to quickly assess the overall effectiveness of a CME activity. Recognizing that each contact with a CME participant is another educational opportunity, I also focus on EOM methods that reinforce educational messages while evaluating their impact on CME participants (click here for more information).
You can learn more about my approach to EOM by following this blog or reviewing some of my related publications (see “Selected Publications” below). If you’d like to discuss, call Jason Olivieri at 312-833-8294 or send an email to firstname.lastname@example.org. I provide a variety of services from EOM training to full service assessment and reporting.
- Olivieri JJ & Regala RP. Improving CME: Using Participant Satisfaction Measures to Specify Educational Methods. J Contin Educ Health Prof 2013;33:146-7. (abstract)
- Olivieri J, Harris D, Dietze D, Viereck C. A Call for Calculating Effect Size in CME Educational Outcome Measurement. CE Meas 2012;6:22-25. (abstract)
- Olivieri JJ, Bidus K. Case Report: Web-based Commitment-to-Change Evaluation of an Annual Continuing Medical Education Conference. CE Meas 2009;3:6-8. (abstract)
- Deutsch ES, Olivieri JJ, Hossain J, Sobolewski HL. Medical simulation topic interests in a pediatric healthcare system. Simul Healthc 2010;5:289-94. (abstract)
- Olivieri JJ, Knoll MB, Arn PH. Education format and resource preferences among registrants of a pediatric-focused CME website. Med Teach 2009;31:e333-7. (abstract)
- Knoll M, Olivieri JJ. Audience-specific needs assessment: using a gap analysis survey of CME conference registrants to assess presentation content. J Contin Educ Health Prof 2008 Fall;28(4):284-5. (abstract)