Evaluation and Dissemination of Measurable Education Outcomes
Evaluating the results of your Evidence-Based Practice (EBP) initiative and sharing them widely can greatly increase the effect of your research. Debra Liebig, M.L.A., B.S.N., RN-BC, program manager in the Accreditation and Readiness Department at Children’s Mercy Kansas City, and Cathleen Opperman, DNP, RN, NEA-BC, CPN, nursing professional development specialist at Nationwide Children’s Hospital, shared suggestions for disseminating them as widely as possible.
Prepare the baseline, and use it
“Estimate the savings and the cost before implementing,” Opperman said. “That’s one thing I learned early on in the EBP world; we get so eager when we see what the evidence says is a better practice and we want to jump in there and start meddling it up with some version of our implementation.”
Establishing a clear baseline of where you stand with the desired measurements at the beginning of the process is critical for monitoring progress, pitching initiative ideas for stakeholders (with estimates), evaluating overall results, and selecting which outcomes to monitor throughout your initiative.
Evaluate the outcomes of change
Some of the outcomes of your implementation may require calculations such as those in the article linked above but can be summarized in for easier understanding. Preliminary research revealed few previous works exploring the use of return on investment (ROI) methods in hospitals. Potential outcomes of that work included:
- More publications with economic impact reported.
- More comfort calculating financial impact for professional development activities.
Once you have a clear understanding of the quantitative and qualitative results of your effort, it’s time to share that knowledge.
Share, share, share
The results of your efforts can only hold value if you share them and allow others, both in your organization and outside of it, to benefit from it. Opperman stressed the importance of sharing any economic assessment, including calculations and results, made during the process to try to combat the current lack of published information in this area.
“It could be even the simple cost per participant calculation. Then the next organization who needs to implement can use your numbers to help make their changes and the next time you have to make a change and somebody has published a number for you to work off of, you can go to the stakeholders with it. Whether it’s published, podium or poster, please give those calculations.”
Opperman and Liebig shared three strategies for dissemination, all of which can be used both within the organization and the larger medical community.
Sharing articles in relevant industry publications allows colleagues to digest the information you are sharing at their own pace, as well as increasing the chances that those actively searching for related information will come across it.
The Education Committee for the Association of Nursing Professional Development (ANPD) ROI research sub-committee team responsible for this initiative shared an original award-winning two article series and a third follow-up article in the Journal for Nurses in Professional Development (JNPD). Opperman also contributed a chapter to the ANPD Core Curriculum on the subject.
Creating flyers and posters to raise staff awareness within your organization and sharing with peers at industry conferences can help gain interest and create collaboration opportunities.
Whether in person or virtual, presenting and speaking about your project and results allows for a more personalized connection with the audience than printed communication. The ANPD team presented a series of webinars from 2016 to 2019, hosted pre-conference workshops at the ANPD Convention in 2017 and 2018, and presented sessions at several conferences.
For all of these methods, the numbers and how you got them are critical pieces of information.
“After all of these literature searches we have done, we only found 16 articles published over 14 years that have demonstrated economic impact for educational programs in hospital settings,” Opperman said. “It’s kind of sad when we do so many, so we need more numbers to work with.”