Humentum offers hundreds of training and learning opportunities to development and humanitarian relief professionals around the world each year. An important goal for our work is to help professionals do their work better, and we would like to think our training and learning offerings support that goal. But do they?
This is not an easy question to answer, but it is an important question to answer. In fact, because training and learning offers tremendous growth opportunities but also requires a significant investment of time and resources—both by Humentum and participants—it’s perhaps one of the most important questions we need to answer.
Asking participants to rate the value of the course at the end of a learning event doesn’t provide a full and complete answer, however, course evaluations can provide valuable insights and information regarding the relevance of the content, the quality of training delivery, and the likelihood the course will lead to results in the workplace.
With that in mind, last year Humentum overhauled its approach to workshop evaluations. With Will Thalheimer’s book “Performance-focused Smile Sheets” as a key resource, we created an approach to evaluations that offers participants clear ways to give direct and meaningful feedback and provides us with rich data that allows us to build on strengths and identify areas for improvement.
The evaluation is deployed electronically, which eliminates paper and streamlines data analysis. Participants receive the evaluation by email on the final day of the training and are encouraged to respond before leaving. Participants who haven’t completed the evaluation are sent a reminder one week later, emphasizing our interest in their feedback so we can deepen our understanding of our learning offerings.
One month after the workshop ends, participants are asked to complete a follow-up evaluation. In that evaluation, we check on their current understanding of the concepts discussed in the course, whether they’re using what they learned in the course, which aspects of the course have proven most useful on the job, and other benefits of workshop participation. We also take the opportunity to encourage participants to continue putting knowledge, skills, tools, and techniques from the workshop into action at work.
This new approach to our workshop evaluations—from better questions to a more efficient delivery system—has improved our understanding and deepened conversations about what’s working well and what can be improved and how.
It has also made us consider how to improve evaluations for non-training and learning events and offerings. Work is underway to transfer what we learned while building and implementing new workshop evaluations for our other events, offerings, and engagements.
Here’s an illustrative sample of two approaches to asking an overall ‘satisfaction’ question, with the second option offering clearer answer choices and opportunities to open a rich dialogue about strengths, areas for improvement, and next steps.
Interested in learning how your organization can improve its approach to training and learning evaluation or apply similar techniques to non-training aspects of your work? Do you disagree that the second question option is better than the first? Reach out to me at email@example.com.