Monday, February 05, 2007

Measuring Training Effectiveness - Looking Beyond 'Impression Management'

A few years back, on a wet and cold December afternoon, someone asked me a question, "Anthony, is asking the participants how they feel about the facilitator, ambience of the venue, quality of the refreshments, and delivery of the contents, a good way to help me determine whether I should continue running the programme with a particular vendor?"

Without a thought, I said yes. "Afterall, isn't this how the industry reports on the 'Returns of Investment' of the workshops, training courses and development programmes it conducts for its clients?"
A month later, I was not so sure.
I think there is another way to do this - a way where we could look beyond the 'impression management' and focus on performance; less about blaming the trainers for a failed programme to looking at the failure holistically and examining the role the organisation has to play in enabling transference of learning.
So, in another wet and cold afternoon, I pulled my enquirer aside and told him that "You have to ask if specific physical takeaways, psychological outcomes and quick-wins had been delivered at the end of the workshop.", and I provided examples to expound on the concept of 'Quality of Outcomes':

In Physical Takeaways, I, as a participant in a programme, ask if the programme delivered the intended breakthroughs. Breakthroughs are defined as tangible (being physical) outcomes (the takeaways) brought about from applying innovative approaches learnt during the programme that create solutions that overcome my current problems.

In
Psychological Takeaways, I try to reflect if my mindset and psyche have been changed because of the programme. The programme should impact me in such a way that I have become cognitively, affectively, and psychomotorcally positive about the skill set learnt and knowledge acquired during my engagement with the trainer.

In Quick-wins, I explore if the WIPs could be used elsewhere. I could have created or developed initial but unintented creative ideas and innovative solutions during the programme. I call these 'work-in-progress' or 'WIPs', which are additional outputs to the physical takeaways. These quick-wins could be used as solutions for other problems.

As I was developing these concepts, I was also studying Dr. Donald Kirkpatrick's four levels of evaluating training and found it to be a useful addition to my framework. His approach suggests that there is a positive relationship between the time and strength of the competency.

In Reaction, I, as a Training Manager, want to know the initial reactions participants have of the programme.

In Learning, I wish to uncover the extent to which participants change their attitudes as a result of a change in their body of knowledge and skills set.

In Behaviour, I like to observe the extent to which the participants change their behaviour at their workplace.

In Results, I wish to know the final outcome that occurs as a result of the changes in the participants' attitudes and behaviours at their workplace.
By embedding Kirkpatrick's Four Levels of Evaluation to my three Categories of Performance in the Quality of Outcomes in Training, I produced a 3 x 4 = 12-grid Model of Measuring Returns on Training Investment, which helps me to focus my thinking on the kinds of evidence I should be collecting over time to determine if the programme has delivered the intended value to the organisation.
These 12 grids are:

Expectation – Collect evidence to determine the difference between the expectations participants brought to the programme and what they actually received at the end of the engagement.
Sample Question: How much the skill sets and knowledge that I have acquired immediately after the programme meet what I expected of it after reading its promotional materials and its programme outlines?
Inquisition – Collect evidence to gauge how 'hungry' the participants want to learn more about the skill sets and knowledge taught in the programme.
Sample Question: Am I keen to find out more about the topics covered in the programme?

Opportunistic - Collect evidence to determine if the participants have been thinking about the application of the skill sets and knowledge covered in the programme.
Sample Question: Have I been thinking of where at my workplace I could apply the skill sets and knowledge I have obtained from the programme?
Discovery – Collect evidence to measure the amount of awareness and insights the participants obtained from the programme.
Sample Question: How much of awareness I have gathered and what is the quantity of the insights I have gained by the end of the programme?

Exploration – Collect evidence to find out if the participants have some ideas where to apply the skill sets and knowledge learnt.
Sample Question: Do I have an idea or two about where I could apply the skill sets and knowledge acquired at the programme?
Set-up – Collect evidence to determine if the participants have been using the skill sets and knowledge at the workplace or at home even before the programme ends.
Sample Question: Am I able to kick-start the application of the skill sets and knowledge at the workplace or at home even before the programme comes to an end?
Instrumentation – Collect evidence on the frequency the skill sets and knowledge is used at the workplace.
Sample Question: How often are and how much of the skill sets and knowledge I acquired from the programme are deployed at my workplace?

Subconscious – Collect evidence to uncover how conscious the participants are when they are applying the skill sets and knowledge at the workplace.
Sample Question: Am I aware that I was applying the skill sets and knowledge that I acquired at the programme at the workplace?

Success – Collect evidence to identify the kinds of initial successes and failures arising from the application of the acquired skill sets and knowledge at the workplace.
Sample Question: Are there initial success I can report after reaching the first two milestones of a task I am doing where the skill sets and knowledge were applied?

Performance – Collect evidence that could identify the outcomes of a task to the participants who attended the programme.
Sample Question: Could my footprints be found in the performance of the organisation which I belong because of the awareness and insights gained from the programme I attended?

Advocating – Collect evidence that shows that the participants, who had attended the programme, are now going around 'marketing' the usefulness of the programme and the skill sets learnt.

Sample Question: Have I been sharing the skills and imparting the knowledge to my superiors, peers, and subordinates?

Adaptation - Collect evidence to uncover the number of bigger successes arising from other tasks after the initial successes at the beginning of the original tasks.

Sample Question: Have a number of these successes snowballed into bigger projects with more management support?
"Of course, you have to talk to your trainers/facilitators and get the 'reaction' and 'learning' requirements worked out before the delivery of the programme. You also have to take care of the supervisors and managers who sent the participants to the training. They need to know how to deploy and support them after the programme so that they could reach their 'behaviour' and 'result' levels at their workplace.
It is because of these concerns that I have also developed the idea called IMPACTs or Information Management for Performance in Action System in 2005 to measure the performance level of my partners more holistically.
There are three components in IMPACTs. Besides incorporating Quality of Outcomes into IMPACTs, there are:
Quality of Delivery is the efforts in placing the essential elements (or Key Success Factors) to support learning throughout the learning duration.
What is 'essential' is contextual but the following example provides a good way of visualising the kinds of statement to be created and included in this segment of the evaluation:
For a simple workshop to help teams create innovation solutions:
  • Before the workshop - Conduct a Team Management Profile to determine if the team has all the required task preferences to perform innovatively. In addition, the team is checked for its current stage of Team Development to identify potential challenges its dynamics have on learning at the training programme (An example of a Quality of Delivery statement: Has the training vendor identified challenges in learning potentially faced by the participants?).
  • During the workshop - Conduct a series of small team building activities to create relatedness and to bring the team to the performing stage (An example of a Quality of Delivery statement: Has the training vendor conducted team development activities to sustain the performance level of its members?).
  • After the workshop - Participants' managers and supervisors monitor the participant's roadmap and milestones for implementing learning obtained at the training course (An example of a Quality of Delivery statement: Are the participants armed with a skill transference roadmap and related milestones to support the application of the knowledge and skill set at the workplace?).
In addition to Quality of Delivery, I have also introduced a measurement to determine the Quality of the Partnership I have with my long term supplier/partners. In here, I consider the following to help me decide if I want to continue working with them again in the future:
  • Level of Involvement - The amount of effort the partner contributes more than what is specified in the contract.
  • Degree of Accommodation - The amount of stretch the partner has achieved to increase the value of the contract vis-a-vis your contractual dollar.
  • Willingness of Collaborate - The amount of effort the partner introduces to strengthen the propensity of producing successful outcomes for you.
  • Preparedness to Share - The amount of new intellectual capital created and shared with your stakeholders.
  • Commitment to Co-exist - The degree the partner’s operations is locked into your business system.
Your suppliers/partners may struggle with you as this is a very radical approach in measuring returns in training but you have to work consistently with them to see this through. I had faced resistance when I first introduced these ideas in 2004 when I was the Manager at the MINDEF Innovation and Transformation Office. "But they eventually understand the importance of measuring training effectiveness", I added at the conclusion of that conversation on that wet and cold afternoon.
This article was first written on 2 May 2007 and
subsequently updated on 10 May & 11 Nov 2008 and revised on 26 Mar 2009.
Copyright 2007, 2008 and 2009. Anthony Mok. All rights reserved

No comments: