Collaborate to Evaluate

In my last blog, I talked about my personal introduction to the world of evaluation with the New World Kirkpatrick Model. Having learnt all about the Four Levels Model and how to effectively apply this theory to display return on expectations, I was really eager to learn more about the other evaluation methodologies out there.

 The second programme I attended took a much more quantitative view of evaluation than I had experienced with the Kirkpatrick Certification, although the Phillips Return on Investment (RoI) programme had a similar 2-day, learning-then-doing format. I’d never thought about learning and development from a purely financial view before; for me, it’s very much about soft skills acquisition, building a team and growing as an organisation, so it was interesting to see it all broken down into numbers and financial returns.

For some areas and in certain industries, RoI works really well and can be a great way to justify that you are getting back the money you’re spending on development. One problem I found with it though; RoI can’t really go into the inherent value of customer service, branding, and all those other things that are not as easily quantifiable as ‘time saved’.  It’s these intangibles that, for me, are the basis of Learning and Development- that essential foundation, not particularly quantifiable but nonetheless vital to success, to up-skilling - and ultimately to the bottom line.

An interesting way to look at successful RoI is through the classic call centre training environment- think more efficient phone calls, time saved, all that good stuff. In a situation like this, there are plenty of things that can be measured; and these things all cost money, so you can do an effective RoI on how much money the training has cost vs. the amount of money saved or the increase in revenue (or hopefully, both). Okay, so let’s think about call centres for a minute. Living in an era of technology, getting to actually speak to a real person is increasingly difficult a task. And, by the time you do get through… you might be a little less than cheerful. So, imagine then the relief when you’re put through to someone who is just incredibly good at their job. Friendly, helpful, knowledgeable, going the extra mile so that you leave that call with an overwhelmingly positive image of the organisation. You’d mention that to people, wouldn’t you? They might be looking for a new service provider. They remember your conversation, and decide to go with the same company as you because they can be assured that if something goes wrong, it will be resolved. They have a positive experience themselves. They mention it to people in their own circles. And, just like that, it snowballs into a fantastic reputation and great corporate success.

 Now, that organisation may well have spent massive amounts of money on training its call centre staff. And it is true that part of the return on this is financially quantifiable efficiency; each employee can now answer more calls, save more time, make the company more money and provide a return on investment for the training. The effect of that training on overall organisational branding, however – this isn’t quantifiable. But it is nevertheless massively important. And it can make you a lot of money.

This was really my main niggle with the Phillips RoI programme, and the gaps that the Kirkpatrick programme filled in for me; it was great learning how to mathematically demonstrate RoI in certain circumstances, but sometimes you need to justify your learning and development programmes by stepping back and taking a look at the bigger picture. Training and subsequent evaluation should be a consultative process; there should be a collaborative approach to ensuring the right solution is selected and delivered so that there is a genuine return on expectations. For me, interesting as it is to add a mathematical element to evaluation, I find that in using RoI exclusively, the roles of consultation and collaboration in effective evaluation are massively underplayed.

Do you evaluate your training?

Which evaluation methodologies deliver for you - and why?

Get in touch, I'd love to hear more about your own experiences of evaluation and evaluation training programmes!

Votes: 0
E-mail me when people leave their comments –

You need to be a member of DPG Community to add comments!

Join DPG Community

Comments

  • Hi Rosie - HI Lucy,

    Great conversation.  I missed this back in July so let me catch up now!

    Lucy - I've spent loads of time in call centres and I love the fact that measuring both quality and quantity is quite straight forward. The fact that you can listen in and score any call at any time gives a great quality measure as does the quantative nature of several KPI reports you've probably got.

    Rosie - Mystery shopping is great isn't it. I work in an organisation where we've had to do a lot of work on this. I work in the casino business and basically, anyone that is playing on a table that has no experience tends to stick out. And when you couple this with the inquisitive questions they are briefed to ask, it's so easy to spot a mystery shopper. Our reception team would spot them a mile off and almost send an alarm call around the building to alert staff to be on the best behaviour! Nowadays, our approach is very different. We use mystery shoppers who have expertise in gaming so they blend in just nice. We've also toned down the inquisitive questions that they ask!

    I think the trick in both cases is to provide the feedback both in call centres and environments that use mystery shopping. I think most call centres have this well covered now with coaching sessions and feedback sessions on call quality. Too often I hear from our current organisation staff saying they don't get feedback on what our mystery shop and customer survey reports say. My view is to create the most stunning looking, visually appealing noticeboard in the staff room with this feedback on. But I guess it would be wise to praise in public and criticise in private!

    How do you share the results of your evaluations in your organisations?

    Ady

  • Hi Rosie,

    Thanks for your feedback, I agree that it would be really interesting to see how different organisations measure learning and implementation at level 3. I think that part of how to best-measure behaviour change might depend on the industry and individual is working in? Obviously soft-skill learning is much harder to measure and track change in than hard-skill learning, but arguably more important because there isn't always that clear yes/no with soft skills.

    360s are really useful for this, but I agree, only if they're done as an implicitly confidential manner so you don't have to sift out the 'what they want to hear' reactions.

    It's definitely a tricky one! And something worthy of a bit of trial and error until you find the right way to measure level 3 in your specific organisation - hard as behaviour change is to measure, to me it's among the most important parts of evaluation, so worth all the effort!

    Lucy

  • Hi Lucy

    This is a really interesting Blog post.  I'm currently researching ways to evaluate at level 3.  We currenly deliver face to face training to a national population (where we are bvased in one place)  The core roles are customer facing and we do sporadic Level 3 evaluation through survey monkey and mystery shops.

    Mystery shops do provide feedback from a real life situation and, with the right framework, can give a clear picture of what learning has been adopted, but......it is just a snapshot in time.  Is the behaviour consistent?  It is also very hard to select the 'right' person to mystery shop so can be expensive for little result.

    Survey Monkey does allow 360 follow up but can be completed in a 'What they want hear' fashion.

    I too would really like to hear how other people are evaluating the learning in terms of financials but also in expectations.

    Rosie

This reply was deleted.

Get Involved

Start a discussion in one of the following Zones
 

 

What's Happening?

David MacKenzie-Rapin, Anna Kenna, Terry Bolson and 3 more joined DPG Community
yesterday
Jeff Thorsen replied to Lyana Jones's discussion How to Get an Ideal Economics Assignment Help in Australia?
Tuesday
Thomas Frank replied to TomHiddleston's discussion Features to Look for in a Time Card Calculator Tool
Tuesday
Thomas Frank replied to Pearl King's discussion What advice do technology book editors have for aspiring tech writers?
Tuesday
Anthony Anson posted a discussion
Monday
rosy dam posted a status
What's up?
Monday
rosy dam posted a status
Salutations!
Monday
rosy dam posted a status
Howdy!
Monday
rosy dam posted a status
Good evening!
Monday
rosy dam posted a status
Good afternoon!
Monday
rosy dam posted a status
Good morning!
Monday
rosy dam posted a status
Greetings!
Monday
More…