11 Jan Knowledgepool launches white paper on how to evaluate learning outcomes
KnowledgePool, the managed learning services company, has launched a white paper outlining a new approach which learning & development practitioners can use to assess their training portfolio and measure the impact of their learning interventions. Getting the Value out of Evaluation draws a distinction between evaluating the activity of training delivery and measuring the outcomes of the learning – the value or business benefit. It explains a practical and cost-effective way to achieve higher levels of evaluation, in the context of an enterprise-wide managed learning service. “Current evaluation practice tends to measure factors such as the achievement of course objectives, the trainer’s delivery skills, the support materials and the venue,” said Kevin Lovell, Learning Strategy Director at KnowledgePool. “Most organisations don’t have the resources or time to evaluate learning outcomes and they don’t know how to demonstrate the benefits of learning effectively. This paper outlines a standard methodology which organisations can use to conduct behavioural change and business benefit evaluations – that is, Kirkpatrick levels 3 and 4 – for any learning intervention. It narrows the gap considerably between level 1 ‘reactionnaires’ at one extreme and a quantified return on investment at the other.” The paper lists ten questions which can be used to evaluate learning outcomes.
These focus on the transfer of learning to the workplace and the business benefit, as indicated by individual performance improvement across seven generic areas, balanced with anecdotal evidence. KnowledgePool piloted the new evaluation approach with 1,000 learners who undertook different training courses in ten organisations. The white paper provides benchmark statistics, against which organisations can compare their own evaluation results. The benchmark figures show that the average performance improvement score for a training course is 35%; 70% of learners transfer their learning to the workplace; 53% of learners get line manager support to apply their learning; 25% of learners neither apply their learning, nor do they get line manager support. “These results provide an acceptably accurate statement of what happens when learners return to work after training,” said Kevin Lovell. “Using modern training administration technology, organisations can automate the issue and collation of evaluation data. This can help them measure how learning is impacting on their organisational performance and whether courses are delivering above-average or below-average levels of performance improvement.” KnowledgePool acknowledges that the evaluation results are based upon the perceptions of the learners. However, the paper explains how the suggested approach utilises narrative – the descriptive words of learners – to supplement the statistical data. “Learners are often clear about whether a particular intervention helped them to do their job better,” said Kevin Lovell. “By asking them specifically about this, you can filter out other factors which may have contributed to their performance improvement. Several months after the training, people can usually tell you how and where the learning helped to improve their performance – or not, as the case may be.”Click here to get your free copy of Getting the Value out of Evaluation.
For more information, please call KnowledgePool on 0870 320 9000 or e-mail email@example.com.
“Line managers play a crucial role in facilitating the transfer of learning to the workplace,” said Kevin Lovell. “A key question we ask is how much learners feel their line manager supports them in this endeavour. Where that support is evident, the transfer of learning to the workplace is much higher and that translates into much greater performance improvement.”
Kevin Lovell, Learning Strategy Director at KnowledgePool