Measuring impact is a perennial hot topic and source of anxiety for most of us in learning and development—it's been our profession’s Achilles heel for as long as I know.
But to fully address the skills gap and meet the needs of our organizations and learners in the future, it is important to communicate our impact story to stakeholders. Without doing so, the efforts of L&D in upskilling and reskilling may be wasted.
In this podcast episode, I speak with L&D experts Kevin M. Yates, Laura Paramoure, and Bonnie Beresford, about measuring impact and the practical steps you can take to make your L&D strategy a planned and meaningful one.
Read on to hear about measuring the impact of L&D solutions, common mistakes made by professionals, and recommended tools for demonstrating how your learning solutions are contributing to business goals.
To begin, we need to clarify what we mean when we refer to measuring impact.
Understanding what measuring impact means is crucial for assessing the effectiveness of our learning solutions on our business goals.
Luckily, our experts, Bonnie, Laura, and Kevin, can help you out.
As Bonnie explains, when learning and development talk about impact, they mean business impact, meaning business metrics.
“We're striving to show that our training or learning interventions can impact some business metric,” she says. “And to do that, you've got to get into the business to get that data.”
But in Bonnie’s experience, organizations tend to make the mistake of overcomplicating the process of analyzing business metrics. They often believe that they should measure the impact of their entire department, instead of focusing on one program at a time and determining its intended impact.
“Measure that,” Bonnie says, “then if you want to add them all together, go for it. But don't try to boil the ocean because sales have different KPIs than manufacturing or customer satisfaction.”
Next, Laura adds that L&D’s impact on the business metrics comes from behavior change.
“We've got to deliver on the metrics,” she says, “but for us to deliver on the metrics, we need to understand what we're actually doing in learning, which is changing the behavior in each individual department to be able to change errors or productivity.”
“And if we understand the behavior we're impacting, we can correlate that directly to show how those behaviors change those metrics.”
And third, Kevin explains, is measuring performance impact: the extent to which learning measurably influences behavior, actions, and performance.
“What we're looking to do, through training and learning performance support solutions, is measurably influence human performance.”
“Not only are we looking for the extent to which learning has changed behavior, actions, or performance, but there are also times when training and learning is a great solution to maintain or sustain performance so that it doesn't dip,” he says.
So, with our shared understanding of what measuring impact means, where do we usually go wrong?
As L&D leaders, Kevin explains, our biggest struggle as a community is that we tend to take a postmortem approach when measuring the impact of our learning interventions.
"Our biggest struggle as a community is that we tend to take a postmortem approach when measuring the impact of our learning interventions." - Kevin M. Yates
“If you don't plan for measuring impact in the beginning, you'll have a hard time measuring it at the end. There needs to be intention, specificity, and purpose upfront,” he says. “Before anything is launched, consumed, and utilized, you have to be clear about where you expect to impact people's performance and business results.”
Start by gathering the data that informs decisions for learning solutions that measurably impact human performance and business results. Then, you will be in a great position to measure the impact of learning and training.
Bonnie adds that if you stumble upon good results in a postmortem fashion, it’s hard to show efficacy of your program because you cannot demonstrate that you expected that desired impact. However, it’s easier to lay out from the beginning that if you do this training, people’s behaviors will change, and then you’ll see this evidence in their performance.
And Laura agrees. “When we communicate what we expect to happen as a result of the learning solution, we start to shift the mindset and our reputation from simply reacting to the change versus driving the change.”
"When we communicate what we expect to happen as a result of the learning solution, we start to shift the mindset and our reputation from simply reacting to the change versus driving the change." - Laura Paramoure
Kevin explains that making measurement a forethought in your L&D strategy starts with having very different conversations with your business partners.
“Here's the reality,” he says. “Those kinds of conversations are not easy because it's a shift in our mindset and our business partnering stakeholder mindset.”
Historically, training, learning, and talent development teams have been seen as fulfillment centers, but Kevin explains that now we need to ask for different kinds of conversations and mindsets. We need to ask questions about business goals, performance requirements to achieve those goals, and KPIs that show human performance.
And in Laura’s experience, what works is if you speak your business partner’s language.
“If we can bridge the gap between our conversation of objectives and all of those things into categories that business partners understand, that will help us to get even clearer on those upfront conversations,” she says.
Bonnie provides an example of how these new conversations can be implemented in real-life situations. She recounts her experience working with an organization on improving its leadership development.
“Their goal was to have everybody live the leadership values. And when I asked the senior executives who designed the program what that looked like, they could not articulate what they really wanted their people to do,” she says.
So, the conversation delved into the individual departments because the desired impact would look different in manufacturing than in sales. “It was an uncomfortable discussion until they had their a-ha moment and realized we were making progress, but it took a while to get there,” Bonnie explains.
"It was an uncomfortable discussion until they had their a-ha moment and realized we were making progress, but it took a while to get there." - Bonnie Beresford
The best solution that Laura has found is the ability to measure the outcomes of what happens in a learning program.
“If you create a blueprint outlining your objectives and the metrics,” she says, “you can get a corresponding assessment to those objectives. We have ROI by design, we call it measurable instructional design, and the only way to measure it is with a direct connection between assessment and objective.”
“So, if my objective is laid out such that I have the conditions of performance and the criteria for performance, then I can create a corresponding assessment that hits that,” Laura explains. “If you have a tool that does that, then if your objective is not met you can go back and look at the corresponding evaluations to see where it failed.”
In Bonnie’s experience, there are systems in the workplace where you can see evidence of the desired behaviors because people are doing the things they have been asked to do. These could be your CRM system logging things or a performance support app.
And Kevin has some good news because some of what you need to measure impact is already collected as data. “We just haven't leveraged it,” he says. “Because if we're using business metrics to signal the effectiveness and impact of training and learning, the business is already measuring what's important to them.”
“If those data points already exist somewhere in the business, we don't have to reinvent the process. We can partner with whoever owns those metrics to understand how that business metric is performing.” explains Kevin.
"If those data points already exist somewhere in the business, we don't have to reinvent the process. We can partner with whoever owns those metrics to understand how that business metric is performing." - Kevin M. Yates
As the discussion concludes, Kevin, Bonnie, and Laura have some expert tips to get you started on showing your impact.
First, Kevin explains that you need to plan for measuring impact from the very beginning.
“Be proactive and intentional before you design anything. Be curious. Have conversations that are discovery-based,” he says. “Talk about performance, outcomes, and metrics already in place that signal human and business performance.”
Next, Bonnie echoes Kevin’s theme of curiosity and that you should start with something easy and tangible to measure.
“You've got to be curious in this business,” she explains. “Start with something easy and more tangible to measure. Start small and mine the data, and you might be surprised what you find as you start segmenting it.”
And finally, Laura finds that one of the biggest challenges is the fear of data and feeling that we cannot measure ourselves.
“So, I would challenge learning professionals to hold themselves accountable. Be curious and ask, ‘What impact am I making?’ Recognize that other people are out there doing this. It is being done, so it can be done,” she says. “Get your mindset into a place where you know you don't just have to take what the organization's giving you.”
Thanks to our panelists Bonnie, Laura, and Kevin for sharing their insights and experience with us! Keen to learn more from L&D experts? Check out our episode with Michelle Parry-Slater about actionable steps and resources for L&D experts or Navid Nazemian on how learning and development can impact C-suite attrition rates.
Want more peer insights on transforming workplace learning? Sign up to become a member of the L&D Collective, and check out our other #CLOConnect interviews with top L&D leaders on driving growth and scaling culture through Collaborative Learning. Or you can subscribe (below 👇) to our weekly newsletter to receive our latest posts directly in your inbox.