l-and-d-podcast-kevin-m-yates
Training & Learning

3 Reasons L&D Fails to Measure Impact (And How to Get It Right): An Interview with Kevin M. Yates

For L&D leaders, making a planned impact continues to be problematic. 

Even though we are in a mature profession, we still don’t know how to predictably and reliably achieve the desired impact of our training solutions. After all, if we’d cracked it, we wouldn’t still be talking about it.

So, we’re kicking off our mini-series on measuring impact with Kevin M. Yates, author of the eBook ‘L&D Detective Kit for Solving Impact Mysteries,’ about planning to make a measurable difference instead of leaving it up to chance.

Read on to hear why aligning learning solutions with business goals from the get-go counts, three reasons why L&D can’t seem to step towards measuring impact, and what data you should be chasing to prove the ROI of your training interventions.

First up, why are we still struggling to measure impact?

3 reasons L&D still fails to measure impact

For almost 40 years, measuring impact has been a consistent point of discussion, but L&D continues to need help in this area.

Even though we’ve moved the needle a little, Kevin explains there are three main reasons why learning and development struggles to prove impact.

First is the need for alignment between our learning strategies and business goals.

1. Lack of alignment

L&D’s learning solutions must intentionally align with performance requirements to help organizations achieve their goals.

“So, if our learning solutions aren't aligned, then they aren't going to produce the types of outcomes that we can measure to the extent they have fulfilled their purpose for measurably influencing human performance and business outcomes,” Kevin says.

Related: The Bigger Picture: Mirjam Neelen's 4 Tips on Aligning Stakeholders When Pivoting to Performance L&D

2. Trying to measure the impact of too many things

There are some learning solutions where impact measurement may be optional.

“We don't need to measure impact for everything because we don't have endless resources, talent, and time,” Kevin explains. “Trying to measure the impact of all the learning solutions in our catalogs is an unrealistic expectation.”

Kevin recommends measuring those training solutions that are specifically and intentionally designed to produce a performance outcome critical for business success.

We don't need to measure impact for everything because we don't have endless resources, talent, and time. Trying to measure the impact of all the learning solutions in our catalogs is an unrealistic expectation.

3. Scarcity of talent or expertise for impact analytics

Finally, L&D faces the evergreen challenge of balancing talent in the team with the time needed to focus on impact analytics and measurement.

“If you don't have someone devoted to that work, it's going to be difficult to get measurement to take root as part of the L&D foundation,” Kevin explains. “It will be difficult if you don't have someone like me on the team focused on creating and executing the measurement strategy.”

So, how do we get measuring impact right? It all starts with alignment.

Related: The Early Adopter Phase is Over: An L&D Data and Analytics Deep Dive With Trish Uhl

Alignment: Starting with the end in mind

As Kevin explains, proving impact starts with having the end in mind, meaning aligning our training and learning solutions with business goals and having a performance-first mindset. 

Alignment is a conversation with business partners and stakeholders to identify the following:

  1. The organization’s goals and the risks to performance
  2. The performance requirements that help achieve those goals
  3. The gap between where performance is and where it needs to be to achieve those goals.

So, where can you start when shifting to a performance-first mindset?

Related: Busting L&D Measurement Worries With Kevin M. Yates, Laura Paramoure, and Bonnie Beresford

See yourself as a performance consultant

If a stakeholder approaches you and requests training, Kevin recommends that you acknowledge that discussion and follow with a performance-first mindset. 

“With a performance-first mindset, you're seeing yourself as a performance consultant first and an L&D practitioner second,” he explains. “By doing so, you're going to engage in a conversation that helps the business partner shift their thinking.”

“And if we lead with performance as the foundation for the conversation,” he says, “we're going to be in a much better position to deliver training and learning solutions aligned to business goals.”

With a performance-first mindset, you're seeing yourself as a performance consultant first and an L&D practitioner second.

Armed with organizational alignment and a performance consultant mindset, you’ll be well on your way to proving impact. But what else do you need to consider? The answer: data.

Related: Did That Training Really Work? Measuring the Impact of L&D with Kevin M. Yates

What data to chase to show impact

When Kevin considers the data he will be chasing, he thinks about performance, performance, performance–every day, all day. 

When measuring performance, you should look for data showing how people behave and use their skills and capabilities to achieve business goals.

Kevin’s advice for determining what data to chase is twofold:

  1. Start with a bullet point list: You need a list of the behavioral or skill requirements people need in their role to help them and the business succeed. This list is your North Star. 
  2. Design measurable learning solutions: Design training solutions with intention and specificity for influencing those bullet points. Now, you have guidance for what you want to produce as an outcome of learning solutions. 

“In other words, you're going to measure the extent to which training and learning fulfilled its purpose,” Kevin explains, “and purpose fulfillment is impact.”

Measuring impact is difficult but possible

In Kevin’s experience, measuring the impact of your learning solutions is difficult but possible. 

“The reality is that there will be times when we can’t measure the impact,” he says. “Does that mean that our training solutions are inefficient? Does that mean that there's no value? No, it doesn't mean that at all.”

As Kevin explains, it can mean one of two things: 

  1. There are times when the complexity cancels the opportunity to measure impact.
  2. There are instances where we don't have the opportunity to create learning solutions designed for measurable impact.

“The best case scenario is one where we can have a performance-based discussion upfront about measurable outcomes and then design training and learning toward that goal,” says Kevin.

Thanks to Kevin for sharing his experience and insights with us! Keen to learn more from L&D experts? Check out our episode with John Tomlinson about the significance of people-centred learning and development or with Lila Warren about her approaches to impacting performance with learning solutions delivered at the point of need.

Want more peer insights on transforming workplace learning? Sign up to become a member of the L&D Collective, and check out our other #CLOConnect interviews with top L&D leaders on driving growth and scaling culture through Collaborative Learning. Or you can subscribe (below 👇) to our weekly newsletter to receive our latest posts directly in your inbox.

join-the-l-and-d-collective