How to make sure your learning initiatives (rather than advertising, market trends or operational leadership) are having a real impact on business outcomes
By Mike Freel, Ph.D., Senior Research Associate, Bellevue University’s Human Capital Lab, and Bonnie Beresford, VP Client Services, Capital Analytics, Inc.
Any organization worth its salt has a real interest in developing its employees. Most companies employ several people in the training and development department to ensure that employees receive appropriate education and skill-building opportunities. Members of the training and development team assess the work environment and evaluate the need for improving the knowledge, skills and abilities of employees. They know intuitively whether the programs they offer are effective and get at the needs of the organization – particularly its bottom-line performance.
Let’s say you’re a member of that team, and one day you receive a call from the CEO who asks: “How do we know that our training is having the desired impact on the organization?” Are you prepared with the right answer? Of course you are. You’ve taken all of the appropriate steps to ensure that you’ve effectively measured the impact of learning on your organization.
In this case we’re talking about quality issues stemming from your production areas. Quality is something all of us can relate to. You prepare your report for the CEO, culminating with the good news that quality has increased 2% across all production areas. The CEO is happy. Operations and production managers are happy, and you continue to offer the training as you have been over the past several months.
But did the program really have the desired effect? Did it really affect all employees the same? Two percent isn’t a lot, but when it comes to production quality, any improvement directly impacts the bottom line translating directly to increased profit. But, again, did the program really do what it was designed to do?
Not many organizations are equipped to conduct in-depth analyses of their learning interventions, let alone have the ability to connect learning to business outcomes. A more thorough assessment may indicate that the two percent average was actually not an across-the-board improvement. In fact, there was an eight percent increase in baby boomers, two percent increase in Gen X’ers, and two percent decrease in Millennials. Suddenly, the impact of learning takes on a whole new light!
What do you do now? After all, you did have a two percent increase in quality. But can you leave it at that? Would you just focus on your more “tenured” employees? Would you forget about the younger ones? After all, they weren’t contributing to the quality improvement. Or, does this kind of information offer the opportunity to reconstruct and redeliver a quality program?
If the latter is true for you, you’ve just become part of your organization’s strategic operations team and not just someone in the training department who lectures employees on the importance of improving quality. An eight percent increase with baby boomers means something really clicked with them. Was it the delivery method? Maybe they value quality more than the others. Or maybe, there were more baby boomers working in the production areas than any other generational group. What about the Millennials? Don’t they care about quality? Maybe your facilitation was slow and they “checked out.” Are you actually wasting time and money training them…or are you losing money?
This kind of analysis contributes to the evaluation of the impact of learning on business outcomes. More importantly, it allows organizations to make informed decisions about how learning initiatives may be developed so that they target specific measures of performance. In this sense, learning becomes a strategic tool and a quantifiable, strategic contributor to the bottom line.
In the example, a two percent increase on the surface looks great. But as we’ve seen in the example, results of the impact of learning on business outcomes can be misleading. You have an expectation that the learning initiative being offered to employees will positively affect business outcomes. Without a deeper dive into the data, you’re missing a tremendous opportunity to drive that two percent even higher. Human capital analytics provide the depth of analysis that is useful to you and the CEO in deciding how to allocate – and reallocate – scarce resources.
So how do you capture actionable analytics? How can you dive deep enough into your organization to make sure that your learning initiatives are having an impact on the business outcomes and that it’s not the advertising, market trends, or operational leadership that’s responsible? This is difficult for most organizations to do. Identifying the necessary data and synthesizing the data across the many information systems can be time consuming. In fact, you probably aren’t even aware of all of the various data systems owned by your organization. Don’t worry. You’re not alone.
In order for actionable analytics to function as a strategic tool, there has to be a way of identifying what factors you want to measure, identifying how those factors are measured, and identifying who is collecting the data. This is where a key performance indicator (KPI) stakeholder meeting contributes to the measurement picture. The purpose of this cross-functional meeting is to bring together the right players, those who have a stake in the outcomes of your training. Most importantly, this is the group that will articulate what is really important to the organization and how they, the business operations people, keep score. !e goal is to identify the specific KPIs that contribute to the business outcomes and then determine how your program impacts those KPIs.
When considering KPIs, think of batting averages. Mark Whiteside, CFO for the Defense Acquisition University in Washington, D.C., recently offered a compelling explanation for KPIs. What’s your batting average? In baseball, batting average is a pretty good indicator of whether or not someone is a good batter. You don’t need much more than that. “When we see it, we know immediately if it’s good or bad; if someone is excelling or not,” said Whiteside. For a player, this might be their ultimate “business outcome” metric. For the team, however, the end goal is to win the World Series. Here, your team’s batting average might be a good leading indicator, but it’s not your outcome metric.
Key performance indicators, much like batting averages, offer a measure of the performance of your business. Changes to KPIs are ultimately reflected in bottom-line results. Whiteside offers that organizations sometimes overestimate the number of KPIs that contribute to outcomes. In any given organization, there are a few core factors that drive the majority of an organization’s success. It’s important to know which of those factors your program is trying to improve and how your organization measures them. These critical factors may include turnover and retention, productivity, and, as mentioned above, quality. The common problem for organizations isn’t identifying what data to measure; it’s where to find the data.
The HR department is often the first stop when looking for data. HR typically has the demographic information on employees, annual review scores, education level, and even training history. That’s all good data to have, but where is the business data? That’s where the real organizational performance metrics are. Where is the productivity data? What about quality? Who has that? Where’s the data on time to ship or call handling time? What about length of stay? Once you start looking for data, you’ll quickly learn that the data is there, but it’s all over the place!
So where are the hurdles and who can help you get over them? Let’s revisit the KPI meeting. These folks are the champions, the keepers of the keys. They may not have direct access to data sources in their respective departments, but they know who does. Gaining access to these people will help you over the hurdles to obtaining the data. Engaging the right people at the outset and building the right relationships will better position you for access to their scorecards, dashboards, and related data.
It’s probably not that easy. There are several organizational barriers to obtaining the necessary data for actionable analytics. Cultural issues are common within most organizations. For example, you probably have a pretty good idea of whether or not your organization values learning and employee development. Some organizations invest huge amounts of money in their employees. They often have well-funded tuition assistance programs, in-house development opportunities, and tie learning into employee development plans and annual reviews. These are the organizations that view human capital as a strategic differentiator, an investment, a tool for organization growth, and improved market share. Other organizations see employee learning as a cost, not an investment in growth. Data collection in these organizations may be tougher. Not because the data isn’t available, but because the organization doesn’t see human capital as a strategic differentiator or as contributing to performance. Simply put, they won’t understand why you want the data.
Some data is in your HRIS and LMS, sure, but the business data will be available in your sales tracking database, the call center’s call tracking system, shipping and receiving spreadsheets, the finance and accounting system, your project management system, your inventory control system, and more.
What about political barriers? The truth is that political agendas can often prohibit the collection of data. For example, let’s imagine that during the KPI meeting, critical data sources that track the business outcomes were identified. However, some of the data is located in a small system in a satellite office. The keeper of the data in that small office has maintained that data for several years. It’s her data and now you want her data. Why? What for? If you didn’t include the keeper of the data in the strategic discussions about actionable analytics, how likely do you think she’ll be to share that information? This example, again, stresses the importance of thoroughness when selecting the people you need to engage with from the outset.
Actionable analytics significantly contribute to the depth and quality of decision making within an organization. As organizational learning professionals are increasingly tasked with demonstrating the impact of learning on business outcomes, the criticality of human capital analytics becomes more evident. Analytics provide the basis for proving and improving the investment in human capital.
You proved that the learning initiative increased quality by two percent. So what if you asked that next question: “Is it increasing everyone’s quality?” Would you be prepared to use actionable analytics to increase the degree to which the learning initiative affected quality? What would you do with the money that you invested in a learning initiative that had no measurable effect on a segment of your audience? Will you be content with an overall improvement? Or, will you take the opportunity to redesign or redeploy for even greater impact?
Can you afford not to ask the question?
This article originally appeared under the title ‘How Analytics Can Impact Learning’ in Innovation@Work, a publication of the Human Capital Lab at Bellevue University. More information at www.humancapitallab.org/