More

    How to Demonstrate ROI to Avoid Slashing the Training Budget

    We’ve all probably seen it: When the IT organization is directed to cut costs, one of the first things to go is the training budget, and learning and development fall by the wayside. The reason is simple: Training tends to be seen as a cost, rather than as an investment.

    It’s difficult to fault business executives for that. Any investment needs to be tied to business objectives, and deliver business results, and the return on investment of learning and training can be murky, at best. But according to Patti and Jack Phillips, CEO and chairman, respectively, of the ROI Institute, it doesn’t have to be that way.

    In their new book, “The Business Case for Learning: Using Design Thinking to Deliver Business Results and Increase the Investment in Talent Development,” the Phillipses discuss how to demonstrate learning and development ROI to even the most skeptical business executive. Here’s an encapsulation of their eight-step process:

    Start with why: Aligning programs with the business. We’ve heard “start with the end in mind” many times, but the end is not having great programs that participants see as valuable. The end is important business measures in the organization. The business measure is the “why” for the program. Whether the payoff need is to bump up growth, improve safety records, boost inadequate customer service, or something else, it needs to be expressed as a specific measure. The task is to identify the specific business measure that must change so that the program delivers business value.

    Make it feasible: Selecting the right solution. With the business need clearly defined, the next step is to decide on the solution to improve the business measure. What should target employees be doing or stop doing that will have the appropriate influence on the business measure? This sometimes requires only a few questions. In other situations, additional analysis is needed, using techniques such as problem-solving, brainstorming, fishbone diagrams, records review, focus groups, nominal group technique, and others in an attempt to understand what’s causing the business problem or what’s keeping the business measure from being what it should be, if this is an opportunity.

    Expect success: Designing for results. A major outcome from the needs assessment is a clear definition of the objectives. The objectives define the success that’s needed at each level. At the payoff level, the ROI objective is the minimum acceptable return on investment. At the business impact level, it’s the minimum amount of business improvement required to be successful. At the application level, it’s the minimum amount of action, the use of knowledge, skills, tools, and processes. At the learning level, it’s the minimum amount of knowledge and skill that must be learned. And at the reaction level, it’s the minimum level of perceived value. Specific objectives are important to the success of the program, and they define the expectations for everyone, from content designers and developers to facilitators, participants, and managers of the participants.

    Make it matter: Designing for input, reaction, and learning. The key to make it matter is to develop a program with content that is relevant, meaningful, important to the individuals and the organization, action-oriented, and something that they will use. This requires prospective participants to decide if this is the right program for them, making sure that they are attending at the right time and the right place, with the proper support. This helps the developers provide examples, activities, and exercises that reflect not only what the participants are learning, but what they will do with what they’ve learned and the impact it will have.

    Make it stick: Designing for application and impact. The reality is that if people don’t use what they’ve learned, then it has failed for the organization. Just because participants learned something doesn’t necessarily mean that they will actually use it. Unfortunately, much research continues to show that a lot of what is learned in formal talent development programs is not used on the job. Some studies put this percentage in the range of 60 percent to 90 percent. This is a huge indictment for the learning profession, to admit that so much of your budget is wasted, because participants don’t use what they have learned. Transfer of learning is a process that occurs over time and involves all the stakeholders. Yet there really are some very simple things a company can do that have a big impact. For example, research shows that it takes only about 30 to 60 seconds for a manager of a participant to make the transfer by having a brief discussion to set expectations with the participant before attending the program. And then have another brief discussion when they return to reinforce the expectations and offer support. Making it stick is not as difficult as it seems.

    Make it credible: Measuring results and calculating ROI. This step can be one of the most rewarding parts of the process. The first action is to sort out the effects of the learning program from other influences. Simple, easy-to-use techniques are available for this action. It’s where you (and others) clearly see the connection of the program to business measures. If the evaluation is needed at the ROI level, three more actions are needed. The impact measures are converted to money, the costs are tabulated, and the ROI is calculated. This can be accomplished with fourth-grade mathematics. The challenge is to overcome the barriers to moving to this level of evaluation and evaluate at this level only when programs are expensive, important, strategic, and attract the interest of top executives. Net benefits are program benefits minus costs. This formula is essentially the same as the ROI for capital investments. For example, when a firm builds a new plant, the ROI is developed by dividing annual earnings by the investment. The annual earnings are comparable to net benefits (annual benefits minus the cost). The investment is comparable to fully loaded program costs, which represent the investment in the program. The principal barrier here is fear of results, and this should be tackled in a very proactive way. If a program is not successful, you need to understand why it’s not working and correct it. If you are proactive, your various stakeholders will accept this easily, even if the results are very negative. But if you wait to be asked for the impact or ROI, then it places you at a disadvantage. This should be tackled from the mindset of process improvement, making programs better, even if they’re not delivering the desired results.

    Tell the story: Communicating results to key stakeholders. The presentation of results can range from executive briefings to blogs. The content can range from a detailed report to a one-page summary. The important point is to tell a story with results. Storytelling is very effective, and it’s the best way to get the audience’s attention and have them remember the results. The outcome data represent a compelling story with very credible, executive-friendly evidence and anecdotes.

    Optimize results: Using black box thinking to increase funding. It is helpful to think about the power of the evaluation completed in the previous steps. The results are there, and you know what caused success or failure. If the results are disappointing, you know how to correct it. Black box thinking is needed at this step. In the airline industry, black boxes point to the cause of a crash of an airplane. Investigators analyze the data with the goal to prevent the accident from occurring again. The analysis usually reveals the cause and identifies the actions to be taken to prevent this type of accident in the future. Learning and talent development professionals can take the same approach. In this final step, the programs are evaluated and the data are used to make them better. When this happens, results will improve, and ultimately the ROI is enhanced.

    A contributing writer on IT management and career topics with IT Business Edge since 2009, Don Tennant began his technology journalism career in 1990 in Hong Kong, where he served as editor of the Hong Kong edition of Computerworld. After returning to the U.S. in 2000, he became Editor in Chief of the U.S. edition of Computerworld, and later assumed the editorial directorship of Computerworld and InfoWorld. Don was presented with the 2007 Timothy White Award for Editorial Integrity by American Business Media, and he is a recipient of the Jesse H. Neal National Business Journalism Award for editorial excellence in news coverage. Follow him on Twitter @dontennant.

     

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles