Boosting Performance on a Budget: Tips for Your Learning Team
Boosting Performance on a Budget: Tips for Your Learning Team
At the recent eLearning Guild Learning Solutions conference, Humentum team members Gus Curran and Mark Nilles were joined by Paige Winn of member organization FHI 360 to lead a conference session called: Boost Performance on a Budget: Tips for Innovative Learning and Development. They share their key ideas from that presentation here.
The idea for the session came at the 2017 Learning Solutions conference. After attending several sessions where learning and development professionals primarily from the private sector commented that their limited budgets prevented them from doing everything they’d like to do, the three of us had a hallway discussion about what we do to make the most of limited budgets.
In our sector, we use free and low-cost tools out of necessity all the time, and we find creative ways to deliver effective solutions without breaking the bank. So, we wanted to share some tools, resources, approaches, and concepts at this year’s conference.
We thought you might like to see what we discussed during the session in case you’re also looking for ways to make the most of your limited learning and development budgets. Although they are somewhat artificial divisions, we organized our session into three phases of learning and development work: design, delivery, and evaluation.
We looked at maximizing budgets when approaching design in three different ways. First, by exploring whether the expense of building training is going to deliver the best solution for the problem. Second, by leveraging existing content to meet training needs. And third, by starting with a minimum viable product when a new or untested approach is required.
In 2017, Association for Talent Development (ATD) estimated that it requires an average of 38 hours to develop one hour of classroom training. ATD also estimates that one hour of ready online learning content takes an average of 42-143 hours to produce, depending on complexity. This means that one hour of ready eLearning content costs $8,880-$28,640 ($18,760 on average) to produce. Whether face-to-face or online, training requires extensive investment.
Yet, when a project or unit experiences a challenge, the first conversation with the learning and development team is often a request to provide “training.” But training isn’t the only answer, and is certainly not the answer to everything. It’s important to examine the problem and then determine the appropriate intervention. (You wouldn’t want your doctor to prescribe treatment before completing a diagnosis of issues leading to your problem!)
To determine what intervention is appropriate, FHI 360 refers to their Learning Standards to design solutions with outcomes in mind. The Learning Standards approach consists of a step-by-step needs analysis that examines current state, desired state, and systems and processes in place to support the intervention—whether training or something else. More precisely, the steps in the Learning Standards approach are to: (1) Clarify the problem; (2) Construct an analysis plan; (3) Collect data on current and actual states; (4) Analyze data to reveal gaps; (5) Validate origins and causes of gaps; (6) Prescribe solutions for each gap; (7) Share findings and results with stakeholders.
FHI 360 also offers a Learning Design and Delivery course that leads to an internal certification for those with a facilitative, instructional design, or learning role. This approach has saved money, connected learning solutions more closely to the work, and provided valuable and relevant professional development opportunities.
Second, we talked about the importance of curation, recycling, and repurposing existing learning content. Often, we seek to create solutions to problems that other people have already solved, thinking that our circumstances are unique and others’ learning or training solutions are not relevant to our audience. But the research shows that designing new training takes considerable time, energy, and expense.
We can cut design and development costs down considerably by curating, recycling, and repurposing content. Oftentimes, there is no need to reinvent the wheel when it comes to design. There are open resources you can borrow from, either online or at your own organization. Does that TED Talk explain a key point you’re trying to get across? Is there a YouTube video that demonstrates a topic better than you can? Link to it!
It is also important while designing to think about how you can make your content flexible enough to be repurposed in the future. If you are going to invest the money and time in creating your own content, look for opportunities to make modules and job aids flexible.
Finally, we discussed an approach that requires careful planning but a low initial financial investment, and can be especially effective for something new or untested: Minimum Viable Product. A minimum viable product is not something that is hastily put together or done with less thought and consideration—quite the contrary. Minimum viable product requires a thoughtful, intentionally iterative approach.
The minimum viable product is a means to an end that is user-ready; it is basic but functional. It is designed as the first iteration available to end-users; as such, it upholds important standards but requires less financial investment than full production. It’s important that users of a minimum viable product benefit from the offering but also recognize the future potential of further development. And establishing metrics from the beginning will help you understand whether it’s worth investing more resources in further development and where. To that point, the minimum viable product can be a proof of concept that helps you convince senior management that additional investment is the right decision.
Next, we turned to delivery of learning. We first asked participants to share the free and low-cost tools that they regularly use (collecting and sharing input using the interactive tool, Mentimeter). We then shared some of our favorite tools, including:
And finally, we discussed the most pernicious and beguiling aspect of our work: evaluation. We encouraged participants to discuss challenges and solutions before sharing a few ideas.
One evaluation methodology we discussed is Robert Brinkerhoff’s Success Case Method. This is an evidence-based storytelling approach that allows you to explore the following questions for any intervention, including training:
- What is really happening?
- What results, if any, is the program helping to produce?
- What is the value of the results?
- How could the initiative be improved?
This is a lean but powerful approach. It won’t directly lead to the number or percentage of participants that have been successful, but the information that comes out of success cases can be very enlightening. The trick, as with all evaluation, is to have feedback loops that allow you to take action with the information that has been gathered.
Another recommended resource is Will Thalheimer’s excellent book on workshop evaluation, Performance Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form. Anyone seeking meaningful feedback following a learning event would benefit from Thalheimer’s perspective and approach.
So that’s what we shared at the conference—as you might have gathered, we focused on improving performance, not just training. Because many times, training is not the answer. But when training is the answer, there are free and low-cost ways to design, deliver, and evaluate it (and to convince senior leadership to invest in proven concepts). What tools, resources, and ideas help you improve performance amongst colleagues, partners, and beneficiaries?