• Home
  • Blog
  • Webinar Q&A | Using Design Thinking for Designing Learning

Webinar Q&A | Using Design Thinking for Designing Learning

Design is design no matter what you’re doing. When it comes to designing learning experiences, applying Design Thinking principles is a no-brainer. Design Thinking is all about getting in the end user’s shoes—the learner! So, why aren’t more learning professionals applying this method? Lack of time, lack of understanding, and lack of identifying the actual problem to solve all come to mind. However, it’s 2018! There is no excuse!

During a recent webinar, I walked attendees through several Design Thinking principles to help jump-start their year by putting the learner at the center of interventions. If you missed the webinar, see the recording available here: Design Thinking to Design Learner Experiences. By the end, you will see how designing learning experiences and Design Thinking go together like peas and carrots. If you are looking for the abbreviated version, I wanted to offer a quick look at some of the key takeaways:

  • While Design Thinking is a buzzword, most learning professionals are already incorporating elements of it in their day-to-day processes and design sessions. The approach isn’t necessarily new, but it’s focused on three main things: knowing your learner, knowing your problem, and taking action. Then, see if that action created positive business results. Remember, what matters is that any learning experience, training program, or development initiative is based on outcomes so that we can move the needle. Moving the needle can be operational, such as better business results, or even internal, such as improving morale, engagement, or retention.
  • The food/recipe analogy – When cooked or baked just right and with the perfect blend of ingredients, you can have a fantastic meal that you want to make again and again. When it comes to crafting training and learning experiences with a Design Thinking mindset, our role is to create a recipe that people LOVE to experience. They love it enough to come back for more! And, they possibility contribute—just as someone who takes a recipe and tweaks it to their liking. They add additional salt, spices, or sugar. On the flip side, we want learners to add and enhance the learning experience so it continues to evolve, improve, and make a difference.
  • When you determine what group you want to get together to go through a Design Thinking exercise, invite diverse perspectives to the table—different backgrounds, disciplines, and demographics. This helps avoid unconscious bias. So, don’t just have decision makers in the room. Have a junior person. A high performer. An average performer from another division. A senior leader. Use Design Thinking sessions as a development experience for your workforce!
  • While the “steps” in Design Thinking can vary from team to team, organization to organization, the mindset is the same—understand the people and problem, and take action.
    • Empathize
    • Define
    • Ideate
    • Prototype
    • Test/Implement

Solving a problem with a Design Thinking approach takes the pressure off. This is not about getting it right before you can move forward. This is about taking action in order to learn. You’re constantly tweaking and sense-checking along the way, gaining insights you may otherwise wait too long to discover if you were to wait until things were “perfect.”

After the presentation, several great questions came up from the audience and I wanted to share them with you. Below are those questions and my best answers. This is an ongoing conversation, and I encourage you to keep the questions coming in via the comments section at the bottom of this page.

Q: What if you were going through a Design Thinking process for multiple roles/learners?

A: Create separate learner personas rather than trying to generalize your audience. And, personas aren’t always roles or functions (although that could be a factor). This might not necessarily be different levels in your organization (associate, manager, etc.), but it could be how each learner “shows up” at work. For instance, a segment of employees may be specialists in a particular area but are not necessarily motivated to understand how other areas of the business can improve their customer’s expertise. Whereas another segment of individuals with a similar role or level have more general expertise and need to hone technical skills. Components of a learning experience might be the same for each persona, but it’s important that each group is differentiated and personalized.

Q: How long does a design thinking process take? Are you in Beta forever?

A: It depends. Just like creating a one-hour classroom training might take a week, building a one-week immersive, simulated experience will take much longer. And, many will debate on how much research is needed during the Empathize phase. Do you need a complete role excellence profile, which could take months? Or a week of observations? Or, do you craft a simple persona with built-in assumptions during a one-day workshop? Feasibly, if you take one problem and have identified a target learner population, you could do a Design Thinking workshop with 8–12 people in two days and cover Empathize, Define, and Ideate. Many of our teammates look to create functional prototypes that can be tested in 2–4 weeks (or even less). When you force an early prototype and accept something as simple as a script or story, then you reduce investment bias, trying to force the solution as opposed to failing fast.

Beta forever happens, but it’s not a fate you have to accept. What is good enough for the audience? I argue that content has such a short shelf life that even a “final product” requires tweaking and updating. We recently prototyped a coaching app with a diagram. We asked users to close their eyes and pretend the visual was appealing and everything functioned. We simply asked them targeted questions: Would you use this feature? Would you pay for it? Would you use it more for productivity or more for gathering information? Then, it helped us tweak to get to the next level: a series of screenshots. We asked the same previous questions but added more. So, prototyping can emulate phases of a “waterfall” design (Alpha, Beta, and Gold stages), but the way in which you test/review is different. If you keep a purpose when prototyping, you’ll make progress.

Q: How do you reconcile the time demands of the iterative process with budget restrictions or client-based time restrictions?

A: Ironically, I find getting client feedback is easier. We have stopped long in-person or virtual reviews (where most people tune out and multitask), but show a segment of an experience and ask key questions via a survey or Google Form. By chunking it down, we can get feedback more passively, which many of our clients prefer since there are such great demands on their time.

As for budget, we have approached it differently, depending on client. For some, we only scope the design session, and then based on the ideas, we plan to prototype or build out; we then scope those separately. For others, we plan for a retainer for a period of time that involves dedicated time from a mix of roles that we anticipate we might need, such as visual/UX developers, graphic recorders, consultants, and writers.

Q: Ideas for when the response to “Let us research, investigate, etc.” is NO.

A: I get it. Clients may not have time or money to spend on analysis. Try to go about it a different way. Often, I find the resistance is around what people assume is “analysis”: weeks of interviews with high-cost resources, expensive travel, and lots of time out of the workflow. Position it differently. Interview at dinner. Observe during a meeting. Send out a survey and follow up with a short Q&A. Ask to just do a ride-along for one day. Ask the learner to put themselves on camera for the day and send it to you. Go find job descriptions. Read Glassdoor reviews. Google. Tinker around your internal network and ask questions. Be scrappy.

Q: Follow up on the resistance question above: Could you use a SWOT analysis when the answer is “no” as a way to convince the client?

A: You could—but don’t make too many assumptions. Perhaps use the SWOT as a way to draft learner personas, and then ask learners if that resonates.

Q: What if the learners are not hungry? Is there a secret recipe?

A: Ditch your prototype and try another idea. The secret recipe is don’t hold on to something that isn’t working.

Q: Do you have a concise definition of Design Thinking or a resource you can point to that provides more detailed information?

A: I really thought about this one, and even did some research and cobbled a good pitch: Design Thinking is an action-oriented approach AND mindset that rallies around a problem and those experiencing the problem. It’s focused on knowing your learner, defining the problem, and testing solutions until the problem is solved.

Q: Can you provide an example of what a prototype or product would be for testing (such as a document, a short eLearning lesson, a lesson plan, or all three)?

A: My favorite “first” prototype is via a written story. We name someone and walk them through a narrative of the experience the learner takes from start to finish. Through that, we identify all the components of the learning experience, whether it’s meeting a mentor, watching a series of videos, taking part in an online MOOC, or completing a work project as a capstone. I make it fun with icons.

Q: How does Design Thinking change Performance Consulting? What considerations should be made?

A: Performance Consulting is a capability, whereas Design Thinking is a way to solve problems with a human-centered mindset. Great performance consultants analyze the work, worker, and workplace with the lens of outcomes that produce results. I would push performance consultants as they empathize with their audience: Help define the problem to solve and then urge them to act on it rather than just reporting out.

Q: How long is “good research” for Design Thinking? We have market research that spans the last 5 years—yet there is an internal urge by a lot of new people not to rely on it—but to spend the time/$ to do the research again.

A: I’m assuming you mean research on your learners/end users. I think the point of Design Thinking is to get to know your learners and care about their lives. Market research can help give context, even if it’s slightly dated, but it won’t tell you the stories or the context. Don’t just use research and data analytics—be sure to observe and engage!

Q: If you consider the ripple effect, would you “cater” to the learner’s leader, the leader’s leader, or the organization?

A: Cater to the learner. The ripple effect exercise (which I LOVE) helps you identify, if you solve the problem for the learner, what the impact would have on others. If you find that impact creates a problem for someone else in some way, then reframe the exercise with that individual in the center. As for the organization, ideally, the impact would be some positive business result gain. But, a positive business gain in one area “could” have a detrimental impact on another. For instance, if your learners are call center agents and you are solving for improving the customer experience, then the length of the phone call with a customer may increase because the agent will be building rapport and creating a value proposition with the customer. However, longer phone calls can lead to longer hold times for other customers, which could lead to increasing staff, which could lead to increasing costs. But, would those costs be offset by the additional sales or increase in customer loyalty? Map it out!

About the Authors

Britney Cole
Britney is a learning leader with experience in organization development, human performance, and corporate learning and has worked remotely, managing virtual teams for more than a decade. Britney lives in Minnesota with her husband and three small children (ages 5, 7 and 8) where she keeps warm with plenty of blankets and cozy hats. She likes to talk, so you might see her at learning conferences as a speaker. Britney has provided consulting for clients in the financial services, pharmaceutical, steel, chemical, media, technology, retail, manufacturing, and aerospace industries. She forms lasting partnerships with her clients helping them with learning design and architecture, content development, leadership and professional development, performance consulting, technology implementation, and change management. Most recently, she is helping pioneer new experiential learning methods and defining learning 3.0 taxonomy.

Get in touch.

Learn more about our talent transformation solutions.

Transformation doesn’t happen overnight if you’re doing it right. We continuously deliver measurable outcomes and help you stay the course – choose the right partner for your journey.

Our suite of offerings include:

  • Managed Learning Services
  • Learning Content Design & Development
  • Consulting
  • AI Readiness, Integration, & Support
  • Leadership & DEI Training
  • Technical Training
  • Learning Technologies & Implementation
  • Off-the-Shelf Training Courses