You create a course, test it on a few of your mastermind friends, make some changes, and then put it on the market. Of those who actually buy it, some folks thank you for the course, some say it’s great, but most others don’t say a thing.
If you have a course platform that can report on your students’ progress you might even see that only 20 or 30% have even bothered to get through it.
What’s in the way? Is there anything you can do to help your students to complete your course and get the most out of it?
In this post, I’ll cover 3 techniques you can choose from to get useful, honest feedback on your course. I’ll move from the easiest technique to one necessitating greater effort and preparation on your part (but still within your reach) and finally discuss an approach that you can actually outsource. Yay!
Let’s start with the easiest:
1) Reaction Evaluations
Have you ever been to a workshop where you’ve been given a one-sheet evaluation at the end? They’re basically assessing your reaction to the training. Was it a high-quality training? Was it relevant?
Good news is that these kinds of evals are easy to create and evaluate.
Heads-up though – they’re only going to give you some info and only a small portion of your students will likely bother to take it unless you offer an incentive (more on that later).
Still, they give all your students an opportunity to give you feedback.
How many do you need? It depends on how long the course is. If you have a course with 6 modules and each module takes the learner 1 hour, it makes sense to create one eval per module with perhaps only 2 or 3 questions. But if your entire course is only about a 1 or 2-hour endeavor, create one eval for the whole thing. You can up your questions in this case.
Here are some tips:
- Get beyond “how’d I do?”
I’ve seen a lot of reaction evaluations that are all about the instructor. Instructor-centered design is “so yesterday.” Try focusing the questions on the learner, not the instructor, to get to what needs to be changed to make the learning better.
- Probe for the perceived impact
There’s a bunch of research that shows that people’s rating of relevance has a higher correlation with learning than their rating of learning. The take-away for you is to skip questions about how much the learners thought they learned. Instead, ask how relevant the learners thought the course was by using a scale from “not at all relevant to me and my [job/course/or other]” to “very relevant.”
- Get specific
Think about the particular questions you have for each module. Are you concerned that there was too much material in Module 3? Ask. Wondering if Module 1 was too simplistic? Ask.
How might you do this? My favorite way is to simply create a Google Form. Free, very quick and easy. (Just Google it to find out more.)
Once you’ve created your evaluation you’ll want to put some incentives in place for people to actually take the time to fill it out.
Sure, there are always bonuses and bennies to hand out. Regardless of whether you sweeten the pot in some way, do insert a reminder about why they should bother.
Evaluation guru, Jim Kirkpatrick, suggests that you take some time to tell the learners why the questions were chosen and what you’ve done with past feedback to improve your workshop. Encourage them to be totally honest with you so that you can provide future learners with a far better workshop.
In some cases, altruistic interest in helping others may be the nudge someone needs.
Recognizing that reaction evals are a limited way to get good feedback, you might want to step it up a notch:
2) Mini Focus Groups
If you have the time – and can gather some $$ or other incentives – see if you can entice some of your students to meet together online for a short discussion. I recommend doing this in small groups – 2-4 people at a time – because often what one person mentions sparks input from the others. Small groups also give you an opportunity to hear contradictory feedback that you can later weigh against what you heard in your reaction evals.
When I’ve done this in the past I’ve had great luck by going through each module of the course, asking what worked and what didn’t work for each. Usually, this type of quick-and-dirty evaluation is worth many times the results I get on my reaction evals.
Lastly, let’s look at the most comprehensive way to get feedback:
3) Get an audit
Okay, so I’m a *bit* biased but can say with certainty that the best way to get great feedback on your course is to get an instructional designer to give you a course audit.
In some cases you might want an overall audit on the strengths and weaknesses of the course. In other cases you might want a targeted audit exploring a particular issue or challenge you’re having based on initial feedback.
How does that play out?
One course creator I worked with was getting feedback that her learners weren’t clear on how to translate her course content into changes and actions they should be making in their business. Her audit focused solely on that question and consisted of my recommendations for how she might increase clarity and in what parts of her course.
Another just gave me access to her course and said, “Tell me what I should be doing differently.” I focused on my top 3 recommendations and gave her the steps to make those improvements herself.
Make sure that you not only understand the issues but also understand what to do about them!
There you have it! 3 techniques to get the feedback you need to improve your course.
But don’t stop there! Keep in mind that what you’re hearing might help you cook up powerful copy for your sales sequence and be the start of some great testimonials. (Repurpose, repurpose, repurpose!)
Check out the SELL IT tag for posts about this.
As usual, if you need any help with these or want to contact me about doing a course audit for you, let me know! (Here’s my contact form.)