A lot of you write test questions for online training (or even for paper-based training).
Maybe you’re doing it with an eLearning authoring tool, such as the ones from Articulate, Adobe, or Lectora. Or maybe you’re doing it with quizmaking tool built into your learning management system (LMS). Or maybe with pencil and paper. Probably not with chisel and cuneiform, though 🙂
However you’re doing it, you may sometimes find yourself wondering about the best practices for writing standard question types. (By the way, instructional designers often use the wonky phrase “assessment items” for this kind of thing–an assessment “item” is a question).
In this article, we’re going to give you tips about something related to test creation that learning experts call fidelity (no, not THAT fidelity–this is not a juicy blog post). In training talk, fidelity is the extent to which your test or test question mirrors the real task your workers will have to perform on the job.
In describing fidelity and test questions, we’ll cover a few other best practices, too. Hope this helps you with your question writin’.
Introduction to Writing Better Test Questions
Before we get down to fidelity, let’s take a moment to consider some basics.
Create Learning Objectives First
All training activities should include a set of learning objectives. The learning objectives are the reason that you created the training–they are what you want employees to be able to do after the training is over.
If this idea of learning objectives is new to you, take a moment and check out our Guide to Writing Learning Objectives and then come back and finish up here.
Create Tests/Test Questions (also called “items”)
(Welcome back if you’ve just read the guide to learning objectives).
Once you’ve created your learning objectives, you can either:
- Create your training materials and then create your assessments to evaluate how well employees learned, or
- Create your assessments and then create your training materials
I know that second option–creating the assessments before you create the training materials–may sound counter-intuitive if you’ve never heard it before. To be honest, I don’t think it’s necessary to do it that way, which is why I listed both options. But a lot of instructional designers and training experts swear by writing the assessments before writing the training materials. And they’ve got some good points. So it’s worth at least considering and maybe even giving it a try. Why not, right?
Here are those two options again.
Training & Test Creation Option 1 (create materials before assessments):
Training & Test Creation Option 2 (create assessments before materials):
Matching Your Learning Objectives And Your Tests (Learning Experts Call this “Fidelity”)
Whether you write your assessments before your training materials or do it the other way, the first step of writing an assessment item is to go back and look at the learning objective(s) for the training.
Ask yourself: what does the worker have to do to satisfy the learning objective? Once you know that, you can create a proper assessment. The assessment, obviously, must determine if the employee can satisfy the learning objective.
Because when you’re all done, there must be a direct relationship between the learning objective, the training material, and the assessments, as shown below. That’s what we mean by “fidelity.”
Let’s consider this example: the learning objective says a worker has to “perform a machine change over.” You think about it, and you decide to create a question that presents the worker with each step of the process and asks the worker to “drag” the steps into the correct order, from first to last.
Now, here’s a question for you: is this a proper assessment of the learning objective? If the worker does drag the various steps into the correct order, does that mean the worker can perform a machine change over?
We’ll give you a little time to think about that….
OK, are you back? What’s your answer? Did you say “this isn’t a proper assessment” and “putting these items in the correct order doesn’t mean the employee can perform a machine change over?”
If you DID say just that, you’re right. If you didn’t say that, let’s explain why your friends were right and what you may have overlooked. The learning objective states that the worker has to be able to perform a machine changeover. That means the worker has to actually DO IT in the real world–perform the machine changeover procedure.
Getting the steps in order is a nice start, but that doesn’t mean the worker knows how to perform each of those steps. And so that wouldn’t be a proper test for that learning objective.
Learning experts would tell you this question has a fidelity problem. When learning experts talk about fidelity in the context of assessments, they’re talking about how well the assessment matches the objective. A high-fidelity assessment matches the terms of the objective well. A low-fidelity assessment doesn’t match the terms of the objective well.
So, that’s your takeaway here. Before you create an assessment, take a look back at the learning objective, ask yourself what the worker really has to do in order to satisfy the objective, and then figure out what kind of assessment would evaluate if the worker can do it or not.
In the example we just gave, you’d probably be better off creating an assessment in which the employee actually demonstrates physically how to perform a machine change over. That assessment would have high–even perfect–fidelity. Or you could create an interactive, simulation-based assessment that would not take place in the real-world but would still have high, if not perfect, fidelity.
All of which means that when you’re writing questions for online tests, you’ll have to ask yourself if you CAN accurately assess the worker’s ability to satisfy the learning objective using the kind of questions you can make for a standard online test (things like true/false, multiple choice, matching, drag and drop, sequencing, etc.).
If so, great, go for it.
If not, consider making a more sophisticated scenario-based eLearning test or using a performance assessment that occurs in the “real world.”
If you go with the real-world performance assessment option above, you can even create digital versions to help with administering these (see below).
There are now many mobile tools that help you administer field-based performance assessments for workforce training evaluation.
You can read our related article Testing Employees After Training: Best Practices for Workforce Training Evaluation for more information about creating test questions and “real world” performance assessments.
Conclusion: Workplace Testing and Fidelity
So the takeaway here is to make your test match the real on-the-job performance as closely as possible. If it’s realistic to get a perfect match, great. If not, consider near-perfect matches with scenarios.
For more on workforce training and testing, consider some of these articles:
- Workforce Training & Testing Best Practices
- Writing multiple-choice questions
- Writing true/false, matching, dragging, and other types of questions
- Using scenario-based learning and assessments
- Testing and fidelity
- Training, testing, validity, and reliability
- The “testing effect” and the forgetting curve
Let us know your thoughts on workplace testing in the Comments below.
How to Write Learning Objectives
All the basics about writing learning objectives for training materials.