Matching Items

What is a matching item?

Matching items are presented in groups as a series of stems or prompts that must be matched by the student to one of a group of possible answer options. The format is particularly useful when the objective to be measured involves association skills or the ability to recognize, categorize, and organize information. Matching items can be written to measure high levels of understanding but are most typically used at the knowledge level and for younger students. 

Matching Items

Match each work with its author. 
Answer options may be used more than once or not at all.


(Answer Options)

_____ 1. The Great Gatsby

A. Updike

_____ 2. The Grapes of Wrath

B. Salinger

_____ 3. The Sound and the Fury

C. Faulkner

_____ 4. Of Mice and Men

D. Fitzgerald


E. Hemingway


F. Steinbeck


Designing matching items 

As with multiple-choice items, there has only been a small amount of empirical research on the characteristics of matching items and how they affect validity or reliability. In addition to research findings, there is also a common set of recommendations found in classroom assessment textbooks. A few of the critical guidelines from both these types of data (Frey, Petersen, Edwards, Pedrotti, & Peyton, 2003; Haladyna & Downing, 1989a, 1989b; Haladyna, Downing & Rodriguez, 2002) are presented below.

Guideline 1.

There should be more answer options than stems.
As with many item-writing rules, the idea is to generate as many plausible answer options as possible, so students must have the knowledge to get the question correct.

Guideline 2.

Answer options should be available more than once.
As with Guideline 1, this increases the number of functional distractors and increases the validity of the items. With this guideline, it is important that the instructions for the matching section indicate that answer options may be used more than once or not at all, so all students are aware of the rule.

Guideline 3.

Directions should include basis for match.
A brief instruction identifying the category of stems and answer options (e.g. leaders and nations, species and phylum) helps students to focus on what constitutes a match, so they can concentrate on choosing the correct answer.

Guideline 4.

Number of answer options should be < 7 for elementary age.
It is believed that younger students have a difficult time sorting through more than just a few answer options. While some students may be able to handle many answer options, other students will have less of that characteristic. The ability to process and evaluate many possibilities is likely not the measurement objective of the assessment.

Guideline 5.

Number of answer options should be < 17 for secondary age.
It is believed that older students can handle longer matching sections with many answer options, but too many options can slow down even the quickest of test-takers (especially when Guidelines 1 and 2 are followed). A well-made classroom assessment should not be exhausting for students.

Guideline 6.

Matching stems should be on the left and answer options on the right.
Students are used to reading from left to right, and the process of matching two concepts together is similar to the construction and comprehension processes which occur when reading sentences.

How can the use of quality matching items benefit your students, including those with special needs? 

Like multiple-choice items, a section of matching items on a test can cover a large amount of material in a relatively brief period of time. In fact, matching items are even more efficient than multiple-choice items because each stem acts as a separate multiple-choice item with all the answer options as possible answers. Functionally, a matching section containing ten stems operates as ten multiple-choice items. When well-written, all the wrong answer options act as distractors. Guessing is difficult, perhaps the most difficult of any objective test format. Because matching items allow for many items in a short space and make guessing difficult, the validity and reliability of classroom tests are improved, and that helps all students to be assessed fairly and accurately.


Research Articles

Frey, B.B., Petersen, S.E., Edwards, L.M., Pedrotti, J.T. & Peyton, V. (2003,
April). Toward a consensus list of item-writing rules. Presented at the 
Annual Meeting of the American Educational Research Association, 
Haladyna, T. M. & Downing, S.M. (1989a). A taxonomy of multiple-choice
item-writing rules. Applied Measurement in Education, 2(1), 37-50. 
Haladyna, T. M. & Downing, S.M. (1989b). Validity of a taxonomy of
multiple-choice item-writing rules. Applied Measurement in 
Education, 2(1), 51-78.
Haladyna, T.M., Downing, S.M., & Rodriguez, M.C. (2002). A review of
multiple-choice item-writing guidelines for classroom assessment. 
Applied Measurement in Education, 15(3), 309-334.