PARCC defines technology-enhanced items (TEIs) as “tasks administered on a computer and [that] take advantage of the computer-based environment to present situations and capture responses in ways that are not possible on a paper-based test” (PARCC, 2016). Although many item-development agencies include multiple choice and constructed response items when listing possible types of TEIs, they technically are not TEIs using this definition, since they can be employed on paper-based tests.
Next-generation assessments do, however, include these item types and there are definitely situations that still warrant them. Here are six situations where it may be more appropriate to use traditional assessment items over TEIs:
1) Recall of Facts
Using a TEI for low-rigor assessment items, such as fact recall, may only increase the time a student spends on an item without increasing the information gleaned about their proficiency at a particular standard. A fact is usually judged as right or wrong. For example, there is no variation in the answer to “During which president’s administration was the Louisiana Purchase made?” Since “Thomas Jefferson” is the only correct response, a multiple choice item is a sufficient means of assessing this knowledge quickly and scoring it appropriately.
2) Subjective Test Items
Even with an expanded range of assessment options, most TEIs do not have the capability of addressing subjective concepts without including some predetermined options as part of the response. Because of this, constructed response items may be more appropriate when asking students to create their own argument. This sort of item can better mimic communication in the real world since it requires students to to apply knowledge and use thinking strategies to analyze, synthesize, and evaluate information. Other constructed response question stems may involve verbs such as “discuss,” “explain,” or “compare and contrast.”
3) Assessing Procedural Concepts
If you simply want to assess whether or not a student can obtain the correct answer without assessing the process, a TEI is not necessary. A multiple choice item is a more efficient means of obtaining this information because, as with “recall of facts” items, the answer is valued rather than the procedure. This is especially applicable to items that assess low-rigor math standards (for example, items that merely ask students to multiply two numbers or solve an equation).
4) The Writing Process
It is difficult to evaluate a student’s writing skills using any selected response item type, be it traditional or technology enhanced. The simplest way for an item to assess students’ composition skills, language use, or argument organization is to require them to construct their own written response. Just remember that constructed response items need a clear rubric to ensure objective grading.
5) Multiple Choice Items with High-Quality Distractors
If distractors for multiple choice items are purposefully designed to target key weaknesses, TEIs may not be necessary to get clear insight into student misconceptions. Multiple choice items can have high diagnostic power a long as students are not guessing. Items with stems that ask students to “choose the best option” or some similar variation have the potential to reveal student misconceptions, based on their incorrect answer choice, if distractors are properly written.
6) Items Testing Broad Knowledge of Curriculum or Learning Objectives
Since multiple choice items are generally quicker for students to complete than certain types of TEIs, assessments composed mostly of multiple choice items can include more questions and assess a broader range of standards. This can be especially useful for pre-assessments that revolve around fact recall and other low-level skills. However, distractors should be well-written, as previously mentioned, for the data to be useful to stakeholders.
Balancing well-designed multiple choice and constructed response items with a variety of TEIs in assessments can help evaluate students’ skills, knowledge, and comprehension at all levels.
Partnership for Assessment of Readiness for College and Careers Glossary of Terms. (2016). Retrieved July 13, 2016.