Center for the Enhancement of Learning and Teaching

Evaluating Your Assessment Instruments

In evaluating a student assessment, be it a paper and pencil test, a creative production or a written assignment, the most important component is to be sure there is a match between the objectives of the unit/course/lesson being assessed, the teaching/learning activities used, and the assessment tool.

In assessing the appropriateness and effectiveness of an assessment tool consider the following:

  • What are the objectives of the course/unit/lesson that are being assessed?
  • What domain is being assessed: cognitive, affective, psychomotor? Is the domain appropriate given the objectives for the course/unit/lesson?
  • If the domain is cognitive, consider what level from Bloom's taxonomy is being assessed: knowledge, comprehension, application, analysis, synthesis and/or evaluation. Is the level appropriate given the objectives for the course/unit/lesson?
  • Is the assessment at a level appropriate to the level of the course (freshmen, graduate etc.)?
  • How well does the content of the assessment match the objectives being assessed?
  • How well does the content of the assessment match the learning opportunities presented in the unit/lesson/course (i.e., does the assessment assess what was taught)?
  • How clear are the directions for the assessment (i.e., what response is required of students, length and form of that response, time for completing response)?
  • Is the assessment organized in such a way as to aid clarity and understanding of its requirements?
  • Here are some further considerations you should give paper/pencil tests (see Gage and Berliner for more information).

Back to Top

Essay questions:

  1. Are verbs chosen carefully and precisely to clearly indicate what students should do? (For example, explain, list, identify, construct, compare.)
  2. Does instructor have a model answer or specific points to help make grading more consistent?

Back to Top

Multiple-choice questions:

  1. Is the stem a meaningful part of the question?
  2. Are distractors plausible?
  3. Are all choices of roughly equal length and precision?
  4. Are all choices grammatically consistent with the stem?
  5. Does answering correctly depend more on reading ability than content knowledge?


Assessment Item Creation and Review. Retrieved from

Chism, N. (1999). Peer Review of Teaching: A Sourcebook. Bolton, MA: Anker. Available in CELT library.

Creating Effective Classroom Tests, by Christine Coombe and Nancy Hubley.

Matching Instructional Objectives, Subject Matter, Tests, and Score Interpretations, Hanna and Cashin, 1987. Retrieved from

Gage, N.L. & Berliner, D. C. (1998). Educational Psychology. 6th edition. Boston: Houghton Mifflin.

Improving Essay Tests, Cashin, 1987. Retrieved from

Improving Multiple Choice Tests, Clegg and Cashin, 1986.
Retrieved from

Back to Top


Learning to Teaching Online Summer Course (CE): Monday, July 8, 2019 12:00 AM – Sunday, July 28, 2019 at 11:59 PM, Led By Katie Jia  Register here.




Test Scoring & Data Analysis
CELT Newsletter
Give CELT Feedback


Helmke Library
Services for Students with Disabilities 
IT Services
Academic Support
Division of Continuing Studies
Alliance for Teaching Enrichment