Fairness, validity, and reliability are three critical elements of assessment to consider. The Education Evaluation IPA Cohort of 2013 compiled this chart of definitions and examples.
Assessment should not discriminate (age, race, religion, special accommodations, nationality, language, gender, etc.)
Subjective vs. objective
Have you prepared students sufficiently to demonstrate skill/knowledge?
Example: Blackboard assessment(objective, more black and white, right-wrong)Non-Example:
Does it matter?
Is it relevant?
Degree the assessment measures what it claims to measure
Does it have value or worth?
How well does it correspond to the real world?
Example: Student generated questions used as a review; using rubrics and providing to students ahead of time (content, critical thinking, communication skills, accountability)Non-Example: Opinion poll may lean away from validity
Repeatable – if you give the same assessment to a different group of people, will you achieve the same result
Transcends class and instructor
How does mode impact the assessment?
Example: Department-wide final exam (same course), using a pool of questions with similar level of difficulty and distribution of question topicsNon-Example: not controlling for question difficulty (even distribution of questions)