By Author
  By Title
  By Keywords

July 2003, Volume 53, Issue 7

Original Article

Towards Improved Assessment - 2 General Principles For Writing Test Items

Z. Naqvi  ( Department for Educational Development,The Aga Khan University, Karachi. )
R. Ahmed  ( Department for Educational Development,The Aga Khan University, Karachi. )

Assessment is a critical component of instruction and with judicious use can assist in achieving curricular goals. The content of assessments also communicates to the students, faculty expectations. Furthermore, well-designed assessment tools help in filling instructional gaps and motivate students to read broadly. Considering the powerful effects of assessments it is very important that testing tools should be carefully chosen and formulated to provide constructive feedback to the students and teachers about students' competence and deficiencies. However, writing high-quality test questions is not an easy task1, especially.
Although currently several assessment tools are being used validly and reliably for assessing the progress and competence of medical students, the major issue is to use testing time most efficiently for valid and reliable results. However, writing high-quality test questions is not an easy task1, especially relevant and valid for medical education.


While the different types of questions--best choice, short and long essays, true-false, and matching are constructed differently, the following principles apply to constructing questions and tests in general.
1. Make the instructions for each type of question simple and brief.
2. Use simple and clear language in the questions. If the language is difficult, students who understand the material but who do not have strong language skills may find it difficult to demonstrate their knowledge. If the language is ambiguous, even a student with strong language skills may answer incorrectly if his or her interpretation of the question differs from the instructor's intended meaning.
3. Write items that require specific understanding or ability developed in that course, not just general intelligence or test-wiseness. Literature debates show that creating questions to test complex thinking patterns demand a degree of professional commitment to 'test design' by the teachers. Lack of this skill leads to an over-emphasis on construction of questions that test memory and lower-level skills.3 Thus, incorrect design of testing tools encourages students to focus on topics at the expense of those which require development of appropriate teacher skills.4
4. Do not suggest the answer to one question in the body of another question. This makes the test less useful, as the test-wise student will have an advantage over the student who has an equal grasp of the material, but who has less skill at taking tests.

Table. Checklist 1.

Why am I asking this question? Because: (major objectives)
(a) It tests an essential fact that all students should know before they can proceed to the next stage or become safe medical practitioners.
(b) It tests an important concept for understanding a common clinical/health problem.
(c) They will be facing tis problem every week of their practice.
(d) They should know the contending illnesses, which mimic common problems.
(e) It is an unusual presentation of a common health problem.
(f) It is a preventable cause of a common problem.
(g) It deals with common emergency situations.
(h) It deals with rare emergency situations.

(minor objectives)

(a) They have been taught this in lectures/tutorials.
(b) They have studied it for the foreign examinations.
(c) It is my area of special interest.
(d) They should have read the whole book.
(e) It is a breakthrough piece of new knowledge.
(f) It is a current hot topic in medicine.

5. Your colleagues should be able to provide the correct response to the questions. If correct answer can be given by only the question writer, and other teachers of the same level cannot achieve the passing marks it indicates that: either the pieces of information being asked will never be used by the learner or the question is framed in an ambiguous manner or is too difficult.
It is universally known that the nature of assessment tasks influences students' approaches to learning and define syllabus5-7 and hence the subsequent grades.8 This suggests that students who perform well in university examinations can retain fundamental misconceptions about key issues in subjects they have passed.9 Some of the most profoundly depressing research on learning in higher education has demonstrated that successful performance in examinations does not even indicate that students have a good grasp of the subject which the examinations are believed to be testing.

Relationship to Course Objectives

For content validity of examinations, questions should be developed in accordance with the approved Table of Specifications (blue print) for the course and certifying examinations.1-4

Objectives of the Questions

Clear pre-defined objectives for each question greatly helps in framing of meaningful items. Before setting to write an item or a question, the examiner should have a clear idea of what piece (s) of information does he wants the examinee to provide. After having established the broad content of the question the examiner should review Checklist 1 (Table).
While answering this 'Why' question it is important to remember that assessment is defined as the measure of the extent of achievement of objectives for a given course/period.
To summarise assessment should aim at gauging the extent of student learning with a special emphasis on the major objectives of the course. Meaningful assessment items/tools indicate the deficiencies in student learning and programme in terms of essential objectives.


1. Schultheis NM. Writing cognitive educational objectives and multiple-choice test questions. American Journal of Health Systems Pharmacology. 1998; 55:2397-401.
2. Childs RA. Constructing classroom achievement tests. ERIC Digest. ERIC Clearinghouse on Tests, Measurement, and Evaluation, Washington, DC: American Institutes for Research, Washington, DC, 1998.
3. Black PJ. University examinations. Physics Education, 1969;3: 93-9.
4. Dahlgren LO. Outcomes of learning. In: Marton F, Hounsell D, Entwistle N. (Eds). The experience of learning. Edinburgh: Scottish Academic Press, 1984.
5. Elton L, Laurillard DM. Trends in research in student learning, Studies in Higher Education, 1979;4:87-102.
6. Miller CML, Partlett M. Up to the mark: a study of the examination game. Guildford, SRHE and Open University Press, 1974.
7. Ramsden P. Studying, learning, improving teaching. In: Ramsden P. (Ed). Improving, learning: new perspectives. London: Kogan Page, 1988.
8. Naqvi Z, Ahmed R. Learning approaches and academic performance of undergraduate medical students in Pakistan. 2000;50:20-4.
9. Ropka ME, Norback J, Rosenfeld M, et al. Evolving a blueprint for certification:M the responsibilities and knowledge comprising American professional oncology nursing practice. Oncology from Nurses Forum. 1992;19:745-59.
10. Paul RW. Bloom's taxonomy and critical thinking instruction. Educational Leadership;1985;42:36-9.

Journal of the Pakistan Medical Association has agreed to receive and publish manuscripts in accordance with the principles of the following committees: