Designing exams

Designing better exams means applying the same assessment principles as you would in designing any other assessment task.

Check out the Exams quick tips. These will assist academics in writing/preparing exams and preparing for the exam period(s).

When designing an exam, ask yourself the follow questions.

Q1

What is the purpose of this exam?

Like other assessment tasks, examinations can be used to measure student learning and the effectiveness of the learning process. So, we must ensure that the skills examined during the examination are constructively aligned with the skills required by the subject learning outcomes. To do this, look at the action verbs in the subject learning outcomes. For example, students may need to be able to explain key principles or make recommendations around a particular problem. These action verbs are key to designing an exam that produces evidence of students demonstrating those skills and abilities.

Another aspect that could be considered is whether the exam is an assessment of learning, for learning, or as learning. The timing of an exam may dictate the type of exam; for example, a mid-session exam may focus on an assessment of learning, while the end-of-session exam is an assessment of learning.

So the above is all about identifying what you require at the end of the assessment, what you want students to learn or show. Then you can ‘plan backwards’ to design the exam you need and obtain the evidence from the exam that you require.

Q2

What exam format/method will allow the students to demonstrate the relevant learning outcomes?

What type of response is required?

A fixed response can be explained as an objective response in which students select the correct response to a question or supply a word or short phrase to answer a question or complete a statement. This can be done in multiple-choice, true-false, matching, or fill-in-the-blank questions.


A constructed response is a subjective narrative that challenges students to create an original answer. Examples include short answer, long answer, essay, multimodal responses, reflections, or performance test items.

Open, closed or in between?

Open book exams allow students access to resources including the web as they complete the exam.

Limited resource exams restrict the resources that students can use.

Closed book exams do not allow any resource material.

Have you chosen the best test format for evaluating cognitive ability to meet the learning outcomes?

The verbs in the learning outcome can provide direction towards the choice of the question type. Some verbs such as identifylist, and select clearly indicate that students need to select the response. If the question is written in such a way that a student has to reason in order to select a correct response, student actions such as analyse or compare could be included in the selection of answers. Generally, verbs such as analyseapplyinterpretcompareinfer and predict indicate that a student should construct a response.

For instance, if the subject learning outcome is expecting the student to be able to synthesis information; then multiple choice questions (MCQ) would not serve as an appropriate assessment tool. Instead, a long answer question with clarity and emphasis on the importance of the student’s ability to synthesis information should be given. However, having MCQ as an assessment does not always mean that the questions are meant to evaluate lower order cognitive skills. MCQ type questions can be constructed to assess higher order cognitive skills like analysecomparejudge etc.

To gain a greater understanding of potential student actions for different question types, you could review Bloom's or SOLO Taxonomy.

What exam format should I use?

There are many different exam formats. Some examples are:

  • Automatically Marked exams - including multiple choice, fill in the blank and matching question types.
  • Short answer exams.
  • Essay exams.
  • Problem or case-based/scenario exams.
  • Oral exams where students may be interviewed and need to justify their answers or demonstrate their clinical reasoning.
  • Practical/Performance exams where students physically demonstrate skills and knowledge in a controlled environment such as a simulation or role-play or practical skills exam.
  • Group exams where students collaborate on a task to create a product or solution to a problem.
  • Computational or calculation exams.
  • Open-book and take-home exams are usually based around higher order thinking skills. They may require synthesising research, forming recommendations, evaluating contexts or justifying decisions.

Does the exam cover what is required?

You may want to map the questions against the subject learning outcomes, knowledge level, degree of difficulty, format, etc to understand the outcomes' coverage and whether the right level of challenge has been achieved.

Q3

Have you considered...

Use this list to sense check your exam.

  • The time you have to design and build the exam.
  • Length of time for students to complete the exam and how long each question/task may take the student to complete.
  • Length of time you have to mark.
  • Number of students taking the exam.
  • Diversity of students taking the exam.
  • What kind of technology is used for exam delivery. Are students confident in using this technology.
  • The spread of marks across the exam.

Reliability, validity and bias

We also talk about exams having the following characteristics of:

  • Reliability - is demonstrated when an exam produces outcomes that are consistent over time and reflect a students actual ability (Banta and Palomba, 2015). Reliable tests are not too long, and have clear instructions and marking guidelines.
  • Validity - means that the test has assessed what it aimed to assess (Banta and Palomba, 2015). This means it is aligned to the subject learning outcomes.
  • Free from bias - There are two types of bias:
    • Construct validity bias - Does the exam assess what it is intended to assess (the subject learning outcomes)?
    • Content validity biasDoes the exam favour one group of students over another group?
Q4

How to improve Academic Integrity?

Another aspect of designing an exam is considering strategies to improve academic integrity.

Designing for Academic Integrity

Consider the following suggestions:

  • Question design that is authentic, contextual, reflective and/or open with multiple solutions
  • Make use of higher order thinking skills, scenarios/case studies
  • Involve multi-step problem solving
  • Socratic questioning
  • Limit resource material that can be used or is based on provided stimulus/resource materials
  • Require student responses to be contextualised to their own experience
  • Use of higher order thinking skills (Blooms taxonomy). Set questions that require students to make use of their subject material rather than simply locating or rewriting information
  • Assess the process. Require students to submit workings, calculations, proofs, research methods or justifications for their answers

See Designing for academic integrity.

Exam format

Your exam may not have to be in the traditional question/answer format. Alternate formats such as oral, collaborative, simulation/role play, and sequential formats could be considered. The exam format may make use of multimedia elements in questions and answers. Additionally, multi-format exams may be appropriate.

Pre-loading your Exam | Questions

You may want to consider seeing what either a Google search or ChatGPT submission of your exam/individual questions looks like in these services. Additionally, uploading your exam into Turnitin will assist in the originality score checking that will be conducted during the marking process.

Building student awareness

Actively lift student awareness around the risks associated with cheating, collusion, and the use of generative artificial intelligence. Develop student media literacy and critical thinking skills, especially in terms of generative artificial intelligence. Promoting a sense of student belonging and accountability to each other and the subject coordinator. Creating a culture where cheating is not acceptable. Further information on promoting academic integrity is available via the Academic Integrity webpage.

Generative artificial intelligence: inform yourself

  • knowing more about generative artificial intelligence limitations (e.g., answers from artificial intelligence can sound reasonable on the surface, but arguments may not make sense or misrepresent information)
  • there can be errors in details and citations, especially when a topic is not frequently covered in online discourse
  • content may also contain biases, stereotypes, and slanted perspectives
  • content/discourse currency also plays a part (as generative artificial intelligence platforms may be ‘behind the times’).

Assessment Design Principles

Explicit and clear expectations and instructions play a key role in exam design. Communicate with students about what is allowed in terms of instructions and behaviour. The Assessment design principles document contains advice about design principles to inform your exam.

The Learning Environment is important

Providing academic skills support and a scaffolded learning environment for students will reduce the likelihood of breaches in Academic Integrity (Ahsan et al., 2022). Help students feel confident in taking an exam through scaffolding and using a practice exam with feedback.

Potential Breaches in Academic Integrity

Communicate to students about any plagiarism detecting strategies that might be used.

See detection and reporting.

Turnitin and Charles Sturt Academic Integrity Checking Processes.

Assessment Policy: Viva/oral presentations

A subject may require students to be prepared to give an oral presentation or response to the assessor (or other audience) on a particular topic or communicate relevant information from their assessment tasks. The assessment policy provides further information on the oral presentation or response.

LMS test tool strategies

Can include shuffling question answers, randomising pools, disabling copy/paste, time limits and varying questions between exams. Never reuse whole exams from previous sessions. Even small changes to case studies or questions can be effective. See:

Q4

What alternatives are there to an exam?

In some cases, an alternate assessment type may provide a preferable method to assess a student's knowledge of learning outcomes.

To discuss an alternate assessment type, please log a service request.

Tips to write effective questions

Writing effective exam questions can take time and lots of redrafting. Consider working with others to write questions or to bounce ideas off.