High stakes vs low stakes testing – key differences to know
Debate persists around the use of the term ‘high-stakes assessment’, especially for large-scale and high-profile exams. We’re likely to come across this term in the media or in academic papers, especially during exam season, or whenever these channels seek to explore the merits of the use of such testing.
In this brief guide, we’ll clarify the meaning of ‘high-stakes assessment’ and its invaluable cousin, ‘low-stakes assessment’. What makes a test ‘high stakes’ and why do we hear so much about it?
“High-stakes assessment”: A test that has real-world implications
To understand whether an assessment is low or high stakes, one must consider the consequence of the outcome on the learner.
If the result of the assessment will have a significant impact on a student’s future, it is deemed to be high stakes. For example, the outcome of a university entrance exam or selective school exam will determine a student’s placement into their preferred education program. Therefore, these assessments have an impact on the test-taker’s future life and career choices and are therefore deemed to be high-stakes assessments.
Similarly, an assessment that counts for more than 50 per cent of a final grade such as some school or end-of-year university exams, or testing that results in achieving a formal professional qualification – such as the Chartered Accountants exams or the Bar exam – will be deemed high stakes.
The contribution of “low-stakes” testing to the education journey
Let’s contrast this with ‘low-stakes’ assessments. Low-stakes testing is a crucial form of assessment that supports the student’s learning journey and informs tailored teaching interventions without its outcomes directly influencing a students’ future.
In a previous article which explored the most common types of school assessments, we explained low-stakes tests that can sometimes be called ‘formative’ or ‘diagnostic’ assessments. They provide valuable insights to help navigate a learner’s individual strengths and additional learning opportunities. They inform the teacher, parent, learner and other key stakeholders of how to best support the learner to unlock their full potential and tackle any learning gaps. An example of this is the NSW Department of Education’s popular Check-in assessment for primary and early high school students.
Cheating can be a BIG problem
Learn the most common ways students cheat for high-stakes tests, and how to stop them.
Why “low-stakes” does NOT mean low-value, low quality nor low importance
All assessments may include the same quality item types and test formats but it’s their purpose and outcome that determine the stakes. Assessments such as NAPLAN, which assesses Australian school student’s fundamental academic skills, OECD PISA-based Test for Schools to inform school improvement plans and the ICAS exams to identify academic excellence in students are excellent examples of low stakes assessments. They are high-value assessments constructed from quality-assured test items deemed to be valid, reliable and fair. Yet, they are low stakes because they don’t directly impact a student’s final grades or directly influence their future choices.
It’s all in a name
So, next time you come across a reference to ‘high stakes’ and ‘low stakes’ testing, you may find it a little easier to navigate the literature or news coverage and be certain about the types of exams being discussed.
About the author
Janison
Unlocking the potential in every learner
But what happens if the servers are nearly full and the load balancer has little to work with? Autoscaling kicks in – a feature that spins up more servers reserved for this moment. With extra servers in play, and more waiting in the wings, the servers running the tests work smoothly throughout the exam.
3. Availability zones
Sometimes entire banks of servers can fail, often due to power outages. If this happens, Azure has another feature to allow students to continue their tests: availability zones.
Say a student is completing their test on a laptop in western Sydney, and the test is running on servers in a Microsoft data centre a few kilometres away. If there’s a power outage in the region, the test would typically fail. But Microsoft prevents this by having separate data centres within a region – “availability zones” with plenty of distance between each. So when the power fails in western Sydney, the system swaps over to another zone – say in eastern Sydney – that still has electricity. The zone has the test application installed and replicated on its servers, so the switch is seamless.
This is another form of server swapping but on a broader scale to protect against entire data centres going offline – something possible when you’re running thousands of tests across a large area. It’s another vital feature that adds to the resiliency of our system.
4. Exam monitoring
When thousands of students sit the same test at once, it’s possible to fully automate the exam with technology. But for massive exam events like these, we have our human engineers monitor the health of the system, tracking data like server response times and CPU usage to ensure everything is running smoothly and problems are pre-squashed.
At Janison, the bigger the exam event, the more intense the monitoring. Some exams are so big and important we assign entire teams of engineers to monitor the system’s health and anticipate and catch problems. We replicate this approach for any clients that run huge exam events.
For smaller exams or tests with lower stakes, our engineers still monitor the system’s health but rely more on alerts. For example, if the system has an unusually large traffic spike during an exam, which doesn’t correspond with the number of students sitting tests, an engineer is notified to investigate (if they hadn’t already noticed). This approach allows them to crush problems quickly before they affect students’ tests.
You might also like
Want to learn more about our tailored solutions?
Chat to one of our assessment or learning consultants today.
or call us on 1300 857 687 (Australia) or +61 2 6652 9850 (International)