← Back to Articles

AI & Assessment Integrity

AI and assessment: from detection to documentation

Detection alone was never sufficient. The sector has moved - and the infrastructure question has changed with it.

AI & Assessment Integrity6 min read

The disruption that AI writing tools created for assessment design was genuine. It was also, in retrospect, an accelerant that exposed a pre-existing problem rather than creating a new one. Assessment that relied on a polished written submission as the primary evidence of learning was always vulnerable to the question of whether the student produced it. For decades that question was manageable. AI made it unavoidable. The sector's response was understandable: if students are using AI to produce work, detect it. But detection was never a complete answer, and the consensus position has shifted accordingly.

TEQSA's assessment reform guidance states explicitly that detecting AI use with certainty is all but impossible, and that institutions should emphasise assessment redesign rather than investing primarily in detection mechanisms. That is not a fringe view. It is the position of Australia's primary higher education regulatory body, and it reflects what assessment design practitioners have observed: detection tools produce false positives, miss sophisticated use, and do nothing to address the underlying question of whether a student engaged with the course material and developed their own understanding. The energy spent on detection infrastructure is energy that could have gone into building assessment infrastructure that makes the question of fabrication less relevant by design.

The shift to documentation

What replaced detection as the dominant frame is documentation. If an institution can demonstrate that a student engaged with course content progressively over a semester - that their thinking developed in timestamped increments, anchored to specific content, across multiple weeks - then the question of whether a final polished submission was AI-generated becomes a smaller part of the picture. The evidence of learning exists independently of the submission. The submission is one data point in a much larger record.

This is not only theory. At the University of Newcastle, Psychology courses used embedded evidence activities across more than 500 students in Semester 1 2025. Engagement averaged 74 to 80 per cent. When questions arose about academic integrity, educators had a progressive record of student thinking across the semester to draw on. The timestamp trail did not detect anything. It made authentic engagement visible by design, from the first week of the course. That visibility changed the nature of the integrity conversation entirely.

The AI integration paradox

Institutions are navigating a genuine paradox: they must simultaneously embrace AI as a legitimate learning support and ensure that the evidence of learning is authentic. These goals are not in conflict if the infrastructure is right. Embedded, progressive, timestamped evidence collection documents how a student's thinking developed in an AI-enabled environment - not whether they used AI. A student whose thinking evolves demonstrably across weeks, in response to specific content, in a learning environment where fabrication would require extraordinary effort sustained over an entire semester, is a student whose engagement is visible regardless of what supporting tools they used along the way.

The infrastructure question has changed. It is no longer primarily about how institutions catch AI use. It is about how they build the evidence trail that demonstrates learning occurred - progressively, authentically, across the journey that leads to the final submission. That is an assessment design question. And it begins not with what is submitted at the deadline, but with where evidence is collected across the weeks that precede it.

Sean Duffy, Co-founder and CEO of Stackle

Sean Duffy · Co-founder & CEO

See what becomes visible.

See how progressive evidence collection addresses the assessment integrity challenge in your programs. Thirty minutes with Sean.

Book a Demo