Assessment Evidence

Move beyond the endpoint submission.

Most of what students learn across a semester leaves no trace. Canvas captures the submission. Stackle captures everything that led there, right where it happened, without adding work for educators.

The Problem

A submission tells you a student finished. It doesn't tell you whether they understood.

Canvas has always captured the endpoint. What it cannot capture is what happened across the weeks before it, whether a student engaged with the material, how their thinking shifted, where understanding developed or stalled. For most institutions, that part of the learning journey is invisible. Educators make their best judgement from a single polished document submitted close to a deadline.

In an AI-enabled environment, that single document is no longer sufficient. A well-structured essay, a coherent analysis, a competent reflection, each of these can now be generated without a student having engaged with the course at all. Institutions that rely on endpoint submissions alone cannot demonstrate that learning occurred. They can only demonstrate that something was submitted.

We'd read through 500 submissions and it just didn't feel particularly authentic. There was no way to know.

Dr Tegan Bradley · Lecturer, Psychology · University of Newcastle

What Institutions Can See

Three views. One complete evidence record.

Stackle gives institutions three distinct views of the evidence that their current tools cannot produce. Each one is auditable, exportable, and captured inside the LMS right where learning happened.

See how thinking developed, version by version.

Every time a student saves a response, Stackle creates a new timestamped version. Not an overwrite - a new record. The complete history of every revision is stored and comparable. A coherent development arc across multiple versions, captured over weeks, is extremely difficult to fabricate.

Evidence Value

Demonstrates that thinking developed progressively over time - not that a polished response was submitted at the deadline.

app.stackle.app/responses/comparison
Clinical Reasoning JournalSarah Mitchell
Version 3 · Apr 2, 2026 · 09:17Current

What struck me most was how Nurse Patel adapted her communication approach for each patient - using visual aids with Mr Thompson, while engaging Mrs Chen in shared decision-making. This wasn't a standardised protocol; it was responsive clinical judgment. I initially assumed good communication was mostly about clarity.

Word count: 187Removed: 38Added: 94
Version 2 · Mar 21, 2026 · 14:32

My initial thoughts on patient communication were largely theoretical. I understood the frameworks but hadn't yet connected them to...

Show comparison
Version 1 · Feb 24, 2026 · 11:08

Communication in clinical settings seems straightforward - convey information clearly and listen...

Show comparison

3 versions · First submitted Feb 24

See who engaged, and where they didn't.

For a cohort of 500 students, the progress matrix shows at a glance which students have responded to which questions, without opening a single individual record. A filled dot means answered. An empty circle means not yet. For quality assurance teams, this view answers the question that matters: was engagement consistent across the cohort, or concentrated at the deadline?

Evidence Value

Demonstrates cohort-wide engagement across the learning journey - not just submission counts at the endpoint.

app.stackle.app/activities/matrix

NURS-3204 · Week 8 Reflection

48 students · 4 questions

StudentQ1Q2Q3Q4Progress
SMSarah M.
100%
JTJames T.
75%
PKPriya K.
50%
LOLiam O.
100%
CWChen W.
25%
EREmily R.
75%
NBNina B.
100%
RPRaj P.
0%

42 of 48 students responded to Q1 · 6 not started

Audit-ready documentation at every level.

Excel-based evidence reports can be generated at any point in the semester - by course, collection, activity, question, or individual student. Reports contain complete raw response data, not just summaries. Every version, every timestamp, every change. Generated on demand, delivered by email, stored for historical access.

Evidence Value

Produces the audit-ready documentation that institutional quality assurance processes require - without manual compilation or additional administrative overhead.

app.stackle.app/reports

Generate Evidence Report

Selected Scope

NURS-3204 · Semester 1, 2026 · All collections

All student responses - complete text, all versions
Timestamps for every save and submission
Version comparison data - added and removed per revision
Cohort engagement summary by question

Delivered by email · Stored for historical access

SECTOR DIRECTION

What the sector is asking for.

These are not Stackle's arguments to make. They belong to the regulators, researchers, and sector leaders who have already made them.

Stackle generates the evidence infrastructure institutions use when addressing these requirements. How that evidence is applied within any specific regulatory or accreditation framework is always a matter of institutional assessment design.

TEQSA - Assessment Reform for the Age of AI

From detection to documentation

For several years, the dominant institutional response to assessment integrity was detection. The sector has moved past that position. TEQSA's assessment reform guidance states explicitly that institutions need to emphasise the redesign of assessment rather than investing primarily in detection mechanisms - and describes detecting AI use with certainty as all but impossible.

What has replaced detection as the primary frame is documentation. TEQSA's Threshold Standard 1.4.4 requires institutions to evidence the process of learning over time and in context. That language describes a fundamentally different kind of infrastructure from endpoint submission or AI scanning. It describes progressive, contextual, timestamped records of how learning developed. That is what Stackle generates.

Evidencing the process of learning over time and in context.

TEQSA - Threshold Standard 1.4.4
Torrens University - Assurance of Learning in the AI Era, December 2025

Transparency as a procurement requirement

The Torrens University Assurance of Learning report represents a shift in how the sector thinks about technology selection. Participants across the research explicitly criticised black-box tools that prevent institutions from accessing granular learning data. The report's recommendation is direct: pedagogical evaluation and data transparency should become standard procurement criteria.

This matters because it reframes the technology question. It is no longer sufficient for a platform to collect evidence. Institutions need to own that evidence, access it, export it, and audit it. Stackle's architecture is open and LMS-native by design. Every response, every version, every revision is accessible and exportable. There is no proprietary algorithm mediating what institutions can see.

Pedagogical evaluation and data transparency should become standard procurement criteria.

Torrens University - Assurance of Learning in the AI Era, December 2025
Professional Accreditation Frameworks - Health, Law, Business Education

Longitudinal evidence across the learning journey

Across professional accreditation frameworks in health, law, and business education, a consistent direction is emerging: demonstrating competency development over time, not just at the point of graduation. This is a shift from outcomes documentation to journey documentation - from asking what a student can do, to how that student's capability developed.

The infrastructure question this creates is the same regardless of disciplinary framework or accreditation body: how do you generate a structured, auditable record of professional thinking development across a program, embedded in the learning environment where that development actually occurred? The kind of evidence Stackle generates is directly relevant to that question.

Relevant to ANMAC, APAC, SRA, AACSB, EQUIS, and equivalent frameworks.

The Castlereagh Statement - April 2026

A national commitment to make the learning process visible

The Castlereagh Statement, published in April 2026 and shaped by more than 80 educators, leaders, and students from over 30 Australian organisations, is the most significant cross-sector consensus document on AI and education yet produced in Australia. Its signatories include the Deputy Vice-Chancellor (Education) at the University of Melbourne, the Director of Assessment 2030 at Curtin University, and the researcher whose work defined the two-lane assessment framework.

Principle 3 of the Statement calls for a fundamental reorientation of assessment philosophy to draw on diverse forms of evidence, accumulated over time and across contexts, to verify learner capabilities. It commits explicitly to shifting teaching and assessment design to make the learning process visible. In the near horizon, it calls for phasing out AI detection in favour of responsible and effective use. In the far horizon, it calls for building infrastructure that verifies demonstrated capability and allows seamless movement between learning contexts throughout life.

Shift teaching and assessment design to make the learning process visible.

The Castlereagh Statement - Principle 3, April 2026Read the Castlereagh Statement →

In Practice

Evidence collection at scale. Without additional overhead.

The University of Newcastle ran a three-course pilot across Psychology, Nursing, and Law, capturing progressive evidence of student learning without adding a new system for students to navigate.

University of Newcastle · Pilot

We wanted reflections to feel genuine and easy for students, while also being meaningful and embedded in their learning journey, and we wanted to know it could work across an entire university, not just one course.

Meegan McHugh · Manager, Learning Technology · University of Newcastle


74–80%

Student engagement rate

1,500+

Learners across the pilot

3

Courses across Nursing, Psychology, and Law

Also working with

University of MelbourneUniversity of SydneyUNSW CollegeHaileyburyVlerick Business SchoolOneSchool GlobalAustralian Christian College
Sean Duffy, Co-founder and CEO of Stackle

Sean Duffy · Co-founder & CEO

See what becomes visible.

The sector has pointed a direction. What it looks like inside your institution, your programs, your accreditation context, your current infrastructure, is a different conversation. That's the one worth having.

Book a Demo