Research Partnership
Partner with us on a pilot semester.
Stackle provides the evidence infrastructure. You bring the cohort and the question. Together we find out what becomes visible when learning is documented where it happens.
The Opportunity
A pilot that generates evidence - for your institution and ours.
Most institutions exploring evidence-based assessment are asking the same questions but generating answers in isolation. A pilot with Stackle is a different kind of engagement. You embed evidence collection activities directly inside your Canvas or Brightspace pages for one semester. Students engage right where learning happens - no new system, no additional login. At the end of the semester, you have a progressive, timestamped record of how thinking developed across your cohort. So do we.
What makes this a research partnership rather than a product trial is the question you bring with you. Stackle provides the infrastructure at no cost for the pilot semester. The academic or learning design team defines the cohort, the discipline context, and the evidence question they want to explore. The data that emerges belongs to the institution. The story - with your consent - becomes a case study that helps other institutions understand what is possible.
The University of Newcastle began as exactly this kind of pilot. Three courses, one semester, a specific question about authentic engagement at scale across Psychology, Nursing, and Law. It expanded to sixteen courses across six disciplines in under two months. Not because Stackle pushed for expansion, but because the evidence was compelling enough that other academics asked to be involved.
3 → 16
Courses in two months at Newcastle
74–80%
Student engagement across disciplines
1 semester
From pilot to institutional expansion
The Questions
What a pilot could help you find out.
These are not Stackle’s questions to answer alone. They are the questions that learning designers, academics, and institutional researchers across the sector are actively asking - and that a single well-designed pilot semester could begin to address.
01
Does embedded evidence collection change the quality of student engagement?
When activities are placed directly inside the learning content rather than submitted separately, students engage differently. A pilot generates the evidence to understand whether that difference is meaningful - in depth of response, consistency of engagement, and willingness to return and revise.
02
Does progressive evidence reduce reliance on AI-generated content?
A coherent development arc captured across multiple weeks in a live learning environment is extremely difficult to generate artificially. A pilot explores whether timestamped, versioned, contextual evidence shifts the integrity dynamic - not by detecting AI use, but by making authentic engagement visible by design.
03
Does visible learning development change how educators intervene?
When educators can see which students are engaging and where understanding is developing - before the deadline, not after it - their ability to intervene changes. A pilot can surface whether real-time visibility into the learning journey changes teaching practice, not just assessment outcomes.
Where It Fits
The evidence need is clearest in these disciplines.
Each of these contexts has a specific gap between where learning happens and where evidence is currently collected. A pilot addresses that gap directly, inside the LMS environment the institution already uses.
Nursing & Allied Health
Clinical placement documentation
The most formative learning in clinical programs happens in placement environments - patient interactions, clinical decisions, moments where theoretical frameworks meet real situations. Embedded evidence collection captures that reasoning as it occurs, not days later when the detail has faded.
From reconstructed portfolio to in-context evidence trail
Psychology & Social Work
Professional identity development at scale
Psychology cohorts of 500+ students present a specific challenge: how do you maintain authentic reflective practice at a scale that makes individual attention impossible? The University of Newcastle addressed this directly - 74–80% engagement across a cohort that size, with no additional systems for students to navigate.
From compliance-driven reflection to genuine engagement at scale
Law & Professional Practice
Reflective evidence for accreditation
Legal education accreditation frameworks increasingly require longitudinal evidence of professional thinking development - not just what a student can do at graduation, but how that capability developed across the program. Embedded evidence collection generates that record inside the Canvas or Brightspace environment the faculty already uses.
From endpoint submission to progressive professional development record
Education & Teacher Training
Practicum evidence across blended cohorts
Education students split their time between campus and school environments, and their most significant learning often happens in the practicum context that is hardest to document. A single evidence infrastructure that works equally for on-campus and fully online students - with no adaptation required - addresses the visibility gap that blended cohorts create.
From reconstructed practicum reflection to embedded engagement record
The Exchange
What you get. What we get.
Your institution gets
Full Stackle infrastructure at no cost for the pilot semester, configured for your Canvas or Brightspace environment and provisioned within standard LTI 1.3 deployment. A progressive, timestamped evidence record across your cohort - every response, every revision, every version - exportable at any level from individual student to full course. Real-time visibility into engagement across your cohort before the deadline arrives, with the complete analytics hierarchy from course-wide overview to individual response version. The evidence data belongs to your institution. We will never share or reference your institutional data without your explicit consent.
Stackle gets
A real implementation in a disciplinary context we can learn from - the specific evidence needs, the workflow questions, and the moments where the infrastructure needs to adapt. With your consent, a case study that helps other institutions understand what evidence-based assessment looks like in practice across your discipline. An ongoing relationship with the academic and learning design community that is doing the most serious thinking about assessment in the AI era.
Start the Conversation
Book a 30-minute conversation with Sean.
Not a demo. Not a sales call. A conversation about your cohort, your discipline context, and whether a pilot semester makes sense for your institution. If it does, we move forward. If it doesn’t, you’ll have a clearer picture of what evidence-based assessment infrastructure looks like and what questions it can help you answer.
Currently working with: University of Melbourne · University of Newcastle · University of Sydney · UNSW College · Vlerick Business School
