Project Highlights
Focused snapshots of product design projects and outcomes
Scaling Learning Assessment
Doubling grader efficiency by understanding workflow and reducing cognitive burden
Contextual Inquiry
Cross-functional Synthesis
Process Mapping
Reducing Cognitive Load
The Challenge
As Springboard rapidly scaled, the team recognized an opportunity to optimize the newly-launched grading system. With grading averaging 30 minutes per submission—2x our initial targets—and grader satisfaction at 20%, there was significant need to improve both efficiency and consistency of the grading experience.
The Process
I led the redesign starting with a two-week video diary study capturing graders in their natural workflow. The sessions revealed significant usability issues—the current workflow required 25+ steps, managing 3-6 windows simultaneously, manual point tracking, and reliance on external tools.
Through collaborative synthesis with cross-functional partners, I identified streamlining opportunities and translated findings into prioritized Jobs-to-be-Done using RICE scoring to balance business impact with technical feasibility. I strategically focused first on foundational usability—streamlined navigation, optimized space, and interfaces that reduced cognitive burden.
The Solution
Our core design principle: remove friction so graders could focus entirely on delivering valuable feedback. Technical constraints ruled out the preferred single-window solution, so I pivoted to optimize within those boundaries.
Streamlining UI, automating repetitive tasks, and integrating functionalities, I delivered a unified solution featuring 1) a fixed-width panel optimized for split-screen grading, 2) criterion-level scoring with visual progress tracking that eliminated manual calculations, and 3) one-click access to reference materials.
Key Outcomes
50% Time Reduction
100% Grader Satisfaction
Eliminated Workarounds
Enhanced Student Experience
Enhancing Feedback Quality at Scale
Improving feedback consistency and educational value through strategic design interventions
Evidence-Based Learning Practices
Behavioral Design
Assistive Design
The Challenge
After successfully optimizing grading efficiency, Springboard identified an opportunity to improve feedback quality. A quality audit revealed that 35% of feedback was unclear or irrelevant, 22% of submissions needing improvement lacked actionable guidance, and 32% of passing assignments received minimal feedback. With graders now working efficiently, the focus shifted to ensuring students received consistent, high-quality feedback that supported their learning.
The Process
I led the design phase building on research from the grading efficiency work. Analysis of grading sessions and feedback audits revealed three core challenges: graders 1) struggled with phrasing and composition, 2) spent significant time off platform to use third-party proofreading tools (e.g. Grammarly), and 3) had difficulty prioritizing what feedback to give on complex submissions.
Drawing on educational best practices for effective feedback, I established a framework for effective feedback: Feedback should be actionable (clear next steps), clear (easy to understand), aligned with rubric criteria, and constructive (acknowledging strengths alongside areas for growth).
I translated these criteria and insights from our research into Jobs-to-be-Done, prioritizing interventions that would guide graders toward better feedback without adding time burden. Using RICE scoring, I focused on three strategic areas: layout optimization to naturally force alignment of feedback with rubric criteria, integrated writing support tools, and embedded logic to ensure appropriate feedback coverage.
The Solution
Our core design principle: make high-quality feedback the path of least resistance. Rather than relying on graders' writing skills or training, I embedded best practices directly into the interface through three strategic interventions.
Strategic layout redesign positioned feedback inputs directly adjacent to each rubric criterion, eliminating the need for graders to repeatedly specify which criteria they were addressing. This tightened the connection between evaluation and feedback while reducing cognitive load.
Integrated writing support provided context-aware sentence starters and a personalized comment library, helping graders compose feedback confidently without starting from scratch each time. Built-in grammar checking eliminated the need to switch to external tools.
Smart feedback logic guided graders to provide feedback where students needed it most—requiring input when criteria weren't mastered while making optional feedback easy to add for exemplary work. Pre-submission prompts encouraged graders to balance out negative feedback with at least one piece of affirming feedback whenever applicable.
Key Outcomes
100% Feedback Coverage
100% Actionable Feedback
50% Reduction in Low-Quality Feedback
Maintained Efficiency and Satisfaction
Flexible Call Credits: A Pilot in Student-Driven Learning Support
Designing and testing a credit-based mentoring system that maintained learning outcomes while enabling sustainable scaling
Learning Experience Design
Behavioral Design
Rapid Research
The Challenge
As Springboard scaled, the weekly 1:1 mentor call model was becoming operationally unsustainable. Before committing significant technical investment to evolve from our legacy system, the team needed to validate whether an alternative model could maintain student learning progress and satisfaction. The challenge: quickly design and test a flexible credit-based system where students could schedule support as needed. A model that deviated in both user experience and technical lift from the familiar model of standing weekly appointments
The pilot needed to answer: Could students progress through our courses at the same rate? Without a regular call cadence, would they feel clear about when and how to get support? With only 6 weeks to test and tight design constraints, success required making an unfamiliar system intuitive from day one.
The Process
With only a few weeks to design and develop a solution to pilot, I mined existing user insights and applied proven design principles to quickly create a testable experience.
Drawing on our research repository I identified the four critical support scenarios where students needed help: struggling with tools, confusion about how to start projects, uncertainty about approach, and unclear how to address failed project feedback.
I applied learning experience design principles to scaffold the new concept. Drawing on research about effective onboarding and just-in-time learning, I designed for progressive disclosure (introducing complexity gradually), familiar mental models (comparing to tutoring/office hours), and contextual triggers (surfacing information when students needed it most).
The Solution
Our core design principle: make the unfamiliar feel intuitive by connecting to familiar patterns and providing just-in-time guidance.
Rather than relying on upfront tutorials, I designed a system that taught through use. A visual credit counter (like a data plan) made the system immediately graspable. One-click booking from any course page reduced friction. Pre-filled booking reasons based on context helped students articulate their needs.
Strategic reinforcement across the student journey ensured students encountered the Flex Calls concept at key decision points: after failed projects, during progress stalls, and when approaching challenging content. Each touchpoint reinforced when calls would be most valuable.
Familiar anchors reduced cognitive load: positioning Flex Calls as similar to office hours or tutoring sessions gave students existing mental models to work with, making the new system feel less foreign.




