Streamlined autism assessment
About the project
As the company transitioned from a fast-paced startup to a more stable scale-up, its vision of becoming a holistic mental health provider was still at odds with its operational reality. Autism assessments—its core revenue stream—were built on rushed workflows that didn’t reflect clinical needs. This project focused on streamlining those workflows by turning spreadsheet-based tools into integrated product features.
My role in it
I led the research, problem discovery, and product design, and co-implemented front-end components. This included shadowing clinicians, interviewing stakeholders, facilitating workshops, wireframing, and usability testing to shape and validate the solution.
Why it mattered
Autism assessments were central to the business, yet the clinician experience was clunky and manual. Clinicians relied heavily on spreadsheets to collect and score diagnostic evidence, creating inefficiencies and a higher risk of errors. Meanwhile, families often felt unsupported post-diagnosis, and the company’s broader vision of holistic care couldn’t move forward without fixing this foundation.
How we did it
Research problem space
To understand the clinician experience, I gathered insights through informal chats, shadowing sessions, and by reviewing entries in our research repository.
Key insights:
• Clinicians struggled to take structured notes during video assessments. Notes were often chaotic or incomplete, leading to rewatching recordings post-session to refine them.
• To map evidence to diagnostic tools, clinicians relied on makeshift spreadsheets, adding significant manual effort and risk of oversight.
From there, I focused discovery on the clinician workflow and translated insights into a service blueprint and user journey. I facilitated problem-framing and ideation workshops with the team that led add to this problem statement:
Clinicians conducting autism assessments struggle to capture structured notes in real time, often needing to rewatch recordings or refine chaotic notes. Mapping evidence to diagnostic criteria relies on manual spreadsheets. What’s missing is an integrated way to record and organize evidence during the session—reducing cognitive load, risk of error, and time spent on post-session admin.
Define success
We defined success across three layers:
• User outcomes – Do clinicians report reduced cognitive load and higher confidence in capturing structured evidence during sessions?
• Product usage – Are at least 80% of clinicians using the tool during assessments, and does it reduce time spent rewatching or post-session admin by at least 30%?
• Business impact – Does it shorten the overall time to complete an assessment report and reduce errors or rework flagged in clinical QA?
Ideate
Next, I worked with clinicians to unpack how their spreadsheets functioned and what they were trying to achieve. From that, I translated the process into an automated flow with two parts:
• A note-taking system embedded in live sessions
• Semi-automated mapping of evidence to diagnostic tools
Solution
Live sessions note-taking
We added a lightweight note-taking system directly into the video session interface. Since our research showed that most clinical notes were short, the input field was designed to take up minimal space—helping clinicians stay focused on the client. Each note was automatically grouped under the corresponding assessment task, which is designed to elicit specific behaviours relevant to diagnosis. This structure allowed notes from a given task to be easily linked to related diagnostic criteria, helping clinicians assess for indicators of autism more efficiently.
Semi-automated evidence mapping
Session notes were automatically organised under the relevant tasks and timestamped, making it easy to jump back to key video moments. In the diagnostic questionnaire, we mapped each question to the tasks that provide relevant evidence—so clinicians could access the right notes instantly, without needing to scroll or search manually.
To further streamline the process, we added:
• Best-practice references directly in the flow (no more flipping through manuals)
• A traffic light system to help clinicians quickly scan diagnostic scores
• Pre-written sentence starters (based on the score) to reduce time spent writing reports
Finally, all clinical notes were automatically transferred to the report—removing the need for manual copy-pasting and reducing the risk of errors.
What it achieved
After rollout, the tool showed promising impact across key areas:
• Clinician outcomes – Most clinicians reported lower cognitive load and greater confidence capturing structured evidence in real time.
• Usage – Over 70% of clinicians actively used the tool during assessments. While just shy of the 80% target, qualitative feedback showed strong adoption once initial habits were established. The tool also cut down time spent rewatching and reorganizing notes—though not yet by the full 30%.
• Business impact – The solution launched to 340 clinicians and reduced evidence collection time by 20 minutes per session within the first three months. Early signs also pointed to shorter reporting timelines and fewer QA issues linked to missed or mismatched evidence.