UX Design Lead · 0→1 · AI Product
Surgical videos are recorded by default — yet surgeons still write notes from memory, days later. I led the end-to-end design from a patent pending algorithm to video editor that turns hours of footage into reviewed, tagged clips surgeons can actually use.
The brief
Dr. Lingga Adidharma built a patent-pending AI algorithm that could compress six-hour surgeries into 15-minute highlight reels — and needed an interface to take it to market. After our first client meeting, my team put it plainly: cart before the horse. We stepped back before touching wireframes to validate the problem and the user. That reframe shaped everything.
"How might digital imagery improve surgeons' recall, collaboration, and time in order to improve patient outcomes?"
— Our design questionDiscovery
Literature review, legal scan across HIPAA and patient privacy, competitive benchmark. Technical and regulatory landscape established before speaking to anyone.
On-site at Harborview Medical Center. Shadowed Dr. Adidharma through handoffs and consultations to understand the full workflow — not just the documentation moment.
1:1 interviews with experienced surgeons. Focused on note-taking, memory under pressure, and use of digital imagery. Each session sharpened where the real friction lived.
"I don't have time to be Ernest Hemingway here."
— Experienced surgeon, in-depth interviewWhat we found
Patient care always interrupts documentation. Notes are finished later, from memory.
"Airway dilated to 2mm" means something different to every reader. A picture of an almost-closed airway doesn't.
Without video, patients undergo repeated scans. Urgency gets lost in text.
Operative notes written days later. Multiple similar surgeries in one day blur together.
"Reading 'the airway is dilated to 2 millimeters' is a lot harder to understand than a picture of the airway almost closed."
— Surgery resident, contextual inquiryDefinition
Before writing a single principle, we brought in Josh Lovejoy — who invented LLM-powered personalization at Google Shopping — to pressure-test our AI assumptions. His critique sharpened what "surgeon autonomy" actually means in practice.
Fit existing workflows. Key tasks in minutes, not hours.
Surgeons in the room at every iteration. Comfort with AI varies — design for the full spectrum.
Function over form. Every feature earns its place clinically.
Ship something credible. Leave a clear roadmap, not speculative features.
Surgeons approve everything. Their corrections train the model. Blind trust is dangerous — structured disagreement is a feature.
My design scope
I mapped the full system, then scoped to the three moments where surgeon judgment is most active:
Design process
My process moved from low-fidelity sketches and storyboards, through three rounds of wireframes and mid-fidelity prototypes, to a fully annotated engineer handoff. Each round was grounded in surgeon feedback — and one concept was killed before it could become a liability. Honest scoping meant shipping something credible over something ambitious.
Independent sketches converged to surface conflicting assumptions. AI-generated storyboard panels for two scenarios: post-op notes and care team handoff.
Three structural variants. I pushed for split-panel based on sponsor feedback: "viewing text and video together keeps me organized."
Introduced a star/person icon system to distinguish AI vs. surgeon-generated tags. Testing revealed the icons weren't landing. Removed them. Redesigned around color and explicit labels. The principle survived; the execution changed.
Measured with the AI Trust Score: understand (Good), efficiency (Okay), control (Bad). The control gap drove R5.
Confusion over right-rail affordances, unclear clip states, AI decisions not reversible enough. Every R5 change addressed one of those three.
Final design system delivered with component annotations, interaction specs, and a full engineer handoff package. Every element traces back to a research finding. Two team members continue working with Dr. Adidharma as the algorithm moves toward patent.
From the sketchbook
Early Concept
Before high-fidelity, we validated the core mechanic: a vertical timeline linking clip moments to a video player. Each node maps to a specific surgical timestamp. Clicking a node or a clip card syncs both the playhead and the thumbnail — keeping video and annotation in lockstep.
"Viewing text and video together keeps me organized."
— Dr. Adidharma, mid-fi feedback sessionDesign decision
Our client loved this concept — but we scrapped it. A limited development timeline made it unrealistic to build. Honest scoping meant shipping something credible over something ambitious. That decision freed us to go deeper on what remained.
Click a timeline node or a clip card to sync the video player & thumbnail. This concept was scrapped due to restrictions in development timeline.
Iteration
With the timeline concept set aside, we focused on the surfaces that mattered most. The mid-fidelity mockup below shows how we restructured the interface around four key task areas — video viewing, clip annotation review, operative note editing, and publishing.
The primary workspace pairs the AI highlight reel with a structured clip list, color-coded by surgical phase. Split-panel driven by sponsor feedback: "text and video together keeps me organized." Draft status always visible. One-click publish.
Tags serve two functions: searchability and training signal. ✦ marks AI-generated tags; surgeon edits and additions feed back into the model. Every correction makes the algorithm more accurate. Unlabelled AI output eroded confidence in testing — explicit attribution restored it.
Before any AI-generated content reaches the patient record, the surgeon must actively confirm review. A design principle from week one — not an afterthought. No passive defaults. The surgeon is the author.
Responsible AI
✦ marks every AI-generated clip and tag. No ambiguity about what the machine produced.
Active confirmation required before anything AI-generated reaches the patient record.
The highlight reel is a supplement, not a substitute. Designed to resist over-reliance.
HIPAA compliance scoped to the sponsor. We designed around that boundary clearly.
Inclusive design
Inclusive design in a surgical context isn't about edge cases — it's about the baseline. Surgeons using Chronicle are time-pressured, context-switching constantly, and making decisions where mistakes have consequences. Every core interaction was designed with that stress state in mind.
Video and clip list side-by-side so surgeons never lose their place. Switching between screens in a post-op review compounds mental fatigue — keeping both in view was a deliberate decision, not a layout choice.
Proposed / Confirmed / All gives surgeons an immediate read on what needs attention without parsing a list. The information hierarchy was built around a surgeon who has three minutes, not thirty.
Before sharing a highlight reel, surgeons see exactly what they're publishing and confirm review. The friction is the feature — a time-pressured surgeon needs one moment of deliberate pause before AI-generated content leaves the system. EMR integration is scoped to a future sprint; right now the goal is making the highlight reel usable and shareable on its own terms.
Outcome
Zero to investor-ready in six months. Dr. Adidharma used the prototype in UW stakeholder and investor presentations. The design made the algorithm pitchable. The AI Trust Score defined readiness — and the control gap became the brief for the next sprint.
Future roadmap
Reflections
Refusing to build before validating the problem was the most valuable thing we did. The cart-before-the-horse moment was the brief, not a blocker.
Week-two principles became actual UI elements. The publish modal, tag system, and AI disclaimers aren't features — they're the ethics made visible.
No secondary research would have surfaced "I don't have time to be Hemingway here." That line came from being in the hospital.
Dr. Adidharma was user, SME, and critic simultaneously. Weekly syncs kept us honest. Leaning into that dynamic — not just presenting to her — made every iteration sharper.
Academic use as the MVP beachhead wasn't a retreat — it was the right pitch for investors at the pre-startup stage. Knowing what to leave out is as important as knowing what to build.
Used AI to generate storyboard visuals and accelerate iteration — while being explicit about where human review was non-negotiable.
"Having objective information is much more helpful than subjective information."
— Experienced surgeon, in-depth interview. The sentence that grounded the entire product.