Dental Suction RobotAutomatic Dental Assistant Robot

Dental Suction Robot
timeline: February 2026·team: Jacky Li, Emmy Dinh, Yang Yang Zhang
tech stack: Python, LeRobot, ACT (Action Chunking Transformer), Diffusion Policy, SO101 Robotic Arm, LeRobotDataset, Parquet

Overview

Canada is short roughly 5,000 dental assistants. Clinics are understaffed, dentists are burning out, and patients in rural and Indigenous communities are waiting over a month just to be seen. The industry standard — "four-handed dentistry" — literally cannot function without an assistant present.

So we asked: what if a robot could hold the suction tube? Not replace the assistant. Not automate the whole procedure. Just handle the repetitive, non-interactive tasks so that the human can focus on what actually matters.

Overview

Inspiration

Yang Yang and I visited 10 local dental clinics and talked to 2 dental students. We wanted to understand the problem firsthand before jumping to a solution.

What we kept hearing was the same thing: dentists love their assistants, they build real working relationships with them — but a huge chunk of what assistants do during procedures is mechanical. Holding tools. Repositioning suction. Tasks that don't require a human, they just require an arm.

We also met Balbir Sohi, founder of Smiles on Wheels, a mobile dental hygiene clinic. She put it plainly: solo practitioners are exhausted. Holding a suction tube for hours is tiring and inconvenient, and it's pulling focus away from the patient. That was our lightbulb moment. We didn't need to build a dentist robot. We just needed a smart fifth arm.

Inspiration

How It Works

The system runs on a four-stage pipeline.

First, we identify which dental assistant tasks are worth automating — repetitive, non-patient-facing, and low-risk. Suction tube management was the clearest target.

Then, a skilled assistant teleoperates the SO101 robotic arm using LeRobot's Python-native interface, which gives sub-centimeter precision. These sessions get recorded as synchronized video and motor action data in a LeRobotDataset.

From there, we run imitation learning — using either an Action Chunking Transformer (ACT) or Diffusion Policy — to train the robot to mimic the timing and positioning of the human operator autonomously, no teleoperator needed.

Finally, the trained robot gets deployed in a real clinic environment for validation.

How It Works

Results

We placed 2nd at SmileHacks 2026, BMEC's annual hackathon at the University of Toronto — the largest biomedical engineering hackathon at UofT with 100+ participants.

We won $400 and a guaranteed interview with the UofT Hatchery NEST program — UofT's engineering startup incubator that has launched 94+ companies since 2012.

Honestly, really proud of what we built in under 24 hours. The problem is real, the tech is real, and the path to deployment is clearer than we expected going in. This one's not done yet.

Results