Designed to mimic the learning curve of a trainee surgeon, the robot responded in real time to instructions such as "grab the gallbladder head," adjusting its actions mid-operation. The machine demonstrated expert-level performance even under variable and unpredictable conditions, maintaining precision and composure throughout all trials.
The work, supported by federal funding, signals a transformative advance in robotic systems capable of interpreting context and adapting dynamically-key capabilities needed for real-world clinical environments.
"This advancement moves us from robots that can execute specific surgical tasks to robots that truly understand surgical procedures," said medical roboticist Axel Krieger. "This is a critical distinction that brings us significantly closer to clinically viable autonomous surgical systems that can work in the messy, unpredictable reality of actual patient care."
Krieger's earlier robot, STAR, completed the first autonomous surgery on a live animal in 2022, but that procedure required ideal conditions and rigid planning. In contrast, the new Surgical Robot Transformer-Hierarchy (SRT-H) is designed to handle complexity and change, using the same AI backbone that powers ChatGPT.
SRT-H can adapt to individual anatomical differences, correct its own actions, and make independent decisions during operations. It even learns from corrections provided by human mentors during training.
"This work represents a major leap from prior efforts because it tackles some of the fundamental barriers to deploying autonomous surgical robots in the real world," said lead author Ji Woong "Brian" Kim. "Our work shows that AI models can be made reliable enough for surgical autonomy-something that once felt far-off but is now demonstrably viable."
Last year, the team trained a robot on basic surgical maneuvers. Now, SRT-H has mastered a much longer and more intricate sequence of 17 steps involved in gallbladder surgery. The robot learned from annotated videos of Johns Hopkins surgeons operating on pig cadavers and successfully replicated the procedure with perfect accuracy.
Although slower than a human surgeon, the robot's results matched expert-level performance. It managed varying anatomical conditions and adapted to changes such as altered starting positions and shifts in tissue appearance due to dye injections.
"To me it really shows that it's possible to perform complex surgical procedures autonomously," Krieger said. "This is a proof of concept that it's possible and this imitation learning framework can automate such complex procedure with such a high degree of robustness."
The team plans to expand the system's capabilities to additional procedures, ultimately aiming for complete autonomous surgeries.
Research Report:SRT-H: A Hierarchical Framework for Autonomous Surgery via Language-Conditioned Imitation Learning
Related Links
Johns Hopkins University
All about the robots on Earth and beyond!
Subscribe Free To Our Daily Newsletters |
Subscribe Free To Our Daily Newsletters |