
“I believe the technology already exists to produce an autonomous surgical robot using current artificial intelligence programs combined with real time 3D [three dimensional] ultrasound scanners and current surgical robots,” Smith told iTnews.
With a team of engineers at Duke University, Smith built a rudimentary tabletop robot that navigated using 3D ultrasound technology.
The robot was controlled by an artificial intelligence program that processed real-time information and gave the robot specific commands to perform.
“We believe that this is the first proof-of-concept for this approach,” Smith said.
“Given that we achieved these early results with a rudimentary robot and a basic artificial intelligence program, the technology will advance to the point where robots -- without the guidance of the doctor -- can someday operate on people.”
The autonomous robot system has successfully performed a simulated needle biopsy that involved directing a needle on the end of the robotic arm to touch the tip of another needle within a blood vessel graft, based on information from a 3D ultrasound transducer.
While the research will continue to refine the ability of robots to perform independent procedures, the new technology could have more direct and immediate applications also.
“Current surgical robots use optical endoscopes which only image the surface of the internal organs. The 3D ultrasound scanner can image inside of tissue,” Smith said.
“Current surgical robots could use 3D ultrasound probes to guide the surgical tools to the tissue structures within organs without additional dissections or incisions,” he told iTnews.
The use of 3D ultrasound transducers in procedures that involve radiation could provide more information to medical staff and reduce the need for patients to be exposed to radiation, the researchers said.