The robotic surgeon will see you now


Sitting on a stool several meters from a robot with long arms, Dr Danyal Fer wrapped his fingers around two metal handles near his chest.

By moving the handles – up and down, left and right – the robot mimicked every little movement with both arms. Then when he pinched his thumb and forefinger together, one of the robot’s tiny claws did much the same. This is how surgeons like Dr Fer have long used robots to operate on patients. They can remove the prostate from a patient sitting in front of a computer console across the room.

But after this brief demonstration, Dr Fer and his fellow researchers at the University of California, Berkeley, showed how they hope to advance the state of the art. Dr. Fer let go of the handles and a new kind of computer software took over. As he and the other researchers watched, the robot began to move entirely on its own.

With a claw, the machine lifted a tiny plastic ring from an equally small stake on the table, passed the ring from claw to claw, moved it across the table and gently hooked to a new stake. Then the robot did the same with several other rings, completing the task as quickly as it did when guided by Dr. Fer.

The training exercise was originally designed for humans; this is how surgeons learn to use robots like the one in Berkeley by moving the rings from one ankle to another. Now, an automated robot performing the test can match or even surpass a human in dexterity, accuracy and speed, according to a new research paper from the Berkeley team.

The project is part of a much larger effort to bring artificial intelligence into the operating room. Using many of the same technologies that underpin self-driving cars, autonomous drones, and warehouse robots, researchers are also working to automate surgical robots. These methods are still far from everyday use, but progress is accelerating.

“These are exciting times,” said Russell Taylor, professor at Johns Hopkins University and former IBM researcher known to academia as the father of robotic surgery. “This is where I hoped we would be 20 years ago.”

The goal is not to get surgeons out of the operating room but to lighten their workload and perhaps even increase success rates – where there is room for improvement – by automating certain phases of the surgery.

Robots can already exceed human precision on some surgical tasks, such as placing a pin in a bone (a particularly risky task in knee and hip replacements). The hope is that automated robots can bring greater precision to other tasks, like incisions or sutures, and reduce the risks of overworked surgeons.

In a recent phone call, Greg Hager, a computer scientist at Johns Hopkins, said surgical automation would progress much like the autopilot software that guided his Tesla down the New Jersey freeway as he spoke. The car was driving on its own, he said, but his wife always had her hands on the wheel, in case something went wrong. And she would take over when it was time to get off the freeway.

“We can’t automate the whole process, at least not without human oversight,” he said. “But we can start to create automation tools that make a surgeon’s life a little bit easier. “

Five years ago, researchers at the Children’s National Health System in Washington, DC, designed a robot capable of automatically suturing a pig’s intestines during surgery. It was a notable step towards the kind of future Dr Hager envisioned. But it came with an asterisk: The researchers had implanted tiny markers in the pig’s intestines that emitted near-infrared light and helped guide the robot’s movements.

The method is far from practical, as the markers are not easily implanted or removed. But in recent years, artificial intelligence researchers have dramatically improved the power of computer vision, which could allow robots to perform surgical tasks on their own, without such markers.

Change is driven by what are called neural networks, mathematical systems that can learn skills by analyzing large amounts of data. By analyzing thousands of photos of cats, for example, a neural network can learn to recognize a cat. Likewise, a neural network can learn from images captured by surgical robots.

Surgical robots are equipped with cameras that record three-dimensional videos of each operation. The video is shown through a viewfinder that surgeons watch while guiding the operation, from the robot’s point of view.

But subsequently, these images also provide a detailed roadmap showing how the surgeries are performed. They can help new surgeons understand how to use these robots, and they can help train robots to handle tasks on their own. By analyzing images that show how a surgeon guides the robot, a neural network can learn the same skills.

This is how researchers at Berkeley worked to automate their robot, based on the da Vinci surgical system, a two-armed machine that helps surgeons perform more than a million procedures per year. Dr Fer and his colleagues collect images of the robot moving the plastic rings under human control. Then their system learns from these images, identifying the best ways to grab the rings, pass them between the claws, and move them to new pegs.

But this process came with its own asterisk. When the system told the robot where to move, the robot often missed the spot by a few millimeters. Over months and years of use, the many metal cables inside the robot’s twin arms stretched and bent slightly, so its movements were not as precise as they should have been. .

Human operators could compensate for this change, unconsciously. But the automated system couldn’t. This is often the problem with automated technology: it struggles to cope with change and uncertainty. Autonomous vehicles are still far from widespread because they are not yet agile enough to handle all the chaos of the everyday world.

The Berkeley team decided to build a new neural network that analyzed the robot’s errors and learned how much precision it was losing every day. “It learns how the robot’s joints change over time,” said Brijen Thananjeyan, a doctoral student on the team. Once the automated system was able to accommodate this change, the robot was able to grab and move the plastic rings, matching the performance of human operators.

Other labs are trying different approaches. Axel Krieger, a Johns Hopkins researcher who was part of the Pig Suture Project in 2016, is working to automate a new type of robotic arm, one with fewer moving parts and one that behaves more consistently than the type of robot used. by the Berkeley team. Researchers at the Worcester Polytechnic Institute are developing ways for machines to carefully guide surgeons’ hands as they perform particular tasks, such as inserting a needle for a cancer biopsy or burning the brain to remove a tumor.

“It’s like a car where the lane keeping is autonomous but you still control the throttle and the brakes,” said Greg Fischer, one of the Worcester researchers.

Many obstacles lie ahead, the scientists note. Moving plastic stakes is one thing; cutting, moving and suturing the flesh is another. “What happens when the angle of the camera changes? Said Ann Majewicz Fey, associate professor at the University of Texas, Austin. “What happens when the smoke gets in the way?” “

For the foreseeable future, automation will be something that will work alongside surgeons rather than replace them. But even that could have profound effects, said Dr Fer. For example, medics could perform surgeries at distances far greater than the width of the operating room – miles or more, perhaps, assisting wounded soldiers on distant battlefields.

The signal offset is too large to make this possible at this time. But if a robot could handle at least some of the tasks on its own, long-distance surgery could become viable, said Dr Fer: “You could send a high level plan and the robot could then execute it. “

The same technology would be essential for remote surgery over even longer distances. “When we start operating on people on the moon,” he said, “surgeons will need entirely new tools. “


Source link

Previous Father Son duo buy Piper Malibu Matrix - Aviation finance
Next Trendy Trading - The Commentator

No Comment

Leave a reply

Your email address will not be published. Required fields are marked *