We're an independent, student-run newsroom.

The Daily Californian covers the city of Berkeley and the campus in unparalleled detail, all while training the future of the journalism industry. Consider making a donation to support the coverage that shapes the face of Berkeley.

  • The robot can’t walk. It can’t talk or clean your living room, but it might one day save your life.

    A team of faculty and students at UC Berkeley are attempting to transform the way surgery is performed by inviting robots into the operating room and allowing surgeons to hand off some responsibility to these automated assistants.

    Along with four other universities, these researchers have received a $3.5 million grant from the National Science Foundation to develop ways to teach robots to learn from their human counterparts and carry out complex subtasks — such as suturing — that are tedious for human surgeons.

    “The big picture here is that humans and robots have different strengths, and what we’d like to do is exploit what robots are good at and exploit what humans are good at and then somehow combine them in a way that (allows for actions) that neither robot nor human could be doing on their own,” said Pieter Abbeel, an assistant professor of electrical engineering and computer science at UC Berkeley and one of the lead investigators on the study.

    Robots, he said, are excellent at performing precise and repetitive tasks, while humans are better at identifying problems visually and adapting to new tasks and environments.

    While medical robots that assist in surgeries have popped up in operating rooms in the past few years, although not widely due to their enormous cost, the researchers want to take things a step further and program the robots with the capability to learn.

    In other words, instead of a surgeon having to manually guide the robot through each task, he or she could potentially demonstrate these subtasks to a robot and have the robot perform the necessary steps.

    “A human surgeon might indicate by pointing that he or she needs five overhand stitches from this point to this point, and then the robot would perform them under the supervision of the surgeon,” said Ken Goldberg, professor of industrial engineering and operations research and EECS, a co-investigator on the study. “Our goal is to train the robot by observing human experts performing sutures.

    Such “machine learning” has become a hot research topic in the past few years as humans look for ways to program computers to perform complex motions without being  directed to do so outright.

    Laparoscopic surgery is one area where researchers hope to apply this technology. This minimally invasive surgery involves two small incisions made in the body through which surgeons can insert a camera and operate by simultaneously looking at a camera monitor and making necessary actions with a specialized surgical tool. The operation can be exhausting for a surgeon who is charged with making complex, delicate movements that can mean life or death for a patient.

    “We can monitor or make changes to drive the motion of our robot using our own algorithms,” Goldberg said. “So that gives us much more freedom and flexibility to explore this frontier.”

    The researchers say  they hope to demonstrate automated suturing techniques in the next four years and hope to see limited autonomy in operating rooms within the decade.

    However, your surgeon of the future is likely to resemble McDreamy more than, say, R2D2.

    “I don’t foresee robots doing surgery without human supervision,” Goldberg emphasized. “We don’t expect that to happen anytime in the future.”