FYI.

This story is over 5 years old.

Tech

The Case for Video Cameras in the Operating Room

Surgeons easily predicted which of their peers had the most complications based on video of one surgery.
Photo via the US Army

In a classic Simpsons episode, Homer is forced to go with discount surgeon Dr. Nick Riviera (Hi, everybody!) for a triple bypass surgery. It doesn't take a bookworm like Lisa to realize that a $129.95 heart procedure isn't going to go well—especially when it's revealed that Dr. Nick learned the procedure via a how-to VHS. But even if real life doctors are far better than anyone in The Simpsons, choosing a competent surgeon can be a bit of a crapshoot—unless they're being judged by their peers.

According to a new study out of the University of Michigan Health System, other surgeons were able to correctly predict which patients would have complications with their gastric bypass surgery based on videos of a surgeon's technique. Those who were rated as "highly skilled" surgeons by their peers were significantly less likely to have patients suffer post-surgery bleeding or infections than surgery performed by lower rated surgeons. Those patients were also less likely to have to come back into the hospital.

Advertisement

"Peer assessment of a surgeon's operative skill may be a more practical, more direct, and ultimately more informative test for assessing the surgeon's proficiency than other measures," said John D. Birkmeyer, MD, a surgery professor and lead author of the study, which was published in the New England Journal of Medicine.

Most hospitals have some type of peer review process set up to assess their doctors, but that data is rarely, if ever, given to patients. And if a doctor can keep from getting hit with malpractice suits, they'll likely be able to stay in the operating room.

For the public, doctor ranking is big business—top doctors and hospitals issues are regularly a city magazine's top-selling on newsstands—but the rankings themselves aren't always scientific. Many magazines rely on voluntary surveys taken by patients or other doctors. Consumer Reports took a recent shot at ranking surgery groups based on optimal surgery technique, patient survival, absence of surgical complications, and "the chance that a patient will get all [of the recommended] prescriptions" post surgery. Much of that data is culled from Medicare claims, health care consulting firms, and other sources of available data.

But how do those data compare to expert opinions? In Birkmeyer's study, surgeons were successfully able to judge other surgeons' skill based on one single video. The surgeons' gentleness, time and motion, instrument handling, and "overall technical skill" were rated anonymously by other surgeons on a scale from 1 ("general surgery chief resident"—so Dr. Nick probably falls below this) to 5 ("master surgeon").

He suggests that video cameras in the operating room could be used to weed out medical students who aren't quite up to snuff (or to help bring them up to standard), perform reviews, and potentially rate the best surgeons.

"The technical skill of practicing surgeons varied widely," Birkmeyer said. "Summary ratings varied from 2.6 to 4.8 and greater skill was associated with fewer postoperative complications and shorter operations."

Those who scored poorly had complication rates of nearly 15 percent and took more than two hours to finish a surgery. Higher-ranked surgeons had complication rates of just 5 percent and took just an hour and a half. According to a previous study, overall complication rates for the surgery are roughly 7 percent. And, for the record, none of them had an eight-year-old girl shouting out instructions from the amphitheater.