Newswise — SAN FRANCISCO: Surgeons are tweaking existing computer technologies to enhance their visualization of cancerous tumors and persistent wounds according to two studies presented this week at the 2014 American College of Surgeons Clinical Congress.
One research team tested the visualization of simulated breast tumors using three-dimensional (3D) ultrasound imaging and specially designed, augmented-reality software that allows the surgeon to pinpoint the tumor and measure its volume, including its depth. The other team employed a new 3D sensor and computer algorithms on a tablet computer and machine learning—a type of artificial intelligence—for the first time allowing surgeons to precisely measure the area, depth, and tissue type of chronic wounds with a mobile device.
These high-tech imaging techniques, according to their developers, are more accurate than standard methods.
Augmented-reality glasses guide surgeonsSurgical oncologists, or cancer surgeons, usually remove breast cancers by relying on tactile feedback and radiologic images of the tumor, such as mammograms and ultrasound images, said M. Catherine Lee, MD, FACS, coauthor of the first study and associate professor of surgery at H. Lee Moffitt Cancer Center, Tampa, Fla.
“Our goal in a lumpectomy is to get the lump out with a small rim of normal tissue around it,” Dr. Lee said. “But we sometimes find out we did not get all of the tumor.”
Dr. Lee’s team developed an imaging guidance system designed to minimize the need for repeated operations while sparing greater amounts of healthy breast tissue. Collaborating with Yanhui Guo, PhD, of St. Thomas University, Miami Gardens, Fla., the researchers designed a software algorithm that works with digital ultrasound technology. Ultrasound images are converted into 3D images on a computer screen. In a simulated surgical procedure, the investigators studied the use of augmented-reality to fuse real-world and virtual 3D images. These augmented-reality images can then be transmitted to high-definition 3D glasses or other devices. When the surgeon wears such glasses, they see a superimposed, reconstructed 3D digital image over the actual tumor.
“It gives the impression of X-ray vision. You can see the tumor through the skin,” said lead author Segundo J. Gonzalez, MD, a surgical oncology fellow at Moffitt Cancer Center.
He and his co-investigators analyzed 66 ultrasound images of tumors (32 pictures of a single tumor and 34 of multiple tumors) inside a plastic model of a breast to determine the augmented-reality system’s accuracy of measuring tumor volume. The closer the 3D ultrasound image overlapped with the actual tumor, the more accurate the software was. For detecting a single tumor, the volumetric accuracy was 1.2 cubic millimeters, a minute fraction of a cubic inch (0.00007), which Dr. Gonzalez called “extremely accurate.” Likewise, accuracy for multitumor detection was 5.4 cubic millimeters, or 0.0003 cubic inches.
The investigators hope to study their software using a smart phone camera and, eventually, study it in patients. Dr. Gonzalez is commercializing the new technology through a start-up company, MedSights Tech Corp., in Tampa.
Automated wound assessment system calculates 3D dimensions on a mobile deviceWound assessment relies on crude visual observation according to the other technology study’s senior investigator, Peter C. W. Kim, MD, PhD, FACS, of Children’s National Health System, Washington, DC. He is associate surgeon-in-chief at Children’s National and vice president of the health system’s Sheikh Zaed Institute for Pediatric Surgical Innovation.
Chronic, nonhealing wounds, which can result from burns, diabetes, blood flow problems, or excess pressure due to immobility, affect 6.5 million Americans and cost the United States $25 billion annually in medical and surgical care.*
“Despite this significant clinical burden, there is a general lack of objective evidence to guide wound management,” Dr. Kim said.
Visual estimation of wound dimensions can vary among examiners by 30 or 40 percent according to Dr. Kim. Furthermore, “eyeballing” a wound cannot determine depth, an important consideration since some chronic wounds extend to the bone.
Traditionally, a wound care specialist manually delineates the wound borders using a transparent film and divides the wound bed into different areas by tissue type. This two-step process is called segmentation. Although several automated wound segmentation applications exist, Dr. Kim said none operates solely on a mobile device and, at the same time, calculates the wound’s physical 3D dimensions.
Dr. Kim and his colleagues created an interactive automated wound assessment system with the added advantage of a mobile application, for easy access to wound images. They implemented computer algorithms on an Apple iPad, using an open-source computer vision library (OpenCV). Computer vision is the way computers perceive the world; that view is then converted into digital information, such as patterns and images. One of the researcher’s algorithms, based on an existing “graph-cut” algorithm, identifies wound borders from the evaluator’s finger strokes on a touch screen.
Using machine learning, the investigators programmed a second algorithm to automatically classify tissue type. The three tissue types were: granulation (growth of healthy new connective, or fibrous, tissue), eschar (amount of dead tissue), and slough (shedding of dead tissue). They then tested the new system’s performance speed and consistency—how much the results varied compared with the traditional manual tracing method (called “ground truth” in machine learning). On an iPad, five wound experts analyzed 60 digital images of different wounds using both the automated method and manual tracing for each image.
Results showed that the experts delineated the wound borders and classified the tissue type 33 percent faster using the automated system. For each task, the raters averaged 31.6 seconds per image compared with 47.2 seconds per image using manual tracing. In addition, the automated results were highly consistent with the standard manual method, as defined by a high overlap score (a measure of agreement between ground truth and automated segmentation): above 90 percent.
“Our method of wound assessment saves time, which saves money, and is more accurate, which translates to improved patient care,” Dr. Kim said.
Dr. Kim‘s research colleagues were: Kyle L. Wu, MD; Ozgur Guler, PhD; and Patrick Cheng, MS, MBA. Dr. Kim disclosed that Children’s National Medical Center holds patents on the algorithms for wound segmentation and has developed a company, Fairfax, VA.-based eKare Inc., to further develop the mobile wound assessment method.
___*Source: Sen CK, Gordillo GM, Roy S, et al. Wound Repair Regeneration. 2009;17:763-771.
# # #
About the American College of SurgeonsThe American College of Surgeons is a scientific and educational organization of surgeons that was founded in 1913 to raise the standards of surgical practice and improve the quality of care for all surgical patients. The College is dedicated to the ethical and competent practice of surgery. Its achievements have significantly influenced the course of scientific surgery in America and have established it as an important advocate for all surgical patients. The College has more than 79,000 members and is the largest organization of surgeons in the world. For more information, visit www.facs.org.