Healthcare Meets Human-Centered Computing

May 30, 2019
Healthcare Meets Human-Centered Computing
Nadir_Weibel_Lab-(70-of-118)_0.png
The Weibel Lab is testing augmented and virtual reality to
help novice surgeons perform complex surgeries.
Photo: Alex Matthews, Qualcomm Institute at UC San Diego

 

By Joshua Baxt, CSE writer

How do we combine computers and medicine to provide better care? Some answers are obvious, like electronic medical records or computed tomography (CT) scans. Others require more research. Sitting at the confluence of computing, healthcare and design, the Weibel Lab is developing new ways to incorporate technology into patient care.

“We are trying to design and implement solutions to distinct problems in health and healthcare,” says Nadir Weibel, an associate research professor in the Department of Computer Science and Engineering and head of the Human-Centered and Ubiquitous Computing Lab at UC San Diego. “Think about sensors, cameras, wearables, augmented reality. How can we use these technologies in healthcare?”

Leveraging technologies both in the clinic and the Simulation Training Center, the lab is collaborating with physicians at UC San Diego, Kaiser and the U.S. Navy to solve a wide range of problems. For example, ultrasound-guided procedures are common in medicine, but the ultrasound screen may be across the room, forcing clinicians to look away from their patient. The solution is augmented reality.

“My approach to health and technology is to almost give super powers to people,” he says. “We project the ultrasound image onto the patient with augmented reality using HoloLens goggles. It’s like giving physicians X-ray vision.”

 

Detecting Stroke

For stroke patients, lost time can mean lost brain tissue. Weibel has been developing ways to speed up diagnoses by tracking eye and body movements, speech and other markers.

One common stroke symptom is hemiparesis, in which one side of the body suddenly weakens. Video cameras can gather information about the patient before the neurologist has even seen them.

“We can train a machine learning algorithm to detect if a patient has hemiparesis while waiting to see the doctor,” says Weibel. “The goal is to deploy a system on ambulances, perhaps at home, to detect strokes earlier.”

In some cases, computers are better at detecting aberrant motions than doctors. One standard stroke diagnostic is the finger-to-nose test, in which the patient alternately touches the physician’s finger and their nose. The neurologist tracks the arm motion to measure coordination. However, while people are good at detecting left and right deviations, they have trouble with other motions, such as assessing up and down trajectories. Computers can perform these interpretative functions much better.

“We could replace standard telemedicine, providing multimodal data,” says Weibel. “Instead of looking at the video, physicians can look at the data we provide. We are now able to move the diagnostics from something that is subjective and experience-based only to being more data-driven.”

 

By Design

Nadir_Weibel_Lab (80 of 118).jpg
Nadir Weibel observes while student tests using virtual reality to support
surgery.
Photo: Alex Matthews, Qualcomm Institute at UC San Diego

With one foot in the UC San Diego Design Lab, it’s natural for Weibel’s group to take design-centered approaches, closely investigating issues before deciding whether technology can be part of the solutions.

When an ICU physician was interested in bringing augmented reality into the department, for example, Weibel was intrigued by the possibilities but did not automatically endorse the proposition. He had one of his students spend a year in the ICU to learn about their issues first-hand.

“Research starts by observing the needs and stepping back to figure out the real problem,” says Weibel. “The interesting thing we found is that augmented reality would not have helped at all in that environment.”

What they really needed was better family support. In the ICU, clinicians are constantly checking on patients, but families don’t always know who they are or what they’re doing.

“Imagine having a phone that can tell families who just came into the room, their expertise and what kinds of questions they can answer,” Weibel wonders.

These are only small samples of the work being done in the lab. One team is using Twitter to identify areas in San Diego where people may be exhibiting high-risk behaviors that could make them susceptible to HIV. These maps could be used to organize educational outreach.

Another project, called ARTEMIS (augmented reality to enable remote integrated surgery), is using augmented and virtual reality to help novices perform complex surgical procedures. The technology is being co-developed with the U.S. Navy, which can’t always put a trained surgeon in the field.

Using virtual reality goggles, a medic can receive assistance from a surgeon thousands of miles away. The surgeon can point to instruments, show the medic where to cut and provide other help.

“We can bring in experts remotely to help people who have basic skills,” says Weibel. “The expert can show up as a hologram or we can just show their hands appearing on the surgical field”

Many of these technologies are still in the testing and development phase, but Weibel looks forward to bringing many of them to hospitals

“We can use the power of sensors and artificial intelligence to augment perceptions and improve care.”