Advertisement
Advertisement
Wellness
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
A recent health tech summit in the US shone a spotlight on trends in the use of data and AI in healthcare that could transform medicine. Photo: Shutterstock

How AI, data are transforming healthcare: three key trends that could change the face of medicine

  • From pinpointing cancer treatments to marrying AI with smartphones and making tech more accessible, a health tech conference puts spotlight on some great ideas
  • ‘We have taught machines to see what we can see. But it turns out that we also have taught machines to see things that we can’t see,’ a neuroscientist exclaims
Wellness

Data and artificial intelligence are transforming the healthcare industry in many ways. The recent Stat Health Tech Summit, organised by the health publication Stat and held in San Francisco, California, and online dived into how technology is changing the face of medicine.

Below are three highlights from the event, concerning the use of AI to mesh patient data with genomic test information to help assess cancer treatment options, using AI with smartphones to potentially help patients to help themselves, and making technologies more accessible for those with disabilities.

How data aids cancer therapy

Billionaire businessman Eric Lefkofsky is a co-founder of Groupon, the US-based coupon discount e-commerce site, but he moved into health tech after his wife was diagnosed with cancer.

Lefkofsky founded his company Tempus in 2015, with the aim of aiding cancer patients’ treatment by developing AI software which could combine patient information from genomic tests with data from clinical records.

New technology combines doctors’ patient notes and DNA test results in a system that helps to assess the best cancer treatment. Photo: Shutterstock

Lefkofsky used the knowledge he had amassed at Groupon about how to structure messy unstructured data – an AI term that means all data, not just numerical – to develop software that could better fit individual patients to suitable cancer therapies.

By extracting clinical data from medical records systems, and combining it with molecular data (concerning the sequencing of patients’ DNA) and bioinformatics (computer tools which interpret biological information), Tempus helps doctors assess which treatments are most suitable for a patient.

A major part of Tempus’ work is looking for clues in DNA sequences, and the company built its own lab to run tests for that purpose. Combining genetic sequencing and machine learning may change the way tumours are treated.

“I’d served as the CEO for Groupon for two years, and in the middle of that, my wife was diagnosed with breast cancer, and I found myself thrust into a clinic,” Lefkofsky says.

“I had this maddening experience where I felt like I was being teleported back in time. I had a Tesla, I had an iPhone, I felt fully in the 21st century … and you get to the hospital and you feel like you’re in 1980,” he says.

“I felt like I was in a time-warp, and that was true for every part of her care, even though she had great care. The technology and data that existed seven years ago, and in many cases, still exists today, was decades behind other industries. I thought, someone has to fix this,” Lefkofsky says – while noting that the hospital treatment did work for his wife.

Eric Lefkofsky is the founder and CEO of Tempus.
Tempus first made genetic testing more personalised. “I started by trying to make genomic tests intelligent and data-enabled,” Lefkofsky says. “Back then, the genetic profiling of cancer patients was relatively new, and the tests were hard to understand. We thought that if we could contextualise them and make them specific to the patient, we could help physicians interpret them.”

Lefkofsky says that the process revealed more than he expected: “We realised that we had stumbled into something bigger than making smart genomic reports. We had worked out a way to bring AI intelligent diagnostics to healthcare.”

Tempus currently still focuses on therapy selection, but is also developing its software to analyse the possibilities of cancer recurrence in recovered patients. Predicting cancer is also a possibility, but it’s a long way down the road, he said.

The potential marriage of AI and mobile computing may deliver new healthcare solutions, allowing people to gain insight on what’s going on inside their body. Photo: Shutterstock

How artificial intelligence benefits healthcare

Greg Corrado, distinguished scientist and senior research director at Google AI, is interested in the potential marriage of AI and mobile computing to deliver new healthcare solutions.

“This really opens up a whole new set of opportunities to expand our ability to provide care to folks who are in real need,” he says.

Greg Corrado is a senior research director at Google AI.

A project which pairs mobile phones with low-cost portable ultrasound scanners is proving successful, he says. “We are looking at health screening in low- and middle- income countries, where over 90 per cent of the mortality around childbirth occurs,” he says.

“It’s an example of how you can use your phone with a trained healthcare provider, or potentially just on your own, to gain insight on what’s going on inside your body.”

Corrado, a neuroscientist, says that deep learning, a form of AI, is particularly suited to solving medical problems. Deep learning’s artificial neural networks are formed of interconnected nodes which, like the neurons in our brain, have the ability to make their own decisions.

Neural networks are initially “trained” by programmers, but after that, are able to form their own conclusions.

“A neural network is a stack of little decision makers who work together to solve a problem,” Corrado says. “The way that they go about this is not predetermined by the engineer – the system discovers how to solve a problem by learning. It’s how the human brain works.”

AI applications are trained to spot the patterns and connections in medical data that humans can see. But they can also search through vast amounts of data at high speed, and find the solutions much faster than a human.

What fascinates Corrado is the potential for AI to come up with a solution that a human had not thought of at all.

“We have taught machines to see what we can see. But it turns out that we also have taught machines to see things that we can’t see. There is a lot of promise there,” he says.

“We could employ AI to automate something that we already do, but we are selling it short if we imagine that all AI is good for is automating a familiar process. In the future, the evolution of AI will be about enabling us to see more,” he says. “It will allow us to understand more about disease states and more about how disease progression works.”

Making technology accessible to all

Website accessibility is the next big thing in the US, and after a slow start, healthcare companies are catching up. The aim of accessibility is to make websites, and other tech, as easy to use for those with disabilities as for the non-disabled.

Joshua Miele, principal accessibility researcher for Amazon Lab126, has made a career of making technology work for people with disabilities. He currently brings his talents to Amazon devices. Accessibility problems arise not from the tech per se, but from the culture that informs the tech, he says.

For instance, Miele, who is blind, says he is often asked whether blind people need to know what’s happening in videos – in spite of the fact that many instructions for carrying out tasks are given in video form. If developers don’t know that blind people use videos, they won’t bother to make it easy for them to do so.

As regards to healthcare, people assume that blind people only need to go to medical websites that reference their blindness, forgetting that blind people also have other illnesses for which they are receiving treatment.

“There are incredible barriers to participation and equity for blind people,” Miele says. “Some of them are technical, but most of them are social. My job is to remove those barriers. It’s about trying to get equitable treatment for people with disabilities. I like to come up with tech solutions that make people think about the social issues.”

One of Miele’s initiatives was a crowdsourcing programme which allowed users to provide an audio description for any YouTube video.

“By putting that out there, we made people consider the question of whether blind people need access to video, which of course they do,” he says.

As a blind person, it’s difficult to be included in anything at all – the problem of accessibility goes far beyond tech, he notes. “To succeed as a blind person, you need a sense of self-advocacy, otherwise you will be marginalised. To be included, you must demand to be included.”

Like what you read? Follow SCMP Lifestyle on Facebook, Twitter and Instagram. You can also sign up for our eNewsletter here.
Post