Hosted by IDTechEx
Artificial Intelligence Research
Posted on September 11, 2019 by

Wearable cameras improve quality of life in heart failure patients

The ever-present devices that seem to track all our moves can be annoying, intrusive or worse, but for heart failure patients, tiny wearable cameras could prove life-enhancing, according to research presented today at ESC Congress 2019 together with the World Congress of Cardiology. For more information see the IDTechEx report on Wearable Technology Forecasts 2019-2029.
 
Minute-by-minute images captured by these little "eyes" provide valuable data on diet, exercise and medication adherence, that can then be used to fine-tune self-management.
 
"The cameras bring more information to health professionals to really understand the lived experience of heart failure patients and their unique challenges," stated study first author Susie Cartledge, a registered nurse and dean's postdoctoral research fellow at Deakin University's Institute for Physical Activity and Nutrition in Melbourne, Australia. "This is a level of detail and context that will help us tailor their care."
 
Something as seemingly trivial as drinking too much fluid - which cameras can "see" - can tax an already burdened heart, leading to a potentially deadly hospital stay.
 
Heart failure is a chronic condition where the heart isn't pumping as well as it should be, so the body isn't getting enough oxygen. There is no cure and limited treatments, meaning that self-care is paramount. Healthcare professionals have traditionally gleaned information on patients' daily activities from self-reports, which can be unreliable. This "life-logging technique" is still in its infancy, but studies have shown that it gleans useful data.
 
 
For this feasibility study, Dr Cartledge and her colleagues recruited 30 individuals with advanced (NYHA II-III) heart failure from a Melbourne cardiology practice. Participants' mean age was 73.6, and 60% were male. Patients attached a wide-angle "narrative clip" to their clothing at about chest height. The cameras, barely two centimetres squared, were worn from morning to night and took still images every 30 seconds.
 
"You can really just see the context of the patient's world from chest height," explained Dr Cartledge. "We saw their bingo score cards, their families, their friends but we only saw them if they stood in front of a mirror. We felt like we had been with the patient for the day."
 
The images revealed no "scandalous" behaviour on the part of the participants, said Dr Cartledge, but they did highlight areas for improvement. Patients in general needed to increase their exercise and reduce sedentary behaviour that was typically associated with screen time. Participants could also generally improve their diets, for example there was one participant who could cut back on diet sodas, beers at bingo, and cigarettes.
 
"We can use this information to have a discussion with the patient. Yesterday, one man's pills sat out on his table for ages before he took them," continued Dr Cartledge, who would counsel this patient to take his medication sooner.
 
Almost all of the participants (93%) said they were happy wearing the camera (the remaining two were neutral). Some went so far as to report that they were reassured "someone was watching over them" or that the cameras spurred them to engage in "good behaviour." All participants had the option of deleting photos before the research team saw them.
 
 
But capturing the images and getting consent from patients was the easy part. By the end of the 30-day study period the authors had a library of more than 600,000 photos which they had to sort through and analyse.
 
Machine learning techniques grouped the images into four domains: medication management, dietary intake, meal preparation and physical activity. This process had mixed results. It was most successful in identifying diet-related photos (an average of 49% of the time), followed by information on meals (average 40%) and physical activity (average 31%). Drug adherence was the least precise, with an average of only 6%. This may be because prescriptions come in so many different forms - pill strips, bottles, sprays and puffers - making them hard to recognise.
 
"The sorting is actually extremely difficult," admitted Dr Cartledge. She and her colleagues enlisted the help of artificial intelligence experts at Ireland's Dublin City University to build a more specific platform. Eventually, the team envisions a relatively low-cost venture using a search engine platform and reusable cameras.
 
The sheer number of images was a limitation, acknowledged the author. And the heart failure findings may not be applicable to other populations, however the study methodology could be implemented for other chronic disease populations. Members of the study group were older, had advanced disease and came from a lower socioeconomic neighbourhood. The author predicted that the system, once refined, will be most helpful for guiding newly diagnosed patients.
 
 
"This is the first step," Dr Cartledge said. "Patients are happy to wear it. We can see the context of the challenges they face. The next step is to build an artificial intelligence platform to sort the images out in a quick and meaningful way so healthcare practitioners can use it. We're entering a new frontier."
 
Source: European Society of Cardiology
Top image: John Hopkins Medicine
Learn more at the next leading event on the topic: Healthcare Sensor Innovations 2019 External Link on 25 - 26 Sep 2019 at Cambridge, UK hosted by IDTechEx.
More IDTechEx Journals