Summary: Researchers have developed an AI application called FaceAge that uses visual images to calculate biological age and anticipate life results in cancer patients. In a research involving over 6,000 people, those with tumor had FaceAges about five years older than their chronological time, and higher FaceAges were linked to poorer life.
The application outperformed practitioners in predicting short-term life lifespan for individuals receiving preventative therapy, particularly when integrated into their decision-making. These results suggest that physical features may serve as powerful, non-invasive markers for aging and disease, opening new doors in detail medicine.
Important Facts:
- FaceAge AI: Indicates natural age and success using visual photos.
- Cancer Insight: Patients with cancer appeared ~5 decades older than their actual time.
- Clinical Boost: FaceAge improved doctors ’ estimates of life expectancy in palliative care.
Origin: Mass General
Eyes may be the windows to the heart, but a person’s genetic time could become reflected in their physical characteristics.
Investigators from Mass General Brigham developed a strong learning algorithms called FaceAge that uses a picture of a person’s experience to anticipate biological age and survival outcomes for patients with cancer.
They found that patients with cancer, on average, had a higher FaceAge than those without and appeared about five years older than their chronological age.
Older FaceAge predictions were associated with worse overall survival outcomes across multiple cancer types.
They also found that FaceAge outperformed clinicians in predicting short-term life expectancies of patients receiving palliative radiotherapy.
Their results are published in The Lancet Digital Health.
“We can use artificial intelligence ( AI ) to estimate a person’s biological age from face pictures, and our study shows that information can be clinically meaningful, ” said co-senior and corresponding author Hugo Aerts, PhD, director of the Artificial Intelligence in Medicine ( AIM ) program at Mass General Brigham.
“This work demonstrates that a photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans for patients and clinicians.
“How old someone looks compared to their chronological age really matters—individuals with FaceAges that are younger than their chronological ages do significantly better after cancer therapy. ”
When patients walk into exam rooms, their appearance may give physicians clues about their overall health and vitality. Those intuitive assessments combined with a patient’s chronological age, in addition to many other biological measures, may help determine the best course of treatment.
However, like anyone, physicians may have biases about a person’s age that may influence them, fueling a need for more objective, predictive measures to inform care decisions.
With that goal in mind, Mass General Brigham investigators leveraged deep learning and facial recognition technologies to train FaceAge. The tool was trained on 58,851 photos of presumed healthy individuals from public datasets.
The team tested the algorithm in a cohort of 6,196 cancer patients from two centers, using photographs routinely taken at the start of radiotherapy treatment.
Results showed that cancer patients appear significantly older than those without cancer, and their FaceAge, on average, was about five years older than their chronological age.
In the cancer patient cohort, older FaceAge was associated with worse survival outcomes, especially in individuals who appeared older than 85, even after adjusting for chronological age, sex, and cancer type.
Estimated survival time at the end of life is difficult to pin down but has important treatment implications in cancer care. The team asked 10 clinicians and researchers to predict short-term life expectancy from 100 photos of patients receiving palliative radiotherapy.
While there was a wide range in their performance, overall, the clinicians ’ predictions were only slightly better than a coin flip, even after they were given clinical context, such as the patient’s chronological age and cancer status.
Yet when clinicians were also provided with the patient’s FaceAge information, their predictions improved significantly.
Further research is needed before this technology could be considered for use in a real-world clinical setting. The research team is testing this technology to predict diseases, general health status, and lifespan.
Follow-up studies include expanding this work across different hospitals, looking at patients in different stages of cancer, tracking FaceAge estimates over time, and testing its accuracy against plastic surgery and makeup data sets.
“This opens the door to a whole new realm of biomarker discovery from photographs, and its potential goes far beyond cancer care or predicting age, ” said co-senior author Ray Mak, MD , a faculty member in the AIM program at Mass General Brigham.
“As we increasingly think of different chronic diseases as diseases of aging, it becomes even more important to be able to accurately predict an individual’s aging trajectory. I hope we can ultimately use this technology as an early detection system in a variety of applications, within a strong regulatory and ethical framework, to help save lives. ”
Authorship : Additional Mass General Brigham authors include Dennis Bontempi, Osbert Zalay, Danielle S. Bitterman, Fridolin Haugg, Jack M. Qian, Hannah Roberts, Subha Perni, Vasco Prudente, Suraj Pai, Christian Guthier, Tracy Balboni, Laura Warren, Monica Krishan, and Benjamin H. Kann.
Disclosures : Mass General Brigham has filed provisional patents on two next-generation facial health algorithms.
Funding : This project received financial support from the National Institutes of Health ( HA: NIH-USA U24CA194354, NIH-USA U01CA190234, NIH-USA U01CA209414, and NIH-USA R35CA22052; BHK: NIH-USA K08DE030216-01 ), and the European Union– European Research Council ( HA: 866504 ).
About this AI, aging, and Cancer research news
Author: Ryan Jaslow
Source: Mass General
Contact: Ryan Jaslow – Mass General
Image: The image is credited to Neuroscience News
Original Research: Open access.
“FaceAge, a deep learning system to estimate biological age from face photographs to improve prognostication: a model development and validation study ” by Hugo Aerts et al. Lancet Digital Health
Abstract
FaceAge, a deep learning system to estimate biological age from face photographs to improve prognostication: a model development and validation study
Background
As humans age at different rates, physical appearance can yield insights into biological age and physiological health more reliably than chronological age. In medicine, however, appearance is incorporated into medical judgements in a subjective and non-standardised way.
In this study, we aimed to develop and validate FaceAge, a deep learning system to estimate biological age from easily obtainable and low-cost face photographs.
Methods
FaceAge was trained on data from 58 851 presumed healthy individuals aged 60 years or older: 56 304 individuals from the IMDb–Wiki dataset ( training ) and 2547 from the UTKFace dataset ( initial validation ).
Clinical utility was evaluated on data from 6196 patients with cancer diagnoses from two institutions in the Netherlands and the USA: the MAASTRO, Harvard Thoracic, and Harvard Palliative cohorts FaceAge estimates in these cancer cohorts were compared with a non-cancerous reference cohort of 535 individuals.
To assess the prognostic relevance of FaceAge, we performed Kaplan–Meier survival analysis and Cox modelling, adjusting for several clinical covariates. We also assessed the performance of FaceAge in patients with metastatic cancer receiving palliative treatment at the end of life by incorporating FaceAge into clinical prediction models.
To evaluate whether FaceAge has the potential to be a biomarker for molecular ageing, we performed a gene-based analysis to assess its association with senescence genes.
Findings
FaceAge showed significant independent prognostic performance in various cancer types and stages.
Looking older was correlated with worse overall survival ( after adjusting for covariates per-decade hazard ratio [HR] 1·151, p=0·013 in a pan-cancer cohort of n=4906; 1·148, p=0·011 in a thoracic cohort of n=573; and 1·117, p=0·021 in a palliative cohort of n=717 ).
We found that, on average, patients with cancer looked older than their chronological age ( mean increase of 4·79 years with respect to non-cancerous reference cohort, p< 0·0001 ).
We found that FaceAge can improve physicians ’ survival predictions in patients with incurable cancer receiving palliative treatments ( from area under the curve 0·74 [95 % CI 0·70–0·78] to 0·8 [ 0·76–0·83]; p< 0·0001 ), highlighting the clinical use of the algorithm to support end-of-life decision making.
FaceAge was also significantly associated with molecular mechanisms of senescence through gene analysis, whereas age was not.
Interpretation
Our results suggest that a deep learning model can estimate biological age from face photographs and thereby enhance survival prediction in patients with cancer.
Further research, including validation in larger cohorts, is needed to verify these findings in patients with cancer and to establish whether the findings extend to patients with other diseases.
Subject to further testing and validation, approaches such as FaceAge could be used to translate a patient’s visual appearance into objective, quantitative, and clinically valuable measures.
Funding
US National Institutes of Health and EU European Research Council.