Cryptonews
EN

Harvard's 'FaceAge' AI Links Cancer Survival to How Old You Look in a Photo

decrypt.co

11 hour ago

Harvard's 'FaceAge' AI Links Cancer Survival to How Old You Look in a Photo

A new artificial intelligence model developed at Harvard University, called FaceAge, estimates biological age by analyzing facial features in photos—and might help predict cancer survival by indicating how old a patient appears relative to their actual age. Trained on images of 58,851 healthy people, FaceAge was later tested on cancer patients to explore whether appearing older than one's chronological age might signal poorer health outcomes. “We found that, on average, cancer patients look older than their chronological age, and looking older is correlated with worse overall survival,” the report said. “FaceAge demonstrated significant independent prognostic performance in a range of cancer types and stages.” Chronological age refers to the number of years a person has been alive, while biological age reflects how well—or poorly—their body is functioning relative to that number. According to Harvard researchers, a person’s physical appearance may provide effective biomarkers to determine their biological age. FaceAge builds on earlier work from ETH Zurich, where researchers created Deep EXpectations (DEX), an open-source deep learning model that estimates apparent age from facial images. The Harvard team also trained FaceAge using images from IMDB-WIKI and UTKFace, two of the largest publicly available facial image datasets. Since 2006, Harvard has committed significant resources to understanding and reversing biological aging. More recently, the university has expanded its investment in AI-driven research focused on diagnosing and treating cancer, fields that are increasingly converging. In October 2024, developers at Harvard Medical School unveiled a new AI model, known as the Clinical Histopathology Imaging Evaluation Foundation (CHIEF). At the time, researchers noted that the AI outperformed previously tested models with a 96% accuracy in cancer detection. While the FaceAge research centered on biological age and cancer, researchers said it could lead to broader applications. “These findings may extend to diseases beyond cancer, motivating using deep learning algorithms to translate a patient’s visual appearance into objective, quantitative, and clinically useful measures,” the Harvard researchers said.  FaceAge is the latest tool in a growing movement among medical experts to focus on biological age, using facial analysis to identify early signs of decline and shift care toward prevention, rather than just treatment. According to experts like Kian Katanforoosh, adjunct professor of deep learning at Stanford University and founder of the skills intelligence company Workera, the shift towards AI in biological age research is about overcoming human limitations. “AI analyzes thousands of features in a face for things most of us don’t consciously notice, and finds patterns that correlate with biological aging,” Katanforoosh told Decrypt. “It’s similar to how early deep learning models got better than humans at detecting cats in photos. They didn’t use intuition. They were trained on millions of examples and learned what was statistically consistent.” “Humans are biased and inconsistent," he added. "AI is trained systematically at a scale we can’t match." Edited by Andrew Hayward

https://decrypt.co/322311/harvard-faceage-ai-links-cancer-photos?utm_source=CryptoNews&utm_medium=app