Ambient intelligence, human impact

Health care providers struggle to catch early signals of cognitive decline. AI and computational neuroscientist Ehsan Adeli’s innovative computer vision tools may offer a solution.

A photo of Ehsan Adeli with shapes and diagram design elements around him.

Photo: Jess Alvarenga

Share this story

Inside eight apartments at a housing community for seniors in Yuma, Arizona, a camera smaller than a sticky note sits on a shelf in the living room. With the consent of residents, it captures their movements, behaviors, and facial expressions throughout the day. On the back end, an algorithm monitors the footage for troubling changes: Have they started watching television for 10 hours straight? Wobbling as they walk? Frowning and waving their hands more often?

Behaviors like these—which can indicate memory loss, depression, mobility challenges, or irritability—are among the early signs of cognitive decline, which can lead to dementia. But they’re often invisible to health care providers, who base diagnoses on self-reported questionnaires, brief assessments during rare visits, and reports from caregivers who may not notice subtle shifts. 

With the experiment in Yuma, Ehsan Adeli, an assistant professor of psychiatry and behavioral sciences at Stanford, is hoping to change that. His pioneering research uses computer vision intelligence to analyze patients’ movements in videos, from daily activities to minute gestures, in order to flag worrisome symptoms for clinicians. Catching those early can allow for interventions and support that otherwise would have to wait until the disease is further along. 

“Our hope is that this will potentially revolutionize the early diagnosis of cognitive decline, Alzheimer’s disease, and related dementias,” Adeli says.

Adeli’s work is part of a growing field called ambient intelligence, which embeds sensors in everyday environments and uses artificial intelligence to interpret the data. His first related project, an ongoing work with Clinical Excellence Research Center (CERC)uses computer vision, a type of AI, to analyze videos of patient interactions in order to improve care at Stanford Hospital.

A GIF of a patient on a bed with a doctor lowering their bed. There are several diagrams around the image.

Among the model’s abilities: recognizing actions such as adjusting the bed, or putting slipper socks and compression sleeves on the patient. Courtesy of Ehsan Adeli and the Partnership in AI-Assisted Care team

Ambient intelligence is also a leading priority for the Stanford Institute for Human-Centered Artificial Intelligence (HAI), which partially supports Adeli’s research. As an HAI-affiliated faculty member, Adeli shares the institute’s vision of a future in which technology augments human potential and enriches lives, ultimately contributing to a more sustainable and compassionate society.

Adeli’s research with seniors brings his innovative approach into homes for the first time. What are known as neuropsychiatric symptoms—such as mood changes, confusion, and wandering—are proven early predictors of cognitive problems. Sensors commonly used in health care, such as sleep and temperature monitors, can’t track these effectively. Wearable sensors are too onerous and can result in missing data if patients remove them.

“That’s why the type of technology we are developing—specifically, computer vision—is key,” Adeli says. “There are few instances of using passive camera data to understand behavior, let alone relating them to clinical outcomes.”

Our hope is that this will potentially revolutionize the early diagnosis of cognitive decline, Alzheimer’s disease, and related dementias.”
Ehsan Adeli
A photo of Ehsan Adeli looking at the camera,.

Ehsan Adeli. Photo: Jess Alvarenga

Adeli and his collaborators began conceiving the technology two years ago with support from, among others, the Jaswa Innovator Award for Early Career Innovators from the Department of Psychiatry and Behavioral Sciences

With input from practitioners, Adeli is building personalized dashboards that track patients’ key behaviors over time, allowing doctors to notice gradual changes. 

“This would be a kind of contactless vital sign monitoring for human behavior,” he says. “If we can detect these signs early on, medications and behavioral therapies could be used to delay adverse effects and have prolonged, higher quality of life.”

Once Adeli and his team establish the feasibility of the technology, they hope to launch a clinical trial that compares its efficacy to reigning methods of diagnosing cognitive decline. With support from Stanford’s psychiatry department, Adeli has built a “Living Lab,” which resembles a typical living room and will be equipped with more than 20 contactless sensors. His goal is to find sensors that can supplement cameras in people’s homes while maintaining privacy. For example, sensors in a mattress could track sleep patterns, or those embedded in a bathroom floor could capture movement. 

Using computer vision algorithms, Adeli’s movements can be mapped in real time. Video by Ehsan Adeli, Tiange Xiang, MS '24, PhD '28, and Jess Alvarenga

Ambient intelligence tools will soon be commonplace in communal areas of senior homes, predicts Bryan Ziebart, president of Insight Living, which manages the Yuma facility. He believes that quality of life and health outcomes will improve as a result.

“If you have 100 residents, you don’t have 100 caregivers,” he says. “Leveraging computer vision around affect, gait, and emotional state will be a core part of how communities operate in the future.” 

Along with HAI collaborators James Landay and Fei-Fei Li, and the CERC collaborators Arnold Milstein and Vankee LinAdeli is in the process of launching a complementary pilot project with the National University of Singapore focused on detecting neuropsychiatric symptoms as a precursor for dementia. Adeli’s team, which also includes Sarah Billington, is also testing and designing ambient intelligence tools that go beyond tracking cognitive decline for use in general senior care. 

For Adeli, the issue is personal. Years ago, he witnessed his wife’s grandmother, who suffered from dementia, gradually lose her ability to perform daily activities. 

“It was heartbreaking, and that’s partly why I am passionate about the technology we are developing,” he says. “I truly hope it can help millions of families like mine—offering early detection, timely intervention, and ultimately, a chance to preserve the health and independence of their loved ones.”

##

Ehsan Adeli is an assistant professor of psychiatry and behavioral sciences and, by courtesy, of computer science at Stanford. He directs the Stanford Translational AI Lab.

An emphasis on privacy

Privacy is of central importance to Adeli’s work. In 2021, he published a paper with colleagues in Lancet Digital Health outlining recommendations for the ethical use of ambient intelligence in health care settings, and he applies these in his own work. The pilot limits cameras to living rooms and kitchens, and participants can turn them off when desired (or ask staff to do so) and can decline to hand over footage at any point. Blurring videos isn’t compatible with analyzing them, but the team stores data in an encrypted form, and researchers have no access to a live video feed. 

Privacy concerns were the reason that residents at another senior living community in Prescott Valley, Arizona, declined to participate in the pilot. 

“They somehow got the idea that Stanford was going to report who has dementia and ship you off to memory care forever, though we don’t receive any of that information and would never do that,” Ziebart says.

Indeed, a hallmark of all funded HAI projects is that they pass the Ethics and Society Review, which ensures that project investigators proactively address ethical and societal risks before receiving funding. The process, pioneered by HAI, requires researchers to identify potential harms, propose mitigation strategies, and engage with ethical considerations throughout the project life cycle.

In Yuma, Adeli’s “minimally invasive” approach appealed to some residents and their loved ones, Ziebart says. “Families are very excited, because they have better insights and data into how loved ones are doing, and seniors feel empowered to help the next generation in a meaningful way by living their life and allowing a little camera to sit in the corner,” he says.

Share this story

Related stories:Explore more