The shrinking caregiver-to-older-adult ratio raises interest in technological innovation for the future of aging and caregiving. Such interest is stimulating investment in digital health, including technologies for older adults that are aimed at sustaining and prolonging independent living, mitigating health issues via early detection, improving quality of life by helping with online shopping and grocery delivery, connecting elders to their friends and families or offering museum tours in virtual reality.
But as the use of technologies like smart speakers, wearables, smart home devices and fall sensors propagate, older adults’ needs, affordances, constraints and preferences are rarely considered in product design. This results in usability and accessibility issues, difficulties with understanding data practices and an often steep learning curve about safe use of those technologies.
Data Collection and Security Risks
While offering considerable benefits, emerging technologies rely upon the collection and analysis of vast amounts of personal information, from blood pressure readings and sleep habits to someone’s precise physical location and prescription medication regimen adherence. Collecting this information poses serious privacy and security threats, to which older users are particularly vulnerable because of their lower technological literacy and experience, financial constraints and age-related declining physical and mental abilities (e.g., memory, attention span, visual and hearing acuity).
Research has shown that compared to younger populations, on average, older adults are more often targeted for security attacks, are less aware of online privacy and security threats and are less likely to protect against them.
In our interview study, my study colleagues and I found that older adults are particularly concerned that the collection, processing and dissemination of information are opaque, unsolicited, ubiquitous and unavoidable. For instance, older adults suspect that smart speakers, such as Alexa or Google Home and smart TVs may be collecting data passively, even while users are not interacting with these devices.
They also find it hard to recognize what privacy-invasive inferences artificial intelligence and machine-learning algorithms may make about their health, their emotional and mental conditions or their financial status. They also may be unaware of how non-sensitive information gathered from personal search queries (e.g., searches about food, movies and music) might be used. This lack of awareness is particularly concerning when combined with the observation that many study respondents thought they had “nothing to hide” and considered their lives “an open book.”
Such naïve beliefs may lead to incorrect assessments of value and sensitivity of personal information, and excessive data disclosure, resulting in increased exposure to individual profiling, targeted advertising, spam, fraud, scams, phishing, identity theft and price and service discrimination. Unauthorized private information disclosures or mistakes in electronic personal records may endanger benefits to which older adults might otherwise be entitled, such as Social Security, disability allowance, insurance coverage or eligibility for senior housing or assisted living facilities.
Choosing the Right Technologies
Many older adult users did not participate in choosing or purchasing smart technologies. Instead, they often received them as gifts from younger family members, and sometimes were too gracious to decline using these products, or they accepted surveillance technologies so their families wouldn’t have to worry overmuch about their safety. Many other elders used public devices and public Internet access, as well as secondhand devices, but often were unaware of associated privacy and security risks.
Elders often avoid buying or limit using technology because of privacy, security, usability and other concerns.
Even when older adults make an active choice to avoid using various technologies, other people’s devices (like security cameras or WiFi sensors) may collect information about them. Doctors and bankers may share these elders’ health and financial records in online systems or the elders’ grandchildren may be posting their pictures in social media, jeopardizing their privacy as bystanders. The perception of ubiquitous and unavoidable data collection often leads to privacy resignation, fatalism and an unwillingness to take action to protect personal information.
In addition, residents of eldercare facilities and older adults with chronic health conditions were more often resigned to the loss of privacy in exchange for care and safety, and were subject to care facility surveillance and data collection policies. Older adults living independently but facing declines in health conditions had to balance tradeoffs between privacy and independence, often feeling persuaded by family members to accept domestic surveillance cameras or fall detectors if they wanted to continue “aging in place.”
However, it would be incorrect to assume that older adults are comfortable with the 24/7 sharing of all their personal information with caregivers. As one of our studies shows, elders’ preferences on sharing personal information with family members, medical professionals and care facility staff is granular and context-dependent: the interplay of benevolent purpose, functional relevance, urgency, anticipated caregivers’ emotional reactions and individual attitudes define the appropriate data granularity, communication frequency and channel for sharing information with a certain recipient.
Finally, our interview study shows that older adults, who can choose whether or not to adopt technology, often decided to avoid buying or to limit using technologies because of privacy, security, usability and other concerns. This illustrates the direct economic incentive for technology designers, developers and manufacturers to engage older adults in design process and testing, and to address their concerns.
Alisa Frik, Ph.D., is a research associate at the International Computer Science Institute and a postdoctoral researcher at the University of California, Berkeley.
Author’s note: We would appreciate any help from Generations Today readers as we try to connect with older adults potentially interested in participating in our academic research. If you're able to help disseminate information about our studies, for example by posting a flyer inviting older adults to take part in remote surveys or interviews, or if you have other suggestions about how to safely conduct empirical research with older participants during a pandemic, please contact us at researchlab@icsi.berkeley.edu.
How Can We Empower Older Adults to Use Technology Safely?
Information privacy and security training and awareness programs could empower older adults to use technology more safely and comfortably. In our interview study, older adults considered computer and Internet classes valuable, but some find them too difficult or find them ineffective at teaching how to address the relevant privacy and security concerns.
We suggest involving older adults in co-design and evaluation of security and privacy educational materials, tailored to their levels of experience, knowledge, and specific technological concerns.
These educational materials can be disseminated via publications and websites directed at older adults; in senior centers and public libraries; through IT professionals, repair shops and customer services; at technology stores, and on the websites of service providers, manufacturers and vendors.
Staff delivering materials or technological help to older adults should be trained to use age-relevant, jargon-free language, to speak more loudly and to use clear, slow pronunciation, to employ analogies of processes in the offline world or other contexts more familiar to older adults and to make the materials accessible for people with declining hearing and vision acuity (for example, adjust image and text size, increase line spacing, avoid text justification and increase contrast).
To reduce costs, companies often hire workers for customer support services from foreign markets with cheaper labor, or implement automated troubleshooting systems, online help pages and online and phone chat bots. Older study participants specifically acknowledged the difficulties with understanding non-native language speakers, with communicating with chat bots or with choosing the appropriate search terms on help pages. Companies should provide an alternative option for older users to speak with a native language−speaking representative.
Helping to equip older adults’ devices with appropriate protective software and to configure their privacy and security settings can be a part of educational practicum or of a privacy and security checkup at the point of sale or in repair shops. These practicums would help older adults to put the new knowledge into practice, to learn new skills and to make connections between the educational materials and relevant practical consequences.
—Alisa Frik