How can we build technology that older adults trust?

This week, I’ve been thinking about trust. Connection is the thread that weaves trust, but is it strong enough to weave trust into human interactions with a machine?

Trust and technology

Knowledge workers have a reputation leading consumer tech adoption. Not me. I’m among the very last laggards of almost every cool new consumer tech product, if I adopt at all. In my career, I can debate authentication protocols1 with software engineers. In my personal life, I’m not even on TikTok.

My borderline Luddite aversion to new technology is simply a personal quirk, but many older adults avoid technology for very real concerns over trust and privacy. Thanks to my career, I understand the high level systems that power technology - but most older adults haven’t had reason or opportunity to learn the processes of building software. This knowledge gap can lead older adults to distrust and therefore avoid new tech. Absent an understanding of the system that powers the visible user interface2 , a minimalist user interface may seem alarming rather than delightful. Many icons are unintuitive, there’s no human-readable text to describe actions, and few sites surface an explanation of underlying security protocols.

I’m not surprised when older adults struggle to trust technology - software can feel like a foreign language governed by hidden rules and outcomes.

Trust but verify

AARP's 2024 survey of older adult tech trends revealed that two-thirds of 50-plus consumers have serious concerns when considering a tech purchase. 21% were explicitly concerned about trust and privacy, asking questions about data collection and security. With rampant risk of fraud and scam, these concerns are more valid than ever - and represent the symptom of insufficient education.3

As technology becomes further abstracted from the end user with the explosion of AI, older adult trust in technology becomes more fraught than ever. “Trust but verify” is generally a good rule of thumb, but trust becomes very difficult when verification requires specialized technical skills, like validating a large language model (LLM)’s parameters.

It’s no wonder Pew Research Center finds that 70% of people aware of AI express little to no trust in companies to use AI responsibly.4 Aging and Health Technology Watch analyst Laurie M. Orlov posits that regulatory efforts are crucial in enhancing consumer trust in AI, especially in sensitive sectors like care work​​.5

I’d argue, though, that companies themselves can go a long way in building trust with older adults simply by expanding their “accessible design” efforts to include optional voice- or text-based descriptions of the operations happening when they use software. A software company building for older adults might consider A/B testing whether replacing a menu icon (⋮) with the word “Menu” improves the user experience.

Approaches to building trust

Companies are taking vastly different approaches to fostering trust. All tech companies targeting older adults have to reckon with trust as a key distribution channel, but several companies are selling trust as a service:

  • EverSafe: EverSafe guards against fraud, identity theft, and age-related issues

    • Founder (2003): Howard Tischler

    • Consumer pricing: $6.36-$21.24/mo

  • Genie: Genie showcases how AI can be harnessed to protect and empower users​​ with AI-powered scam-blocking

    • Founder (2021): Jason Wolf

    • Consumer pricing: $7.49-$19.99/mo

  • teleCalm: teleCalm addresses the urgent need for trustworthy communication tools in our daily lives​​ with a stress-free phone service for older adults

    • Founder (2015): Tavis Schriefer

    • Consumer pricing: $45.99-$68.98/mo

Voices of the Upper West Side

This week’s tech coaching sessions with older adults in the Upper West Side (NYC) opened my eyes to how implicit logic and instructions can erode older adult trust in technology6 :

  • Michael’s experience writing financial modeling programs in Fortran in the 1970s led him to equate trust in technology with visual logic. But today’s “clean” modern consumer user interfaces hides all logic from the end user.

    • With Michael, I see a need to surface more explanations to older adults about underlying operations.

  • Mentioned last week, Linda relies on an unwieldy, locally saved Word doc for password management. She tracks passwords digitally just as she would on paper, without backup and without a legend to decipher her system.

    • With Linda, I see a need for more trustworthy password management solutions for older adults.

  • Margaret struggles with the icon-based instructions native to modern consumer tech. For Margaret, icons like the classic kebab menu (⋮) don’t map to meaning.

    • With Margaret, I see a need for explicit (verbal, text-based) rather than implicit (icon-based) instructions for older adults.

Speaking multiple trust languages

Sleek user interfaces paradoxically erode trust among older adults just as they foster trust in younger digital natives. In my experience, older adults often find reassurance in tangible proof of security (like handwritten passwords), while younger generations embrace abstract proofs of security (like Google single sign-on).

As I learn more about older adults’ preferences for the tangible, I’m hopeful that we can transition from “accessible” design to truly universal design that accommodates various transparency preferences. Though generations may differ in their paths to trust, trust itself is an age-agnostic conduit to technology adoption.

1 User interface is the point of interaction between human/computer - think, the app on your phone screen.

2 Authentication protocols are ways to manage identity and verify access, like ways to login. A common example is Google single sign-on (SSO), where you log into Google and Google verifies your identity with other platforms so you don’t need to log in to each platform. Google SSO uses enterprise-grade SAML protocol, while other types of login use lighter-weight OIDC protocol.

 6 Name and identifying details have been changed to protect privacy.