I am sorry, Grandma. I am sorry to every grandmother and grandfather navigating a world that was not built with them in mind. Life is already hard enough. Adding financial hardship layered on top of everything else that comes with age; the health battles, the grief, the loneliness of outliving people you loved, and then telling someone in their 70s that their resume was filtered out by an algorithm before a single human being laid eyes on it? That is not progress. That is cruelty with a user interface.
Ageism is not new. The word itself has been with us since 1968, when pioneering gerontologist Dr. Robert N. Butler, who watched a community resist housing for older residents in a Washington, DC, suburb, turned to a young Washington Post reporter and said, “It’s like racism. It’s ageism.” A front-page story followed. A term entered the dictionary. And a truth that generations of older Americans had been living without language for finally had a name.
Butler later wrote in his Pulitzer Prize–winning book that in America, “childhood is romanticized, youth is idolized, middle age does the work, wields the power and pays the bills, and old age, its days empty of purpose, gets little or nothing of what it has already done. The old are in the way.”
That was 1975. We are now more than 50 years removed from those words and the wound has not healed. It has simply evolved. What was once a slow, human prejudice; a hiring manager glancing at a graduation year and quietly moving on, has now been automated, accelerated, and embedded into systems that process thousands of resumes before any human being ever gets involved.
My Grandmother Still Works. She Has To.
My grandfather died in his early 50s. My grandmother has been navigating this world alone ever since. Without the financial safety net that a long partnership was supposed to provide, without the social security that disappears the moment you earn just enough to keep a roof over your head but not quite enough to breathe comfortably. She is in her mid-70s. She still works. Not because she wants to. Because she has to.
‘Amazon once abandoned an AI hiring tool after discovering it penalized resumes that included the word “women’s.” ’
Last week she told me she was looking for a new job. I had to stop her and remind her to remove anything from her resume that could reveal her age: graduation years, dates stretching back decades, anything that might signal to a screener, human or algorithmic, that she belongs to a generation they have already decided is not worth their time. Because if she leaves any of that in, there is a good chance the only thing she receives back is a polite, automated “thank you for your interest” email. And that email will not come from a person. It will come from a system.
That conversation stayed with me. Because we were not talking about discrimination in the abstract. We were talking about my grandmother’s grocery bill. Her electricity. Her dignity.
The Algorithm That Never Blinks
What scares me most is that nobody cares about my grandma, or yours. They are the forgotten. Not forgotten by accident, but forgotten by design, by systems built without them in mind, by algorithms trained on data that never included them, by a world moving so fast it stopped looking back to see who it left behind.
Eighty-eight percent of companies now use AI for initial candidate screening.
Adoption surged from 30% of employers in early 2024 to nearly 99% among Fortune 500 companies, with 57% –78% using AI specifically for resume screening and candidate sourcing.
Think about what that number means for someone like my grandmother. Before she has spoken to a single person. Before anyone has heard her story, understood her work ethic, or recognized that four decades of lived professional experience is not a liability but an asset. A machine has already made a decision about her.
And that machine was trained on historical data. Which means it was trained on a world that already excluded people like her.
Traditional hiring methods have long struggled with bias. Amazon once abandoned an AI hiring tool after discovering it penalized resumes that included the word “women’s,” highlighting how these systems can inadvertently perpetuate historical discrimination.
If AI systems can encode gender bias so deeply that a single word triggers automatic penalization, imagine what they do with age signals. Graduation years. Employment dates stretching back to the 1980s. Job titles from industries that no longer exist under those names. Every marker of a long and full career becoming a quiet disqualifier in a system that was built for efficiency, not equity.
She Left When Everything Moved to Computers
I think often about a conversation I had with my late mother-in-law, Josephine. She had been a head nurse, and not just any nurse, but the kind who rose to the top of her field through decades of hands-on care, clinical intuition, and a deep knowledge of her patients that no system could replicate. I asked her once why she left when she still had so much left to give. She answered simply.
‘What does it say about how we value our elders that we have automated the first gate they must pass through to earn a living?’
When everything moved to computers, I was done. I was not about to learn everything I spent 40 years learning to do on paper and with my hands, only to relearn it on a computer system with time constraints. That was her choice not to make that transition. But it never felt entirely like her choice. It felt like the world made the decision for her and left her to accept it.
She was not pushed out loudly. There was no confrontation, no discriminatory firing, no obvious moment of injustice. She was pushed out quietly, by the accumulation of a thousand small signals that said the new system was not designed with her in mind. That her knowledge, her instincts, her decades of presence at bedsides and in crisis moments; none of that had been translated into the new language. And nobody was going to translate it for her.
That quiet exclusion is the cruelest kind. Because it does not give you anything to fight.
I Love Technology. That Is Exactly Why This Matters.
I want to be clear about something. I am not writing this as someone who fears technology or wants to slow its progress. I have loved technology my entire life. I remember the thrill of sliding a floppy disk into a drive and the magic of watching something appear on screen. I remember falling in love with a turquoise iBook in school and wishing I could bring one home. I was born a techy girl and I remain one. I have spent the last several years earning certifications in AI, evaluating AI models, and building companies that use AI as a tool for community empowerment.
But it is precisely because I love technology that I refuse to look away from what it does when we deploy it without asking hard enough questions about who it was built for and who it leaves behind.
The Questions Nobody Is Asking Loudly Enough
What does it say about how we value our elders that we have automated the first gate they must pass through to earn a living?
Who is thinking about how AI in hiring discriminates against and isolates vulnerable applicants, and calling it what it is: unjust?
Where is the silver lining for the worker in her 70s who cannot take her age off her lived experience, only off her resume?
How do we govern AI in hiring responsibly so that discrimination is not quietly baked into every screening process?
How do we protect the people who came before us from being filtered out by systems trained on a past that excluded them?
I am in my 30s and I already feel the edges of this. Job postings that ask for recent graduates or soon-to-be graduates are not subtle. They are signals. They are a younger candidate pool by design, disguised as a qualification. If I feel that at 30-something, I cannot imagine what my grandmother feels at 70-something, building a resume in a world that has moved on without asking if she was ready.
The Line Has to Come From Us
The technology does not draw its own lines. It expands into whatever space we allow it to occupy. And right now we are allowing it to occupy the space between an older worker and a livelihood, without nearly enough conversation about what that costs us as a society.
Our elders built the institutions we work in. They raised the generations who wrote the code. They paid into systems that were supposed to catch them. The least we can do is make sure the systems we are building now do not become one more way of telling them they are in the way.
They were here first. They deserve better than an automated rejection.
We must ask ourselves, as Dr. Butler asked 50 years ago, whether we are willing to settle for a technology that merely works efficiently, when so much more is possible.
AnnaMarie Callanta is a former Wells Fargo and Goldman Sachs finance professional, certified AI Model Evaluation Specialist, and MBA graduate with more than 14 AI certifications. She is the founder of Lit Youth AI, a community education initiative dedicated to teaching AI literacy and career readiness to students and families in underserved communities. She writes about the intersection of technology, family, equity, and the human cost of systems built without the most vulnerable.
Photo credit: Shutterstock/3rdtimeluckystudio













