So, You Want to Build a Program?


This article will help dementia care service organizations develop and evaluate intervention programs in the absence of evidence-based solutions, which is key, given: the limited access family caregivers have to evidence-based intervention programs; and the need for organizations to use limited resources to develop and test new programs to serve families living with dementia. It draws upon two case studies of interventions developed at an academic-service center: KINDER and Ayudando a Quien Ayuda; evaluates lessons learned in assessing the two programs to refine them by applying the Exploration, Preparation, Implementation, Sustainment (EPIS) framework, and recommends ways organizations can refine interventions prior to efficacy-testing.

Key Words:

family caregiving, intervention, community services, implementation, translation, evidence-based programs


Despite its devastating impact on persons living with dementia and their family caregivers, the COVID-19 pandemic also drove innovation in dementia care services and supports. Prompted by safety concerns and social distancing requirements, organizations in the aging network quickly shifted to deliver dementia care services remotely. Community organizations and researchers transitioned in-person programs to virtual and telephone-based delivery models and founded new services to address the evolving needs of families living with dementia (Cuffaro et al., 2020; Gitlin et al., 2022; Masoud et al., 2021).

These innovations will likely benefit families living with dementia long past the pandemic. For example, since the start of the pandemic, caregivers reported remote service delivery made it easier to participate in educational and therapeutic programs, by eliminating commutes and reducing the need for respite care (Lightfoot et al., 2021; White et al., 2022). In so doing, dementia care support services became more accessible for many caregivers.

The rapid conversion to remote service delivery during the COVID-19 pandemic provides insight into a persistent question facing dementia care services: What happens when new needs emerge more quickly than evidence-based solutions? Moreover, how does rapid implementation of innovative programming affect how we build toward evidence-based programs?

There is a lengthy history of practice-based solutions for aging and dementia care services developing outside the research context (Onken et al., 2014; Pillemer et al., 2003). Faced with the slow pace of knowledge translation for dementia care interventions (Gitlin et al., 2015; National Academies of Sciences, 2021), community organizations often are left with few options but to build their own solutions to meet client and community needs. The COVID-19 pandemic put into sharp relief a known disparity between translatable evidence-based dementia care programs—i.e., efficacious programs that are affordable and feasible to implement, as well as acceptable to users—and the need for organizations to serve the immediate needs of clients.

To untangle the rapid service response and parallel evaluation of programs generated during the pandemic, we describe our experiences pilot testing two remotely delivered dementia care interventions in a community service organization. Our purpose is twofold. First, identify what worked to pilot new remote interventions in a service setting, including the multiple challenges encountered while doing so. Secondly, shed light on how testing evidence-informed programs in community settings prior to establishing intervention efficacy affects how we build evidence for dementia care interventions. In so doing, we hope to help community-based dementia care services to leverage their program innovations from the pandemic to better support families living with dementia.

Models of Implementation and Intervention Translation

Before describing each intervention, we introduce two models. The first is the National Institutes of Health (NIH) Stage Model of Behavioral Intervention, which is used to characterize and guide development of evidence for behavioral interventions (Onken et al., 2014). The second is the EPIS model of program implementation, which we found useful in examining barriers and facilitators to testing novel interventions in a service setting (Aarons et al., 2011).

National Institutes of Health (NIH) Stage Model of Behavioral Intervention

The NIH Stage Model of Behavioral Intervention was created to reflect the reality that developing interventions often begins in clinical and research settings (Onken & Kaskie, 2022; Onken et al., 2014). Evidence development, according to this model, occurs in six stages. Stage 0 is characterized by basic behavioral research that often informs interventions. Stage 1 focuses on developing and refining the intervention, including pilot studies to determine program acceptability and feasibility. Stage 2 pertains to efficacy testing in a controlled research setting, followed by Stage 3, efficacy testing in community settings. In Stage 4, interventions are tested for effectiveness. The last stage, Stage 5, is dedicated to implementation and dissemination. Intervention development may begin at any point, though the model’s authors note that movement between some stages is more likely than others. The interventions we describe are in Stage 1: they are research-informed programs still undergoing refinement, prior to pure efficacy testing.

EPIS Model of Implementation

We also employ the EPIS implementation model. Implementation models are typically used to integrate evidence-based programs into a practice setting. Yet, in our experience, implementation models like EPIS are informative even at the pilot stage to understand how an evidence-informed intervention fits within an organization, as well as to identify barriers and facilitators to future implementation. We chose the EPIS model because of its applicability to community organizations as diverse as those serving families living with dementia (Aarons et al., 2011; Moullin et al., 2019). However, we also recommend readers see Hodgson’s & Gitlin’s (2021) excellent systematic review where they apply an alternative implementation framework, ERIC, to dementia care intervention research.

‘Video vignettes were story-based and followed members of a support group managing different relationship challenges.’

The EPIS model breaks implementation into four parts, exploration, preparation/adoption, implementation, and sustainment (for a more in-depth introduction to the EPIS model, see: At the exploration phase, there is growing awareness of a problem or need to change. In the preparation phase, an organization adapts to make changes, such as by identifying and addressing barriers and facilitators to implementation. During implementation, program changes take place. Sustainment refers to efforts to maintain changes. We mostly focus on the first three stages in our discussion, which are most pertinent to the pilot-testing phase. Each phase is influenced by outer variables and inner variables affecting the likelihood of success. For example, at the exploration phase, awareness of a community need can emerge from outside the organization, such as from a new research report. Awareness also can grow from within an organization, such as from service provider observations.

Piloting KINDER and Ayudando a Quien Ayuda Interventions

During COVID We draw upon two new dementia care intervention programs tested during the early stage of the pandemic to describe facilitators and barriers we found while testing novel programs in service settings. Both interventions responded to community needs seen during the pandemic and were based upon prior research (i.e., evidence informed). We selected these programs for discussion because they represent different models of delivery (asynchronous online and telephone) and levels of implementation (individual- and organization-level).

Both programs were implemented at the University of Southern California’s Family Caregiver Support Center (USC FCSC). This organization serves approximately 846 family caregivers per year in Los Angeles County. Most caregivers served at the USC FCSC assist a family member living with dementia, and 60% of caregivers served by this program are Latino. Regular services include caregiver assessment, assessment-informed plans and follow up calls, resource referral, support groups, and education programs.

Case 1: The KINDER Intervention

Knowledge and Interpersonal Skills to Develop Exemplary Relationships (KINDER) is an asynchronous online psychoeducational intervention. KINDER was developed to support healthy caregiving relationships and prevent low quality care, including verbal elder mistreatment. Prior to developing KINDER, the study team conducted N=9 focus groups with racially and ethnically diverse caregivers from across Los Angeles County (Avent et al., 2019). Following programs to prevent intimate partner violence, KINDER focuses on building healthy relationships to better appeal to caregivers.

KINDER participants were sent eight weekly lessons by email. Each lesson contained a short video vignette, reading, quiz, and reflection. Topics included understanding dementia, seeking resources, communication, seeking help, and mental health. Video vignettes were story-based and followed members of a support group managing different relationship challenges, where group members would help identify solutions discussed in the reading.

KINDER was tested from March 2020 until September 2021 using a single-arm pre- and post-test design with support from Archstone Foundation. While the rollout of KINDER was planned prior to the pandemic, increased risk of elder mistreatment prompted an urgent need to develop education programs that could prevent mistreatment (Han & Mosqueda, 2020). Participants in this pilot study were recruited from among clients who attended the USC FCSC, such as through client newsletters and information shared by assigned family care navigators. KINDER was administered to 23 caregivers, although 50 caregivers expressed interest by completing online registration. Unfortunately, only seven of 23 participants completed all eight lessons of KINDER. Still, qualitative interviews with those caregivers who completed KINDER were overwhelmingly positive.

Application of the EPIS model helped uncover barriers to testing the KINDER program in a service setting, as well as facilitators that could be leveraged in the future to improve KINDER’s feasibility, acceptability, and eventual implementation in a service setting.

Exploration: Relevant to the exploration phase, the study team observed a strong perceived need for elder abuse prevention interventions from outside organizations that facilitated the organization undertaking KINDER. Archstone Foundation, for example, was a major funder of elder abuse research in Southern California and facilitated this project (Archstone Foundation & USC Keck School of Medicine, 2019). However, KINDER lacked buy-in from within the organization. Asynchronous and automated online intervention delivery was unaligned with normal services at USC FCSC, which are high-touch and delivered by telephone. Limited buy-in from family care navigators meant that few recommended the KINDER study to clients.

Preparation: At the preparation phase, we again encountered internal barriers that undermined the program’s success. Due to delays in developing KINDER, the team cut short beta-testing of automated components of the integrated program delivery and data collection web platform. This meant that the study team encountered multiple technology errors that undermined participation and evaluation data collection. For example, automated follow-up surveys were sent to caregivers before they finished completing lessons. This occurred when caregivers fell behind in completing weekly lessons, such as when they did not have time to review all content assigned each week due to caregiving demands. Despite technology challenges, qualitative interviews with participants also revealed strengths at this phase. Program completers commented that content, based on focus groups with caregivers completed prior to program development, felt authentic.

Implementation: Although the team encountered multiple technology errors, a facilitator to evaluating this program was strong communication between the investigators, the research coordinator, and the technology team. When technology issues occurred, the coordinator communicated with the programming team, which could address technology issues and limit adverse effects on participants’ experiences. For example, a high volume of registrants at one time signaled incoming participants were likely “bots” and not real caregivers. The study coordinator quickly identified this issue while visually assessing incoming study registration forms so that these cases were removed. Prompt response ensured timely follow up with actual participants and prevented enrollment of fraudulent cases. This experience also triggered the team to create a plan to prevent future enrollment of bots.

‘The community assessment revealed that Latino caregivers often did not use local community resources that could help support high levels of direct care provided within families.’

While the initial pilot of KINDER did not go as expected, this experience demonstrated important lessons. First, the study team learned the need to gain internal stakeholder interest at the organizational level, in addition to external support to aid recruitment and, in the future, service use. Importantly, we also identified a need for more planning, including beta-testing of automated intervention components. Still, where we encountered technology difficulties, communication between team members helped to overcome them.

Case 2: Ayudando a Quien Ayuda

The second program we describe, Ayudando a Quien Ayuda (Helping the Helper), was designed to connect Latino family caregivers to local community resources. Ayudando a Quien Ayuda is an ongoing program that was created from a partnership between AARP California, 211 LA, the USC FCSC, and Vision Y Compromiso. Ayudando a Quien Ayuda emerged from an 18-month community needs assessment to address the needs of Latino older adults and their caregivers (2017–19). This assessment was conducted in partnership with the Latino Caregiver Coalition and St. Barnabas Senior Services and sponsored by AARP California.

The community assessment revealed that Latino caregivers often did not use local community resources that could help support high levels of direct care provided within families. To increase connection to local services to support Latino caregivers, eligible caregivers who call the 211 LA information line are asked if they would like to receive an outreach call from the USC FCSC. A navigator from USC FCSC then connects with these individuals, completes an assessment, creates a care plan to support the caregivers’ needs, and identifies tailored resources.

From the July 2019 program launch until May 2022, USC FCSC made 2,424 calls and completed 891 intakes and 423 caregiver assessments. AARP California’s program evaluation team determined six in ten caregivers (59%) were very satisfied with their experience, and more than half (53%) were extremely or very satisfied with how easy it was to get help. Application of the first stages of the EPIS model can reveal reasons for overall success of the Ayudando a Quien Ayuda program at this early stage.

Exploration. During the community needs assessment that led to the creation of Ayudando a Quien Ayuda, comprising the Exploration phase in the EPIS model, participating organization leaders identified a gap in the 211 LA information line services. There was no direct way to connect family caregivers to information most relevant to their needs, such as how to access respite care. Shared awareness of this community service gap drove interest among multiple organizations and funders to address this unmet need. But, despite shared recognition of a problem, several partner organizations showed early reluctance to address this need due to anticipated costs. For example, a local government organization withdrew due to concerns that solutions to address this gap would be costly and unsustainable.

Preparation. In the preparation phase, remaining stakeholders sought to overcome concerns about costs and sustainability. Modification of the 2-1-1 system was found to be infeasible. Instead, AARP California partnered with the USC FCSC, which already provided information and referral services to caregivers. By referring 211 LA caregiver calls to the USC FCSC, the community could leverage the program’s existing infrastructure and expertise in caregiving. Only later, during implementation, did the program team realize a need for further staff training and modification of workflow processes in response to this new referral source. Specifically, direct service staff spent more time on calls with caregivers referred through 211 LA, and caller needs were greater than those of clients from other sources. Still, pre-implementation preparation may not have fully addressed this: COVID-19 seemed to exacerbate caller needs, where more callers required assistance with basic needs such as accessing food and housing.

‘COVID-seemed to exacerbate caller needs, where more callers required assistance with basic needs such as accessing food and housing.’

Implementation. During implementation, the program team managed multiple unexpected challenges, such as more intensive client needs, by adopting an iterative approach. This included holding weekly meetings between staff and leadership with program partners to resolve challenges. These meetings were information-driven, as leadership took time to pull recorded 211 LA phone calls to understand caller interactions and then provided retraining to call staff when needed. These recordings also revealed opportunities to improve call scripts to clarify which types of services were appropriate for USC FCSC referral. Later, the project team realized they needed to clarify the distinction between service staff and the AARP California evaluation team, such as assuring callers that their personal information was not shared with the evaluation team. Another concern that arose early in the project was mistrust between staff who provided referrals and staff who received them. Mistrust was alleviated by conducting site visits, prior to COVID-19 social distancing requirements, to help each organizations’ staff understand the other.

Summary of Lessons Learned

Our experiences with KINDER and Ayudando a Quien Ayuda demonstrate how testing program innovations within service settings provides a unique opportunity to refine content, delivery methods, and processes prior to efficacy testing. We hope that our experiences can guide dementia care service organizations that test their own program innovations, such as those responsive to the pandemic, so they may develop into evidence-based and translatable interventions.

Early testing of an intervention within a service setting presents multiple benefits. Pilot testing an intervention in a practice setting prior to efficacy testing provides a chance to modify the intervention so it will be more feasible to implement downstream (Hodgson & Gitlin, 2021). By implementing evidence-informed programs prior to efficacy-testing, we pre-emptively address the problem described as “fitting square pegs into round holes,” or trying to make evidence-based interventions fit into service settings. By starting with an initial implementation, we can begin the process of building an evidence base with more “rounded” interventions that may be more feasible to implement in the future (Onken et al., 2014). While this goal could also be accomplished by integrating stakeholder input during intervention design (Gitlin & Hodgson, 2015; Onken et al., 2014), some problems may not be foreseeable without an initial test-run in a service setting. We found this with KINDER, for example, where stakeholder focus groups initially supported an entirely asynchronous intervention, yet program participants later expressed their preference to interact with other caregivers.

Another benefit of initial testing of service innovations in service settings is the ability to identify likely barriers and facilitators to future implementation. While testing the Ayudando a Quien Ayuda program, the program team realized that clients who contacted the USC FCSC from 211 LA referrals had more intensive needs than previous clients, thus requiring greater staff time and training. A similar example as this is found with the Care to Plan (CtP) intervention. Care to Plan is an online intervention that provides tailored resource recommendations to caregivers of persons living with dementia. CtP was developed with extensive stakeholder input (Gaugler et al., 2016). More recently, the investigators conducted a small pilot in a healthcare setting to identify barriers and facilitators to implementing the program (Cha et al., 2022). Qualitative interviews revealed barriers and facilitators related to caregivers’ and providers’ experiences, such as the belief that it could be too time consuming to use resources.

Based upon our experiences, we recommend service organizations evaluate program modifications and responsive interventions to understand fit within their organization as well as opportunities for refinement. To prevent an over-commitment of resources at early stages in the intervention’s development, organizations might consider initially testing innovations on a small scale (Taylor et al., 2014). Once intervention components are refined following pilot testing, organizations that test new interventions in dementia care are encouraged to partner with academic institutions to engage in efficacy-testing (Aarons et al., 2011; Hodgson & Gitlin, 2021). The ability to test both interventions in an academic service center like the USC FCSC for caregivers was advantageous for both programs, which leveraged existing infrastructure such as data collection tools and institutional review board review. As the pandemic enters a new phase, dementia care service locations may have the opportunity to return to in-person service delivery. Yet, organizations realize the value of maintaining at least some remotely delivered programs generated or modified during the intervention. Following this rapid period of innovation, we recommend dementia care organizations reflect on their service responses and look ahead to build the evidence for the programs they developed and modified that may continue to serve families living with dementia into the future.

Kylie Meyer, PhD, is assistant professor at the Frances Payne Bolton School of Nursing at Case Western Reserve University in Cleveland, Ohio. Nancy McPherson is state director for AARP California. Donna Benton, PhD, directs the USC Family Caregiver Support Center/LACRC, is assistant dean of Diversity, Equity & Inclusion, and associate research professor of Gerontology at the USC Leonard Davis School of Gerontology in Los Angeles.

Photo credit: Koto Amatsukami



Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adminstration and Policy in Mental Health, 38(1), 4–23.

Archstone Foundation, & USC Keck School of Medicine. (2019). Elder Abuse and Neglect Initiative Legacy Report.

Avent, E. S., Rath, L., Meyer, K., Benton, D., & Nash, P. (2019). Supporting Family Caregivers: How Does Relationship Strain Occur in Caregiving Dyads? A Qualitative Study. Innovation in Aging, 3(Supplement_1), S289. doi: 10.1093/geroni/igz038.1066

Cha, J., Peterson, C. M., Millenbah, A. N., Louwagie, K., Baker, Z. G., Shah, A., Jensen, C. J., & Gaugler, J. E. (2022). Delivering Personalized Recommendations to Support Caregivers of People Living With Dementia: Mixed Methods Study. JMIR Aging, 5(2), e35847.

Cuffaro, L., Di Lorenzo, F., Bonavita, S., Tedeschi, G., Leocani, L., & Lavorgna, L. (2020). Dementia care and COVID-19 pandemic: a necessary digital revolution. Neurological Sciences, 41(8), 1977–9.

Gaugler, J. E., Reese, M., & Tanler, R. (2016). Care to Plan: An Online Tool That Offers Tailored Support to Dementia Caregivers. The Gerontologist, 56(6), 1161–74.

Gitlin, L., & Hodgson, N. (2015). Caregivers as therapeutic agents in dementia care: The context of caregiving and the evidence base for interventions. In R. Kane & J. Gaugler (Eds.), Family Caregiving in the New Normal (pp. 305–53). Academic Press.

Gitlin, L. N., Bouranis, N., Kern, V., Koeuth, S., Marx, K. A., McClure, L. A., Lyketsos, C. G., & Kales, H. C. (2022). WeCareAdvisor, an Online Platform to Help Family Caregivers Manage Dementia-Related Behavioral Symptoms: an Efficacy Trial in the Time of COVID-19. Journal of Technology in Behavioral Science, 7(1), 33–44.

Gitlin, L. N., Marx, K., Stanley, I. H., & Hodgson, N. (2015). Translating Evidence-Based Dementia Caregiving Interventions into Practice: State-of-the-Science and Next Steps. The Gerontologist, 55(2), 210–26.

Han, S. D., & Mosqueda, L. (2020). Elder Abuse in the COVID-19 Era. Journal of the American Geriatric Society, 68(7), 138–67.

Hodgson, N., & Gitlin, L. N. (2021). Implementing and sustaining family care programs in real-world settings: Barriers and facilitators. In Bridging the Family Care Gap (pp. 179–219). Academic Press.

Lightfoot, E., Moone, R., Suleiman, K., Otis, J., Yun, H., Kutzler, C., & Turck, K. (2021). Concerns of Family Caregivers during COVID-19: The Concerns of Caregivers and the Surprising Silver Linings. Journal of Gerontological Social Work, 64(6), 656–75.

Masoud, S. S., Meyer, K. N., Martin Sweet, L., Prado, P. J., & White, C. L. (2021). "We Don't Feel so Alone": A Qualitative Study of Virtual Memory Cafes to Support Social Connectedness Among Individuals Living With Dementia and Care Partners During COVID-19. Frontiers in Public Health, 9, 660144.

Moullin, J. C., Dickson, K. S., Stadnick, N. A., Rabin, B., & Aarons, G. A. (2019). Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Science, 14(1), 1.

National Academies of Sciences, E., Medicine. (2021). Meeting the Challenge of Caring for Persons Living with Dementia and Their Care Partners and Caregivers: A Way Forward. T. N. A. Press.

Onken, L., & Kaskie, B. (2022). Implementation Science at the National Institute on Aging: The Principles of It. Public Policy & Aging Report, 32(1), 39–41.

Onken, L. S., Carroll, K. M., Shoham, V., Cuthbert, B. N., & Riddle, M. (2014). Reenvisioning Clinical Science: Unifying the Discipline to Improve the Public Health. Clinical Psychological Science, 2(1), 22–34.

Pillemer, K., Suitor, J. J., & Wethington, E. (2003). Integrating theory, basic research, and intervention: two case studies from caregiving research. The Gerontologist, 43 Spec No 1, 19–28.

Taylor, M. J., McNicholas, C., Nicolay, C., Darzi, A., Bell, D., & Reed, J. E. (2014). Systematic review of the application of the plan-do-study-act method to improve quality in healthcare. BMJ Quality & Safety, 23(4), 290–8.

White, C. L., Barrera, A., Turner, S., Glassner, A., Brackett, J., Rivette, S., & Meyer, K. (2022). Family caregivers' perceptions and experiences of participating in the learning skills together intervention to build self-efficacy for providing complex care. Geriatric Nursing, 45, 198–204.