LeadingAge Magazine · January/February 2014 • Volume 04 • Number 01

Evaluating Your Innovations

January 09, 2014 | by Debra Wood, R.N.

Evaluation, whether done in-house or in a rigorously formal manner by professionals, must be a part of any innovative program or policy by aging-services providers. Here is a look at how some organizations evaluate their innovations, and some advice from professional researchers.

The desire to improve care or offer a better product or service permeates the American culture of innovation, but knowing whether new equipment, policies or staffing actually has enhanced quality or safety requires evaluation.

“Evaluation enhances the quality of our care, the experience of persons receiving the care, and ensures the care delivery is efficient and cost effective,” says Kimberly Van Haitsma, Ph.D., director of the Polisher Research Institute of the Madlyn and Leonard Abramson Center for Jewish Life in North Wales, PA.

“It has a tremendous amount of payback,” Van Haitsma adds. “Our research processes are designed to help us learn what we are doing well and also contribute to the scientific knowledge base of caregiving and health delivery.”

Lee Ratta, senior vice president of the Organizational Advancement Group at Front Porch in Burbank, CA, agrees about the importance of evaluation and finds it beneficial in achieving the organization’s mission.

“You have to evaluate your work against how well you are meeting residents’ needs,” Ratta says.

Evaluation is the only way to know if what you are doing is making a difference in people’s lives and is good practice, adds Robyn Stone, executive director of the LeadingAge Center for Applied Research and senior vice president of research at LeadingAge.

“It’s good business sense to be evaluating,” Stone says. “It’s about how you are using your investments, whether you are using them wisely and using the findings to do some quality improvement. It’s about seeing if something works, for whom and at what cost.”

Evaluation can address short-term issues as well as assessing long-term impacts through both prospective and retrospective analyses.

Alice Bonner, Ph.D., R.N., associate professor at the School of Nursing and faculty associate at the Center for Health Policy at Bouve College of Health Sciences at Northeastern University in Boston, reports that one of the worst mistakes one can make is to start thinking about evaluation after a program has been launched. Evaluation must, she says, be the starting point of planning and should involve frontline staff members, residents and their families. Bonner is a past member of the Advancing Excellence in America’s Nursing Homes steering committee, and now serves the organization as a consultant.

“When we approach projects, we ask how it meets the needs of people we are serving,” says Davis Park, director of the Front Porch Center for Innovation and Wellbeing. “We ask what do we want to know and how will it be impactful.”

Forming the right questions that need answers is critically important, Stone adds. Thinking about all of the things that must be in place for a proper evaluation creates a better structure when implementing the innovation, she says.

“Every time you do something, a new training, introduce a new piece of equipment—ask who is it intended to help and what are we going to get out of it,” Stone reports. “Then the organization must track it and learn where it made a difference or didn’t and then use this information to change course if necessary.”

That includes establishing a clear plan for measuring the activity, which can include focus groups, surveys and analysis of administrative data, before any intervention takes place, and afterwards.

Front Porch pilots a program before rolling it out system-wide. It works with vendors and team members to outline objectives, develop a plan with survey tools and focus groups, and then evaluate and study the results, so it will know how well the new initiative has worked.

“We deploy, test and roll it out,” Park explains. “Communities [within Front Porch] can take the results and determine whether it is something worth doing.”

“We use data to determine how well we are doing and if there are opportunities for improvement,” says Scott Crespy, Ph.D., vice president for quality improvement at the Abramson Center. “We use the data to make changes and monitor the impact of our innovations and improvement over time.”

Staff members collect data to aid in the investigation. Some metrics are required by the Centers for Medicare & Medicaid Services and a regular part of daily activities. But the Abramson Center also uses tools developed by Advancing Excellence in America’s Nursing Homes. They are standardized Excel-based metrics Abramson Center collects and then enters at the Advancing Excellence Web portal. Then Advancing Excellence takes the information, compiles it and returns it to providers, which then allows Abramson Center to compare its results to other providers.

Abramson Center employed one of the Advancing Excellence tools to measure and, ultimately, reduce staff turnover to less than 10 percent. Crespy adds that it has helped with other quality improvement initiatives, since less retraining is needed.

“Advancing Excellence tools are a terrific first stop for organizations interested in evaluating their innovations,” declares Van Haitsma, adding that the CMS Quality Assurance & Performance Improvement website also offers a variety of evidence-based tools.

“Doing a good evaluation is going to cost something,” Stone reports. “It involves people’s time and financial resources. It is probably the best money spent in terms of helping you with quality improvement and quick-course recalibrations when things aren’t working.”

Front Porch employs both self-evaluation and formal academic studies. If it has grant funding, the organization will seek out a professional partner to conduct the research. Volunteers assist with self-evaluation.

“Focus groups can be done with relatively little effort,” Stone reports, but they provide only one qualitative perspective. “Quantitative approaches such as pre- and post-surveys and quasi-experimental research, comparing your natural laboratory’s outcomes with those of a control group, are more rigorous ways to assess impact. Mixed methods, that combine both qualitative and quantitative approaches, are the best way to evaluate your activities. Just thinking ‘knowing it by seeing it’ misses the boat.”

Internal evaluations require ground rules, oversight, informed consent, and independent review, Stone adds.

Many of Polisher’s formal research projects have received National Institutes of Health funding, including the development and testing of the Preferences for Everyday Living Inventory. That tool is now offered through Advancing Excellence, which follows it with a satisfaction questionnaire. That data allows caregivers to monitor how they are doing in terms of patient-centered care and adjust care plans accordingly.

“You do not need to have an advanced degree to do quality assurance and performance improvement,” Crespy says.

Organizations need not attempt to evaluate their innovations alone. Collaborating with an outside partner, such as an academic researcher, provides additional insight from an unbiased source and as to whether goals are being met.

Bonner reports there are lots of university faculty who will arrange for their post-doctorate or graduate students to conduct program evaluations, and that may be easier to arrange than people think. Graduate students typically enjoy a solid grounding in statistics and data analysis, and can serve as advisors in setting up the evaluation process as well as carrying it out.

“Don’t discount the options from local universities,” Bonner says.

Additionally, local departments of public health and area agencies on aging may be interested in helping with evaluations.

The Abramson Center has found partnering with academia can spur internal improvement.

“Partnering has been a key in developing tools for more formal projects, and those tools became the foundation for measuring other pilots,” Park says.

Front Porch has partnered with the University of California, San Francisco (UCSF) and the University of Southern California (USC) to measure the effect of its projects. For instance, UCSF provided study resources on a pro bono basis for Front Porch’s Model eHealth Community for Aging program, which served to enhance digital literacy, increase access to resources and help older adults proactively manage their own health and wellbeing.

UCSF asked participants before and after the intervention about lifestyle, quality of life, health conditions and medical utilization. The researcher developed metrics and tools to measure success and is preparing a report.

Currently, Front Porch is collaborating with the USC Davis School of Gerontology, piloting a “telemental health” intervention for its Model eHealth Community for Aging, linking residents in affordable housing with therapists using iPads and videoconferencing technology. It received a $25,000 grant from the LeadingAge Innovations Fund.

(For more on the latest round of Innovations Fund grants, see the article, “Innovations Fund Fosters New Programs” in this issue.)

Objectives included determining whether video conferencing and telemental health could serve as a strategy for addressing the mental health needs of underserved seniors and establishing a replicable, sustainable model in an affordable housing setting. The activity has become a learning tool for students.

“Initial findings from the UCSF study show that the Model eHealth Community for Aging has been impactful, with high levels of satisfaction and engagement,” Park says. Some residents have said the intervention was better than seeing a doctor in person, because they did not have to leave their apartments.

Park finds residents enjoy participating in research, because it gives them an opportunity to share their feedback.

“The residents feel empowered, and it was meaningful to them,” Park says.

Keeping employees and residents engaged in the process requires keeping everyone updated about the results and sharing in the successes.

“When doing performance improvement, celebration is an essential component,” Crespy reports.

The Abramson Center holds monthly and annual events to celebrate successful innovations and encourage people to get involved and pitch in.

“Quality improvement can be interesting, fun and exciting,” Crespy concludes.