Introduction
As part of our effort to understand better approaches to community-engaged research and how they may help ensure research is equitable and actionable, we embarked on a journey to learn from those doing it well (learn more in our introductory blog post, our interview with Andrea Jones and Kenneth Wells on Community-Partnered Participatory Research, and our interview with Harriet Yepa-Waquie and Nina Wallerstein on Community-Based Participatory Research). Dr. Melody Goodman graciously agreed to sit down with us to discuss the Research Engagement Survey Tool (REST). The REST is a research instrument that allows users to measure the quality (i.e., how well) and quantity (i.e., how often) of community-engaged research principles in practice.
During the interview, Dr. Goodman shared a great deal about the history of developing the REST instrument; the process of validating the content; insight into its use in practice; and directions for future research. In this post, we’ll highlight some key learnings and insights that we gathered from our conversation with Melody. We encourage you to review the entire transcript as there is much more detail, for example, about the different versions of the REST instrument (e.g., short vs. long form) and how partners or teams might approach using it throughout the lifecycle of their community-engaged research project.
On the Development of the REST Instrument
In our interview with Melody, we learned more about what motivated the development of the REST instrument. Melody’s work began with the Program to Eliminate Cancer Disparities (PECaD) evaluation team. In thinking about how to evaluate the work PECaD was doing, Melody and colleagues wondered, “How does community engagement impact the work that this center is doing? Can we really think about the science of community engagement, and can we quantify it in some way?” Melody knew that community engagement was valuable but wanted to know more about the actual science of engagement.
“Anecdotally, I know that I'm a better researcher because of my engagement work. The questions I ask are better; the way of implementing things is more realistic and sustainable. But this is all anecdotal from my years of doing this. So part of the development of this measure was, can we really think about the science of engagement, and can we quantify it in some way? And I know it's not going to be perfect, but really showing people that engagement in science is so imperative that it makes the science better.”
A challenge to answering this question was that PECaD had several different partnerships around cancer and several different types of community engagement projects. In these projects, the practice and focus of community engagement were not always the same. So in her effort to identify a measure that might help her answer these questions, she realized that no measure of the quality and quantity of community engagement principles existed.
"PECaD had been doing lots of community engagement; we just hadn't been evaluating it… We did a systematic literature review of the community engagement literature, the community-based participatory research literature, literature on community-academic partnerships, and literature on patient-centered outcomes research. And we tried to figure out what were these engagement principles?"
In this initial review, Melody and colleagues identified 11 engagement principles (see Table 1 below and this paper).
A Delphi Study to Validate Community Engagement Principles and the REST Instrument
To better understand the 11 community-engagement principles they identified for the REST instrument, Melody and colleagues sought to engage academic experts and other stakeholders who do community-engaged research through a Delphi study (see side bar for more information on the Delphi process).
"We had researchers who do community-engaged research. We also had community members who have participated as partners in community-engaged research. We actually had more community, more people who would identify in the community stakeholder group than in the academic researcher group on the Delphi panel, because that was the voice that we were really trying to get."
Identifying the relevant voice to develop an instrument was a point that Melody emphasized (and a point we echoed in a previous blog). Melody noted,
"We often don't include the people that the measure is designed for in the development process, right? Like if you were developing a measure for asthma you probably should include asthmatics when you're developing the measure, but we often don't do that. And I think that needs to change because one of the things that became really clear in the Delphi process was how much definitions matter, like how people thought about things and how people define things. And it was really important for us to make sure that when people took our measure, they were answering the questions that we were posing and not interpreting them in some different sort of way."
Related to this point, Melody made another intriguing comment regarding the inclusion of a professional editor as part of their in-person meeting for their Delphi process. Melody found that the editor was useful for helping Delphi participants reach consensus and served essentially as an unbiased arbiter of the in-person meeting. As Melody described,
"I invited an editor to the in-person meeting and that was really smart because I think we reached consensus even though we weren't …trying to force it. I think we reached consensus because she was so good with words and where there was disagreement, she was able to find a word that we could get common ground on."
"The editor…had no stake in the game, so to speak, just to try to help us…. So it was actually really nice because she was like an unbiased person just trying to find language that potentially could help."
The result of the Delphi process was that the original engagement principles were reduced to eight. However, it wasn’t a simple elimination of three principles but rather a reduction to seven principles and a stakeholder-driven addition of one principle. (Specifically, three of the original principles stayed the same, four were modified, four were removed, and one was added.) The principle that emerged from the Delphi process was trust. The stakeholders believed that trust needed to be measured as a principle.
In Table 1, we summarize the changes from the original 11 principles to the eight principles defined in the Delphi process. (For more details on the modification of the original principles and Delphi process, see this paper. For instance, it is important to understand that a Delphi process is iterative and takes time. In Melody’s case, their Delphi process involved five stages, with four stages done using web surveys and one stage involving an in-person meeting).
Table 1: Initial and Revised Community Engagement Principles
Initial | Revised |
---|---|
Focus on the local relevance and determinants of health (Δ1) | Focus on community perspectives and determinants of health (Δ1) |
Seek and use the input of all community partners (Δ2) | Partner input is vital (Δ2) |
Involve a cyclical and iterative process in pursuit of objectives (Δ3) | Partnership sustainability to meet goals and objectives (Δ3) |
Build on strengths and resources within the community/target population (Δ4) | Build on strengths and resources within the community or patient population (Δ4) |
Foster co-learning, capacity building, and co-benefit for all partners | Foster co-learning, capacity building, and co-benefit for all partners |
Facilitate collaborative, equitable partnerships | Facilitate collaborative, equitable partnerships |
Involve all partners in the dissemination process | Involve all partners in the dissemination process |
Acknowledge the community (✖) | Build and maintain trust in the partnership (✔) |
Disseminate findings and knowledge gained to all partners (✖) | |
Integrate and achieve a balance of all partners (✖) | |
Plan for a long-term process and commitment (✖) |
Note. ✖ - indicates that the principles were dropped, ✔ - indicates the added principles, and Δ# - indicates the relevant principles that were modified as part of the Delphi process. The three principles that have no symbol next to them were not modified.
Using the REST Instrument in Practice
We wanted to get Melody's insight into how partners or teams might use the REST instrument in practice. For instance, we asked Melody at what stage of the research process should the REST measure be used (e.g., at the end of a community-engaged research project or longitudinally throughout the life of the community-engaged project)?
We learned that there are two versions of the instrument (a comprehensive 32-item instrument and a 9-item condensed version) that might be used in various ways.
"We've been suggesting to people, if you're starting a new partnership or at the beginning of a study, you may want to start with the comprehensive, right. See how that works out. And then you may want to do the condensed version a few times, depending on how long your study is, like let's say it's a five-year study - you may want to do the condensed version annually and then the comprehensive version at the end of the project, right? So, you give it pre/post."
"But, what I know one team is doing and I think is a really great way to use it. They did the comprehensive version at their baseline and found that there were two or three engagement principles where they felt like they weren't performing well. And so they're only tracking those, right? So they're like, we're doing these other ones while we're only going to track these three engagements. So then they're just using the items related to those principles. They're tracking those over time, which I think is a really nice way to think about it. If we're doing great on all this stuff, let's focus on the thing that we're not doing so well at."
We also learned that individuals using the REST instrument found it helpful in having a conversation with community partners about partnerships. As Melody describes,
"I think to me, what's most exciting about it is, even the people that used it in our pilot study came back and said it was a great way to have a conversation with our community partners about partnership. So for me, it was exciting to see people use it in a real way, like let's now really talk about our partnership. Is this where we are? Is this where we want to be? Where do we want to be?"
"People said, it was a really good conversation starter and maybe it provided a forum for people to talk about the project that they often don't talk about, which is how is the partnership working? Whereas when you meet with your project team, you're usually talking about, are we on time? Have we accomplished the task? But you often don't talk about how are we functioning as a partnership? Let's reflect on that, let’s think about what we can do better."
We also learned that the REST instrument might be used to gauge how a partnership is evolving over time. As Melody elaborates when describing the use of the instrument in practice,
"And we've told people, no one likes to take a test and do bad, right? So it's not a test at the baseline. It was a new partnership. You shouldn't have these high levels of engagement - that wouldn't make sense, right? … It takes some time to build a good partnership. And so the goal should be growth, right? The goal shouldn't be to be excellent at baseline. The goal should be that your partnership is developing over time and most people who do engagement work are committed to doing it, across multiple projects."
Tools and Resources
To learn more about Melody’s work and measuring community engagement, we encourage you to look at the resources and papers below. As Melody stated in the interview, "We encourage people to use [the REST instrument] and let us know how [the REST instrument] works."
- Research Engagement Survey Tool
- Evaluating community engagement in research: Quantitative measure development
- Reaching consensus on principles of stakeholder engagement in research
- Development and validation of a brief version of the research engagement survey tool
- A study examining the usefulness of a new measure of research engagement