The COVID-19 pandemic, lockdowns, and restrictions have forced researchers and evaluators to shift from traditional in-person data collection, as face-to-face interviews and observations are, in many instances, not allowable, too risky, or not ethical.
Dakota Cintron, our E4A Postdoctoral Scholar, had a high-level discussion with Dr. Nadia Diamond-Smith about her work recruiting participants to take online surveys through Facebook and Instagram advertisements during the pandemic. While this blog post serves as an introduction to using Facebook to reach participants and collect primary data, it does not go into detail on some critical topics. For example, the ethical considerations of using Facebook/Instagram in research and distrust of social media platforms.
Research on the use of social media in research is complex and on-going. Dr. Diamond-Smith’s insights are especially relevant now, as researchers and evaluators confront the challenge of how to collect data safely and ethically in the COVID-19 pandemic.
Dakota Cintron (DC): Dr. Diamond-Smith, would you mind sharing a bit about yourself and the work that you do?
Nadia Diamond-Smith (NDS): I am an assistant professor in the Epidemiology & Biostatistics Department at UCSF and also part of the Institute for Global Health Sciences. My work is mostly global and in South Asia. I recently started to do some work in the U.S. My work is mainly at the intersection between women’s status and empowerment and access to reproductive and maternal health care services. I look at things like nutrition and family planning, pre-conception pregnancy care, and a little bit of postpartum care. I am trained as a public health demographer, so I think a lot about methods, samples, and populations. I have recently started doing more mHealth from the intervention and data collection/research sides. That is, using different apps or mobile technologies for actual interventions, and also evaluating these interventions or understanding trends by recruiting populations over social media.
DC: How are you using social media sources in your recruitment?
NDS: I first started using Facebook ads to recruit women for an anemia study in India. Then, with the start of the COVID-19 pandemic, I started collecting data via Facebook and Instagram to try to understand how the pandemic and associated lockdowns, anxiety, and other fears might influence decisions to use and access reproductive and maternal health care services.
Colleagues heard about the work in India and wanted to try something similar in the U.S. In July, we sent out surveys using Facebook and Instagram ads. We recruited about 5,000 women to ask about their access to and use of the full spectrum of reproductive and maternal health care services, as well as fertility intentions and impacts of COVID-19 on violence, economics, mental health, etc.
DC: What motivated your strategy to use Facebook and Instagram ads?
NDS: At the beginning of the pandemic, everybody was thinking about how to collect data to understand what’s happening. In South Asia, we would typically go out into the field and interview women, which wasn’t going to happen. From my previous experience using Facebook ads, I knew it was remarkably easy to recruit a large number of participants very quickly. There is bias in many ways. However, it seemed like a good approach, and really one of the only options, to get a simple snapshot of what was happening with at least a subset of the population.
DC: How effective were your strategies to get a representative sample?
NDS: I knew that Facebook was disproportionately white and skewed a little bit older. I wanted to make sure to send more ads that have pictures of women of color and younger-looking women. Also, since I was recruiting for questions about pregnancy and family planning and postpartum, I wanted to make sure that the ads captured images of women at different parts of their life course. The images weren’t just a bunch of pregnant women because I also wanted women who weren’t pregnant or already had babies. So, my design partners tried to design ads that had a lot of different images that different types of women might identify with.
One of the biggest challenges with this type of research is whether it is representative. There are people who have a Facebook account, among those, there are people who use Facebook a lot, then, there are people who are willing to click on a survey, and there are people who are willing to answer questions once they click on the survey. There are many open questions about the selection of who’s answering. But I think that that’s true for any survey. People forget that we rely a lot on telephone surveys, for example, and only something like 3% of people are willing to answer a phone survey. I think there’s still bias inherent and people are skeptical about online approaches, and they’re not without fault, but I think that they still hold a lot of promise.
DC: What percent of people who see an ad click on it, and what percent ultimately enroll in the study?
NDS: For the India study (I haven’t analyzed this yet for the U.S.), about 3.6 million people were shown the ads. Of those 63,000 or so, about 1.7% clicked on it. Of those people who clicked on it, about 1.5% did the survey. But these ads are really cheap to run, so you can feed an ad to 3 million people for very little money and get 5,000 responses pretty quickly at a time when other forms of recruitment are more challenging.
DC: Do you think there is any way to address or quantify selection bias?
NDS: For the U.S. data, we’ll compare our population’s demographics to the National Longitudinal Survey of Youth and apply weights to get a better sense of who’s actually answering. Other people design their studies differently to do it on the front-end, where you can target pretty narrow geographic codes and say we want at least one man and one woman to answer it from this specific zip code, and Facebook will flood that zip code with ads until that target is reached.
DC: Are there ways to weed out bad actors who are not truly answering the survey?
NDS: We deleted responses where people answered too quickly or took too long. We also added what’s called a honey pot question, where the question asks the respondent to respond “c, “and all the responses with people who didn’t select c were deleted. We also deleted anything that was from the same IP address as another response or outside of the country. We didn’t have a huge amount of fraudulent responses in either of our studies in the U.S. or India. Other people who have used similar approaches have had a really tough time and received many fraudulent responses. I think the way incentives were structured in these studies made a difference—since we wanted more than one round of data, we only gave respondents an incentive when they answered two rounds of the survey in the US and in India, respondents who answered 4 rounds of the survey were entered into a raffle for an iPad mini. I believe the delayed nature of incentives led to less fraudulent data, however, I would love to design a study to actually test this!
DC: Will you be able to retain study participants for longitudinal follow-up?
NDS: This is the big question. With the U.S. study, we recruited people in July, and we’re planning to follow up in the next couple of weeks, four months after the initial survey. So, I don’t know yet if we will succeed in recruitment. In our India study, we only got 53 people who answered all four surveys from that initial sample of 6,000 people. There were about 500 people who answered it at least twice. I spoke with some colleagues who tried to recruit a cohort or panel in the U.S. and had pretty low follow up, even though they used a combination of text messages, emails, and phone calls. With COVID, I don’t know if people will be more or less likely to answer a follow-up survey now.
DC: Is there anything in particular you’d recommend people consider if they are thinking about using social media platforms for recruitment?
NDS: If you want to get a representative sample, try stratifying exactly what groups you want to target and what locations, age groups, or other demographics, specifically making sure you have at least one or two people in each of those groups ahead of time.
People also need to think carefully about designing the survey itself. I often design surveys with many different response options because I want to collect this very nuanced data, but people are most likely doing this on their phone or responding very quickly, and people aren’t going to scroll down to the bottom. So fewer options might be better. Also, not having anything that people type in (like an age) is recommended, and, in general, having as few questions as possible. Which we, as researchers, never want to do.
Another consideration was the sensitivity or subject of the questions. We were asking about things like reproductive health or violence. Some people have argued that respondents aren’t going to want to give honest responses about these topics on an online survey from Facebook, or that was advertised on Facebook, due to distrust of Facebook. But I think that you could even argue the reverse, that it is easier to respond to a question about a sensitive issue by clicking a button in the privacy of your own home, rather than telling someone over the phone or in-person to someone who came to my house. Again, more research is needed to really understand how people answer questions differently through various forms of data collection and recruitment processes, but I see no reason to universally dismiss data collected in this manner.
DC: Thank you for your time and thoughts, Dr. Diamond-Smith. Using social media platforms for survey recruitment is an important topic that needs further research and discussion. There are clear tradeoffs. We can quickly obtain several responses with very little money. However, as you point out, there may be concerns about distrust of social media and privacy. We look forward to hearing more about your work as it evolves!
Additional Information & Resources
Below are some informational resources on using social media platforms for survey recruitment. For more information about Dr. Diamond-Smith’s research, you may reach her at nadia.diamond-smith@ucsf.edu.
- Gelinas, L., Pierce, R., Winkler, S., Cohen, I. G., Lynch, H. F., & Bierer, B. E. (2017). Using social media as a research recruitment tool: ethical issues and recommendations. The American Journal of Bioethics, 17(3), 3-14.
- Melanthiou, Y., Pavlou, F., & Constantinou, E. (2015). The use of social network sites as an e-recruitment tool. Journal of Transnational Management, 20(1), 31-49.
- Samuel, G., & Buchanan, E. (2020). Guest Editorial: Ethical Issues in Social Media Research. Journal of Empirical Research on Human Research Ethics, 15(1-2), 3.
- Welch, T. D. (2020). Is Facebook a viable recruitment tool? Nurse Researcher, 28(1).
- Kosinski, M., Matz, S. C., Gosling, S. D., Popov, V., & Stillwell, D. (2015). Facebook as a research tool for the social sciences: Opportunities, challenges, ethical considerations, and practical guidelines. American Psychologist, 70(6), 543.
- Pedersen, E. R., & Kurz, J. (2016). Using Facebook for health-related research study recruitment and program delivery. Current Opinion in Psychology, 9, 38-43.
- Wozney, L., Turner, K., Rose-Davis, B., & McGrath, P. J. (2019). Facebook ads to the rescue? Recruiting a hard to reach population into an Internet-based behavioral health intervention trial. Internet Interventions, 17, 100246.