Download PDF
Original Article  |  Open Access  |  2 Jul 2023

Impact of different financial incentive structures on a web-based health survey: do timing and amount matter?

Views: 575 |  Downloads: 148 |  Cited:   0
Conn Health Telemed 2023;2:200006.
10.20517/chatmed.2023.002 |  © The Author(s) 2023.
Author Information
Article Notes
Cite This Article

Abstract

Aim: Financial incentives improve response to electronic health surveys, yet little is known about how unconditional incentives (guaranteed regardless of survey completion), conditional incentives, and various combinations of incentives influence response rates. We compared electronic health survey completion with two different financial incentive structures.

Methods:We invited women aged 30-64 years enrolled in a U.S. healthcare system and overdue for Pap screening to complete a web-based survey after receiving a mailed human papillomavirus (HPV) self-sampling kit in a pragmatic trial. HPV kit returners (n = 272) and non-returners (n = 1,083) were allocated to one of two different incentive structures: (1) Unconditional: $5 pre-incentive only (n = 653); (2) Combined: $2 pre-incentive plus $10 post-incentive conditional on completion (n = 702). Chi-square tests evaluated whether survey completion differed by incentive structure within kit return groups or was modified by kit return status. For each incentive-by-kit status group, the cost-per-survey response was calculated as: ([number invited*pre-incentive amount] + [number responses*post-incentive amount]) / number responses.

Results:Overall, survey response was higher in kit returners vs. kit non-returners (42.6% vs. 11.0%, P < 0.01), and survey response was higher in the combined (20.1%) vs. unconditional (14.4%) incentive group (P = 0.01). Kit return status did not modify the association between incentive type and survey response (P = 0.52). Among respondents, time to survey completion did not differ by incentive type among either kit returners or non-returners. Among returners, the cost-per-survey response was similar between groups ($13.57 unconditional; $14.15 combined); among non-returners, the cost was greater in the unconditional ($57.78) versus the combined ($25.22) group.

Conclusion: A combined incentive can be cost-effective for increasing survey response in health services research, particularly in hard-to-reach populations.

Keywords

Conditional, costing, incentive, survey, unconditional

INTRODUCTION

Surveys on subgroups in larger trials can provide critical information to help interpret findings and support translation of interventions into practice[1]. Surveys are also the most efficient quantitative research method to collect information from large numbers of individuals in population-based settings[2]. However, low response rates can bias results and decrease generalizability[3]. Response rates are correlated with demographic factors, including participant age, sex, and socioeconomic status[4]. Additionally, response rates are influenced by survey design features, including length and format, recruitment and invitation methods, content and style of questions, and type and timing of incentives offered for participation[5,6]. While financial incentives increase survey response[7,8], less is known about the relative influence of unconditional incentives (guaranteed regardless of survey completion), conditional incentives (guaranteed after survey completion), and other various combinations. Unconditional incentives are generally more effective than conditional incentives for increasing survey response[5,9-11]. To our knowledge, however, only one health-related study surveying healthcare consumers has evaluated the influence of a combined incentive structure in which individuals are offered both an unconditional pre- and conditional post-incentive[12], and none have specifically compared a combined pre/post-incentive with an unconditional pre-incentive alone. Combined incentives incur lower upfront implementation costs when small denomination pre-incentives are used; thus, it is important to determine if these incentive types yield similar or higher response rates compared to larger-denomination unconditional incentives. Edwards et al.’s 2009 meta-analysis found no response differences between unconditional and conditional incentives for electronic questionnaires, or when larger versus smaller financial incentives were used[5]. Other studies have evaluated conditional and unconditional incentives for postal and telephone-based surveys. A 2018 systematic review focused on studies involving health-related questionnaires concluded that unconditional monetary incentives were more effective than conditional incentives in increasing response rates in postal surveys, among both patients (response ratio: 1.15; 95%CI: 1.09, 1.21) and nonpatients (response ratio: 1.24; 95%CI: 1.12, 1.38)[9]. In a telephone-based survey of postpartum women, Beydoun et al. found a combined $5 telephone card pre-incentive and $25 post-incentive was more effective than a $30 conditional incentive for increasing telephone tracing rates, with no difference in survey completion[12].

Studies have demonstrated mixed results related to an incentive’s effect on response time. Parkes et al. showed that unconditional incentives significantly reduced survey response times compared to those who received no incentive[13]. Blomberg et al. compared conditional and unconditional lottery tickets and found that participants who received a lottery ticket unconditionally took more time to respond and were least likely to respond[14].

We nested a comparison of two incentive structures in a randomized trial evaluating a home-based human papillomavirus (HPV) testing strategy to increase cervical cancer screening in underscreened women (a hard-to-reach population)[15,16]. We invited two subgroups of women (those who did and did not return HPV kits) to complete a survey on their perspectives on this screening modality and allocated them to either an unconditional or combined incentive. We compared the effect of a $5 unconditional incentive (pre-incentive) with a combined $2 unconditional incentive mailed with the invitation letter plus a $10 conditional incentive sent after completing the survey (hereafter called combined incentive) on survey response, as well as the cost implications of these incentive strategies.

METHODS

Study design

From January to July 2015, we conducted a web-based survey with a subset of 30-64-year-old women who were mailed an HPV self-sampling kit six months earlier as part of the Home-based Options to Make cervical cancer screening Easy (HOME) pragmatic trial at Kaiser Permanente Washington (KPWA) (ClinicalTrials.gov identifier: NCT02005510)[15]. Invitations were mailed weekly to women who did (n = 272) and did not (n = 1,083) return a kit until we reached the target sample size of 100 per group. All eligible participants were allocated to receive either an unconditional $5 pre-incentive only (n = 653) or combined incentive [unconditional $2 bill plus $10 conditional incentive upon survey completion] (n = 702). These amounts were determined based on feasibility and ensuring the pre-incentive was not coercive.

Invitations included a research information sheet, instructions with a URL to access the survey and a QR code to scan, and a $2 or $5 cash pre-incentive with an explanation of the conditional incentive ($10) if allocated to the combined group. The research information sheet described a 5-10-minute web survey on experiences with a “health screening kit” mailed 6 months prior with a toll-free telephone number to call with questions, request a paper version of the survey, or “opt-out” of having their individual-level medical record data used for research. Initially, women who did not respond within 1-2 weeks received up to three telephone reminder calls and one voicemail over a 10-day period asking if the invitation was received and an offer to mail a paper version or e-mail the survey link. Invitees who did not complete a survey were automatically mailed a paper version after six weeks. After observing low survey response rates following telephone reminders and automatic paper survey mailings, these strategies were discontinued after two and three months, respectively. The protocol was approved by the KPWA Institutional Review Board.

Data analysis

We used chi-square tests to compare the distribution of select covariates from electronic medical record (EMR) data (age, race, ethnicity, census block household income, and Charlson comorbidity score[17]) by incentive structure randomization group, separately for women who returned and did not return a kit. A chi-square test was also used to assess whether survey completion (yes/no) varied by incentive type (unconditional versus combined), and a chi-square test of homogeneity was used to evaluate whether the association between survey completion and incentive type was modified by kit return status. Among women who returned the survey, we used a Wilcoxon Rank-Sum test to examine whether the number of days between invitation mailing and survey completion varied by incentive type. All statistical tests were two-sided with alpha of 0.05. Analyses were conducted using SAS version 9.4.

For each incentive-by-kit status group, we calculated cost-per-survey response as: ([number invited*pre-incentive amount] + [number responses*post-incentive amount]) / number responses.

RESULTS

Survey participation

Most women invited to the survey were aged 50-64 years, non-Hispanic, white, and had a Charlson comorbidity score of 0. Within kit-returners and non-returners, distributions of EMR-derived covariates were similar between women randomized to the unconditional versus combined incentive [Table 1]. Of the 235 completed surveys, 192 were web-based and 43 were on paper. Overall, survey response was higher in kit returners vs. non-returners [(116/272) 42.6% vs. (119/1,083) 11.0%, P < 0.001], and higher in the combined ([141/702] 20.1%) vs. unconditional [(94/653) 14.4%] group (P = 0.01) [Table 2]. Kit return status did not significantly modify the association between incentive type and survey response (P = 0.52). Survey response was not statistically significantly higher in the combined versus unconditional group among kit-returners [(67/139) 48.2% vs. (49/133) 36.8%, P = 0.06] and higher among non-returners [(74/563) 13.1% vs. (45/520) 8.7%, P = 0.02]. Among respondents, time to survey completion did not differ by incentive type among either kit returners [combined: median [interquartile range (IQR)] = 8 (4-22) days vs. unconditional: median (IQR) = 9 (5-22) days; P = 0.64] or non-returners [combined: median (IQR) = 14.5 (6-44) days vs. unconditional: median (IQR) = 14 (5-26) days; P = 0.35].

Table 1

Baseline characteristics by kit return status and incentive structure


Kit returners (n = 271a)Kit non-returners (n = 1,079b)
$5 (n = 132)$2/10 (n = 139)$5 (n = 519)$2/10 (n = 560)
Covariatescn(%)      n(%)P-valuedn(%)      n(%)P-valued
Age Group, years
30-39
40-49
50-64

14
33
85

(10.6)
(25.0)
(64.4)

20
25
94

(14.4)
(18.0)
(67.6)

0.30e


88
137
294

(17.0)
(26.4)
(56.6)

97
150
313

(17.3)
(26.8)
(55.9)

0.97e

Race
White
Asian
Black/African-American
Otherg
Unknown

102
7
8
9
6

(77.3)
(5.3)
(6.1)
(6.8)
(4.5)

111
13
3
10
2

(79.9)
(9.4)
(2.2)
(7.2)
(1.4)

0.19f




380
40
20
42
37

(73.2)
(7.7)
(3.9)
(8.1)
(7.1)

395
59
15
50
41

(70.5)
(10.5)
(2.7)
(8.9)
(7.3)
0.41e


Ethnicity
Non-Hispanic
Hispanic
Unknown

121
5
6

(91.7)
(3.8)
(4.5)

132
4
3

(95.0)
(2.9)
(2.2)

0.50f


465
18
36

(89.6)
(3.5)
(6.9)

489
31
40

(87.3)
(5.5)
(7.1)

0.26e

Census Block Household Annual Income
$0-49,999
$50,000-74,999
$75,000-99,999
$100,000+
Unknown

31
47
33
14
7

(23.5)
(35.6)
(25.0)
(10.6)
(5.3)

28
43
41
18
9

(20.2)
(30.9)
(29.5)
(12.9)
(6.5)

0.66e




120
188
127
48
36

(23.2)
(36.2)
(24.5)
(9.2)
(6.9)

117
187
143
58
55

(20.9)
(33.4)
(25.5)
(10.4)
(9.8)

0.70e



Charlson Comorbidity Indexh
0
1
2
3+

101
21
3
7

(76.5)
(15.9)
(2.3)
(5.3)

110
14
10
5

(79.1)
(10.1)
(7.2)
(3.6)
0.13f


422
56
27
14

(81.3)
(10.8)
(5.2)
(2.7)

449
62
30
19

(80.2)
(11.1)
(5.4)
(3.4)
0.92f

Table 2

Survey response by incentive structure and kit return status

OverallUnconditional incentive group ($5)Combined incentive group ($2 unconditional plus $10 conditional)
InvitedCompleted survey
InvitedCompleted surveyInvitedCompleted surveyP-valuea (combined vs. unconditional)P-valueb for interaction by kit return status
nn(%)nn(%)nn(%)
All women1,355235(17.3)65394(14.4)702141(20.1)0.006
Kit returners272116(42.6)13349(36.8)13967(48.2)0.0580.515
Kit non-returners1,083119(11.0)52045(8.7)56374(13.1)0.018

Survey costs

Overall, the incentive cost-per-survey response was lower in the combined $2/$10 ($19.96) versus the unconditional $5 ($34.73) incentive group [Table 3]. Among kit returners, however, cost-per-survey response was similar between groups: $14.15 combined versus $13.57 unconditional. Among kit non-returners, the cost-per-survey response was less than half in the combined ($25.22) versus the unconditional ($57.78) group.

Table 3

Cost-per-survey response by incentive structure and kit return status

Unconditional incentive group
($5)
Combined incentive group
($2 unconditional plus $10 conditional)
Completed survey
%
Cost-per-survey responseTotal costCompleted survey
%
Cost-per-survey responseTotal cost
All women14.4%$34.73$3,26520.1%$19.96$2,814
Kit returners36.8%$13.57$66548.2%$14.15$948
Kit non-returners8.7%$57.78$2,60013.1%$25.22$1,866

DISCUSSION

Survey response rates were higher in women allocated to a combined incentive versus an unconditional pre-incentive only. Our survey was embedded in a randomized pragmatic trial that was designed to evaluate an intervention to increase cervical cancer screening uptake among members of an integrated healthcare delivery system who were underscreened and, therefore, relatively less engaged in the healthcare system than screening-adherent women[15]. Though response rates were much lower in kit non-returners, the combined incentive was associated with higher response in both groups and less costly than the unconditional only incentive among kit non-returners, the hardest-to-reach subgroup.

We found a combined $2/$10 incentive (i.e., small pre-incentive amount) resulted in higher response rates and lower total costs than a $5 unconditional incentive, when response rates were relatively low. While we did not find any web-based surveys comparing unconditional and combined incentives, several studies have evaluated the relative influence of unconditional versus conditional incentives on survey response[5,9]. Though some studies have suggested a possible disparity between paper and web-based survey response rates, our study did not compare the two formats[9].

Receiving an unconditional financial incentive in advance of participation may foster trust and goodwill, thereby motivating a subject to return the favor by completing the survey[18]. In our study, all subjects received an unconditional incentive. The $2 bill, a less common denomination that grabs potential participants’ attention[11,19], and the added post-incentive may have further encouraged members of the combined incentive group to complete our survey. Edwards et al.’s 2009 meta-analysis found no response differences between unconditional and conditional incentives for electronic questionnaires, or when larger versus smaller financial incentives were used[5]. We are unaware of any other studies that have specifically compared the influence of unconditional pre-incentives with combinations of pre- and post-incentives in an electronic survey of healthcare consumers, and how these structures influence survey costs.

A short time to survey return may be important, especially if evaluation of an intervention is time sensitive. In our study, incentive type did not influence survey response time. Other studies suggest that response time may be dependent on incentive structures, types of incentives, or what the research participation entails[13,14].

There is value in including methodological studies to improve the design and yield of sub-studies conducted within pragmatic trials. While our sub-study was not originally designed to examine different incentive structures, we leveraged an opportunity to embed a methodological study within the survey sampling design. Other studies evaluated multiple incentive strategies[5,9], and randomized more than 1,000 participants per group. In comparison, our sample was relatively small and precluded evaluating greater than 2 incentive structures. We encourage others to consider evaluating other pre/post-incentive combinations.

Reflective of the underlying parent trial population, survey invitees were mostly white and non-Hispanic, and all were insured members of an integrated healthcare system. Thus, our findings may not be generalizable to other populations (e.g., underrepresented racial/ethnic groups, uninsured people). The incentive structures used in this study may also perform differently among populations with other health conditions or in research with non-survey-based data collection.

CONCLUSION

Our results suggest that combination incentives are preferable to unconditional only incentives for increasing response to health-related surveys. Furthermore, when low response rates are expected, offering a small pre-incentive combined with a larger post-incentive could be more cost-effective than offering only a larger pre-incentive. Future health services research seeking to survey hard-to-reach populations may want to consider using combined incentives. It would also be worthwhile to explore the use of combined incentives to increase engagement in other types of health studies and interventions, such as digital health.

DECLARATIONS

Authors’ contributions

Writing - Original Draft, Visualization: Escudero JN

Conceptualization, Methodology, Writing - Review & Editing: Tiro JA

Conceptualization, Supervision, Writing - Review & Editing: Buist DSM

Formal Analysis, Data Curation, Writing - Review & Editing: Gao H

Project Administration, Writing - Review & Editing: Beatty T

Validation, Data Curation, Writing - Review & Editing: Lin J

Methodology, Validation, Writing - Review & Editing: Miglioretti DL

Conceptualization, Supervision, Funding Acquisition, Writing - Review & Editing: Winer RL

Availability of data and materials

Data will be made available without investigator support to researchers with adequate resources to cover the regulatory and data sharing costs. Data will be made available after approval of a concept proposal aligned with current data approvals, and with a signed data access agreement.

Financial support and sponsorship

This work was supported by the National Cancer Institute of the National Institutes of Health (grant R01CA168598).

Conflicts of interest

All authors declared that there are no conflicts of interest.

Ethical approval and consent to participate

The protocol was approved by the KPWA Institutional Review Board. ClinicalTrials.gov identifier: NCT02005510.

Consent for publication

Patients agree that their data will be used for research and publication.

Copyright

© The Author(s) 2023.

REFERENCES

1. O'Cathain A, Thomas KJ, Drabble SJ, Rudolph A, Hewison J. What can qualitative research do for randomised controlled trials? A systematic mapping review. BMJ Open 2013;3:e002889.

2. Ponto J. Understanding and evaluating survey research. J Adv Pract Oncol 2015;6:168-71.

3. White E, Armstrong BK, Saracci R. Principles of Exposure Measurement in Epidemiology: Collecting, Evaluating, and Improving Measures of Disease Risk Factors (2nd edn). Available from: https://academic.oup.com/book/6327 [Last accessed on 20 Jun 2023].

4. Roberts LM, Wilson S, Roalfe A, Bridge P. A randomised controlled trial to determine the effect on response of including a lottery incentive in health surveys [ISRCTN32203485]. BMC Health Serv Res 2004;4:30.

5. Edwards PJ, Roberts I, Clarke MJ, et al. Methods to increase response to postal and electronic questionnaires. Cochrane Database Syst Rev 2009;2009:MR000008.

6. Dillman DA, Smyth JD, Christian LM. Internet, phone, mail, and mixed-mode surveys: the tailored design method. 4th ed. New York: John Wiley & Sons, Incorporated; 2014. Available from: https://www.wiley.com/en-fr/Internet,+Phone,+Mail,+and+Mixed+Mode+Surveys:+The+Tailored+Design+Method,+4th+Edition-p-9781118456149 [Last accessed on 29 Jun 2023]

7. David MC, Ware RS. Meta-analysis of randomized controlled trials supports the use of incentives for inducing response to electronic health surveys. J Clin Epidemiol 2014;67:1210-21.

8. Gibson PJ, Koepsell TD, Diehr P, Hale C. Increasing response rates for mailed surveys of Medicaid clients and other low-income populations. Am J Epidemiol 1999;149:1057-62.

9. van Gelder MMHJ, Vlenterie R, IntHout J, Engelen LJLPG, Vrieling A, van de Belt TH. Most response-inducing strategies do not increase participation in observational studies: a systematic review and meta-analysis. J Clin Epidemiol 2018;99:1-13.

10. Arora K, Cheyney M, Gerr F, Bhagianadh D, Gibbs J, Anthony TR. Assessing health and safety concerns and psychological stressors among agricultural workers in the U.S. Midwest. J Agric Saf Health 2020;26:45-58.

11. Falk D, Tooze JA, Winkfield KM, et al. A comparison of survey incentive methods to recruit rural cancer survivors into cancer care delivery research studies. Cancer Causes Control 2022;33:1381-6.

12. Beydoun H, Saftlas AF, Harland K, Triche E. Combining conditional and unconditional recruitment incentives could facilitate telephone tracing in surveys of postpartum women. J Clin Epidemiol 2006;59:732-8.

13. Parkes R, Kreiger N, James B, Johnson KC. Effects on subject response of information brochures and small cash incentives in a mail-based case-control study. Ann Epidemiol 2000;10:117-24.

14. Blomberg J, Sandell R. Does a material incentive affect response on a psychotherapy follow-up questionnaire? Psychother Res 1996;6:155-63.

15. Winer RL, Tiro JA, Miglioretti DL, et al. Rationale and design of the HOME trial: a pragmatic randomized controlled trial of home-based human papillomavirus (HPV) self-sampling for increasing cervical cancer screening uptake and effectiveness in a U.S. healthcare system. Contemp Clin Trials 2018;64:77-87.

16. Tiro JA, Betts AC, Kimbel K, et al. Understanding patients’ perspectives and information needs following a positive home human papillomavirus self-sampling kit result. J Womens Health 2019;28:384-92.

17. Deyo RA, Cherkin DC, Ciol MA. Adapting a clinical comorbidity index for use with ICD-9-CM administrative databases. J Clin Epidemiol 1992;45:613-9.

18. Schweitzer M, Asch DA. The role of employee flexible spending accounts in health care financing. Am J Public Health 1996;86:1079-81.

19. Doody MM, Sigurdson AS, Kampa D, et al. Randomized trial of financial incentives and delivery methods for improving response to a mailed questionnaire. Am J Epidemiol 2003;157:643-51.

Cite This Article

Export citation file: BibTeX | RIS

OAE Style

Escudero JN, Tiro JA, Buist DSM, Gao H, Beatty T, Lin J, Miglioretti DL, Winer RL. Impact of different financial incentive structures on a web-based health survey: do timing and amount matter?. Conn Health Telemed 2023;2:200006. http://dx.doi.org/10.20517/chatmed.2023.002

AMA Style

Escudero JN, Tiro JA, Buist DSM, Gao H, Beatty T, Lin J, Miglioretti DL, Winer RL. Impact of different financial incentive structures on a web-based health survey: do timing and amount matter?. Connected Health And Telemedicine. 2023; 2(3): 200006. http://dx.doi.org/10.20517/chatmed.2023.002

Chicago/Turabian Style

Escudero, Jaclyn N., Jasmin A. Tiro, Diana S.M. Buist, Hongyuan Gao, Tara Beatty, John Lin, Diana L. Miglioretti, Rachel L. Winer. 2023. "Impact of different financial incentive structures on a web-based health survey: do timing and amount matter?" Connected Health And Telemedicine. 2, no.3: 200006. http://dx.doi.org/10.20517/chatmed.2023.002

ACS Style

Escudero, JN.; Tiro JA.; Buist DSM.; Gao H.; Beatty T.; Lin J.; Miglioretti DL.; Winer RL. Impact of different financial incentive structures on a web-based health survey: do timing and amount matter?. Conn. Health. Telemed. 2023, 2, 200006. http://dx.doi.org/10.20517/chatmed.2023.002

About This Article

© The Author(s) 2023. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, sharing, adaptation, distribution and reproduction in any medium or format, for any purpose, even commercially, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Data & Comments

Data

Views
575
Downloads
148
Citations
0
Comments
0
3

Comments

Comments must be written in English. Spam, offensive content, impersonation, and private information will not be permitted. If any comment is reported and identified as inappropriate content by OAE staff, the comment will be removed without notice. If you have any queries or need any help, please contact us at support@oaepublish.com.

0
Download PDF
Cite This Article 4 clicks
Like This Article 3 likes
Share This Article
Scan the QR code for reading!
See Updates
Contents
Figures
Related
Connected Health And Telemedicine
ISSN 2993-2920 (Online)

Portico

All published articles are preserved here permanently:

https://www.portico.org/publishers/oae/

Portico

All published articles are preserved here permanently:

https://www.portico.org/publishers/oae/