пятница, 2 марта 2012 г.

Increasing response rates to lifestyle surveys: A pragmatic evidence review

Abstract

Aims: Lifestyle surveys are often a key component of a local Joint Strategic Needs Assessment (JSNA), undertaken to inform public health planning. They are usually administered to a large number of people in order to provide a comprehensive profile of population health. However, declining response rates coupled with the under-representation of certain population groups in lifestyle survey data has led to doubts concerning the reliability of findings. In order to inform the design of their own lifestyle survey, NHS Calderdale commissioned an evidence-based review of the methodological literature relating to the administration of lifestyle surveys, with the specific aim of identifying practical and resource-efficient strategies shown to be effective for maximizing whole-population response rates.

Methods: A pragmatic review of the published literature was undertaken, specifically to explore the most practical and resource-efficient ways to maximize lifestyle survey response rates to the most commonly used methods (postal surveys, face-to-face interviews, telephone interviews and electronic surveys). Electronic databases including MEDLINE, CINAHL, DARE, EMBASE and PsychINFO were searched. Empirical evidence published in the last 10 years was identified and citation tracking performed on all retrieved articles. An internet search for 'grey literature' was also conducted.

Results: The postal questionnaire remains an important lifestyle survey tool, but reported response rates have decreased rapidly in recent years. Interviews and telephone surveys are recommended in order to supplement data from postal questionnaires and increase response rates in some population groups, but costs may be prohibitive. Electronic surveys are a cheaper alternative, but the empirical evidence on effectiveness is inconclusive. Careful planning and tailoring of survey design to the characteristics of target populations can increase response rates and representativeness of lifestyle survey data.

Conclusions: The results of this pragmatic review could provide a valuable resource for those involved in the design and administration of lifestyle surveys.

Key words

lifestyle survey; mixedmethods; literature review; response rates

INTRODUCTION

Lifestyle surveys are traditionally used to collect detailed population information about individual behaviours that impact on health. They are often a key component of a local Joint Strategic Needs Assessment (JSNA), undertaken to inform public health planning. Lifestyle surveys tend to be lengthy and commonly include some of the following topic areas:

* smoking

* alcohol consumption

* body mass index

* dental health status

* diet and nutrition

* drug consumption

* mental health

* neighbourhood perceptions

* physical activity levels

* prevalence of disease

* self-reported general health

* sexual health history and behaviour

(Source: North West Public Health Observatory)

Postal questionnaires are frequently used to collect lifestyle survey data and are often the only financially viable option when collecting information from large populations.1 However, it has been reported that 'one-size-fits-all' survey modes do not achieve high or representative response rates, and that 'survey fatigue' toward the multiquestionnaire method may be developing.2 Indeed, a decline in response rates to postal questionnaires has been observed in recent years,3 alongside under-representation of certain groups in population-wide lifestyle survey data cumulatively confounding the reliability of any results. In order to boost response rates and increase the representativeness of data, it is now recommended that lifestyle surveys are conducted using other modes chosen to suit the characteristics of the target respondents.4,5

Alternative approaches to the postal lifestyle survey commonly include faceto- face interviews, telephone interviews and electronic surveys. Usefully, an overview of these methods has been collated and published as the Lifestyle Survey Toolkit by the South East Public Health Observatory. At the time of writing, an updated version of this toolkit is about to be published by the Association of Public Health Observatories (APHO). However, the literature relating to the design and conduct of lifestyle surveys specifically associated with increased response rates and representative data has not previously been collated. In order to inform the design of its own lifestyle survey, NHS Calderdale commissioned an evidence-based review of the methodological literature relating to the administration of lifestyle surveys, with the specific aim of identifying practical and resource-efficient strategies shown to be effective for maximizing wholepopulation response rates. This work was carried out as part of the National JSNA Dataset Project, and provides a pragmatic review of the relevant evidence.

METHOD

Electronic databases including MEDLINE, CINHAL, DARE , EMBASE and PsychINFO were searched for literature published in the last 10 years on the most effective administration of postal, face-toface and telephone interviews, and electronic lifestyle surveys, specifically in relation to maximizing response rates. A search for 'grey literature' was also conducted using the internet and citation tracking was performed on all retrieved articles. Unfortunately, there was a dearth of published literature specifically related to lifestyle survey work and therefore studies reporting on general survey design associated with increased response rates were also included in this review.

RESULTS

This work was conducted to provide a practical resource for those involved in the design and administration of lifestyle surveys. Therefore, only those studies reporting on survey methodologies effective at increasing response rates and/or representativeness were included in this pragmatic review. With this objective in mind, it was neither useful nor appropriate to include an assessment of study/evidence quality. Rather, the presentation (and subsequent discussion) of results focuses on the factors directly relevant to the conduct of lifestyle surveys.

Postal surveys

The procedure for a postal survey is relatively simple: a sample of names and addresses is drawn from a database, mailing labels are produced and the questionnaire is sent out with a covering letter. Reminder postcards and letters (sometimes with another copy of the questionnaire) are sent to encourage response. A systematic review1 of randomized controlled trials (RCTs) of strategies to increase response rates to postal questionnaires revealed the following:

* The odds of response were more than doubled when a monetary incentive was used and almost doubled when incentives were not conditional on response.

* Response was more likely when short questionnaires were used.

* Personalized questionnaires and letters increased response, as did the use of coloured ink.

* The odds of response were more than doubled when the questionnaires were sent by recorded delivery. They increased when stamped-addressed envelopes were used and when questionnaires were sent by first class post.

* Contacting participants before sending questionnaires increased response, as did follow-up contact and providing non-respondents with a second copy of the questionnaire.

* Questionnaires designed to be of more interest to participants were more likely to be returned, but questionnaires containing questions of a sensitive nature were less likely to be returned.

* Questionnaires originating from universities were more likely to be returned than were questionnaires from other sources, such as commercial organizations.

The results of this review have direct implications for the conduct of lifestyle surveys, as the majority of trials included were medical, epidemiological and health related. However, it should be noted that the review focused on increasing response rates and not whether the addition of any of these strategies increased representativeness. In addition, the specific differential effects of individual strategies on response were not explored.

Interview surveys

Established ways of collecting data (such as the postal survey) are often inappropriate for some minority groups, particularly where high levels of illiteracy and lack of fluency in English exist - two key issues among some ethnic minority communities.6 It has been estimated that approximately 23% of immigrants in Britain have no functional skill in English and 70% cannot function fully in an English-speaking environment.7 It is now suggested that interviews should be carried out to supplement data collected from postal questionnaires in surveys that include ethnic populations. It is proposed that this would assist in overcoming language and literacy obstacles, thereby improving response rates.8

'Face-to-face' interview surveys are often conducted by a trained interviewer either 'in street', 'on the doorstep' or 'in house'. The interviewee can be a named respondent selected often at random from a list or database, a person selected from within a randomly selected household or an individual chosen to meet certain pre-defined criteria. A study conducted by Allison et al.9 found that combining postal questionnaires and interviews as part of a population survey with South Asians was an effective method to maximize response rates. The authors reported that if interviews had not been used the response rate would have been 35%, but with both methods it was 75%. It was also reported that the percentage of agreements between the methods of administration were good and there was a trend for closer agreement between methods when the time between questionnaire and interview was shorter. Other successful strategies include offering translated versions of a questionnaire, or offering interviews with bilingual researchers.

Telephone surveys

Telephone surveys offer good coverage of a population and are widely used in market research, but less frequently in health surveys. One study found that recruitment rates were improved in minority ethnic populations by using telephone compared to face-to-face methods, and the authors concluded that audio-recorded methods are an acceptable alternative to written methods in study populations where literacy skills are variable.10 Another study demonstrated that first contact by telephone with mothers of young children living in areas of high deprivation yielded a recruitment rate of 77% to a home interview study.11

Electronic surveys

Electronic surveys are becoming more common as the proportion of people who have access to the internet increases. These can either be webbased or email surveys. In web-based surveys, individuals are invited to visit a website to complete the survey or are sent a link. Email surveys often have the survey embedded within the email, which can then be returned to the mailer. A systematic review of strategies to increase response rates to postal questionnaires,1 subsequently updated in 2009 to include electronic surveys,12 reported as follows:

* The odds of response were increased by more than a half using non-monetary incentives, shorter e-questionnaires, including a statement that others had responded, and a more interesting topic.

* The odds of response increased by a third using entry into a lottery with immediate notification of results, an offer of survey results, and use of a white background to typescript.

* The odds of response were also increased with personalized e-questionnaires, using a simple header, using textual representation of response categories and giving a deadline.

* The odds of response tripled when a picture was included in an email.

* The odds of response were reduced when 'survey' was mentioned in the email subject line, and when the email included a male signature.

Findings from a meta-analysis of 45 comparisons between internet surveys versus other survey modes showed that, on average, internet surveys yield an 11% lower response rate compared to other modes.13 It was reported that response rates to (the disadvantage of) the internet mode are systematically influenced by the following:

* The sample recruitment base - response rates are lower from panel members (those who are used repeatedly in internet surveys), compared with one-time respondents.

* The solicitation mode chosen for internet surveys - there is a greater response rate to postal mail solicitation compared to email.

* The number of contacts - response rate is lower with more contacts (i.e. letter or phone-call reminders). Interestingly, this often works in the opposite direction for other survey designs.

Another meta-analysis of response rates to internet surveys offers further clarification to this last point.14 The decrease in response rate was greater among those receiving the largest number of reminders to internet surveys; the authors proposed this may represent email volume saturation, resulting in resistance to survey reminders.

An experimental comparison of internet and telephone surveys found that fewer people completed an online version of a questionnaire, compared with a telephone survey.15 An analysis of the demographic differences between the two samples selected for the survey showed that internet users were more educated, younger, more likely to be white and were more supportive of research and its impact on everyday life. Yet, overall these population groups were not well represented among the respondents. The authors concluded that internet surveys create somewhat different cognitive demands (or burden) from telephone surveys, and this seems to affect how certain types of people respond to particular formats.

Another study found that an internet survey achieved a comparable response rate to a postal survey when both were preceded by an advance mail notification.16 However, those who responded to the postal survey were significantly older than those who responded to the internet survey. A similar study comparing response rates to postal and internet surveys found that the internet survey had a significantly shorter reply time, but it had a lower response rate both overall and for each of three mailings attempted.17 The authors suggest one advantage of the postal survey over the internet survey is that response rates appear to increase as repeated mailings are administered. However, the internet survey was found to have lower item non-response and longer open-ended responses. Younger age groups, males, avid internet users and those with greater technological sophistication tended to be overrepresented in this type of survey. It should be acknowledged that internet surveys, by their nature, will exclude large parts of the population who do not have access to the internet or who are not computer literate.

DISCUSSION

The evidence presented seems to suggest that the use of a single survey mode does not achieve high or representative response rates. Hence it is becoming increasingly common to employ more than one method to overcome this problem.18 While mixedmethods surveys are recommended in order to increase response rates, a general understanding of all survey modes is needed to construct the most effective design for the target population(s). Understanding the population mosaic is an essential element and involves effective communication and coordination with local experts and stakeholders, often drawn from different organizations to inform planning. Irrespective of the choice of survey method, other factors that will impact on response rate must also be considered at the design stage.

Despite the rapid growth in electronic communication and the potential for cost savings this offers, the postal questionnaire remains an important lifestyle survey tool.19 However, the response rate to postal surveys has increasingly declined in recent years.3 Face-to-face and telephone interview survey modes are recommended to supplement data from postal surveys and increase response rates from some population groups, but these methods are often associated with increased costs. Electronic surveys may offer a cheaper alternative, but the evidence seems to suggest that the use of the internet does not appear to increase overall response rates.19,20 One explanation could be the internet survey's shorter history, meaning less time and attention has been devoted to developing and testing motivational tools to increase internet survey response in comparison with, say, postal surveys (e.g. the use of personalization, precontact letters, follow-up postcards and incentives).16 However, the approaches that are beneficial for improving response rates to postal surveys may not readily translate as response rate benefits for internet-based designs. Research has revealed concerns for potential survey participants that are particularly salient for internet users, including security and the receipt of electronic 'junk mail' or 'spam'.21

A crucial aspect of conducting a lifestyle survey is the sampling procedure, where the overriding imperative is to gain survey responses from as representative a sample as possible. There is a number of sampling frame options such as the electoral register, council tax registers, telephone directory and general practitioner lists, but it should be noted that all of these have their inaccuracies (and it is outside the remit of this review to include a discussion of them here). Measures should be taken to address these in-built errors, especially when targeting hard-to-reach populations. Ethical and data protection issues associated with access to individuals should also be considered.

In trying to maximize the response rate from under-represented groups, it is important at the design stage to be cognisant of the customs, values and beliefs of the target group(s). It is concerning that the majority of commonly used health outcome measures or questionnaires were developed in English-speaking countries and more often than not designed for use among ethnically homogenous ('white') populations.22 Research analysing the translation of local and national health surveys has uncovered numerous potential problems, not least insensitivity.23 For example, asking Muslim respondents whether they drink more at Christmas may be offensive to some sections of the population.

Recent examination of a number of translated questionnaires has helped to explain the reduced validity of certain instruments in South Asian populations.24 A more sensitive approach would be to ascertain cross-cultural relevance by, first, defining issues as salient and meaningful within a culture and, second, determining salience between and across cultures. This strategy requires a participatory approach whereby monolingual and bilingual representatives of the target population(s) need to be involved to generate items for inclusion in a mode of enquiry relevant to that group.6 It is suggested that open discussion between potential respondents and members of the investigative team should be attempted before carrying out research.10,25

It is also recommended that survey mode effects need to be taken into account when evaluating data quality, and complications that impact on the quality of data can arise around translation.26 For example, where different forms of the same language are used and where the written and spoken forms of a language are dissimilar. This may mean that questions asked at interview are not expressed in the same way as questions written on the questionnaire or interview schedule. Another consideration when examining data quality is that internet survey respondents produce higher rates of 'don't know' answers and more item non-response compared with face-toface survey respondents.27

Research on internet surveys shows that item non-response can be controlled by providing alternative response categories, and intricate graphics and multimedia can encourage response.28 As a final word of caution, zealous attempts to increase response rates may impair data validity; one study found a clear positive relationship between questionnaire retrieval rates and the mean proportions of missing answers.29 It is therefore advisable that future studies assess the validity of data from surveys gathered with intensive recruitment efforts and assess item nonresponse as a measure of data quality.

Limitations

A major limitation of this review was the lack of published literature specifically related to lifestyle survey work. The majority of the empirical evidence cited in this review was published in market research or educational research journals and any findings or recommendations therefore may not readily transfer to public health arenas.30 Of those studies that were conducted in a healthcare setting, many were trials conducted with patient populations who may be motivated to participate in surveys for different reasons. Additionally, the methods used in the conduct of clinical trials may not be directly transferable to lifestyle survey methodology and therefore the generalizability of such findings may be limited.

Conclusion

The limitations discussed make it difficult to propose prescriptive recommendations for the future conduct of lifestyle surveys. Nevertheless, there is sufficient relevant information from the existing literature that can be transferred to more general, 'best practice' recommendations. These can be summarized as follows:

* Offer mixed-mode surveys to small, representative target groups. Costs can be high, so careful planning should go into sample selection and recruitment of participants. The ethical issues associated with different strategies used to access samples should also be factored into designs. Caution should be applied when analysing and interpreting data collected from different survey modes for quality effects and comparability.

* Consider the survey length carefully - only collect data that is really needed. Shorter survey length was found to be an important factor in increasing response rates in two systematic reviews.1,12 Consider if other routinely collected data can be used to supplement lifestyle survey data. Again, ethical and data protection issues associated with access to individual data should be addressed.

* Cultural appropriateness and sensitivity of selected survey instruments needs consideration and, ideally, consultation with the target population(s).

* Tailor the design of the survey and maximization strategies to respondent characteristics. Evidence from this review has given an indication of what works with whom. Additional work reported elsewhere has also incorporated the evidence on the use of incentives in survey work.31

* Incorporate an assessment of nonresponse. This is often omitted from lifestyle survey work, possibly because the additional costs can be considerable. However, findings from lifestyle surveys that do not incorporate an assessment of non-response (i.e. a comparison of respondent and non-respondent characteristics/ demographics and discussion of how non-response may affect findings) must be treated with caution as results may not provide a true representation of the population under study.

ACKNOWLEDGEMENTS

This work was funded by NHS Calderdale and The NHS Information Centre as part of the National JSNA Dataset Project.

The requirement for Joint Strategic Needs Assessment (JSNA) was created in the Local Government and Public Involvement in Health Act 2007. It is proposed that JSNA will lead to stronger partnerships between communities, local government and the NHS, providing a firm foundation for commissioning that improves health and social care provision and reduces inequalities. JSNA identifies areas for priority action through local area agreements (LAAs). It helps commissioners specify outcomes that encourage local innovation and helps providers shape services to address needs. Lifestyle surveys are often a key component of JSNA.

[Reference]

References

1 Edwards P, Roberts I, Clarke M, DiGuiseppi C, Pratap S, Wentz R, Kwan I. Increasing response rates to postal questionnaires: Systematic review. British Medical Journal 2002; 324(7347): 1183

2 Hill A, Roberts J, Ewings P, Gunnell D. Nonresponse bias in a lifestyle survey. Journal of Public Health Medicine 1997; 19(2): 203-207

3 Curtin R, Presser S, Singer E. Changes in telephone survey non-response over the past quarter century. Public Opinion Quarterly 2005; 69: 87-98

4 Biemer PP, Lyberg LE. Introduction to Survey Quality. Hoboken, NJ: Wiley 2003

5 Groves RM, Presser S, Dipko S. The role of topic interest in survey participation decisions. Public Opinion Quarterly 2004; 68(1): 2-31

6 Hunt S, Bhopal R. Self-reports in research with non-English speakers. British Medical Journal 2003; 327(7411): 352-353

7 Free C. Breaking down language barriers. Some ethnic groups may have problems in getting as far as a consultation. British Medical Journal 1998; 317(7161): 816-817

8 Chaturvedi N, McKeigue PM. Methods for epidemiological surveys of ethnic minority groups. Journal of Epidemiology and Community Health 1994; 48: 107-111

9 A llison T, Ahmad T, Brammah T, Symmons D, Urwin M. Can findings from postal questionnaires be combined with interview results to improve the response rate among ethnic minority populations? Ethnicity & Health 2003; 8(1): 63-69

10 Lloyd CE, Johnson MR, Mughal S, Sturt JA, Collins GS, Roy T, Bibi R, Barnett AH. Securing recruitment and obtaining informed consent in minority ethnic groups in the UK. BMC Health Services Research 2008; 8: 68. doi: 10.1186/1472-6963-8-68

11 Kiezebrink K, Crombie IK, Irvine L, Swanson V, Power K, Wrieden WL, Slane PW. Strategies for achieving a high response rate in a home interview survey. BMC Medical Research Methodology 2009; 9: 46

12 E dwards PJ, Roberts I, Clarke MJ, Diguiseppi C, Wentz R, Kwan I, Cooper R, Felix LM, Pratap S. Methods to increase response to postal and electronic questionnaires. Cochrane Database of Systematic Reviews 2009(3): MR000008

13 Katja Lozar M, Michael B, Jernej B, Iris H, Vasja V. Web surveys versus other survey modes: A meta-analysis comparing response rates. International Journal of Market Research 2008; 50(1): 79-104

14 Cook C, Heath F, Thompson RL. A meta-analysis of response rates in web- or internet-based surveys. Educational and Psychological Measurement 2000; 60: 821-836

15 Fricker S, Galesic M, Tourangeau R, Yan T. An experimental comparison of web and telephone surveys. Public Opinion Quarterly 2005; 69: 370-392

16 Kaplowitz MD, Hadlock TD, Levine R. A comparison of web and mail survey response rates. Public Opinion Quarterly 2004; 68(1): 94-101

17 Kwak N, Radler B. A comparison between mail and web surveys: Response pattern, respondent profile and data quality. Journal of Official Statistics 2002; 18: 257-273

18 Dillman DA, Smyth JD, Christian LM. Internet, mail and mixed-mode surveys: The tailored design method. Hoboken, NJ: Wiley, 2009

19 Kroth PJ, McPherson L, Leverence R, Pace W, Daniels E, Rhyne RL, Williams RL, Prime Net Consortium. Combining web-based and mail surveys improves response rates: A PBRN study from PRIME Net. Annals of Family Medicine 2009; 7: 245-248

20 Couper MP, Miller PV. Web survey methods: Introduction. Public Opinion Quarterly 2008; 72: 831-835

21 Sills SJ, Song C. Innovations in survey research: An application of web surveys. Social Science Computer Review 2002; 20: 22-30

22 Collins GS, Johnson MR. Addressing Ethnic Diversity in Health Outcome Measurement: A Systematic and Critical Review of the Literature 2004. Warwick: Warwick University, 2004

23 Vettini A, Bhopal R, Hunt S, Wiebe S, Hanna L, Amos A. Measurement of Risk Factors for Cancer in Ethnicity and Health Research: A Case Study of Tobacco and Alcohol. Report to the Scottish Cancer Group of the Scottish Executive. Edinburgh: Section of Public Health Sciences, Edinburgh University, 2002

24 Fischbacher CM, Bhopal R, Unwin N, White M, Alberti KG. The performance of the Rose angina questionnaire in South Asian and European origin populations: A comparative study in Newcastle, UK. International Journal of Epidemiology 2001; 30: 1009-1016.

25 Greenhalgh T, Collard A, Begum N. Sharing stories: Complex intervention for diabetes education in minority ethnic groups who do not speak English. British Medical Journal 2005; 330(7492): 628

26 R oberts PJ, Roberts C, Sibbald B, Torgerson DJ. The effect of a direct payment or a lottery on questionnaire response rates: A randomized controlled trial. Journal of Epidemiology and Community Health 2000; 54(1): 71-72

27 Heerwegh D, Loosveldt G. Face-to-face versus web surveying in a high-internetcoverage population: Differences in response quality. Public Opinion Quarterly 2008; 72: 836-846

28 Malhotra N. Completion time and response order effects in web surveys. Public Opinion Quarterly 2008; 72: 914-934

29 E aker S, Bergstrom R, Bergstrom A, Adami HO, Nyren O. Response rate to mailed epidemiologic questionnaires: A population-based randomized trial of variations in design and mailing routines. American Journal of Epidemiology 1998; 147(1): 74-82

30 Scott P, Edwards P. Personally addressed handsigned letters increase questionnaire response: A meta-analysis of randomized controlled trials. BMC Health Services Research 2006; 6: 111

31 McCluskey S, Topping AE. Increasing Response Rates to Lifestyle Surveys: A Review of Methodology and 'Good Practice'. The National Joint Strategic Needs Assessment Dataset Project: University of Huddersfield, 2009

[Author Affiliation]

Authors

S McCluskey

BSc (Hons), PhD, Senior

Research Fellow, Centre for

Health and Social Care

Research, HHR3/01,

University of Huddersfield,

Queensgate, Huddersfield,

HD1 3DH, UK

Tel: 01484 471448

Email: s.mccluskey@hud. ac.uk

AE Topping

RGN, BSc (Hons), PGCE,

PhD, Professor of Nursing,

Centre for Health & Social

Care Research, University of

Huddersfield, UK

Corresponding author:

S McCluskey, as above

Комментариев нет:

Отправить комментарий