Handbook of Web Surveys. Jelke Bethlehem

Handbook of Web Surveys - Jelke Bethlehem


Скачать книгу
refreshment is planned. At the time being the Kauffman Firm Survey (KFS) is a panel study of 4,928 businesses founded in 2004 and tracked over their early years of operation. Until September 2020, the University of Chicago NORC Data Enclave has been managing a secure remote access to the KFS confidential microdata file access for researchers. Free access was allowed. After September 2020, access to the KFS is shifted towards a fee-based model.

      Objective of the panel survey is to provide information about creation and development aspects on new businesses (especially of high‐technology and women‐owned businesses). Firm characteristics, revenue and expenses, profit and loss, owner characteristics, and, since 2007, information about predominant markets and Internet sales are also collected.

      2.2.4 TRENDS IN WEB SURVEYS

      Dillman, Smyth, and Christian (2014) describe some changes in the survey environment from the 1970 to 2000, focusing on the factors like human interaction, trust that the survey is legitimate, time involvement for each respondent, attention given to each respondent, respondent control over access, and respondent control over whether to respond. These observations indicate that during the 1990s human interaction and individual contact relevance are decreasing due to the use of IT (i.e., computer‐aided and web surveys) and to massive use of e‐mails. Trust on survey relevance and legacy is very low, and the possibility of refusal and of filtering against surveys (anti‐spam, disclosure rules) is very high. These observations are in line with developments with respect to web surveys.

       All kinds of events and behavior occurring during the survey process are considered.

       The overall response rate is only a very simple measure of survey quality, although it is frequently used as an indicator. To some extent this measure could be useful in identifying weak points in the process (for instance, a large amount of refusal might be due to a bad‐contact process), but it fails to consider that people who have web access, or that are respondent in a web survey, could significantly differ from other units.

       Overall response rates do not give information regarding the response propensity of different respondent subgroups (late respondents versus early respondents, and sociodemographically different subgroups) or on respondent behavior.

       Response rates are anyway becoming low, and there is a need to investigate the reasons. Incentive is thought as a possible solution. Göritz (2006, 2010, 2015) and Brown et al. (2016) suggest a generally positive effect of incentives in web surveys. Singer and Ye (2013) conclude that in all survey modes, prepaid cash incentive is the most effective. In this case, it is required to use a mode rather than web to contact respondents. If the e‐mail contact option is adopted, Dillman, Smyth, and Christian (2014) comment that electronic incentive sent to all sample member is likely the best option. With respect to sampling error, due to imperfect frames in web surveys, traditional probabilistic samples are in many cases not easy to implement. Therefore, it is not possible to compute the sampling error, as the theory of statistical inference does not apply.

      As consequence of this new paradigm, attention is going at:

       How to face decreasing response rates. Possible solutions may be:Keeping respondents focused on the relevant parts of the computer screen and keeping distraction to a minimum can help to get completed questionnaire. To accomplish this task, studies based on eye‐tracking analysis are to be carried out.An interesting strategy for improving response rates is to use mixed‐mode surveys (see Chapters 3 and 9). However, new problems arise with the mixed approach, since mode effects are to be considered in analyzing survey results. Occurrence and treatment of mixed‐mode effects need further investigation. Chapter 9 is about it.

       How to use paradata (i.e., data collected during the interviewing process). Increasing attention is going to be devoted to the analysis of this type of data. In particular, they help to identify typologies of response behavior explaining the potential variations in participation in web‐based surveys and providing a valuable insight into understanding nonresponse and various aspects of response behavior. From the methodological point of view, behavioral analyses rely to the Cognitive Aspects of Survey Methodology Movement (CASM), and, in many empirical studies, the theory of planned behavior (TPB) model is applied (Ajzen, 1991). The main objective is to obtain a more comprehensible picture on how intentions form. For example, based on the TPB, two alternative models were empirically tested, in which the roles of trust and innovativeness were theorized differently—either as moderators of the effects that perceived behavioral control and attitude have on participation intention (moderator model) or as direct determinants of the attitude, perceived behavioral control, and intention (direct effects model).

       How to get representative web surveys and/or panels? Many access panels consist of volunteers, and it is impossible to evaluate how well these volunteers represent the general population. In any case, they represent a non‐probability sample. Recent research attempts to tackle the task of how to apply probabilistic recruitment to panels and how to draw inferences from them are present in recent literature. One approach to correct for a lack of representativity is, for example, to apply propensity score methodology (Steinmetz et al., 2014). Propensity scores serve to reweight web survey results.

      Generally speaking the methodology and quality of data collected in area of socioeconomics could greatly benefit by:

      1 The development of suitable estimation methods aimed at capturing the bias and specific variance connected with the frame characteristics and participation process of this type of survey;

      2 Research, principally based on experimental designs, allowing the effects of various factors to be tested (for example, the effects of different types of question structure, various contact modes, etc.);

      3 Research, based on behavioral models, that allows response and participation processes to be analyzed and modeled in the context of the individual behavior of survey respondents.

      The Italcementi Group is a large Italian company. The following data are about the situation when the study took place. With an annual production capacity of approximately 75 million tons of cement, it is the world's fifth largest cement producer. The group had companies in 22 countries around the world. Italcementi regularly monitors, by means of a mixed‐mode survey, the working conditions and working climate in the company. As regards the survey data is collected part by web and part by paper questionnaire forms.


Скачать книгу