Tag Archives: Public Opinion Quarterly

Vignette Study

Institution: see Organisers & Acknowledgements

Program of study: International Research Workshop

Lecturer: Prof. Dr. Katja Rost (University of Zurich)


03.10.2013, 09:30 – 17:30

Room: n.s.

Max. number of participants: 20

Semester periods per week: n.s.

Credit Points: 5 CP for participating in the whole IRWS

Language of instruction: English


Vignette experiments provide “… short descriptions of a person or a social situation which contain precise references to what are thought to be (…) important factors in decision-making or judgment-making processes of the respondents…” (Alexander & Becker, 1978, 94). Within the description, the independent variables are systematically varied by the experimenter (Beck & Opp, 2001). Then the targeted variable, for instance behavioral intentions, is asked about. Provided the vignettes are realistic, the number of factors chosen should mirror the complexity of the decision environment decision makers are normally confronted with (Rossi & Anderson, 1982). Hence, a vignette experiment mimics the outcomes of “typical” decisions. Participants are led to weigh the significance of single characteristics to arrive at an overall preference for one alternative. As in reality, the participants are involved in a trade-off. Such a capacity to deal with the complexity of real decision making gives the design external validity while retaining the internal validity. provided through the experimental features of the factorial survey (Taylor, 2006).

In short, vignette analyses are based on the following three concepts (Teichert, 2001): (1) Every situation consists of a bundle of characteristics. (2) Every participant makes an individual evaluation of the benefits of various combinations of characteristics. (3) The combination of the benefits of various characteristics provides the relative overall benefit to an individual.

The workshop aims at establishing a theoretical and practical understanding about vignette experiments. We will discuss the method by using concrete examples of my former research (Rost & Weibel, 2012; Weibel, Rost, & Osterloh, 2010).

Current research ideas, projects or materials of participants can be considered and discussed in case of interest.


Alexander, C. S. & Becker, H. J. 1978. Use of Vignettes in Survey-Research. Public Opinion Quarterly, 42(1): 93-104.

Beck, M. & Opp, K.-D. 2001. Der Faktiorelle Survey Und Die Messung Von Normen. Kölner Zeitschrift für Soziologie und Sozialwissenschaften, 53: 283-306.

Rossi, P. H. & Anderson, A. B. 1982. The Factorial Survey Approach: An Introduction. In P. H. Rossi & S. L. Nock (Eds.), Measuring Social Judgments: The Factorial Survey Approach: 15-67. Beverly Hills, CA: Sage.

Rost, K. & Weibel, A. 2012. Ceo Pay from a Social Norm Perspective: The Infringement and Re-Establishment of the Fairness Norm. Corporate Governance. An International Review, forthcoming.

Taylor, B. J. 2006. Factorial Surveys: Using Vignettes to Study Professional Judgement. British Journal of Social Work, 36: 1187–1207.

Teichert, T. 2001. Nutzenschätzung in Conjoint-Analysen. Wiesbaden: Gabler.

Weibel, A., Rost, K., & Osterloh, M. 2010. Pay for Performance for the Public Sector – Benefits and (Hidden) Costs Journal of Public Administration Research and Theory 20(2): 387-412.

You have to register for the 7th International Research Workshop to participate in this course.

Questionnaire Design

Institution: see Organisers & Acknowledgements

Program of study: International Research Workshop

Lecturer: Prof. Dr. Juergen H. P. Hoffmeyer-Zlotnik (University of Gießen)
Prof. Dr. Dagmar Krebs (University of Gießen)
Dr. Natalja Menold (GESIS)


01.10.2012, 09:00 – 12:30
02.10.2012, 09:00 – 12:30
04.10.2012, 09:00 – 12:30
05.10.2012, 09:00 – 12:30

Room: n.s.

Max. number of participants: n.s.

Semester periods per week: n.s.

Credit Points: 5 CP for participating in the whole IRWS

Language of instruction: English/German (depending on participants)


The lectures deal with the basic principles which have been established in the best practice of questionnaire design. The theoretical background and current state of research will be demonstrated on examples and practical exercises.

1. Cognitive process and cognitive pretests: Monday

For the beginning the cognitive process in survey responding, including comprehension, retrieval, judgement and formatting response will be presented. For each of these phases the demands for questionnaire design related to the questions about attitudes, opinions and behavior will be explicated. It will be shown, how cognitive pretest techniques (think aloud, probing, confidence rating, paraphrasing) can help to detect the problems in questionnaires, which were related to the cognitive burden of the respondents.

2. Context effects and question wording: Tuesday

This section deals with the impact of situational context given in questionnaires on judgements/answers. Regarding the principles of question wording topics such as to phrase the questions, usage of terms and problems with hypothetical, suggestive, negative and double-barreled questions were attended. For each of the principles examples of problems and their solutions will be given.

3. Constructing of optimal answer formats: Thursday

Constructing of optimal answer formats due the reliability and validity of questions includes topics such as number of scale points, midpoint, usage of unipolare and bipolare scales, labels of scale points, ascending and descending sequences. Related topics are handling of open and closed questions and usage of non-opinion filters. The problems and their solutions are demonstrated with help of examples and exercises.

4. Collection of sociodemographic data: Friday

The fourth part of this lesson demonstrate how to harmonise demographic and socio-economic variables in cross-national comparative survey research. Demographic and socio-economic variables describe the context in which a person is acting. In cross-national comparable research standardised instruments or indices exist only for a very small group of variables. Aside from these instruments there are rules for developing further measurement instruments for measuring socio-demographic variables in cross-national research.


Bortz, J., & Döring, N. (2002). Forschungsmethoden und Evaluation für Human- und Sozialwissenschaftler. Berlin et al.: Springer.

Christian, Leah. M., Parsons, N. L., & Dillman, Don. A. (2009). Designing Scalar Questions for Web Surveys. Sociological Methods and Research, 37(393), 423.

Christian, Leah. M., & Dillman, Don. A. (2004). The Influence of Graphical and Symbolic Language Manipulations on Responses to Self-Administered Questions. Public Opinion Quarterly, 68(1), 57-80.

Christian, L. M., Dillman, D. A., & Smyth, J. D. (2007). Helping Respondents Get it Right the First Time: The Influenece of Words, Symbols, and Graphics in Web Surveys. Public Opinion Quarterly, 71(1), 113-125.

Couper, M. P., Conrad, F. G., & Tourangeau, R. (2007). Visual Context Effects in Web Surveys. Public Opinion Quarterly, 71(4), 623-634.

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys. The tailored design method. Wiley: New Jersey.

Dillman, D. A. (2007). Mail and Internet Surveys. The Tailored Design Method. Wiley: New Jersey.

Groves, R. M; Fowler, F. J.; Couper, M.P.; Lepkowski, J. M.; Singer, E. & Tourangeau, R. (2004). Survey Methodology. New Jersey: Wiley.

Hippler, Hans-J. (1988). Methodische Aspekte schriftlicher Befragungen: Probleme und Forschungsperspektiven. Planung und Analyse, 6, S. 244-248.

Holbrook, A. L., & Krosnick, J. A. (2010a). Social desirability bias in voter turnout reports: Tests using the item count technique. Public Opinion Quarterly, 74, 37-67.

Holbrook, A. L., & Krosnick, J. A. (2010b). Measuring voter turnout by using the randomized response technique: Evidence calling into question the method’s validity. Public Opinion Quarterly, 74, 328-343.

Krosnick, J. A., & Fabrigar, L. R. (1997). Designing rating scales for effective measurement in surveys. In L. Lyberg, P. Biemer, M. Collins, E. de Leeuw, C. Dippo, N. Schwarz, & D. Trewin, (Eds.), Survey measurement and process quality (pp. 141-164). New York: Wiley.

Krosnick, J. A., & Presser, S. (2010). Questionnaire design. In J. D. Wright & P. V. Marsden (Eds.), Handbook of Survey Research (Second Edition). West Yorkshire, England: Emerald Group.

Krosnick, J. A., Holbrook, A. L., Berent, M. K., Carson, R. T., Hanemann, W. M., Kopp, R. J., Mitchell, R. C., Presser, S., Ruud, P. A., Smith, V. K., Moody, W. R., Green, M. C., & Conaway, M. (2002). The impact of “no opinion” response options on data quality: Non-attitude reduction or an invitation to satisfice? Public Opinion Quarterly, 66, 371-403.

Krosnick, J. A., Judd, C. M. & Wittenbrink, B. (2005). The measurement of attitudes. In D. Albarracin, B. T. Johnson & M. P. Zanna (Hrsg.). The Handbook of Attitudes (S. 21-75). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.

Maitland, A. (2009 a). Should I label all scale points or just the end points for attitudinal questions? Survey Practice, 04. AAPOR e-journal.

Maitland, A. (2009 b). How many scale points should I include for attitudinal questions? Survey Practice, 06. AAPOR e-journal.

Porst, R. (2000). Question Wording – Zur Formulierung von Fragebogen-Fragen. Gesis How-to Reihe, Nr. 22; http://www.gesis.org/fileadmin/upload/forschung/publikationen/gesis_reihen/howto/how-to2rp.pdf.

Porst, R. (2008). Fragebogen. Ein Arbeitsbuch. Wiesbaden: VS Verlag für Sozialwissenschaften.

Saris, W. E., & Gallhofer, I. N. (2007). Design, evaluation, and analysis of questionnaires for survey research. Hoboken, New Jersey: John Wiley & Sons, Inc.

Saris, W., Revilla, M., Krosnick, J. A., & Shaeffer, E. (2010). Comparing questions with agree/disagree response options to questions with item-specific response options. Survey Research Methods, 4, 61-79.

Schuman, H., & Presser, S. (1981). Questions and answers in attitude surveys: Experiments in question form, wording, and context. New York: Academic Press.

Schwarz, N., Strack, F., & Mai, H. P. (1991). Assimilation and contrast effects in part-whole question sequences: A conversational logic analysis. Public Opinion Quarterly, 55, 3-23.

Stadtmüller, S. & Porst, R. Wie man die Rücklaufquote bei postalischen Befragungen erhöht. http://www.gesis.org/fileadmin/upload/forschung/publikationen/gesis_reihen/howto/how-to14rp.pdf

Sudman, S., Bradburn, N. M. & Schwarz, N. (1996). Thinking about answers: The application of cognitive processes to survey methodology. San Francisco: Jossey-Bass.

Tourangeau, R., Rips, L. J. & Rasinski, K. (Druck 2006, Auflage 2000). The psychology of survey response. Cambridge: Cambridge University Press.

You have to register for the 6th International Research Workshop to participate in this course.