Skip to main content Skip to navigation
WSU Surveys WSU Surveys

Best Practices

The goal of survey research is to systematically collect data in order to describe trends, beliefs, or experiences within a given population. At Washington State University, surveys are used to:

  • Measure attitudes, experiences, and institutional effectiveness
  • Review, assess, and evaluate existing programs
  • Collect data for student and faculty research studies

This page includes a collection of best practice tools that can be integrated with a survey strategy to maximize data quality and ensure survey integrity.

WSU Survey Research Best Practices

Develop your research question(s)

  1. Have specific goals for the survey
    • A good survey starts with specific goals and clear research questions. Surveys should be used to develop statistical information about a subject, not to produce predetermined results. Your research goals will determine what kinds of questions you write and who you include in your sample.
  2. Check if data already exists
    • Once you know what your research question is, check to see if there is already data available on the subject. The ideal time to do a survey is when an individual or institution has a question or challenge that cannot be answered sufficiently using current data. Also consider whether a survey is the best way to collect the required data. Some research questions might be better answered using qualitative interviews, focus groups, or secondary data analysis. Sometimes these methods also serve as useful precursors to survey research and can be used to further develop research questions. Two places to check for pre-existing data about people at WSU include institutional research and human resources (see links below).
  3. If you decide to conduct a survey, decide which mode would work best. Potential survey modes (how people take your survey) include paper-and-pencil, in person interview, phone interview, and web survey.
    • Factors to consider:
      • Target population
      • Available sample frames
      • Survey length
      • Cost
      • Complexity of the survey
      • Speed of data collection
      • Contact (or correspondence) mode(s)
    • For more information see Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method (2014), 4th Edition, Don A. Dillman, Jolene D. Smyth, and Leah Melani Christian
      • Guidelines for Designing Telephone Questionnaires (p. 262)
      • Guidelines for Designing Web and Mobile Questionnaires, (p. 303)
      • Guidelines for Designing Paper Questionnaires (p. 352)
    • Or, drop by the Survey Design Clinic to discuss which survey mode would be best for your project.

Draft survey instruments

  1. Construct questions that you think will answer your research question
    • Important things to consider when drafting your questionnaire include definition of topics, concepts and content; question wording and order; and questionnaire length and format. It is important that the questions included in your survey accurately operationalize the concepts needed to answer your research question. All relevant concepts must be measured and, when possible, multiple measures should be included for key concepts.
  2. Write questions that are easy to answer and are in a logical order
    • Questions should be written using words that are easily understandable and unambiguous no matter who is answering them. 6th grade level language is generally recommended. Questions should be in an order that makes sense to respondents, try to ask all questions about one topic at the same time. Demographics are often included at the end of the survey. Avoid asking more than one question at a time (double-barrel questions). For example, it would be inappropriate to ask “Do you drive to work? If yes, how far?” in the same question. Scales for multiple choice questions should be carefully considered and response categories should line up with question wording (I.e. to what extent do you agree or disagree that… would go with responses from strongly agree to strongly disagree. Examples of popular scalar response categories include:
      • 5 point – Strongly Agree/Agree/Neutral/Disagree/Strongly Disagree
      • 4 point – Strongly Agree/Agree/Disagree/Strongly Disagree
      • 4 point, 1 direction – Always/Often/Sometimes/Never
  3. Draft the correspondence you plan to send to your survey participants
    • In addition to drafting your questionnaire, you will need to draft a message to potential participants asking them to take your survey. Depending on your sample and project specifications, contacts can be made through mail, phone, or email. Correspondence should look professional to show the legitimacy of the project and make people feel safe responding. Whether or not results will be confidential needs to be included when you ask people to take your survey. Your correspondence should highlight why your project matters and why the person should take your survey. Multiple contacts (following up with non-respondents) is highly recommended and failure to do so can bias results. Make sure to include your contact information in correspondence to respondents and be sure to honor any requests to opt out of future communications.
  4. Include how results will be used in correspondence to participants
    • Your correspondence must include a description of whether or not responses are confidential and or anonymous. In addition, you should describe how results will be used (I.e. for a class presentation, published paper, to inform policy, etc.)
  5. Make sure the visual design of the survey instrument is clear and neat
    • Question numbers, page numbers, and section subtitles help respondents navigate through your survey. Fonts should be easy to read and consistent throughout the questionnaire. If you are completing a mixed-mode survey with paper and web response options, the visual design of the web survey should be as close as possible to that of the paper survey to minimize measurement error due to mode difference. For more help on visual design bring or email your questionnaire to the survey design clinic.
  6. For additional information on questionnaire design, see:
    • Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method (2014), 4th Edition, Don A. Dillman, Jolene D. Smyth, and Leah Melani Christian
      • Chapter 4: The Fundamentals of Writing Questions
      • Chapter 5: How to Write Open- and Close-Ended Questions
      • Chapter 6: Aural vs. Visual Design of Questions and Questionnaires
      • Chapter 7: Ordering Questions and Testing for Question Order Effects
    • Margins of Error: A Study of Reliability in Survey Measurement (2007), Duane Alwin.

Sampling

  1. What is a sample?
    • A sample is used to randomly identify a subgroup of respondents that the researcher plans to study for generalizing different populations at Washington State University and beyond. Typically, a sample is randomly selected from a list containing all members of that population.  See Coverage Error.
  2. Sample versus Census
    • In most cases, it is inappropriate to sample entire populations of respondents (e.g., the entire university population). This leads to low response rates and survey fatigue, which negatively impacts data quality and response rates across the University. Plus, doing a census is often costly.  Instead, you should opt for a sample rather than a census unless there is a legitimate reason to survey everyone (e.g., the population is very small or you need a census by law).
  3. Representative Samples
    • A representative sample is a sample that accurately represents the population you are studying. While there will be a certain margin of error for doing any type of sampling, see Sampling Error, you can minimize this error by having a representative sample. These samples fall under the category of probabilistic sampling techniques, and are the most desirable for statistical analysis and inference.  To determine if you have a sample representative of the WSU population, you must compare it to population level data, which can be found on the Office of Institutional Research’s website.
  4. Non-Probability Sampling
    • Sometimes it is difficult to obtain a random, or probabilistic, sample. If you do not have the resources to obtain a random sample, or if you are working with a very specific population, you can opt for a non-probability sample.  For this sampling technique, you select respondents by their accessibility rather than random selection.  While these samples are much easier to obtain and are often used for exploratory purposes, the results cannot be used to generalize to the whole population if you use this sampling technique.  Your sample will most likely not be representative of the population since the most accessible respondents were the ones who completed the survey.  You are also not able to calculate sampling error since you do not know the probability that a respondent could be chosen from the population.
  5. Obtaining a random sample at WSU

Design, program, and test survey instruments

  1. Use appropriate survey software
    • Only use web survey software that meets WSU criteria to program web surveys (e.g., Qualtrics, REDCap, and the SESRC). If you are using a third party vendor, such as Amazon’s Mechanical Turk (M-Turk) to recruit participants, do not program your survey in M-Turk. Instead, provide the link to one of the approved survey software (e.g., Qualtrics) and use M-Turk to compensate respondents. Do not link respondent’s worker IDs to their responses.
  2. Visual Design
    • Make sure that the visual design of your survey is clear for the respondent to read, and that the questions are logically arranged and grouped. This allows respondents to answer questions accurately. See Measurement Error.
  3. Survey Modes
    • If you are doing a survey that utilizes multiple modes (e.g., mail and web), make sure that the formats are consistent across modes.
  4. Testing your Survey
    • Thoroughly test your survey by having a variety of people take it before you start collecting data. Consider conducting cognitive interviews to determine if your survey makes sense or contains sensitive topics.
  5. Incentives
    • If you have a budget, consider giving the respondent an incentive to complete your survey. If you do not have funding, consider getting an authority figure to endorse your survey, or providing a small report of the results to your respondents.  These methods also establish trust with your survey, which help maximize your response rate. See Nonresponse Error.
  6. Additional Resources:

WSU IRB

  1. Getting an IRB approval or exemption
    • Obtain approval or an exemption from WSU IRB before you start collecting data. You will need to have your full questionnaire, sample design, incentives, data storage, and correspondences ready when you submit your application to IRB.

Field your survey/Collect your data/administer survey

  1. Maintaining open communication
    • When conducting any survey, it is important that respondents have a direct, reliable means to contact you if they have any questions or experience any difficulty completing the survey. It is important to respond to every message you receive throughout the administration of your survey. To aid with this, it is recommended that you provide your preferred contact information in your initial letter requesting respondents’ survey participation.
    • Always honor every respondent request to opt-out of taking your survey.
    • You will need to establish systems for dealing with bounced emails (for web surveys), undeliverable mail (for paper mail surveys), or issues with incorrect phone numbers or do not call lists (for telephone surveys).
    • Additional Resources
  2. Monitoring progress and sample management
    • It is important in any survey to monitor the progress of your survey (especially at the beginning) and to review the data being collected each day. Developing a monitoring protocol for keeping track of your survey’s progress is important and will help you identify and attend to any potential issues. It is not uncommon for adjustments to be made to your survey protocol, such as fixing programming errors in web surveys or adjusting the information included in your communications to survey respondents. Catching and addressing issues for the remainder of a survey can improve data quality.
      • A monitoring protocol is a systematic set of steps for keeping track of your survey in progress. Your monitoring protocol should produce regular summaries of response dispositions, such as completes, partial completes, refusals, return surveys, and the nonresponse rate.
      • Keeping track of response dispositions will help you identify and address two primary sources of survey error; nonresponse and measurement errors.
  3. Data storage, data security, and maintaining confidentiality
    • Surveyors need to ensure procedures are in place to maximize data security and respondent confidentiality. Depending on the sensitivity of information collected in your survey, you may consider consulting with data security experts.
    • Some common steps you can take to maximize data security include (1) storing any identifying information separately from the survey responses; (2) regarding digital data, secure all information being collected and transferred using encryption (e.g., SSL, Secure Socket layering), rigorous firewalls, and other technologies at each state in the process; (3) You should also maintain control over who has access to the data.
  • Web surveys present their own data security limitations. The primary being issues associated with using online survey vendors like Qualtrics. These vendors store their data on their own servers, which makes it difficult to directly ensure the security of the data.

Analyze the results

  1. The goal of good data analysis is to represent findings comprehensively, clearly, and fairly. Findings and their interpretations should be presented objectively. This requires full reporting of all findings, even those that seem contradictory or unfavorable.
  2. Washington State University provides access to and information on statistical software. If you require software, please link below to the Center for Interdisciplinary Statistical Education and Research for more information.
  3. Attention to the four areas of survey error or in data processing (e.g., imputation) should be acknowledged. Where possible, such errors should be taken into account in the analysis and interpretation of the results. For a more detailed description of the four primary sources of error and how they might impact your survey see Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method (2014), 4th Edition, Don A. Dillman, Jolene D. Smyth, and Leah Melani Christian
    • Chapter 1: Sample Surveys in Our Electronic World
    • Chapter 8: Telephone Questionnaires and Implementation
    • Chapter 9: Web Questionnaires and Implementation
    • Chapter 10: Mail Questionnaires and Implementation
    • Chapter 11: Mixed-Mode Questionnaires and Implementation
  4. Statistical tables should be labeled clearly. This should include identifying the questionnaire source and the unweighted number of cases forming the basis for analysis. If modeling is used, include specifications necessary for replication of modeling included in analysis or reports.
  5. All sources of error and the limitations of the study need to be addressed when you are making generalized conclusions about the results of the study.
  6. Additional Resources

Report findings

  1. Reporting must be clear, honest and thorough.
  2. Document entire process, enabling future research to replicate your own.
  3. A survey report is usually composed of the following sections, although different organizations may have their specific requirements.
    • Title page
    • Table of contents
      • Table of contents allows the reader to have a quick view of each section of the report, and easily locate the sections they are interested in reading.
    • List of tables and charts
    • Highlights or executive summary
      • An executive summary provides the main findings of the survey succinctly. Readers can obtain the essentials of the survey findings without reading the details of the report.
    • Introduction/Contextual Background
      • This part gives background relevant to the survey, such as the objectives and how the research results could be used. The amount of information provided in this section depends on the complexity of the survey.
    •  Methodology
      • This section mainly discusses who has been included in the survey and why, how many people were surveyed, how they were contacted, and the method of data collection. Key concepts and variables should also be clearly defined. Very detailed information regarding survey method may be included in the appendix.
    • Results and analyses
      • This section provides survey results and tabulations. Main results and findings should be presented first and then more detailed information should be provided. Often this section includes tables, charts, along with explanations of what the results mean and their significance.
    • Conclusions and Recommendations
      • This part provides to the readers the implications of the findings. Recommendations about necessary action may be presented in this section.
    • Bibliography/List of references
    • Contacts
      • A phone number, email address and mailing address should be provided so that users can contact the person who has been in charge of the survey project.
    •  Appendices
      • Appendices provide details that are not included in the previous sections because of the limited space.

Archive

  1. Archive and/or destroy research materials as outlined in your IRB protocols.

Additional Resources

  1. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method (2014), 4th Edition, Don A. Dillman, Jolene D. Smyth, and Leah Melani Christian
  2. How to Conduct Your Own Survey by Priscilla Salent and Don A. Dillman
  3. Social and Economic Sciences Research Center (SESRC)
  4. SESRC Survey Design Clinic
  5. NORC at the University of Chicago
  6. American Association for Public Opinion Research (AAPOR) best practices
  7. AAPOR condemned practices
  8. Survey Methods and Practices – Statistics Canada
  9. Probability Sampling – Statistics Canada
  10. University of Wisconsin-Madison Survey Fundamentals
  11. University of Minnesota Survey Best Practices
  12. Washington State Local Government Records Retention Schedules