HOW TO CONDUCT A SURVEY

Want to make sure your surveys give you the useful information you need? Follow our simple six-step process to help you design and administer effective surveys.


  • Step 1. Define your purpose

    • What is the goal of your survey? What do you want to learn? Knowing how you will use the survey information before you start is the key to success. If possible, make a list of specific questions you are trying to answer and decisions the survey will inform.
    • Consider whether the information you are trying to collect is available elsewhere (e.g., reports.umd.edu, data from information sytems).
    • Are there alternatives to a survey that would be better suited to your purpose or provide more appropriate information (e.g., interviews, focus groups, observational studies)?
    • Self-reported survey data is a useful way to gather perceptions or beliefs, but less accurate in measuring actual behaviors. These types of data need to be interpreted with caution and/or considered in combination with other data sources.
    • To decide if a survey is right for you, consider:
      • Is your goal to learn about opinions and perceptions or behaviors?
      • How will you use this information you collect?
      • How would a survey provide useful information for academic planning purpose and/or for those providing services to students?
      • How would a survey provide useful information on the experience of students, staff, and/or faculty that we cannot get elsewhere?
      • Would survey administration divert important University resources away from other projects (e.g., staff time)?
    • Design an analysis plan in advance. Planning your analysis up-front will help you design a survey that captures the information you need to collect in order to take action.
      • What visualizations or statistical tests do you want to do with the results? What questions do you need to ask to meet those plans?
      • What demographic variables (e.g., college of major, class standing, gender) would be meaningful and actionable to analyze?
    • Consider triangulating survey data with other data sources you have access to.
  • Step 2. Decide who you will survey and how you will do it

    • You don't need to survey everyone in your target population (e.g., all current students at UMD) to get valid results. You can use sampling techniques, such as random sampling, and then generalize to the population.
    • A low response rate can be problematic because certain groups could be over- or under-represented; or you might have nonresponse bias, depending on the survey topic. Fear of low response rates is not a good justification for conducting a population survey. See step 6 for more information on how to address nonresponse bias in your analysis.
  • Step 3. Create a plan for confidentiality/anonymity, data security, and IRB approval (if necessary)

    • Student survey data must be kept confidential. Data shared must not include student identifiers such as social security number, UID, or name; reports (paper, electronic, or verbal) using the data must not identify individual students. UMD's Institutional Review Board (IRB) has additional information on maintaining confidentiality.
    • An anonymous survey would require that the survey has no way to link a respondent with their responses. This may be more appropriate if the survey topic is sensitive.
    • Summary data with small cell sizes should not be reported if it could potentially reveal information about an individual student.
    • Adhere to IT policies, standards, and guidelines on how to share and store data.
    • Decide whether you need IRB approval for your survey. How do you plan to use the results? Who do you plan to share them with? Is your survey part of a research project?
  • Step 4. Design your survey

    • Unless mandated, surveys should be brief, taking no longer than 10 minutes to complete. Have others test your survey in advance to see how long it is.
    • Is your survey asking about a sensitive topic (e.g., mental health, substance abuse, violence)? Be thoughtful about the language you use when asking potentially triggering questions, and consider providing resources for respondents who might experience distress. Read more about how to write sensitive questions here.
    • Always review, pilot, and, if necessary, edit a survey before it goes out, even if another entity (e.g., a consultant) created it.
      • Think about the language we use to describe things at UMD that a consultant might not be familiar with.
      • Any survey with "UMD" branding can be perceived as a formal, official survey from the University and can influence people's perceptions of the institution, including ones that a consultant may have created.
    • To write inclusive, accessible questions, such as reading level, question order, double-barreled questions, and loaded questions, follow these resources on how to write strong survey questions:
    • If you can, try to collect demographic information afterwards instead of through survey questions. If you choose to ask questions about race/ethnicity and gender, consider whether you need to be able to compare your sample with the whole population. If so, use language that's consistent with how IRPA reports on race/ethnicity and gender. More information can be found on the Data Definitions page of the IRPA website.
    • Ensure your survey meets the University's web accessibility standards.
    • UMD supports a number of survey platforms. See the "Campus tools" section for more information.
  • Step 5. Administer your survey and collect data

    • Timing: Avoid administering a survey during finals, midterms, or the first and last weeks of the semester. This timing can impact your response rate.
    • Communicating about your survey: Make survey invitations and reminders short and personal (e.g., include the recipients' name if the survey isn't anonymous; have the survey be sent from someone whose name recipients will recognize).
    • Incentives: Consider when and what incentives might be appropriate. Depending on how the incentive is structured, it might change someone's response or how they complete the survey (e.g., an incentive that rewards the first 100 people who complete a survey might motivate survey-takers to take the survey very quickly, rather than thinking through each question).
    • Additional resources on survey administration:
  • Step 6. Analyze, interpret, and share your results

    • Your analysis should relate to your goals (see step 1). For each goal, state what you found using data.
    • Review data before you start analyzing it to ensure that values are coded properly, there aren't any unexpected responses, etc.
    • Assess whether you had any nonresponse bias in your results.
      • Compare your respondents to your population. Are any groups over- or under-represented?
      • When talking about survey results with a low response rate, make it clear that you are talking about respondents, not the entire population (e.g., say "survey respondents" instead of "students").
    • Start with simple analyses. Oftentimes, the simplest analysis and statistics will answer your question. It doesn't need to be complex!
    • When sharing your results, consider the audience. You may want to provide different reports, presentations, etc., for difference audiences. Which audiences care about your methodology? Which audiences want to know about your process? How much background knowledge does your audience have?
    • Additional resources: