The Office of Institutional Research and Assessment (OIRA) has provided these resources which address survey planning and design topics that follow best practices to help you achieve higher responses and obtain more representative data.
The OIRA makes itself available to consult on survey design and distribution as time permits. Please contact the office at research@mwcc.mass.edu if you would like to schedule an appointment to discuss your survey project. In general, we recommend following these guidelines as a start to crafting a strong survey.
Quick Tips Focus on the Purpose Writing Questions Addressing Privacy, Anonymity and Confidentiality Launching the Survey Reporting Findings |
- Clearly define your question.
- Only ask exactly what you need to know. Only ask questions that you need and intend to report or use for specific analysis. Know what you will be able to do with that information.
- Keep it short! Consider the amount of time that you would be willing to devote to an unsolicited survey that appeared in your inbox. The IR office typically recommends that a survey take no longer than 10 minutes to complete. A 3-5 minute-long survey is ideal. Longer and complex surveys can annoy respondents and cause them to abandon the survey or begin to pick random answers to finish quickly.
- Keep your questions simple. Use as few words as possible when constructing a survey item. Normally, the shorter the survey item, the clearer it is to readers and can therefore yield higher quality responses.
- Many surveys, particularly surveys of students, take place at the end of the academic semester. For better response rates, avoid busy times of year, like midterms, finals, or the end of the fiscal and academic year. Consider launching your survey at other times of the year.
Start by asking yourself “What things exactly do I want to know and why?” Don’t worry about question design yet. Just think about the information you’re after. The “why” part is very important. A good survey starts with specific goals and clear research questions.
For every piece of information, ask yourself “What decision will be made or what action will be taken when I have this information?” Answering a survey is a burden, so if a question is included make sure it is necessary and will impact a decision or action. Curiosity is not a good enough reason to include an item on a survey.
Only when you have fully thought through your purpose and information needed should you begin to work on questions.
Once you know what your research question is, check to see if there is already data available on the subject. The ideal time to do a survey is when there is a question that cannot be answered sufficiently using current data. Also consider whether a survey is the best way to collect the required data. Some research questions might be better answered using other methods such as qualitative interviews, focus groups, or secondary data analysis.
Writing Questions – Best Practices & Scale Examples
Take a look at similar surveys for examples of how questions are asked. As you start writing the questions you should keep in mind several things:
- Whenever possible, look for existing questionnaires, scales, and items intended to collect the same data you are interested in.
- Ask only questions that will help you meet your goal. Each question should be able to be traced back to an outcome or analysis you are hoping to do with the results.
- Ask one question at a time. It should be direct/specific to the topic. If not, toss it out.
- Don’t ask leading and/or biased questions
- Stay away from asking double-barreled questions (Example: asking people to rate the quality of your product and support.) For example, a question that asks “The professor arrived for class well-prepared and on-time” is double barreled as it is possible for a professor to have been well-prepared but late, or on-time but not prepared.
- Define things specifically.
- Make sure you cover all possible answer choices. Include a “don’t know/doesn’t apply” option if it is appropriate. This gets you better results.
- Focus on using closed-ended questions (multiple choice or checkbox questions)
- For online survey, radio buttons are better than drop down menus.
- Avoid yes/no questions, if possible
- Usually between five and seven points is best
- Questions which use agree/disagree scales can be biased toward the “agree” side, so it’s usually best to avoid this wording.
- Points on the scale should be labeled with clear, unambiguous words
- Use open-ended questions sparingly. They can be useful for context but are hard to summarize.
- Be consistent with the formatting
- Don’t use acronyms without an explanation of what the acronym means. Avoid jargon.
- Present questions on a similar topic together. Do not switch back and forth between topics.
- Keep question order in mind. Survey responses can be impacted by previous questions.
- Think about the context that respondents are hearing your questions.
- Ask the important questions first and demographic questions last. That way if someone drops out of the survey you have important questions answered.
- Start a questionnaire with an introduction. If a respondent reads the survey, provide a title for each section. If an interviewer reads a survey, write smooth verbal transitions.
Always preview your survey before you send it. Test on a mobile device in addition to a webpage. Take the survey yourself to see how it flows and checking for mistakes. Test all components of the survey; it is more than the questions.
Online Resources for Creating Survey Content
5 Common Mistakes that will ruin Your Data by Survey Monkey
Creating Good Survey & Interview Questions by the OWL at Purdue University
Survey & Questionnaire Design — a free online tutorial. This is a cute, low-budget offering from StatPac, a survey software company. It’s easy to navigate, and not bad at all!
Types of Survey Questions by Survey Monkey
Sample Size Calculator by Qualtrics
Sample Size Calculator by Rasoft, Inc.
Likert Scale Response Options_MWCC
Administration
A statement that participation in the survey is voluntary. If respondents feel coerced to participate, they may not provide you accurate or useful survey data. Allow respondents to decline to respond or opt out of taking the survey if they wish.
Give instructions: If the respondent reads the survey, have clear instructions on the page.
How Many to Survey – Population vs. Sample
Do you need to ask every one or will a sample be sufficient?
The number of individuals that you should survey depends on a number of things, particularly the expected response rate, the level of accuracy you require (margin of error), any sub-groupings that you will need to look at (e.g. if you want to look at your data by gender, student class year, or faculty tenure status), and the number of response options in your questions.
When planning a survey, it is important to decide if you will need to administer the survey to an entire population (all new students), or to a sample of the population (250 randomly-chosen new students). If you are interested in broad measures, such as overall satisfaction, a sample may suffice. If, however, you want to answer questions about students in Math versus students in Chemistry, you may need a larger sample or a population in order to net enough respondents for your analysis.
Addressing Privacy, Anonymity and Confidentiality
Depending upon the type of survey and the subject matter, it is important to determine whether the survey should be anonymous, confidential, or neither.
Anonymous: It is difficult to guarantee anonymity. Many survey platforms log IP addresses, and many individuals can be uniquely identified by a combination of demographic data points that are commonly asked on surveys. A downside to an anonymous survey is that it is difficult to restrict the responses to be one per individual. However, for sensitive subject matters, anonymity may yield more honest responses.
Confidential: Confidential surveys do collect identifying information, but the identifying information is protected so that only the survey administrators collecting the survey data will be able to link specific respondents to their responses. When the survey results are shared, it will be done at an aggregate level or without sharing identifying information.
If the administrator of a survey promises or implies that the responses will be treated as confidential that means that the results are confidential to the research team; i.e., the results should never be reported at a level of granularity that would allow someone outside the research team to infer the identity of an individual. In no cases should individual survey responses be shared with a supervisor, an advisor, or anyone who is not part of the research team. Whenever possible identifiers should be separated from the dataset.
In your survey introduction or email, it is common to specify to your respondents whether your survey is anonymous or confidential, who will have access to the results, and how the results will be presented (typically stating data will be presented only in aggregate data form).
Launching the Survey – Contact Message
When your survey is ready, it’s time to think about how you will launch it. If at all possible, you should follow the Dillman method listed below to enhance responding. This will take some planning, as there are several points of contact and letters to be written. In the message that goes with the survey, you should briefly describe:
- the purpose of the survey;
- why they have been selected;
- who they may contact if they have questions about the survey;
- how the information will be handled (including whether identifiers will be kept);
- who will have access to the data; and,
- to whom summaries of the survey will be provided
Strive to make your survey and all correspondence associated with it look as polished and professional as possible.
- Capture attention with a better subject line. We’re inundated with emails. Make the message stand out with these tips:
- Be specific. Think of the subject line like a headline
- Use keywords to catch attention
- 50 characters or less
- Don’t use “Help” or “Reminder”
Increasing Response Rates
The recognized expert on enhancing response rates to surveys is Don Dillman, the namesake of the “Dillman Method.” You should make it very easy for your subjects to respond to the survey. Details matter! Here is a summary of Dillman’s basic steps in administering a survey:
- About a week before the survey is launched, send a personalized advance-notice message to everyone in the sample. Tell them why they have been selected and what the survey is for.
- When the survey is launched, have a personalized greeting line. The survey should come with instructions on how to return it. An electronic survey will have a “Submit” button at the end. A mailed paper survey should come with a stamped return envelope.
- Four to eight days after the survey is sent, a brief follow-up correspondence should be sent. It could be in the form of a letter or post-card, or email, but it should offer a thanks to those who responded and a request to those who didn’t, to complete the survey.
- About 10-18 days after the first survey message was sent, a second reminder message should go out with a new personalized cover letter. If you are tracking respondents, this should only be sent to those who have not yet responded.
Depending on your circumstances, you may not be able to follow this completely, but it’s an ideal to strive for. There are additional steps you can take, such as putting up posters, placing announcements in appropriate print or online mediums, having announcements made at relevant gatherings, etc. to help increase responding. Try not to overwhelm or be annoying.
Analyzing Data
Simple analyses can be done using Excel with Pivot Tables, histograms, and graphs, but more advanced statistical analyses will require the use of a statistical software package such as SPSS, SAS, Stata, or R.
One of the first things you should look at is your final response rate. What is the proportion of individuals surveyed who actually responded to the survey? Across the country and for many different types of surveys, response rates have been declining in recent years. A survey with multiple follow-ups will have a higher response rate than a one-shot survey. Typically, MWCC student surveys achieve a response rate around 25%.
Next, you should compare respondents to the population using whatever demographic information you may have (e.g. gender, race, major). This will help you to know how representative the respondents are of the population or sample from which they came. If the respondents are very different, you may not be able to assume that their responses reflect those of the target group.
With those findings in mind, you can start summarizing your data, and conducting appropriate statistical analyses.
Small Cell Size
It is important to make sure that you do not report results for very small groups of people. For example, if your intended report would break out responses by department, but only 4 people responded from one department, then reporting those responses could jeopardize the privacy of those respondents’ answers. OIRA only reports summary results and uses the standard practice of only reporting cell sizes of 5 or greater.
Report Findings
Once your results have been analyzed, we urge you to share the results in a way that they are accessible to the community surveyed. We have found that by being transparent with our results, the populations we survey can see the value in answering our surveys.
- Document the process of the administration and response rates.
- A survey report usually has these components at a minimum;
- Executive Summary. An executive summary provides the main findings of the survey succinctly. Readers can obtain the essentials of the survey findings without reading the details of the report.
- Contextual Background. Provides relevant details, such as the objectives and how the research results could be used.
- Briefly discuss who has been included in the survey and why, how many people were surveyed, how they were contacted, and the method of data collection. Key concepts and variables should also be clearly defined. Very detailed information regarding survey method may be included in the appendix.
- This section provides survey results and tabulations. Main results and findings should be presented first and then more detailed information should be provided. Often this section includes tables, charts, along with explanations of what the results mean and their significance.
Archiving Data
Once you have finished gathering your data, please provide the survey instrument, administration details, and record-level responses to OIRA. This is especially helpful, for example, when there are changes in departmental leadership.