Soliciting Quantitative Feedback

“A successful person is one who can lay a firm foundation
 with the bricks that others throw at him or her.”  – David Brinkley

User surveys and feedback forms are arguably the easiest and most inexpensive methods of gathering information about your users and their preferences.  Like competitive analysis, surveys are also used in a marketing context. However, UXD surveys contain more targeted questioning about how usable a website or software is, and the relative ease with which people can access content. There are a few different types of user surveys. Some are different questioning methods; others approach the user at a different point in time.

Feedback forms – Feedback forms are the most common way to elicit input from users on websites, but they can be implemented for desktop applications. A feedback form is simply a request for input on an existing application or set of functionality. A feedback form can be a single, passively introduced questionnaire that is globally available on a website for users to fill out. They can also be short one- or two-question mini-forms placed throughout an application in relevant places such as a help/support topic, or when an application unexpectedly terminates.

Tips for Feedback:

  • Keep it simple. A shorter, simpler survey will be completed more frequently.
  • Always offer a comment field or other means for users to type free-form feedback. Comments should be monitored and categorized regularly.
  • If your marketing team has already established a feedback form, try to piggy back off of their efforts. If not, consider collaboration. Much of the information collected in a feedback mechanism will be useful to both groups.

Surveys – A survey is not all that different from a feedback form, except that they are usually offered for a limited amount of time, involved some sort of targeted recruitment effort, and the content of the questions are not necessarily related to an existing site or application. Surveys can be implemented in a few ways, depending on the goals of your research.

  • Intercept Surveys – Intercept surveys are commonly used on web sites. An intercept survey is a survey that attempts to engage users at a particular point in a workflow, such as while viewing a certain type of content. Intercept surveys usually take the form of a pop-up window or overlay message. Because these are offered at a specific point in a process (e.g. the jewelry section of an e-commerce department store.), the results are highly targeted.
  • Online Surveys – Online surveys can be conducted to learn about a website, application, kiosk, or any other type of system. However, the survey itself is conducted online. There are many advantages over traditional phone or in-person surveys, not the least of which the ease of implementation. There are a number of online survey tools available that will make it easy for you to implement your survey in minutes. Most offer real-time data aggregation and analysis.
  • Traditional Surveys – There is definitely still a place for surveys conducted in person or on the phone. While a bit more costly and time consuming these methods can reach users who would not otherwise be reachable online or those who normally will not take the time to fill out a survey on their own.

Why Conduct User Surveys?

  • User Surveys are an effective, economical method of gathering quantitative input.
  • You can poll a very large number of users when compared to the number of users who participate in user testing or other research methods.
  • Since surveys (online versions in particular), are usually anonymous, you are more likely to get more honest and forthright responses than from other forms of research.
  • Surveys can be implemented quickly, allowing you to test ideas in an iterative process.

When is a Survey Most Useful?

  • When you have specific targeted questions that you would like answered by a statistically relevant number of people. Polling attitudes, etc.
  • When there is a specific problem that you want to investigate.
  • When you want to show the different between internal attitudes and perceptions of an issues verses the attitudes and perceptions of your target market
  • When you want direct feedback on a live application or web site.
  • When you want to use ongoing feedback to monitor trends over time, and gauge changes and unexpected consequences.

Development Life-cycle

Surveys are useful in the requirements gathering stage as well as follow-up feedback elicitation during the maintenance stage, i.e. post release.

Limitations of User Surveys

  • User surveys are entirely scripted. So while skip logic[1] and other mechanisms can offer a degree of sophistication, there is little interactive or exploratory questioning.
  • Users usually provide a limited amount of information in any open ended questions.
  • Because there is little room to explain or clarify your questions, surveys should be limited to gathering information about tangible, somewhat simplified concepts.
  • In the case of passive feedback forms, most people won’t think to offer positive feedback. In addition, not everyone with a complaint will take the time to let you know.

How to Conduct a User Survey

  1. Define your research plan. As in any research effort, the best starting point will be to define and document a plan of action. A documented plan will be useful even if it’s just a bullet list covering a basic outline of what you will be doing. Some considerations for your user survey research plan are:
    • What are the goals of your survey? What are you hoping to learn from participants? Are you looking for feedback on what currently functionality, to explore concepts and preferences, or to troubleshoot an issue?
    • Who will you survey? Are you polling your existing users? Do you need to hear from businesses, teens, cell phone users, or some other targeted population?
    • How many responses will you target? While even a small number of responses are arguably more valuable than no input at all, you will want to try for a statistically relevant sample size. There are formulas that will tell you how many survey participants are needed for the responses to be representative of the population you are polling. For most common UXD uses, however, 400-600 participants should be sufficient.[2]
  2. Select a survey tool. There are a number of online survey tools to choose from so you’ll need to select an application that fits your goals and budget. Even if you will be conducting the surveys in person or on the phone, consider using survey software as they offer substantial time savings in processing and analyzing the results. Once you have selected the survey software, take some time to use the system and get to know what options are available. A basic understanding of your survey tool can help reduce rework as you define your questions.
  3. Develop your survey. This is usually the most time consuming part of the process. While seemingly straightforward, it’s quite easy to phrase questions in a way that will bias responses, or that are not entirely clear to participants—resulting in bad or inaccurate data.
    1. Define your Questions. Consider the type of feedback you would like to get and start outlining your topics. Do you want to learn how difficult a certain process is? Do you need to find out if people have used or are even aware of certain functionality? Once you have a basic outline, decide on how to best ask each of your questions to get useful results. Some possible question types include:
      • Scale – Users are asked to rate something by degree i.e. 1 to 10, Level of Agreement. Be sure to include a range of options and a “not applicable” where relevant.
      • Multiple Choice – Users are asked a question and given a number of pre-set responses to choose from. Carefully review multiple choice answers for any potential bias.
      • Priority – Users are asked to prioritize the factors that are important to them, sometimes relative to other factors. A great exercise to help users indicate priority is the “Divide the Dollar” where users are asked to split up a limited amount of money by dividing it among a set of features, attributes etc.
      • Open Ended – Users are asked a question that requires a descriptive response and are allowed to provide an open response, usually via text entry. Opened ended questions should be used sparingly in this type of quantitative study; it’s difficult to get consistent responses and they are difficult to track and analyze.
    2. Edit & Simplify.  Your goal is to write questions that the majority of people will interpret in the same way. Consider both phrasing and vocabulary. Try to write in clear, concise statements, using plain English and common terms. Your survey should be relatively short and the questions should not be too difficult to answer. In addition, be sure to check and double check grammar and spelling. I once had an entire survey question get thrown off due to one missing letter. We had to isolate the data and re-publish the survey.
    3. Review for Bias. Review the entire survey, including instructions or introductory text. Consider the sequence of your questions; are there any questions that might influence future responses?  Take care not to provide too much information about the purpose of the survey; it might sway user’s responses. Keep instructions clear and to the point. As with all research, you need to be ever vigilant in identifying areas of potential bias.
    4. Would you like a piece of candy?
      Yes (95%)            No (5%)

      Would you like a piece of caramel?
      Yes (70%)            No (30%)

      The Surgeon General recently came out with a study that shows candy consumption as the most common cause of early tooth loss. Would you like a piece of candy?
      Yes (15%)            No (85%)

      Thinking Ahead
      Recruit for future studies. At the very end of your survey, ask users if they’d like to participate in future studies. If they agree, collect their contact details and some additional demographic information. This will help you build up your own database of participants for more targeted future testing. Many users will want to be assured that their information will not be connected to their survey responses, and that you will not resell their data.

  4. Implement & Internally Test. Implement the survey using the survey tool you selected. Next, test the survey internally, this can mean you and a few colleagues, or a companywide involvement. Testing the survey internally will help ensure your questions are clear, without error, and that you are likely to get the type feedback you want.
  5. Our Users, Ourselves

    Data from an internal test can be additionally helpful. Getting feedback from people in your company or department will likely produce biased results. While it’s important to isolate that data before conducting the actual survey, you can use feedback from your internal group to illustrate any differences (or notable similarities) in their responses from those of your users. E.g. 90% of our employees use Twitter, while only 35% of our users do.

  6. Recruit Respondents. Your research plan should have outlined any appropriate demographics for your survey. But now that you are ready to make your survey live, how will you reach them? If you have a high traffic web site or large email list, you can use these to recruit participants. If not, you may need to get creative. Consider recruiting participants through social networking sites, community boards or other venues frequented by your target population.
  7. Prepare the Analysis. – One of the nice things about the online survey tools is that the analysis is usually a breeze. You may, however, want to put together an overview of some of the insights and interpretations you gathered from the data, e.g. “Since a relatively small number of our customers tend to use social networking sites for business referrals, we may want to lower the priority of getting Twitter integrated onto our website in favor of other planned functionality.” As with all UXD research, present your findings and publish any results.

Additional Resources

  • Sample Size Calculator (www.surveysystem.com/sscalc.htm) – a sample size calculator that will determine the sample size required for a desired confidence interval.
  • SurveyMonkey (www.surveymonkey.com) – offers a popular online hosted survey tool that works well for basic surveys.
  • SurveyGizmo (www.surveygizmo.com) – is comparable to SurveyMonkey, but offers a somewhat less robust reporting at a slightly lower fee.
  • Ethnio (www.ethnio.com) – is primarily a recruitment tool, but offers some basic survey functionality. Ethnio can be used to drive users from your website to another third party survey.

 

 


[1] Skip logic refers to the ability to skip a question or set of questions in a survey based on the users response to a preceding question.

[2]  For a survey of the general population where the data has a 95% degree of confidence and 4-5% margin of error.