Designing a Survey Part 1 - Asking Good Questions
- Jennica Nichols & Maya Lefkowich
- Oct 6
- 6 min read
Designing Survey Questions, Evaluation Methods, Tips for Survey Design, Reflections on Practice

So, you decided to make a survey (and if you’re still questioning whether it’s the right choice, you can check out our Blog Post on deciding if it is the right tool for you here).
Crafting good questions isn’t as easy as writing down what you want to know. You are likely managing competing interests from your board, funders, team, and community. Perhaps you feel pressured to measure things that will result in glowing reviews. Or, you might find yourself handcuffed to survey software that is less than ideal. And, maybe you are muddling through this survey off the side of your desk - because it isn’t actually what you do!
Limited resources, time crunches, and pressure often result in questions that are clunky, confusing, and burdensome for participants. The result? Participants skip through questions or abandon the survey. The survey falls apart when questions aren’t answerable.
Don’t waste time and energy on questions that don’t work. Below are the 5 most common mistakes we see folks make when crafting questions - and how to fix them before your survey goes live.
Red Flag # 1: Marathon Surveys
Most people generally won’t give you more than 10 minutes of their attention. Use that time well by prioritizing the minimum viable questions you need to satisfy the purpose of your survey. You can save space by:

Removing questions that are repetitive (e.g., asking what country and what continent people live in)
Using branching or display logic instead of multi-step or dependent questions (e.g., if you answered "Yes" above…)
Adding page breaks after similar questions to show progress
Stay tuned for our next post, Designing a Survey Part 2, where we explore strategies for structure and flow to further reduce survey chaos.
Red Flag # 2: Alienating Questions
Stereotypical, invasive, or harmful questions - especially when it comes to demographics - can cause harm to, break trust with, and result in unreliable data from participants. Data isn’t valid if it's harmful.
Consider how it might feel for someone to misrepresent themselves, reduce their identity to a tick-box, or disclose information that isn’t safe or appropriate for the context. Instead, you can prioritize respect in your survey by:

Removing anything you don’t absolutely need to know
If you ask for personal information, include a brief blurb about why you are asking and how you plan to use it (e.g., We have a few questions about your identity that will help us understand if we are meeting the needs of our community and ensure our programs have equitable benefits)
Be careful with “other” categories when asking people to identify with tick-boxes. It is a good idea to include many options that avoid binary thinking about identity, as well as an open-ended box like, “I prefer to identify as...” so people can choose how they want to describe their identities. It is not a good idea to use an “other” box when there are limited or binary options to begin with (e.g., How do you identify, “Male”, “Female”, or “Other”). This can be disrespectful and damaging to groups that are not meaningfully represented in the limited categories.
Avoid asking too many questions about identity. Often, people will only participate in a survey only when it is anonymous. When people worry that their responses could reveal their identity, they will abandon the survey or give dishonest answers.
Pro Tip:
We often see organizations ask for exhaustive or outdated demographic categories because the funder requires it. Push back gently. Most funders care more about ethics, program effectiveness, and sustained trust within the community than exhaustive demographics. If you have made your case and they won’t budge, consider making the mandatory question(s) optional and ensure you leave an open-field option for people to self-identify instead.
Red Flag # 3: Forced Choices
Often, people don’t know, don’t care, or don’t remember. While it might be important for you to know if they would definitely recommend your program to a friend, think your mandate is relevant, or feel represented by your leadership, they might not have enough information to give an honest answer. And, to be honest, they probably don't care as much about the answer as you do. If you force a choice, the answer may not be true.
For example, I (Maya) recently did an online survey asking if there is an airline I would never travel with. In truth, I don't have a strong preference. But the survey forced a choice between a list of a few airlines (most of which I had never flown with). So, I picked one at random to move on. The resulting data is not valid.
Help people answer honestly by giving them the option to feel "Meh" about your question. Here are three ideas to do this:
Type of Question | Option to Add | |
![]() | Yes or No | Maybe |
![]() | Matrix or Scale | Unsure or Not Applicable |
![]() | Preference Drop-Downs | Don't know or Don't care |
Red Flag # 4: Chaotic Lists
If you are asking people to choose from many available options, make it easy for them to find their choice in a list or drop-down. Example: if you are asking people which province or state they are from, you could order alphabetically, or geographically (e.g., east to west or north to south). This shows that you have thought about the question carefully and aren’t just throwing information at respondents to see what sticks.
Red Flag # 5: Making People Work Too Hard

People are volunteering their time and insights - so it should be easy and intuitive for them to complete your survey. If they have to reach for a calculator, check their tax returns, flip through their calendar to find specific dates, write you an essay, or answer skill-testing and trick questions to participate (all things we have seen requested in real surveys), they won’t. That’s why data from labour-intensive questions is notoriously unreliable.
People will make some s*** up and move on.
Example: we recently came across a survey asking people to:
Assess their familiarity with an organization’s mandate on a scale of 1-5
Describe the organization’s mandate in their own words in an open-text box
On the next page
Review the organization’s actual mandate (this included ⅓ of a page of text to read)
Rate the relevance of the organization’s mandate on a scale of 1-5
Summarize the mandate (now having seen it) in their own words
Why? These repetitive questions require a lot of work for the participant just to measure relevance and comprehension. If you want people to review and comment on information, make it easy. Give them a short blurb (like your mission or vision statement) and ask one or two direct questions about it. Simple, direct, and no trick questions!
Or, if you want to see what people gained from your program, ask a simple question about it like, “What was one thing you learned from X program?” or “Since completing X program, which of the following techniques have you tried?”
Repetitive questioning is more common than you'd think. We see it when organizations try to validate assumptions rather than genuinely gather participant feedback.
Reality Check
Good survey questions feel effortless to answer, but take real skill to craft. Even with these guidelines, expect to spend 4-5 hours crafting questions and 2-3 hours testing and refining. If that feels overwhelming alongside your other responsibilities, you're not alone.
Before your survey goes live, ask yourself (and your colleagues) - would you honestly complete this survey if it showed up in your inbox? If the answer is no, your participants won't either.

Consider reviewing each question in your next survey against these red flags. And, we always recommend the buddy system. Ask someone who matches your target participant profile to beta-test the survey. For example, if your program serves working parents, don't test with colleagues who don't have kiddos - their experience won't match your participants' reality.
Pro Tip:
If you can’t find anyone who matches your participant profile to beta test the survey, ask someone you trust who is a tough critic and won’t shy away from telling you, “This question doesn’t make sense” or “I don't know how to answer this.” We often ask our parents to review our surveys because they answer questions differently than we intended, highlighting the flaws in our logic.
When to Call for Backup:
If your team or partners are debating what to measure, you're getting pushback on question wording without clear solutions, or if the survey is taking longer than expected to finalize, those are signs that an outside perspective could save you time, money, and headaches.

Whether you're crafting your first survey or your fiftieth, we're here to help you avoid common pitfalls and create surveys that actually work for your programs and participants. At AND, we offer free 30-minute no obligation consultations to chat through your survey questions and woes to spot potential issues and see if we can help. You'll speak to a senior consultant who has been there and is deeply nerdy and passionate about effective survey design. Reach out for a free 30-minute chat by sending us an email at admin@andimplementation.ca
Liked what you read? Subscribe to our blog to have new content delivered right to you!
#collage #artsbasedmethods #artsbased #creative #qualitative #qualitativemethods #evaluation #skillbuilding #professionaldevelopment #trysomethingnew #evaluation #programevaluation #research #methodology #learning #creativepractice #creativeconsulting #tipsandtricks #tipsforpractice #methods
Comments