Blog
Read stories and case studies from our work at the intersection of strategy, design, and impact. From evaluation insights to community-driven campaigns, we break down how data and storytelling drive better public health outcomes.

Evidence-Based Survey Design
for Public Health Organizations: What to Avoid
Designing a survey may seem straightforward, until response rates decline, the data doesn’t align with your indicators, or the questions fail to reflect community realities. In community health work, where surveys inform programs, funding, and policy, design missteps can lead to missed opportunities—or worse, erode community trust.
Here are five common pitfalls—and evidence-informed strategies to avoid them.
1. Asking Without Purpose
Before drafting any questions, define a specific, attainable goal and ensure it aligns with your broader evaluation strategy. What insights are you trying to gain? What decisions will this data help inform for stakeholders?
2. Overloading the Survey

Mistake: Asking too many questions or combining multiple topics in one tool.
Poor structure and survey fatigue are common culprits behind low-quality data. Overly long or disorganized surveys discourage completion, leading to biased or incomplete results.
The Institute of Education Sciences (IES, 2023) recommends…
- Keeping surveys brief and specific
- Grouping related items into clear sections
- Including progress indicators to reduce fatigue
- Placing open-ended or demographic questions at the end
Avoid poorly constructed questions, including:
- Loaded questions (contain assumptions)
- Leading questions (bias in the response)
- Double-barreled questions (ask two things at once)
- Vague questions (unclear or broad)
Inappropriate response formats—such as inconsistent scales—can also hinder data cleaning and interpretation (Dillman et al., 2014).
3. Ignoring Health Literacy
Mistake: Writing in academic or clinical language that confuses or alienates respondents.
Use plain, accessible language. Avoid jargon, and make sure the survey is culturally and linguistically relevant. IES (2023) recommends pilot testing with members of your intended audience to identify confusing terms or unclear instructions.
Inclusive surveys also consider a range of lived experiences. Ask: Is this question necessary? Is the language respectful? Are the identity and demographic options expansive enough?
4. Disseminating Too Many Surveys
Mistake: Repeatedly surveying the same groups without coordination.
Communities, especially those historically marginalized, are often over-surveyed and under-informed. This leads to survey fatigue, disengagement, and declining response rates. When communities are continuously asked for input without seeing change, it can feel extractive and transactional.
IES recommends reviewing prior surveys, leveraging publicly available data, and coordinating with partners to prevent duplication. If a new survey is necessary, clearly communicate its purpose, intended use, and how results will be shared. Respecting participants’ time and voice is essential for maintaining trust and ensuring long-term engagement.
5. Skipping the Feedback Loop
Final Thoughts
An effective survey is not just a data tool—it’s a trust-building tool. When designed with intention, clarity, and inclusivity, surveys produce not only better data but stronger relationships and more equitable outcomes.
Citations
- Institute of Education Sciences (2023). Creating Effective Surveys: Best Practices in Survey Design. https://ies.ed.gov/rel-west/2025/01/handout-creating-effective-surveys-best-practices-survey-design
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method (4th ed.). Wiley.
Stay Informed. Stay Connected.
Join our listserv for exclusive updates, practical tools, and curated resources on survey methodologies, evaluation frameworks, and best practices to strengthen your public health data collection and community engagement strategies.