New

Now in Claude, ChatGPT, Cursor & more with our MCP server

Back to docs
Research Methods

How to Write Great Interview Questions

Learn to craft open-ended, neutral interview questions that surface genuine user insights instead of confirmation bias.

How to Write Great Interview Questions

The questions you ask in a user interview determine the quality of insights you get out. Ask the wrong questions and you'll collect an hour of data that tells you nothing useful — or worse, data that misleads you.

This guide covers the principles, techniques, and specific examples you need to write interview questions that surface genuine, actionable insights.

The Golden Rule: Ask About Behavior, Not Opinion

The single most important principle in interview question design is this: ask people what they do, not what they think they would do.

Human beings are remarkably bad at predicting their own future behavior. Research from the Journal of Consumer Research shows that stated purchase intentions correlate with actual purchasing behavior only about 30% of the time. When you ask "Would you use a feature that does X?" you're asking someone to simulate a future scenario — and their answer is essentially worthless.

Instead, ground your questions in past behavior and concrete experiences:

  • Bad: "Would you find it useful to have a dashboard that shows team progress?"
  • Good: "Tell me about the last time you needed to understand what your team had accomplished. What did you do?"

The good version reveals actual behavior, real frustrations, and genuine workarounds — none of which the opinion-based version would surface.

Open vs. Closed Questions

Open-ended questions invite elaboration. Closed questions invite a single word or phrase.

Question TypeExampleWhat You Get
Closed"Do you use project management software?""Yes" or "No" — a dead end
Open"Walk me through how you keep track of your projects."A rich description of actual workflow
Closed"Is collaboration important to you?""Yes" (obviously) — no insight
Open"Tell me about a recent project where you had to coordinate with others. How did that go?"Specific challenges, tools, and pain points

As a rule, 80% or more of your interview questions should be open-ended. You can use a few closed questions for factual setup ("How many people are on your team?"), but the substance of your interview should invite stories and explanations.

Avoiding Leading Questions

A leading question pushes the participant toward a particular answer. They're insidious because they often feel natural to the interviewer but contaminate the data.

Research by Elizabeth Loftus demonstrated that even subtle wording changes in questions can significantly alter participant responses and memory recall. This applies directly to user interviews.

Examples of Leading Questions and Rewrites

Leading QuestionWhat's WrongNeutral Rewrite
"How frustrated do you get when the page loads slowly?"Assumes frustration exists"How do you feel about the page load times?" or better: "Walk me through what happens when you open the app."
"Don't you think it would be better if we added search?"Leads toward agreement"If you can't find what you're looking for, what do you do?"
"Most people love this feature — what do you think?"Social pressure to agree"What has your experience been with this feature?"
"How much time does this clunky process waste?""Clunky" and "waste" embed a judgment"Walk me through this process. How does it fit into your day?"
"Would you pay for a premium version with advanced analytics?"Suggests they should want it"How do you currently measure results? What works and what doesn't?"

Probing Techniques

Your initial questions open a topic. Probing questions dig deeper into it. Great interviewers spend more time probing than asking new questions.

The Five Probing Strategies

1. Elaboration probes — Ask for more detail.

  • "Can you tell me more about that?"
  • "What do you mean when you say it was 'complicated'?"

2. Clarification probes — Ensure you understand correctly.

  • "When you say 'the team,' who specifically are you referring to?"
  • "You mentioned switching tools — which tools were those?"

3. Example probes — Ground abstract statements in concrete instances.

  • "Can you give me a specific example of when that happened?"
  • "Think about the last time you did that — walk me through it step by step."

4. Contrast probes — Explore differences and preferences.

  • "You mentioned you used to do it differently. How does the current way compare?"
  • "What would the ideal version of that look like?"

5. Emotional probes — Understand feelings and motivations.

  • "How did that make you feel?"
  • "What was going through your mind at that point?"

Structuring Your Interview Guide

A well-structured guide follows a natural conversational arc:

1. Warm-Up (2–3 questions)

Purpose: Build rapport, gather context, ease into the conversation.

  • "Tell me a bit about your role. What does a typical day look like?"
  • "How long have you been working in [domain]?"

2. Broad Context (2–3 questions)

Purpose: Understand the participant's world before zooming into your topic.

  • "Walk me through how your team currently handles [process]."
  • "What tools or methods do you use for [activity]?"

3. Core Topic (5–8 questions)

Purpose: Deep dive into the specific area you're researching.

This is where you spend most of your time. Frame questions around key research themes, and be prepared to abandon your planned order to follow interesting threads.

4. Reflection (1–2 questions)

Purpose: Let participants step back and reflect on what they've shared.

  • "If you could change one thing about how you [do X], what would it be?"
  • "What would need to be true for [process] to work really well for you?"

5. Wrap-Up (1–2 questions)

Purpose: Capture anything you missed.

  • "Is there anything about [topic] that I should have asked about but didn't?"
  • "Anything else you'd like to share before we wrap up?"

The Mom Test Connection

Rob Fitzpatrick's The Mom Test offers a complementary lens on question design. His core insight is that even your mom will lie to you if you ask bad questions — not out of malice, but because humans naturally want to be encouraging and supportive.

The antidote is to never ask about the future or opinions, and instead focus on specifics from the past. For a deeper dive, see our Mom Test methodology guide.

Question Design for Different Research Goals

Research GoalQuestion StrategyExample
Explore a problemAsk about current behavior and pain points"Walk me through the last time you had to [task]. What was that like?"
Understand motivationAsk about triggers and decision-making"What prompted you to start looking for a new solution?"
Evaluate a conceptShow, don't tell; ask for reactions"Here's a rough sketch of an idea. What's your first reaction?"
Map a journeyAsk for step-by-step walkthroughs"Take me through the entire process from when you first realized you needed X to today."
Assess valueAsk about current costs and workarounds"How do you handle this today? What does that cost you in time or effort?"

Advanced Technique: The Critical Incident Method

Developed by John Flanagan in the 1950s, the critical incident technique asks participants to recall a specific, vivid experience — positive or negative — related to your research topic.

Research in the Journal of Service Research found that critical incident narratives yield richer behavioral and emotional data than general questions about the same topic.

Template: "Think about a time when [situation] went particularly [well/badly]. Can you walk me through exactly what happened, from the beginning?"

This technique produces vivid, story-rich data that's highly quotable in reports and highly useful for identifying specific design opportunities.

Practice Exercise

Take any one of these research questions and write 5 interview questions for it:

  1. "How do product managers decide which features to build next?"
  2. "What motivates people to switch from one email client to another?"
  3. "How do remote teams build trust?"

For each question, check: Is it open-ended? Is it neutral? Does it ask about behavior rather than opinion? Could it be more specific?

Tools like Koji can help you brainstorm and refine interview questions before your study, ensuring your guide is balanced and free of common biases.

Further Reading

Related Articles

Choosing a Methodology

An overview of every research methodology Koji supports and when to use each one.

Active Listening Techniques for Research Interviews

Learn how to practice active listening during qualitative interviews to uncover deeper participant insights through reflection, paraphrasing, and strategic silence.

Probing and Follow-Up Questions: Going Deeper in Research Interviews

Learn the different types of probing questions — clarification, elaboration, and contrast — and when to use each to get richer qualitative data from your participants.

Avoiding Bias in Research Interviews

Understand the most common biases in qualitative research — confirmation bias, leading questions, and social desirability — and learn proven techniques to minimize their impact on your data.

User Interview Guide Template: How to Plan, Run, and Analyze Interviews

A practical template for creating user interview guides that produce consistent, actionable insights — whether you run 5 interviews or 500.

The Definitive Guide to User Interviews

Everything you need to plan, conduct, and analyze user interviews that produce actionable research insights.

The Complete Guide to Thematic Analysis

Learn how to systematically analyze qualitative data using Braun and Clarke's six-phase thematic analysis framework.

Jobs-to-Be-Done Interview Guide

Learn the JTBD interview methodology to uncover why customers switch products and what progress they're trying to make.

The Mom Test: How to Talk to Customers Without Being Misled

Learn Rob Fitzpatrick's Mom Test methodology to ask questions that even your mother can't lie to you about.

Qualitative vs. Quantitative Research: When to Use Each Method

A clear breakdown of qualitative and quantitative research — what each method reveals, when to use each, and how to combine them for the most complete picture of your users.

UX Research Process: A Complete Framework for 2026

A practical end-to-end guide to the UX research process — from defining your research question to activating insights that actually change product decisions.

Focus Group Research: The Complete Guide

Learn when to use focus groups, how to design and moderate them, and when AI-powered individual interviews are a better fit.

Customer Discovery Interviews: The Complete Guide

Learn how to conduct customer discovery interviews to validate your product ideas before building. Covers Steve Blank methodology, question frameworks, sample sizes, and common mistakes.