The Definitive Guide to User Interviews
Everything you need to plan, conduct, and analyze user interviews that produce actionable research insights.
The Definitive Guide to User Interviews
User interviews are the backbone of qualitative research. They give you direct access to the thoughts, motivations, and frustrations of the people you're building for — something no survey or analytics dashboard can replicate.
But there's a catch: poorly conducted interviews produce misleading data that can send your team in the wrong direction. This guide will teach you how to run interviews that surface genuine insights, not just confirmation of what you already believe.
What Is a User Interview?
A user interview is a one-on-one conversation between a researcher and a participant, designed to explore the participant's experiences, behaviors, needs, and motivations related to a specific topic. Unlike surveys, interviews allow you to follow up on unexpected responses, probe deeper into reasoning, and observe the emotional weight behind answers.
According to the Nielsen Norman Group, user interviews are one of the most frequently used UX research methods, employed by over 80% of UX professionals. Their power lies in their flexibility — you can adapt your questions in real time based on what you're hearing.
When Should You Use Interviews?
Interviews are ideal when you need to:
- Explore a problem space you don't yet understand well
- Understand motivations behind behaviors you've observed in analytics
- Validate or challenge assumptions before investing in a solution
- Gather rich context around user workflows and pain points
- Uncover needs users can't articulate through indirect questioning
They're less useful when you need statistically significant data, when you're testing usability of a specific interface (use usability testing instead), or when you need to measure preferences across a large population (use surveys).
Types of User Interviews
Not all interviews serve the same purpose. Here's how the main types compare:
| Interview Type | Best For | Structure Level | Typical Duration | When to Use |
|---|---|---|---|---|
| Structured | Consistent data collection across many participants | High — fixed questions in fixed order | 20–30 min | Comparing responses across a defined set of questions |
| Semi-structured | Balancing consistency with exploration | Medium — prepared guide with flexibility | 30–60 min | Most research projects; allows follow-up on surprises |
| Unstructured | Deep exploration of a new domain | Low — topics but no fixed questions | 45–90 min | Early discovery when you don't know what you don't know |
| Contextual inquiry | Understanding real-world behavior | Medium — observation plus questions | 60–120 min | When environment and workflow matter |
| JTBD interview | Understanding purchase/adoption decisions | Medium — timeline-based structure | 45–60 min | Product strategy and positioning |
For most product research, semi-structured interviews hit the sweet spot. You get the consistency of a prepared guide with the freedom to explore unexpected threads. Learn more about different approaches in our choosing a methodology guide.
Planning Your Interviews
Step 1: Define Your Research Questions
Before writing a single interview question, clarify what you're trying to learn at the project level. Research questions are not the same as interview questions — they're the strategic questions your study aims to answer.
Good research questions are specific enough to be answerable but broad enough to leave room for surprise:
- "How do product managers currently prioritize feature requests, and what frustrations do they experience in that process?"
- "What triggers someone to switch from a free tool to a paid solution for project management?"
Step 2: Identify Your Participants
The quality of your insights depends entirely on talking to the right people. You need participants who have relevant, recent experience with the topic you're studying.
According to research published by Nielsen Norman Group, 5 participants will uncover approximately 85% of usability problems in usability testing. For interview-based research exploring attitudes and behaviors, the numbers are different — most researchers find that 8 to 12 interviews are needed to reach thematic saturation for a reasonably homogeneous group.
For more on recruitment, see our guide on finding research participants, and for guidance on sample size, read how many interviews are enough.
Step 3: Write Your Interview Guide
Your interview guide is a structured document that outlines the topics and questions you want to cover. It's a guide, not a script — you should feel comfortable departing from it when a participant says something interesting.
A strong guide includes:
- Warm-up questions (2–3 minutes): Build rapport, confirm background
- Core questions (20–40 minutes): Your main research topics, roughly 8–12 open-ended questions
- Wrap-up (5 minutes): Catch-all questions, thank the participant
For detailed guidance on crafting questions, read how to write great interview questions.
Step 4: Set Up Logistics
Decide on:
- Format: Remote (video call) vs. in-person. Research from the Journal of Usability Studies suggests that remote interviews produce comparable data quality for most topics, with higher participant recruitment rates.
- Recording: Always get consent. Record audio and video if possible — transcripts are essential for analysis.
- Duration: 30–60 minutes is the sweet spot. Shorter and you can't go deep; longer and fatigue sets in.
- Incentives: Appropriate compensation shows respect for participants' time. See finding research participants for guidance on incentive amounts.
Conducting the Interview
Opening (First 5 Minutes)
The first few minutes set the tone for the entire conversation. Your goals are to:
- Make the participant comfortable. Small talk matters. Ask how their day is going.
- Explain the purpose. "We're trying to understand how people approach [topic]. There are no right or wrong answers — we're learning from your experience."
- Get consent. "Is it okay if I record this conversation? The recording is only for our research team."
- Set expectations. "This will take about 45 minutes. I'll be asking questions and mostly listening. Feel free to say if any question doesn't make sense."
The Core Conversation
This is where your skill as an interviewer matters most. Here are the techniques that separate great interviews from mediocre ones:
Follow the energy. When a participant's voice changes — they speed up, slow down, show frustration or excitement — that's a signal. Follow it. "You seemed really frustrated when you mentioned that. Can you tell me more?"
Use the 5-second rule. After a participant finishes an answer, wait 5 seconds before asking your next question. People often fill silence with their most honest, unfiltered thoughts.
Probe with purpose. Generic follow-ups like "tell me more" are fine occasionally, but targeted probes are more effective:
- "You mentioned X — what did you mean by that?"
- "Can you walk me through a specific time that happened?"
- "What happened next?"
- "How did that make you feel?"
Ask for stories, not opinions. "Tell me about the last time you had to onboard a new team member" produces richer data than "What do you think about onboarding?"
Stay neutral. Don't react to answers with "That's great!" or "Oh no." Use neutral acknowledgments: "That's helpful," "I see," or a simple nod.
Closing (Final 5 Minutes)
End with catch-all questions that give participants space to share things you didn't think to ask:
- "Is there anything about [topic] that I should have asked about but didn't?"
- "Is there anything else you'd like to share?"
Thank them genuinely. Tell them what happens next.
Analyzing Your Interviews
Raw interview recordings are useless until you transform them into structured insights. This is where many research projects stall — analysis is time-consuming and cognitively demanding.
Research from the University of Auckland found that manual qualitative coding typically takes 5 to 8 hours per hour of interview recording. For a study of 10 interviews averaging 45 minutes each, that's 37 to 60 hours of analysis work.
The analysis process typically follows these steps:
- Transcribe all recordings. Use automated transcription and then review for accuracy.
- Read through all transcripts to get an overall sense of the data.
- Code the data by tagging meaningful segments with labels.
- Identify themes by grouping related codes together.
- Synthesize themes into findings that answer your research questions.
For a complete breakdown of the analysis process, see our thematic analysis guide and affinity mapping guide.
Platforms like Koji can significantly accelerate this process by automatically transcribing interviews and using AI to surface initial themes — turning weeks of manual coding into hours of guided analysis. This lets you spend more time interpreting what the patterns mean rather than hunting for them.
Common Mistakes to Avoid
| Mistake | Why It Hurts | What to Do Instead |
|---|---|---|
| Asking leading questions | Participants tell you what you want to hear | Use neutral framing; see writing interview questions |
| Talking too much | You learn nothing while talking | Aim for a 20/80 split (you/them) |
| Not recording | You'll forget critical nuances within hours | Always record with consent |
| Skipping the pilot | Your guide may have confusing or redundant questions | Test with 1–2 participants first |
| Recruiting the wrong people | Insights don't apply to your actual users | Define screening criteria carefully |
| Analyzing too late | Context fades quickly from memory | Debrief after each session |
Building an Interview Practice
The best research teams don't treat interviews as one-off events — they build a continuous practice of talking to users. Here's how to get there:
- Establish a regular cadence. Even 2–3 interviews per month keeps you connected to your users.
- Create a participant pool. Maintain a list of people who've consented to be contacted for future research.
- Build organizational buy-in. Invite stakeholders to observe interviews (with participant consent). Nothing converts skeptics faster than hearing a user struggle with something the team thought was obvious.
- Document and share. Make findings accessible to the whole team, not locked in a researcher's notebook.
Next Steps
Ready to start interviewing? Here's your learning path:
- How to Write Great Interview Questions — craft questions that surface honest, useful answers
- Finding Research Participants — recruit the right people efficiently
- How Many Interviews Are Enough? — know when you've reached saturation
- Thematic Analysis Guide — turn raw transcripts into structured insights
Related Articles
AI Interviews vs. Surveys — Why Conversations Beat Forms
Traditional surveys give you data. AI-powered interviews give you understanding. Compare response quality, completion rates, insight depth, and cost-effectiveness between survey tools and AI interview platforms like Koji.
Active Listening Techniques for Research Interviews
Learn how to practice active listening during qualitative interviews to uncover deeper participant insights through reflection, paraphrasing, and strategic silence.
User Interview Guide Template: How to Plan, Run, and Analyze Interviews
A practical template for creating user interview guides that produce consistent, actionable insights — whether you run 5 interviews or 500.
Empathy Interviews: Questions, Structure, and How to Run Them
An empathy interview goes deeper than a typical user interview — it surfaces the feelings, values, and mental models behind behavior. This guide explains how to structure and run empathy interviews that reveal what customers really experience.
In-Depth Interviews: The Complete Methodology Guide
Everything you need to plan, conduct, and analyze in-depth interviews — the gold standard of qualitative research.
The Complete Guide to Thematic Analysis
Learn how to systematically analyze qualitative data using Braun and Clarke's six-phase thematic analysis framework.
How to Write Great Interview Questions
Learn to craft open-ended, neutral interview questions that surface genuine user insights instead of confirmation bias.
Jobs-to-Be-Done Interview Guide
Learn the JTBD interview methodology to uncover why customers switch products and what progress they're trying to make.
The Mom Test: How to Talk to Customers Without Being Misled
Learn Rob Fitzpatrick's Mom Test methodology to ask questions that even your mother can't lie to you about.
How to Find and Recruit Research Participants
A practical guide to sourcing, screening, and scheduling the right participants for your qualitative research study.
How Many Interviews Are Enough? A Guide to Sample Size
Understand saturation, practical guidelines, and research-backed recommendations for qualitative sample sizes.
Affinity Mapping: Organize Qualitative Data Into Themes
Learn how to use affinity mapping to group qualitative research data into meaningful clusters and uncover actionable patterns.
Qualitative vs. Quantitative Research: When to Use Each Method
A clear breakdown of qualitative and quantitative research — what each method reveals, when to use each, and how to combine them for the most complete picture of your users.
UX Research Process: A Complete Framework for 2026
A practical end-to-end guide to the UX research process — from defining your research question to activating insights that actually change product decisions.
User Research Plan Template: How to Plan Research That Gets Used
A complete user research plan template with guidance for every section — so your research stays on track, gets stakeholder buy-in, and produces findings that actually influence decisions.
Focus Group Research: The Complete Guide
Learn when to use focus groups, how to design and moderate them, and when AI-powered individual interviews are a better fit.
Customer Discovery Interviews: The Complete Guide
Learn how to conduct customer discovery interviews to validate your product ideas before building. Covers Steve Blank methodology, question frameworks, sample sizes, and common mistakes.
How to Conduct Usability Testing: The Complete Guide
A comprehensive guide to usability testing for UX researchers and product managers. Covers types of testing, participant numbers, step-by-step facilitation, and the most common mistakes to avoid.
Surveys vs. Interviews: How to Choose the Right Research Method
A comprehensive comparison of surveys and interviews as research methods. Understand when to use each, the key trade-offs, how to combine them in mixed-methods studies, and why the choice matters for research quality.
Customer Discovery Interviews: The Complete Guide
Learn how to plan, conduct, and analyze customer discovery interviews that reveal real customer needs — and how AI can help you run them at scale.
How to Scale Your User Research Practice
A practical guide to building a research operation that generates more insights with the same headcount — using automation, democratization, and continuous research pipelines.