{"site":{"name":"Koji","description":"AI-native customer research platform that helps teams conduct, analyze, and synthesize customer interviews at scale.","url":"https://www.koji.so","contentTypes":["blog","documentation"],"lastUpdated":"2026-05-09T13:02:14.296Z"},"content":[{"type":"documentation","id":"2c9328bb-e839-4cf2-a09e-706c3c2b60d7","slug":"solution-interviews-guide","title":"Solution Interviews: How to Validate Product Ideas Before You Build","url":"https://www.koji.so/docs/solution-interviews-guide","summary":"A solution interview is the conversation you run after problem interviews confirm pain is real and before you build — you put a low-fidelity prototype in front of qualified prospects to test for comprehension, excitement, pull, and behavioral commitment. The 5-section script (re-confirm problem, present solution, capture unguided reaction, test pricing and commitment, surface alternatives) takes ~30 minutes per interview. AI interview platforms like Koji let founders run 20+ solution interviews per week without scheduling calls.","content":"**Solution interviews are the conversation you have *after* problem interviews and *before* you write a line of code: you put a low-fidelity version of your proposed solution in front of qualified prospects to test whether it solves a real problem at a price they will actually pay.** Steve Blank's customer development model splits these into two distinct phases for a reason — problem interviews tell you whether the pain is real, solution interviews tell you whether your answer is the right one. Skipping either step is the most common reason validated problems still produce dead products.\n\nThis guide covers when to run solution interviews, the structure of a high-signal solution interview, the four signals you are looking for, and how to run 20+ solution interviews in a week using Koji's AI moderator instead of stacking your calendar with 30-minute calls.\n\n## What is a solution interview?\n\nA solution interview is a structured conversation in which you present a hypothetical or low-fidelity solution to a customer who has already confirmed they have the underlying problem. The goal is *not* to sell. The goal is to learn whether the solution sparks the kind of intense, behavioral interest that predicts purchase — and to surface the objections that will kill conversion if you ignore them.\n\nSolution interviews live inside Steve Blank's customer development methodology, the four-step process — Customer Discovery → Customer Validation → Customer Creation → Company Building — that underpins the lean startup. Inside the discovery step, the sequence is:\n\n1. **State the hypothesis** — who has the problem, what pain it causes.\n2. **Test the problem** — problem interviews to confirm the pain is real, frequent, and urgent.\n3. **Test the solution** — solution interviews to confirm your proposed answer fits.\n\nThe artifacts you bring to a solution interview are deliberately rough: a clickable prototype, a static landing page, a hand-drawn user flow, or even a written description. Fidelity is the enemy here. The lower the fidelity, the easier it is for the customer to push back on the underlying idea instead of getting distracted by visual polish.\n\n## When to run solution interviews\n\nRun solution interviews after at least 10 problem interviews have produced strong signal that the pain is real, but before you have invested any meaningful engineering time. The right moment is when:\n\n- You can describe the problem in the customer's own words.\n- You have at least one specific solution hypothesis (not \"an app for X\" — a concrete proposal with at least three named features).\n- You can list two or three plausible alternatives the customer could pick instead.\n- You have not yet committed to building.\n\nIf you have already shipped, you are past solution interviewing — what you need now is [usability testing](/docs/usability-testing-guide), [concept testing](/docs/concept-testing-methodology), or [continuous discovery](/docs/continuous-discovery-user-research).\n\n## Solution interview structure (the 5-section script)\n\nA solid solution interview runs about 30 minutes and follows this arc:\n\n### 1. Re-confirm the problem (5 min)\n\nDon't assume the participant remembers the problem context. Open by asking them to walk through the last time they faced the underlying pain. This re-anchors the conversation in their reality before you introduce your solution. If they struggle to recall a specific instance, you're talking to the wrong person — politely end the interview.\n\n### 2. Present the solution at low fidelity (5 min)\n\nShow the prototype, sketch, or description. Keep it brief, neutral, and free of marketing language. Resist the urge to \"sell\" or list benefits — describe what the solution does and let the participant react. Two sentences is plenty.\n\n### 3. Capture the unguided reaction (10 min)\n\nThis is the most valuable section. Open with \"Tell me what you're thinking\" and then shut up. Long silences are fine. The first 60 seconds of unguided reaction is where genuine signal lives — once you start prompting, you're leading. Probe only on specifics (\"you said it's 'interesting' — interesting how?\") not on judgments.\n\n### 4. Test pricing and commitment (5 min)\n\nThis is what separates solution interviews from concept tests. You are not asking \"would you buy this?\" — you are testing for behavioral commitment. Three questions in increasing order of power:\n\n- **Price anchor:** \"What would you expect something like this to cost?\"\n- **Commitment ask:** \"If I had this ready in 30 days, would you give me an email address to be notified at launch?\"\n- **Pre-order ask (advanced):** \"Would you put a $1 deposit on it today?\"\n\nA verbal \"yes\" is worth almost nothing. An email address is worth a little. A credit card is worth a lot. The whole point of the pricing section is to escalate toward an action that costs the participant something, however small.\n\n### 5. Surface alternatives and objections (5 min)\n\nClose by asking what they would do if your solution didn't exist, what would stop them from using it, and who else they'd need to convince. The answers tell you about competitive alternatives (often surprising) and adoption blockers you haven't thought of.\n\n## The four signals you are looking for\n\nA solution interview produces four signals, in roughly increasing order of value:\n\n1. **Comprehension.** Did the participant understand what the solution does without you explaining twice? If not, your positioning is broken.\n2. **Excitement.** Did they ask follow-up questions or volunteer use cases? Polite interest is not excitement.\n3. **Pull, not push.** Did *they* try to extract more information from you, or did you have to keep the conversation alive? Pull is the strongest qualitative signal of fit.\n4. **Behavioral commitment.** Did they hand over an email, a deposit, a calendar invite, or an introduction to a buyer? This is the only signal that survives optimism bias.\n\nIf you walk away from 10 solution interviews with comprehension and politeness but no pull and no commitment, your solution does not fit — and no amount of polishing the wireframes will change that.\n\n## Solution interview vs. problem interview vs. concept test\n\n| | Problem Interview | Solution Interview | Concept Test |\n|---|---|---|---|\n| **When** | Before any solution exists | After problem is validated, before build | After build, before launch |\n| **You bring** | Open questions about the problem | Low-fidelity prototype + hypothesis | Working concept or feature |\n| **Looking for** | Real, frequent, painful problem | Pull and commitment signals | Comprehension and usability |\n| **Sample** | 10–20 | 15–30 | 50–200 |\n| **Risk if skipped** | Build for a fake problem | Build the wrong solution | Launch a confusing product |\n\nFor a deeper read on the adjacent stages, see [customer discovery interviews](/docs/customer-discovery-interviews) (problem interviews) and [concept testing methodology](/docs/concept-testing-methodology) (post-build).\n\n## How to run 20+ solution interviews in a week with Koji\n\nThe traditional knock against solution interviews is calendar gravity. Booking 20 separate 30-minute calls eats two weeks of founder time and self-selects toward people willing to take a stranger's call — usually not your target customer.\n\nKoji removes the scheduling layer entirely:\n\n- **Embed your prototype or solution mock** as context inside the study so the AI moderator can describe it to each participant in their language.\n- **Run text or voice mode.** Voice produces richer reactions; text scales to participants who don't want a call. See [Voice vs Text Interview](/docs/voice-vs-text-interviews) for when to use each.\n- **Three-layer AI probing** ensures every \"interesting\" gets followed by \"interesting how, specifically?\" — the question that converts polite interest into real signal. The mechanics are documented in the [AI Probing Guide](/docs/ai-probing-guide).\n- **Six [structured question types](/docs/structured-questions-guide)** let you embed price anchoring (scale), commitment asks (yes/no), and alternative ranking (ranking) without breaking the conversational flow.\n- **Personalized links** from your CRM mean every interview opens with the participant's name, company, and prior context already loaded.\n- **Real-time results** mean you don't wait for field to close — the report updates as each interview completes, so you can iterate the prototype mid-study and run a second batch by Friday.\n\nA founding team using Koji can ship a tight problem-then-solution sequence — 12 problem interviews, 20 solution interviews, opportunity scoring — in two weeks of elapsed time. Two weeks of conversation is faster than most teams spend deciding what to build.\n\n## Common solution interview mistakes\n\nSolution interviews fail for predictable reasons. The big four:\n\n- **Showing high-fidelity mocks.** Beautiful designs invite design feedback instead of solution feedback. Stay rough.\n- **Pitching instead of presenting.** \"And the best part is…\" is a tell. Describe and then shut up.\n- **Skipping the commitment ask.** Verbal interest without behavioral commitment is the most-cited source of false positives in early-stage validation.\n- **Talking to the wrong people.** Solution interviews are only as honest as your screener. If you can't qualify against the underlying problem, your participants will be polite about a solution they will never need.\n\nFor a broader catalog of failure modes, see [User Research Mistakes](/docs/user-research-mistakes).\n\n## Related Resources\n\n- [Structured Questions in AI Interviews](/docs/structured-questions-guide) — the six question types Koji combines to capture pricing, commitment, and ranking signals\n- [Customer Discovery Interviews](/docs/customer-discovery-interviews) — problem-side interviews that precede solution interviews\n- [Customer Development Methodology](/docs/customer-development-methodology) — Steve Blank's four-step parent framework\n- [The Mom Test Methodology](/docs/mom-test-methodology) — how to ask honest customer questions without leading\n- [Concept Testing Methodology](/docs/concept-testing-methodology) — what comes after solution interviews when you have a working concept\n- [Prototype Testing and Concept Validation](/docs/prototype-testing-concept-validation) — turning a validated solution into a tested prototype","category":"Research Methods","lastModified":"2026-05-09T03:21:32.788639+00:00","metaTitle":"Solution Interviews: Validate Product Ideas Before You Build (2026)","metaDescription":"How to run solution interviews — Steve Blank's second customer-development phase — to test if your prototype solves a real problem before writing code. Structure, scripts, and AI-moderated scaling.","keywords":["solution interviews","validate product idea","steve blank customer development","idea validation","solution validation","lean startup","prototype validation","concept validation interviews","startup validation","customer validation"],"aiSummary":"A solution interview is the conversation you run after problem interviews confirm pain is real and before you build — you put a low-fidelity prototype in front of qualified prospects to test for comprehension, excitement, pull, and behavioral commitment. The 5-section script (re-confirm problem, present solution, capture unguided reaction, test pricing and commitment, surface alternatives) takes ~30 minutes per interview. AI interview platforms like Koji let founders run 20+ solution interviews per week without scheduling calls.","aiPrerequisites":["customer-discovery-interviews","customer-development-methodology"],"aiLearningOutcomes":["Distinguish solution interviews from problem interviews and concept tests","Structure a 5-section solution interview script","Test for behavioral commitment instead of verbal interest","Read the four signals: comprehension, excitement, pull, commitment","Run 20+ solution interviews in a week using AI moderation"],"aiDifficulty":"intermediate","aiEstimatedTime":"11 min read"}],"pagination":{"total":1,"returned":1,"offset":0}}