Introduction
The Mom Test (by Rob Fitzpatrick) is the gold standard for customer interviews. It teaches you how to ask questions that uncover real problemsโwithout leading, bias, or getting polite lies. This guide gives you 20 real interview examples, showing what works, what fails, and how to get actionable insights for your indie hacker journey.
Why does this matter? According to the Lean Startup methodology, most startups fail because they build products nobody wants. The Mom Test helps you avoid this by teaching you to validate assumptions before investing time and money.
What is the Mom Test?
The Mom Test is a framework for conducting customer interviews that reveals the truth about whether people actually care about your idea. The core principle: you should be able to ask your mom about your idea and get honest feedback (hence the name).
The book “The Mom Test” by Rob Fitzpatrick has become essential reading for indie hackers, founders, and product managers worldwide. It’s about cutting through the noise of polite feedback to find real, actionable insights.
Key Problem It Solves
When you pitch your idea to friends and family, they’re likely to be nice. They don’t want to hurt your feelings. This is called a false positiveโsomeone says they like your idea when they don’t actually care enough to use it or pay for it. The Mom Test eliminates these false positives by teaching you to ask about past behavior instead of hypothetical futures.
The Three Core Principles of the Mom Test
1. Talk About Their Life Instead of Your Idea
Instead of pitching your solution, ask questions about the person’s current situation, workflow, and challenges.
Why it works: People are naturally defensive when pitched to. They become less honest and more polite. When you ask about their life, they relax and share real stories.
Example shift:
- โ “I’m building a time-tracking app. Do you think you’d use it?”
- โ “How do you currently track your time spent on projects? Walk me through your process.”
2. Ask About Specifics in the Past Instead of Generics or Opinions About the Future
Past behavior is the best predictor of future behavior. Generic questions and hypotheticals invite lies; specific past questions reveal truth.
Why it works: People are notoriously bad at predicting their own behavior. Asking “Would you pay for X?” gets false positives. Asking “Have you paid for similar tools?” reveals their actual spending patterns.
Example shift:
- โ “Do you think scheduling software is useful?”
- โ “Tell me about the last time you struggled with scheduling. What was the problem, and how did you solve it?”
3. Talk Less and Listen More
Your job is not to convince or pitch. Your job is to listen and learn.
Why it works: When you talk, you’re reinforcing your assumptions. When you listen, you learn new things. Great interviewers spend 80% of the time listening and 20% talking.
Pro tip: If you find yourself talking more than 30% of the interview, you’re doing it wrong.
The Mom Test Principles Deep Dive
Avoid Opinions and Hypotheticals
Definition: A hypothetical is a “what if” question about a future scenario that hasn’t happened yet.
- โ “Would you use this?” (hypothetical, opinion-based)
- โ “How are you currently solving this?” (past behavior)
Why opinions don’t work: People form opinions quickly and without much thought. They’re shaped by how you framed the question. If you ask nicely, they’ll say yes. Opinions are unreliable data.
Ask About Past Behavior, Not Future Intent
Definition: Future intent is what someone says they will do. Past behavior is what they actually did.
Research shows there’s often a massive gap between stated intent and actual behavior. This gap is called the intention-action gap. Relying on future intent leads to building products nobody uses.
Examples of the gap:
- Someone says they’d pay $20/month for your tool, but when you ask “Have you ever paid for a similar tool?”, they say “No, I usually find free alternatives.”
- Someone says they’d use your app daily, but when you ask “How often do you face this problem now?”, they say “Once a month.”
Don’t Pitch Your IdeaโListen for Pain
Your goal is to understand the person’s world, not to sell them on your idea.
Why it matters: If you pitch, you’re testing whether they like your solution. That’s not useful. You need to know if they have a real problem that’s worth solving.
The right mindset: “I’m here to learn about your life, not to get your approval of my idea.”
Dig for Three Key Metrics
When evaluating whether a problem is worth solving, dig into:
-
Frequency: How often does the problem occur?
- Daily? Weekly? Quarterly? Once a year?
- The more frequent, the more urgent.
-
Urgency: How painful is the problem when it occurs?
- Do they drop everything to solve it?
- Or is it a mild inconvenience?
-
Budget: How much are they willing to spend?
- Have they paid for similar solutions?
- What was the highest price they paid?
Let Them Talk; You Take Notes
Actionable tip: Use a voice recorder (with permission) or detailed notes. This serves two purposes:
- You capture exact quotes and stories (which are gold for marketing later)
- You’re forced to listen instead of planning your next question
Recording tip: Always ask permission first: “Would you mind if I recorded this? It helps me remember the details accurately.”
20 Interview Examples: Good vs Bad Questions
Example 1: The Invoice Tracking Problem
โ BAD: “Would you pay for an app that tracks your freelance invoices?”
Why it fails:
- Hypothetical (“would you…”)
- Leads the witness (you’ve pre-loaded the solution)
- Invites polite lies (they don’t want to say “no”)
- No insight into frequency, pain, or current workflow
What you’ll hear: “Sure, that sounds useful!” What they’ll do: Ignore your app even if you build it.
โ GOOD: “How do you currently track your freelance invoices? Walk me through your process from start to finish.”
Why it works:
- Open-ended (they decide what to share)
- Anchored in their actual workflow
- Reveals pain points naturally
- Shows frequency (“I do this weekly…” or “…once a year”)
- Uncovers current solutions (spreadsheet? pen and paper? accounting software?)
Follow-ups:
- “What’s the most annoying part of that process?”
- “How much time do you spend on this each week?”
- “Have you ever tried a different approach? What happened?”
- “Have you ever considered paying for a tool to handle this?”
Example 2: Feature Validation
โ BAD: “Do you think this feature is useful?”
Why it fails:
- Opinion-based (not actionable)
- Too vague (“useful” means different things to different people)
- No context about their real needs
โ GOOD: “Tell me about the last time you needed this feature. What did you do?”
Why it works:
- Anchors in a real scenario
- Reveals current solutions
- Shows urgency and frequency
- Actionable insight
Follow-ups:
- “How often does this situation come up?”
- “What was frustrating about how you solved it?”
- “If you could wave a magic wand, what would happen instead?”
Example 3: Social Media Management
โ BAD: “Would you use a tool that automates your social media posts?”
โ GOOD: “How do you currently manage your social media posting? What’s the hardest part?”
Real conversation example:
You: “How do you currently manage your social media posting?”
Them: “I use Buffer. I usually batch my content on Sunday and schedule it for the week.”
You: “Walk me through that process. How long does it take?”
Them: “About 2 hours. The annoying part is coming up with captions for each platform. Buffer doesn’t help much with that.”
You: “Have you tried any other tools for writing captions?”
Them: “No, I just do it manually. I’ve thought about hiring someone, but I can’t justify the cost.”
Insight: The real pain is writing platform-specific captions, not scheduling. A caption-generation tool might be more valuable than another scheduler.
Example 4: Pricing Research
โ BAD: “If this was free, would you sign up?”
Why it fails:
- Everyone says yes to free
- Tells you nothing about willingness to pay
- Hypothetical
โ GOOD: “Have you ever paid for a tool like this? What made you choose it?”
Why it works:
- Reveals actual spending patterns
- Shows how they evaluate tools
- Indicates the price ceiling
Follow-ups:
- “How much did it cost?”
- “Was it worth the price?”
- “What would have made you switch to a cheaper alternative?”
- “What features justify the price you pay?”
Real example conversation:
You: “Have you ever paid for a scheduling tool?”
Them: “Yeah, I used Later for about 6 months. Paid $25/month.”
You: “What made you choose it over the free options?”
Them: “Better analytics and mobile app. But honestly, I switched back to the free version because the analytics weren’t detailed enough to justify the cost.”
You: “What would have justified staying on the paid plan?”
Them: “If it showed me which posts drive the most traffic to my blog. That’s what I really care about.”
Insight: Your pricing power is limited unless you build deep analytics. Free alternatives are strong competition at $25/month.
Example 5: Workflow Discovery
โ BAD: “Do you like the idea of a productivity dashboard?”
โ GOOD: “Walk me through your daily workflow. Where do you lose the most time?”
Why it works:
- Open-ended discovery
- Reveals actual pain points
- Shows frequency and impact
- Uncovers competing solutions
Example 6: Word-of-Mouth & Referrals
โ BAD: “Would you recommend this to your friends?”
โ GOOD: “Have you ever recommended a tool for this problem? What was it? Why did you recommend it?”
Why it works:
- Real behavior, not hypothetical
- Shows what makes a tool recommendation-worthy
- Reveals network effects potential
Example 7: Usage Frequency
โ BAD: “Is this something you’d use every day?”
โ GOOD: “How often do you encounter this problem?”
Why it works:
- More specific
- Reveals actual frequency
- Less leading
Follow-ups:
- “When was the last time you faced this?”
- “How much time does it take to solve?”
Example 8: Problem Severity
โ BAD: “Do you think this is a big problem?”
โ GOOD: “Tell me about the last time this problem caused you trouble. What happened?”
Why it works:
- Anchors in real scenario
- Reveals consequences and impact
- Shows emotional weight
Follow-ups:
- “How much did that cost you?”
- “What was the worst-case scenario?”
- “How long did it take to resolve?”
Example 9: Willingness to Pay
โ BAD: “Would you pay $10/month for this?”
Why it fails:
- Hypothetical
- You’ve anchored them on a price
- Most people say no to any specific price
โ GOOD: “Have you ever paid for a solution to this? What was the deciding factor?”
Why it works:
- Shows real purchasing behavior
- Reveals what features justify spending
- Indicates price sensitivity
Follow-ups:
- “What’s the most you’ve paid for a similar tool?”
- “At what price would you consider it too expensive?”
- “What would make it worth double the price?”
Example 10: Tool Integration
โ BAD: “Do you want more integrations?”
โ GOOD: “Which tools do you use together? How do you connect them now?”
Why it works:
- Reveals the actual tech stack
- Shows manual workarounds
- Identifies valuable integration opportunities
Real example:
You: “Which tools do you use in your daily workflow?”
Them: “Notion, Slack, Google Calendar, and Stripe.”
You: “How do you move information between them?”
Them: “I manually copy data from Stripe into Notion. Takes about 15 minutes each week.”
Insight: A Stripe-to-Notion integration could save 15 minutes weekly. That’s a valuable feature.
Example 11: Platform Preference
โ BAD: “Would you use a mobile app for this?”
โ GOOD: “Where do you usually solve this problemโon desktop, mobile, or somewhere else?”
Why it works:
- Anchored in real behavior
- Shows platform usage patterns
- More specific
Example 12: Missing Features
โ BAD: “Is this feature missing in your current tools?”
โ GOOD: “What do you wish your current tools could do that they don’t?”
Why it works:
- Open-ended (they think about their real frustrations)
- Not leading
- Reveals feature priorities
Example 13: Switching Behavior
โ BAD: “Would you switch to a new tool if it was better?”
โ GOOD: “Have you switched tools for this before? What made you change?”
Why it works:
- Real behavior
- Shows switching costs and barriers
- Reveals what’s “better” to them
Real example:
You: “Have you ever switched to a different invoicing tool?”
Them: “Yeah, I used Wave for years, then switched to FreshBooks.”
You: “What made you switch?”
Them: “Wave couldn’t handle multi-currency invoices properly. That was a real problem for my international clients.”
You: “Did the migration take long?”
Them: “Yeah, about a day to export and re-import everything. That’s why I waited so long to switch.”
Insight: Switching costs matter. Your tool needs to be significantly better to justify the migration effort.
Example 14: Trust & Privacy Concerns
โ BAD: “Do you care about privacy?”
Why it fails:
- Opinion-based
- Everyone says yes
- Not actionable
โ GOOD: “Have you ever stopped using a product because of privacy concerns? What happened?”
Why it works:
- Real behavior
- Shows actual priorities
- Reveals trust barriers
Example 15: Performance Impact
โ BAD: “Would you use this if it was faster?”
โ GOOD: “How does speed impact your workflow? Have you ever quit a tool because it was slow?”
Why it works:
- Real behavior and impact
- Shows speed sensitivity
- Context about why speed matters
Example 16: Metrics & Measurement
โ BAD: “Do you want more analytics?”
โ GOOD: “How do you measure success in this area? What metrics do you track?”
Why it works:
- Reveals what they care about
- Shows data maturity
- Uncovers analytics priorities
Follow-ups:
- “What information would help you make better decisions?”
- “What reports do you run regularly?”
Example 17: Team Dynamics
โ BAD: “Would you use this for your team?”
โ GOOD: “How does your team currently solve this problem? What’s the biggest challenge?”
Why it works:
- Team context is critical
- Reveals collaboration barriers
- Shows team size and structure impact
Follow-ups:
- “How many people are involved?”
- “How do you share information between team members?”
- “Who makes the decision to buy tools?”
Example 18: Problem Impact
โ BAD: “Is this a must-have or nice-to-have?”
โ GOOD: “What happens if you don’t solve this problem? What’s the impact?”
Why it works:
- Shows real consequences
- Reveals problem criticality
- Emotional weight of the problem
Follow-ups:
- “Has this ever caused you to lose money or time?”
- “What’s the worst-case scenario if you ignore it?”
Example 19: Support Value
โ BAD: “Would you pay more for premium support?”
โ GOOD: “Have you ever paid extra for support? What made it worth it?”
Why it works:
- Real spending behavior
- Shows support value threshold
- Reveals support needs
Follow-ups:
- “What’s the most you’ve spent on support?”
- “When do you actually need support help?”
Example 20: Customization Needs
โ BAD: “Do you want more customization?”
โ GOOD: “How do you customize your workflow now? What’s missing?”
Why it works:
- Shows actual customization usage
- Reveals workflow uniqueness
- Indicates workflow complexity
Tips for Running Great Interviews
1. Listen More Than You Talk
Target: Aim for 20% you talking, 80% them talking.
How to stay quiet:
- Ask open-ended questions (“Tell me about…” vs “Did you…?”)
- Use silence. If they stop talking, don’t fill the gap. They’ll often keep going.
- Take detailed notes instead of planning your next question
- Avoid saying “Yeah, that’s what I thought…” (this biases them)
2. Take Notes & Record (With Permission)
Benefits of recording:
- You can focus on listening, not transcribing
- You capture exact quotes (invaluable for marketing and validation)
- You can review later and catch details you missed
How to ask: “Would you mind if I recorded this conversation? It helps me remember the details accurately and makes sure I don’t misquote you.”
Tools:
- Otter.ai (automatic transcription)
- Voice Memos (iPhone)
- Google Recorder (Android)
- Zoom (records with permission)
3. Ask for Stories, Not Opinions
Story questions:
- “Tell me about a time when…”
- “Walk me through how you…”
- “What happened next?”
Opinion questions (avoid):
- “Do you think…?”
- “Would you…?”
- “Do you like…?”
4. Follow Up Relentlessly
The power of follow-up questions:
- First answer: surface-level, often polite
- Third answer: the real story
Follow-up templates:
- “Can you tell me more about that?”
- “What happened next?”
- “Why did you decide to…?”
- “How did that make you feel?”
- “What was the alternative?”
- “Did you consider anything else?”
Real example:
You: “How do you currently manage your email?”
Them: “I use Gmail.”
You: “Tell me about the last time Gmail frustrated you.”
Them: “Um, I don’t know. It works fine.”
You: “What about inbox management? How many emails are in your inbox right now?”
Them: “Hundreds. Thousands, probably. I haven’t cleaned it in years.”
You: “What’s the problem with that?”
Them: “I can never find old emails. The search works okay, but it takes time. I probably waste an hour a week searching for old emails.”
Insight: Only after three follow-ups did you find the real pain point.
5. Don’t PitchโYour Goal is Learning, Not Selling
The wrong mindset: “I’ll explain my idea and see what they think.”
The right mindset: “I’m here to understand their world. Any feedback about my idea is a bonus.”
Why it matters: If you pitch, you frame their answers. They become defensive or polite. You get biased feedback.
What to do instead:
- Ask about their current situation
- Ask about past behavior
- Listen to their stories
- Only mention your idea if they specifically ask “So what are you building?”
6. Interview 10โ20 People Before Building Anything
Why this number?
- 1-2 interviews: Biased (selection bias, too early to see patterns)
- 3-5 interviews: Starting to see patterns, but still risky
- 10+ interviews: Confident in problem validation
- 20+ interviews: Deep customer understanding
Diversity matters: Talk to different types of people:
- Different industries
- Different company sizes
- Different experience levels
- Different attitudes toward the problem
Where to find interviewees:
- Twitter: “Looking for indie hackers who use X. Happy to grab a 20-min call.”
- LinkedIn: Direct message relevant people
- Reddit: Engage in relevant communities, ask for interviews
- Product Hunt: Early users often eager to talk
- Slack communities: Join industry-specific Slack groups
- Referrals: Ask each person you interview to refer 1-2 others
7. Pre-Interview Preparation
Before the call:
- Research the person (LinkedIn, their website, Twitter)
- Write out 5-7 question themes (not word-for-word questions)
- Plan for 20-30 minutes (respect their time)
- Test your recording equipment
- Send a calendar invite with Zoom link
Sample opening: “Thanks for taking the time to chat. I’m researching how people in [industry] handle [problem]. I’m not here to pitch anything; I just want to understand your experience. I’ll take notes, and I’d like to record if that’s okayโthis helps me be accurate. Any questions before we start?”
8. After the Interview
Within 24 hours:
- Transcribe or review your notes
- Highlight key quotes and insights
- Note their frequency/urgency/budget scores
- Write a 1-page summary of the conversation
- Identify follow-up questions
Create a simple tracker:
- Name / Contact
- Company / Role
- Key Problem Identified
- Frequency (daily/weekly/monthly/yearly)
- Urgency (pain level 1-10)
- Current Solution
- Willingness to Pay
- Key Quotes
How to Synthesize Your Learnings
After 10+ interviews, you’ll see patterns. Here’s how to extract insights:
1. Group by Problem
- List the 3-5 most common problems mentioned
- How many people mentioned each? (Frequency)
- How painful was it? (Urgency)
- What are they currently doing? (Market opportunity)
2. Build a Feature Roadmap
Don’t build everything. Prioritize by:
- Frequency: How many people need it?
- Urgency: How painful is it?
- Willingness to Pay: Can you monetize it?
3. Identify Your Ideal Customer Profile (ICP)
Who cares most about this problem?
- Industry
- Company size
- Role
- Pain level
Focus on this ICP for launch.
Common Mistakes to Avoid
โ Mistake 1: Pitching Too Early
You: “So I’m building this tool that…”
Them: (stops listening, thinks about how to be nice)
Instead: Save your pitch for after you validate the problem. Or don’t pitch at allโlet your product speak for itself.
โ Mistake 2: Asking Yes/No Questions
You: “Do you think this would be useful?”
Them: “Yeah, sure.” (They’re being nice)
Instead: Ask open-ended questions. “Tell me about the last time you needed this…”
โ Mistake 3: Interviewing Friends & Family
Friends are biased toward being nice. Interview strangers.
Exception: Interview friends later (for a product review), not during validation.
โ Mistake 4: Interviewing Only Enthusiasts
If you find someone on Reddit ranting about your target problem, they’re biased. Interview:
- Enthusiasts (10% of interviews)
- Normal users (80% of interviews)
- Non-users (10% of interviews)
โ Mistake 5: Not Digging Deep Enough
Surface-level answers:
- “Do you have this problem?” โ “Yeah, sometimes.”
Deep answers:
- “Tell me about the last time this happened.” โ [5-minute story with specific details]
Always dig deeper with follow-ups.
โ Mistake 6: Ignoring Contradictions
You hear: “I’d totally pay for this!” from 5 people.
But also: “I’ve never paid for a similar tool.”
The second answer is more honest. People are bad at predicting behavior.
Trust past behavior over stated intent.
Interpreting Your Findings: Red Flags vs Green Lights
๐ข Green Lights (Build This)
- โ Multiple people mention the same problem unprompted
- โ They’re currently paying money to solve it (with competitors)
- โ They’ve tried multiple solutions and given up
- โ It causes them emotional frustration (not just mild annoyance)
- โ They ask YOU when you’re building it
๐ด Red Flags (Reconsider)
- โ They say they’d use it, but have never paid for similar tools
- โ They mention the problem only when prompted
- โ They found a free workaround and seem satisfied
- โ When you ask “How often?”, the answer is “rarely” or “maybe once a year”
- โ They seem polite but not excited
- โ Only friends/family mention the problem; strangers don’t
Real-World Case Studies
Case Study 1: The Failed Analytics Tool
The idea: A better analytics dashboard for solopreneurs.
What they did wrong:
- Asked: “Do you want better analytics?” (everyone said yes)
- Didn’t dig into frequency: Users checked analytics monthly, not daily
- Didn’t ask about willingness to pay: Free Google Analytics was “good enough”
- Built for 6 months, launched to 3 active users
The lesson: High opinion agreement โ market opportunity.
Case Study 2: The Successful Invoice Tool
The idea: Invoicing software for freelancers.
What they did right:
- Asked: “How do you currently track invoices?” (dug into workflow)
- Found that freelancers spent 2-3 hours/month on invoicing
- Asked: “Have you paid for invoicing tools?” (found willingness to pay)
- Interviewed 15 freelancers; 12 mentioned the same pain
- Built an MVP in 6 weeks
- Launched with 40 customers on day 1
The lesson: Frequency + willingness to pay + emotional investment = market opportunity.
Resources & Tools
Books
- The Mom Test by Rob Fitzpatrick (the original, essential reading)
- Lean Analytics by Alistair Croll and Benjamin Yoskovitz
- The Customer Discovery Handbook by Courtney Paul
Websites & Communities
- Mom Test official site - Free PDF chapters
- Y Combinator Startup School - Free courses on customer discovery
- Indie Hackers forum - Interview requests are common
- Product Hunt - Community of early adopters eager to chat
Tools for Recording & Transcription
- Otter.ai - Automatic transcription, free tier available
- Rev - Manual transcription (more accurate but pricier)
- Zoom - Built-in recording and transcription
- Loom - Screen recording + voice
Tools for Scheduling Interviews
- Calendly - Simple scheduling links
- Typeform - Collect quick info before the call
- Slack - Build a community to recruit interviewees
Templates
- Interviewee Database - Notion/Airtable template to track interviews and synthesis
- Question List - Pre-designed question templates by industry
Conclusion
Great customer interviews are the foundation of successful indie products. Use The Mom Test to avoid false positives, uncover real pain, and build what people actually need. The right questions lead to the right product.
Action steps:
- Create a list of 10-15 people you can interview (target audience)
- Write down your 5-7 key questions (open-ended, past behavior focused)
- Schedule calls this week (even 3-5 interviews is better than zero)
- Take detailed notes and record (with permission)
- Dig deeper with follow-up questions
- After 10+ interviews, synthesize what you’ve learned
- Validate your findings before building
Remember: The best product idea is worthless if nobody wants it. Spend time interviewing before you spend time building.
Good luckโnow go talk to your customers.
Comments