Skip to main content
Case Study

Case Study: UX Research Team Cuts Analysis Time by 60%

How a product team used FableSense AI to analyze 100+ user interviews in days instead of weeks

FableSense Team
|
|
7 min read
UX research team analyzing data with FableSense AI dashboard
#case study#UX research#time savings#user interviews#efficiency

The Challenge: Too Much Data, Too Little Time

A mid-sized SaaS company faced a critical business question: Why was customer churn increasing?

Their UX research team had the data to answer it. The problem? They didn't have the time.

The Scope

Data SourceVolume
Recorded user interviews100+
Survey responses with open-ended feedback2,000+
Deadline before strategy offsite3 weeks
Researchers available2

"We were drowning in data," recalls Sarah, the Research Lead. "We had rich insights locked in those interviews, but no way to extract them in time."


The Pain Points

Before FableSense AI, the team's workflow looked like this:

1. Manual Transcription Delays

Even with transcription services, cleaning and formatting took days before analysis could begin.

2. Inconsistent Coding

With two researchers working in parallel, codes drifted. "User frustration" vs. "frustrated users" vs. "frustration with product" — all meant the same thing but fragmented the analysis.

3. No Way to Connect Data Types

Interview themes lived in one tool. Survey metrics lived in another. Connecting them required manual cross-referencing in spreadsheets.

4. The Numbers vs. Stories Divide

Stakeholders wanted hard data. Researchers had compelling quotes. Neither side felt fully satisfied.

"Our executives would ask 'what percentage?' and we'd have to say 'it came up a lot.' That wasn't good enough."

Sound familiar?


The Solution: FableSense AI

The team decided to try a different approach. Here's how their three weeks unfolded.

Week 1: Setup and Upload

Day 1-2: Data Import

  • Imported all 100+ transcripts into a single FableSense AI project
  • Uploaded survey responses with quantitative metrics (NPS, satisfaction scores, usage data)
  • AI immediately began processing and analyzing content

Day 3-5: Initial Theme Discovery

  • AI suggested 28 initial themes based on language patterns across all documents
  • Team reviewed suggestions, merged similar themes, and refined the codebook
  • Started systematic coding with AI-assisted suggestions

"The AI's initial themes were about 70% aligned with what we would have created manually. It gave us a huge head start."


Week 2: AI-Assisted Analysis

Theme Detection in Action AI continued surfacing patterns as the team coded:

  • Identified that "onboarding" appeared in 73% of churned user interviews
  • Flagged sentiment shifts — users who started positive but ended negative
  • Grouped related concepts (e.g., "confusing," "unclear," "didn't understand" → "Clarity Issues")

Sentiment Analysis

  • Ran sentiment analysis across all 2,000+ open-ended survey responses
  • Identified emotional hotspots and outliers
  • Prioritized the most negative feedback for deeper investigation

Correlation Engine This was the breakthrough moment. FableSense AI's correlation engine connected interview themes to survey metrics:

  • Users who mentioned "confused" had NPS scores 12 points lower than average
  • The theme "too many steps" correlated with 40% lower satisfaction ratings
  • "Great support" mentions predicted 2.3x higher renewal likelihood

"We would have NEVER found these connections manually. Not in three weeks. Maybe not ever."


Week 3: Synthesis and Reporting

Joint Displays The team created visual joint displays showing qualitative themes alongside quantitative data:

  • Integration matrix mapping themes to user segments
  • Network diagram showing theme relationships and strength
  • Side-by-side comparisons of churned vs. retained user feedback

Shareable Dashboards Built executive-ready dashboards that stakeholders could explore:

  • Key metrics with supporting quotes
  • Drill-down capability for deeper investigation
  • Export options for the strategy presentation

Final Report Compiled comprehensive findings with evidence from both data types — numbers AND stories, working together.


The Results

MetricBefore FableSense AIAfter FableSense AI
Total Analysis Time6+ weeks (estimated)2.5 weeks
Time Savings60%
Themes Identified~15 (typical for manual)28
Correlation Insights0 (no integration)12 key findings
Stakeholder ConfidenceLowHigh

Key Discoveries

The team uncovered insights they would have missed with traditional methods:

Finding 1: Onboarding Was the Culprit

73% of churned users mentioned onboarding friction in their interviews. But it wasn't just "onboarding" generically — it was specifically:

  • Account verification complexity (mentioned 47 times)
  • Initial setup taking too long (mentioned 38 times)
  • Unclear next steps after signup (mentioned 29 times)

Finding 2: Sentiment Predicted Churn

Users who mentioned "frustrated" or "confused" had satisfaction scores 40% lower than average — and 3x higher churn risk.

Finding 3: Feature Requests Clustered

Open-ended feedback revealed three specific workflow gaps that appeared across both interviews AND surveys:

  • Bulk actions (mentioned 89 times)
  • Better search (mentioned 67 times)
  • Mobile access (mentioned 54 times)

These weren't random feature requests — they were patterns validated across data types.


What Made the Difference

1. Unified Data Environment

No more switching between NVivo, Excel, Tableau, and Google Docs. Everything lived in one place, making connections visible.

2. AI as Research Assistant

The AI handled repetitive tasks — initial coding, pattern detection, sentiment analysis — while researchers focused on interpretation and strategy.

3. Mixed-Methods Integration

Joint displays made it easy to show stakeholders HOW qualitative themes connected to business metrics. No more "trust me" — the evidence was visual and compelling.

"FableSense AI didn't just save time — it helped us see connections we never would have found manually." — Sarah, Research Lead


The Business Impact

Three weeks after the strategy offsite:

  • Onboarding redesign project kicked off, prioritizing the top friction points
  • Churn prediction model updated to include sentiment indicators
  • Product roadmap adjusted to address the three clustered feature requests

Six months later:

  • Churn reduced by 18% (directly attributed to onboarding improvements)
  • Time to value for new users decreased by 34%
  • NPS increased by 11 points

Your Team Can Do This Too

Whether you're analyzing user interviews, academic research data, or market research, FableSense AI helps you:

  • Cut analysis time without cutting corners
  • Find hidden patterns across data types
  • Build compelling evidence for stakeholders
  • Move faster from data to decisions

Ready to Transform Your Research Workflow?

Create your free account — No credit card required. Upload your data and see AI-powered insights in minutes.

Planning for a team?


Learn the Fundamentals

New to mixed-methods research? Start with the basics:

Read: Getting Started with Mixed-Methods Research — Master the fundamentals of combining qualitative and quantitative data.

Explore: 5 Ways AI Can Accelerate Your Research — Deep dive into automated theme detection, sentiment analysis, and pattern recognition.


FableSense AI is the only platform built for mixed-methods research. Combine qualitative coding, quantitative visualization, and AI-powered insights in one unified workspace. Start free today.

Ready to Transform Your Research?

Join researchers worldwide using FableSense AI for mixed-methods analysis. Combine qualitative coding, quantitative visualization, and AI-powered insights in one platform.

No credit card required. Free account forever.

Continue Reading