Role: Design Manager (UX | UI) Duration: 8 weeks Platform: Lifesight.io Team: 2 Members
Marketing teams using MMM (Marketing Mix Modeling) tools couldn't quickly identify which channel combinations were helping or hurting each other. The existing workflow required analysts to export model outputs to spreadsheets, manually scan 50+ channel pairs, and interpret statistical coefficients — a process that took ~40 minutes and often led to misinterpretation.
Core issues identified:

The existing MMM dashboard showed individual channel metrics across multiple tabs (Input, Model, Contribution) but buried cross-channel interactions users had no way to quickly identify synergies or cannibalization.
| Role | Key Pain Point |
|---|---|
| Marketing Director | "I wait 2 days for analysts to tell me which channels work together." |
| Performance Marketer | "I don't know if -0.034 is bad or really bad. Just tell me what to do." |
| Data Analyst | "I spend 60% of my time formatting outputs, not analyzing them." |
| CMO | "We've made budget mistakes because interaction effects weren't visible." |
I audited 5 MMM platforms in the same space:
| Platform | Strengths | Gaps |
|---|---|---|
| Sellforte | Clean ROI visualizations | No interaction-level detail |
| Rockerbox | Good MTA + MMM hybrid | Interactions require custom SQL |
| Triple Whale | Fast setup for DTC brands | Limited to digital channels |
| Measured | Strong incrementality testing | Complex UI for non-analysts |
| Recast | Rigorous model validation | No actionable recommendations |
Key insight: None of the competitors surfaced channel interactions with plain-language recommendations. This was an opportunity.