Building My Own Design Intelligence Tool
Accrue (Internal Tool) • 2025–2026
Accrue • 2025–2026
Jump to solutionRole
Solo designer + builder
Timeline
2 weeks to MVP, ongoing
Tools
Next.js, Mixpanel, Clarity, Claude
Impact
Informed complete onboarding redesign
THE CLIENT
Accrue (Internal Tool)
Accrue is a white-label fintech platform that powers wallet, loyalty, and payment experiences for enterprise retail brands. I was the lead product designer responsible for the SNIPES wallet experience.
I was redesigning the SNIPES wallet onboarding flow, but every design decision was based on assumptions. We had 67,000 wallets created but only 108 fully funded — a 99.8% drop-off — and nobody could tell me exactly where or why users were leaving.
THE CHALLENGE
The design team had no direct access to analytics. Getting data meant filing a request, waiting days, and receiving spreadsheets that didn't answer the questions I was actually asking. I needed to see the funnel through a designer's lens, not a PM's.
PROJECT GOALS
- 1Surface the exact drop-off points in the onboarding funnel
- 2Connect behavioral data (Mixpanel) with session recordings (Clarity)
- 3Make data accessible to the design team without engineering support
- 4Use real evidence to prioritize what to redesign next
THE PROCESS
Phase 1: The Problem
Week 1Identified the data gap. Design reviews were driven by opinion, not evidence. Stakeholders asked 'what does the data say?' and nobody had answers.
Phase 2: Build the Tool
Week 2Connected Mixpanel and Clarity APIs via MCP servers. Built a dashboard with funnel visualization, drop-off analysis, and direct links to session recordings for each friction point.
Phase 3: The Insight
OngoingThe dashboard revealed that card entry had a 99.99% success rate — the problem wasn't the form. 84% of users who completed their profile never even attempted to add a card. Motivation was the bottleneck, not mechanics.
Phase 1: The Problem
Week 1Identified the data gap. Design reviews were driven by opinion, not evidence. Stakeholders asked 'what does the data say?' and nobody had answers.
Phase 2: Build the Tool
Week 2Connected Mixpanel and Clarity APIs via MCP servers. Built a dashboard with funnel visualization, drop-off analysis, and direct links to session recordings for each friction point.
Phase 3: The Insight
OngoingThe dashboard revealed that card entry had a 99.99% success rate — the problem wasn't the form. 84% of users who completed their profile never even attempted to add a card. Motivation was the bottleneck, not mechanics.
THE PROBLEM
Designing with no signal.
Every design review, stakeholders asked what the data showed. Every time, the answer was 'we don't have it yet.' I was making high-stakes decisions about a flow used by 67,000 people based on gut instinct and stakeholder opinions.
Funnel visualization showing 67K → 108 drop-off
67,000 wallets. 108 funded.
The most important metric was hiding in plain sight, but nobody on the design team could see it. We knew activation was low — we didn't know it was 0.16%.
Timeline showing data request lag vs. design sprint pace
Data requests took days
Getting a simple funnel breakdown meant filing a ticket, waiting for an analyst, and receiving a CSV that didn't answer the question I was actually asking. By the time the data arrived, the design review had passed.
Disconnected data sources diagram
Clarity recordings existed but were disconnected
Microsoft Clarity was tracking sessions, but there was no way to connect a specific funnel drop-off to a specific recording. I could see that users were leaving, but not what they were doing before they left.
Card entry success was 99.99% once users tried it. The real problem? 84% never tried. The entire team was optimizing the wrong thing.
THE APPROACH
I built it myself.
Instead of waiting for engineering to prioritize an analytics dashboard, I used Claude and MCP server connections to build exactly the tool I needed — a designer-first view of the entire onboarding funnel with live data from Mixpanel and session recordings from Clarity.
Mixpanel funnel visualization with conversion rates
Connected Mixpanel via MCP
Pulled live event data — wallet creation, profile completion, card linking, fund actions — and structured it into a visual funnel. Each step shows the absolute count, conversion rate, and week-over-week trend.
Dashboard showing Clarity recordings linked to funnel steps
Linked Clarity session recordings to each drop-off
For each friction point in the funnel, the dashboard surfaces the most relevant Clarity recordings. Instead of watching random sessions, I could watch exactly the users who dropped off at the card-linking step.
Three dashboard views side by side
Three views for three audiences
Designer mode shows friction hotspots and UX issues. PM mode shows conversion metrics and feature adoption. Executive mode shows business KPIs and revenue impact. Same data, different framing.
Cron pipeline diagram: data refresh → deploy → brief
Automated weekly refresh
Set up a Vercel cron job that refreshes all data every Monday at 8 AM. The dashboard is always current without anyone manually pulling numbers. Designed a weekly brief format that highlights the top 5 changes since last week.
THE DISCOVERY
We were fixing the wrong thing.
The dashboard's first output changed the entire redesign strategy. The team had been focused on making the card entry form easier. The data showed that the form wasn't the problem — motivation to reach it was.
Metric card showing 99.99% success rate
Card entry success: 99.99%
10,840 users attempted to add a card. 10,839 succeeded. The form was working perfectly. All the energy spent on OCR scanning and form optimization was solving a problem that didn't exist.
Funnel showing the 84% gap between profile and card entry
84% never even tried
43,210 users completed their profile and then vanished. They never reached the card entry screen. The drop-off wasn't at the form — it was between profile completion and the motivation to add a payment method.
Clarity recording screenshot showing barcode interaction
The barcode screen was the pivot point
Clarity recordings showed that users who saw their barcode first were more likely to add a card afterward. The barcode created a tangible 'I have something' moment. This insight led to the barcode-first redesign in Phase 2.
THE OUTCOME
Data changed the roadmap.
The dashboard findings were presented at the March 6 design review. The team shifted from optimizing card entry to redesigning the motivation layer — a fundamentally different design direction.
Before/after: old flow vs barcode-first flow
Barcode-first onboarding
Instead of pushing card entry immediately after profile setup, the new flow shows users their barcode first — giving them something tangible before asking for payment. Directly informed by the Clarity recordings surfaced in the dashboard.
Barcode screen with card-linking nudge
"Spend more than $10?" nudge
Added a contextual card-linking prompt on the barcode screen. Users who see their $10 balance and want to spend more have a natural reason to add a card — not because the flow forced them, but because they want to.
Weekly brief format showing top 5 issues
Weekly design briefs
The dashboard now generates a weekly summary of the top UX friction points, complete with Clarity recording links. Monday morning, the design team knows exactly what changed and what to prioritize.
THE IMPACT
67K → 108
Funnel gap discovered
67,000 wallets created, 108 fully funded. The dashboard made this visible for the first time.
84%
Users who never attempted card entry
The real drop-off wasn't at the card form — it was before users even got there.
99.99%
Card entry success rate
Once users tried adding a card, it almost always worked. The team had been optimizing the wrong step.
3 views
Designer / PM / Executive modes
Each stakeholder sees the data framed for their decisions — friction points, conversion metrics, or business KPIs.
The dashboard directly informed the Phase 2 redesign of SNIPES onboarding. Instead of fixing the card entry form, we redesigned the barcode screen to motivate card linking — a fundamentally different design direction that would have been invisible without the data.
LOOKING BACK
Designers should own their data
Waiting for someone else to pull numbers means designing on a delay. Building this tool gave me real-time answers to design questions as they came up — not days later.
AI is a design material
I used Claude to build the entire dashboard — MCP server connections, data pipeline, cron jobs, deploy hooks. The AI wasn't the product; it was the tool that let me build the product I needed.
Data without framing is noise
The raw Mixpanel numbers existed before I built this tool. What made it useful was framing the data around design decisions — not 'what's the conversion rate' but 'where should I redesign next?'
WHAT I'D DO DIFFERENTLY
I would have built this before starting the onboarding redesign, not after. Phase 1 was designed blind — every assumption turned out to be wrong.
LOOKING AHEAD
Adding weekly automated briefs that surface the top 5 UX friction points, connected to Clarity recordings. The goal is a Monday morning email that tells the design team exactly what to fix this week.