Email Performance Analysis
A year of email data across three audiences, and nobody could say what was actually working. I pulled the numbers, benchmarked them against industry averages, and built a dashboard that made the priorities obvious.
A year of email data across three audiences and no way to tell what was working.
An interactive dashboard with benchmarks, segment comparison, and prioritized actions.
The organization was emailing three distinct audiences: SQLs, customers, and MQLs. Each group behaved differently, but nobody had looked at them side by side or compared them against outside benchmarks. Most decisions were still being made on instinct.
Open rates were rising, but read rates were falling. Glance rates were up across every segment. People were opening emails without actually reading them. Without clear analysis, the team kept leaning on longer email copy that readers were not finishing.
They also had no real frame of reference. A number might look fine in isolation, but was it actually competitive? Was the click rate strong, average, or weak for a nonprofit of that size? There was no clear answer.
The first step was pulling a full year of data across all three segments — opens, clicks, CTOR, bounces, unsubscribes, read rates, glance rates — broken out by segment and compared year over year. A number means nothing without knowing where it should be, so each metric got scored against national nonprofit averages from Neon One, MailerLite, M+R, and Nonprofit Tech for Good.
The analysis surfaced six clear priorities: list hygiene, shorter copy, subject line fixes, CTA optimization, cadence review, and content pattern analysis. Each one tied directly to a benchmark gap.
The dashboard itself was a single lightweight HTML file with Chart.js — segment filters, live KPI cards, benchmark bands. Something the team could open and use without training.