March 7, 2017 | Article | by Eric Gilbertsen | Advertising, Analytics
Amazing Segmentation Insights from an Advocacy Campaign
If you're hoping for a silver bullet that super-charges your next advocacy campaign, this is not the blog post for you. Rather, this is a case of common-sense assumptions proven true thanks to segmentation, analytics, and a little technology. We had a lot of fun thinking it through, and we hope it sparks a few ideas for you in managing your advocate database or your next campaign.
The goal was to generate petition signatures urging the chairmen of Tennessee's Government Operations Committee to oppose a proposed regulation attempting to impose tax collection and remittance burdens on out-of-state retailers. For small internet retailers within the state, this sets a dangerous precedent that could lead to burdens on their ability to sell online and reach new markets outside of Tennessee.
We used email to reach advocates in our database who live in Tennessee. We know that our members are very likely to be small business owners who sell online, export to other countries, and employ people in their local community. However, their businesses are far too small to afford armies of accountants and lawyers, like big retailers can.
- Tier 1, 2 and 3 members based on past engagement history. Tier 1 members have at least opened an email within the past year, while Tier 3 has never opened anything we've sent.
- Note: to improve deliverability, it is always best to send to reliable openers first, as it sends a signal to the email platforms that your mailing is legitimate and valuable.
- Validated residents using TargetSmart vs. those non-validated residents who provided a Tennessee address that couldn't be matched.
Our first mailing to Tier 1 members included an A/B test of a personalized link that pre-populated the petition form fields vs. a generic link that required the user to manually complete the form.
The Shocking Results
- Tier 1 members were far more responsive, with an open rate of 43% and a click rate of 23%.
- Tier 2 resulted in open and click rates of 14% and 7.5%, while Tier 3 garnered 7% and 4%. Though the numbers were low, we were excited to re-engage these dormant members after a year or more of radio silence!
- The personalized link performed better, with a completion rate of 87% compared to 80% for the generic link, proving that it pays to make it easier for advocates to take action.
- Validated residents of Tennessee were more likely to open, click and convert. It stands to reason that our validated members were more likely to actually live in Tennessee.
- The breakdown of traffic by device category was 48% desktop, 42% mobile and 10% tablet. Larger screens performed best, with conversion rates of 90% for desktop, 83% for tablets and 75% for mobile phones. With advocates on mobile phones being more likely on-the-go and required to scroll past the introduction and petition copy to reach the form, this result didn't surprise us.
By all benchmarks we have for this advocacy community, this campaign was a smashing success. But that doesn't mean it couldn't be better. Below are a few ideas we're considering for the next one:
- For those advocates who open, click, and don't convert, wouldn't it be nice to know why? Do they disagree with our position or do they not trust us with their data? This narrow segment of people is ripe for a follow-up survey.
- For those advocates who open but don't click, why? Could we offer a "No thanks" link in addition to our "Sign the Petition" button to gain insight?
- For a time-sensitive campaign like this, growing our mobile opt-in list and reaching people via SMS could unlock hundreds of incremental signatures. The open rate on a text message is 99% while even our strongest segment opened at only a 43% clip.
Sometimes logic wins in Washington, but isn't it nice to prove your assumptions with data?