I read an interesting blog this morning that advances an argument that I thought, like the anti-vax movement, had been debunked by actual data a long time ago. And like that movement, the argument still keeps coming up over and over again.
The “argument” is about recipient engagement, and about whether it actually has an impact on inbox placement. On the one side, ISPs and very high volume senders seem to be in accord: engagement informs automated deliverability processes that make decisions about inbox placement. On the other are those who think (or wish) it doesn’t.
The author’s chief argument? He’s never seen the impact. I believe him, but that could be because he’s not sending mail that compels a measurable amount of engagement. In any case, the logic is fallacious; I’ve never seen water evaporating into the atmosphere and magically turn into clouds, but I am convinced that it happens all the time.
The author’s second argument is that a large free inbox provider recently mentioned that they don’t count clicks (which is not actually new news, but it’s apparently news to enough people to make it seem like news). This, according to the author, shakes the very foundation upon which are laid the arguments of the “engagement matters” crowd.
ESPs and recipient domains must necessarily measure engagement in different ways. Free inbox providers control the entire interface, and collect data about recipient interactions to which ESPs have no access. ESPs rely on opens and clicks, because that’s pretty much all they can measure.
Take one of those two away, and it gets very hard for the sender to measure interactions in a meaningful way. When an ISP says they don’t count clicks, it’s not because engagement doesn’t matter. It’s because they have a ton of other ways to measure it that senders don’t.
In what other ways do ISPs measure interaction and engagement? Well, at the very same conference that is the backdrop for the author’s piece, the ISPs also detailed seven metrics they measure, of which zero are clicks, and of which only one is also measurable by the sender. None of the others were mentioned by the author in his piece:
- Open (GOOD)
- Reply (GOOD)
- Move to junk (BAD)
- Not junk (GOOD)
- Delete without open (BAD)
- Move to folder (GOOD)
- Add to address book (GOOD)
If you want even more detail, you can also have a gander at a research paper Google published in 2011 that reveals the exact algorithm employed by Gmail to measure recipient interaction and engagement, and to award preferred positioning within the Gmail inbox to mail that matters most to the individual recipient.
It’s easy to understand how senders can get hung up over engagement. Large recipient domains have far more data at their disposal to inform their automated deliverability decisions than senders do to measure engagement. It is wrong to assume that the ISPs don’t measure interaction and engagement just because they measure it in fundamentally different ways.
Pingback: The ISP should be the last of your concerns | Spamtacular
Pingback: Engagement Is Not Important for Deliverability - Or Is It? - L5 Direct
My friend and colleague Evan Burke recently proposed a tongue-in-cheek alternate title for this piece. As usual, he is very much on the nose:
“My Deliverability Expert Told Me I Should Segment By Activity, But Then Someone Much Louder Than My Deliverability Expert Told Me I Shouldn’t”
Pingback: Does Email Engagement Matter? – Social Marketing Tactics
Pingback: The ISP should be the last of your concerns – Spamtacular