Maybe you’re planning email marketing goals for the year. Maybe you just launched your first newsletter. You know Open Rate is important. It tells you how many people are opening your emails. Click-to-Open Rate too. That shows how many readers are engaged with your content. These metrics help measure quantitative success for your email marketing efforts.
But what if they’re not accurate?
You may or may not remember Apple Mail Privacy Protection (MPP), or your newsletter may not have existed when it was announced, but it rolled out ~September 2021 alongside the iOS 15 update.
Currently, ~64% of email opens are coming from Apple, and of that ~59.53% have Apple MPP-enabled, according to Litmus. This may not be the case for your subscriber base, but take comfort in knowing it is calculated from over 1.6 billion opens in Litmus Email Analytics in December 2022 post-Apple MPP launch – meaning it is a fairly accurate depiction of the general market.
In a nutshell, there's a high probability that ~two-thirds of your subscribers are Apple users creating inflated metrics. Meaning…
Your Open Rate is most likely inflated.
Most ESPs embed a tracking pixel into email sends, so when you send your newsletter or an email marketing campaign, a pixel automatically gets sent along with the email. This pixel sends information back to the ESP and generates information in your dashboards, such as Open Rate.
Typically, data from an email was loaded only when the recipient opened the email and the images from the email were downloaded. This data includes the tracking pixel, which allows ESPs to detect that the email was opened, what device was used and where the subscriber was located when they opened the email.
If a subscriber has Apple MPP-enabled, Apple will preload the images and content of emails - including the pixel – regardless if the recipient actually opened the email or not. This means a subscriber could have a 100% Open Rate due to Apple preloading the emails and triggering the tracking pixel for each email.
The more subscribers with Apple MPP-enabled, the more likely your open rates are inflated. And based on Litmus’ analysis that ~two-thirds of email opens in December 2022 are coming from Apple, it’s likely this is impacting your metrics along with any workflows, subject line testing or emails based on Open Rate or Opens.
We encourage you to take a look at your Email Service Provider’s dashboard. Do they mention in their documentation how Open Rate is calculated, or a way for you to sort by device?
Wait, what is Click-to-Open rate? Isn’t that Click Rate?
Side note: x100 for a percentage.
For # of Clicks, it’s important to decide if it is going to be calculated as Total Clicks or Unique Clicks.
With Open Rates becoming less reliable, you may think to start using Click-to-Open Rate, but it is important to note that Opens are being factored into the equation.
Apple users with MPP-enabled might appear to open your emails, but they may or may not click links.
As your Open Rate artificially inflates, your Click-to-Open rate will artificially drop.
Click-to-Open Rate has been a great method to measure how effective content was at motivating users to click, but since it uses Email Opens, it’s now an unreliable metric.
However, Total Clicks are not impacted by Apple MPP.
You can still analyze Total Clicks across campaigns with a similar number of recipients or audience cohort as a way to assess which campaigns are getting the most engagement.
As you shift focus towards clicks as more of a primary metric for your emails and newsletters, it may be helpful to start building benchmarks for Total and Unique Clicks and Click Rate. Don’t be alarmed if Click Rates are below 1% or just lower in general than Open Rates. It’s a different metric!
Also, if you’re reporting Open Rate to dashboards, clients or sponsors, think about trying to use different success metrics like Click Rate, Conversion Rate or engagement through polls, surveys, etc.
Now, we’ve mentioned using Clicks instead of Opens a lot.
Your newsletter’s purpose may be getting all the content in that email, and not having to leave the inbox. You’re not linking out to a blog to continue reading, or linking out to a store to check out merchandise or a digital product.
That’s okay! There are other ways. Try to create opportunities for readers to click:
OoOOOoh Subject Line testing. Yes, split testing subject lines has been heralded as a way to see what resonates with readers and how to get more readers to open your emails.
The subject line is the front door to your email, along with the preheader.
It’s one of the first things a reader may notice and may be a determining factor in opening an email from you, but the way some subject line testing is being done is not an accurate depiction of what’s going on with your emails.
For example, it’s common to take a percentage of your subscribers and run a subject line test with that cohort. You determine that whichever subject line has the highest Open Rate after X amount of time is the “winner”, and the rest of the list will receive emails with that winning subject line.
If the winner of the subject line test is based off of Open Rate, and we know Open Rate is potentially being artificially inflated by Apple MPP-enabled subscribers – the output is not accurate.
You may also need to check you have a sample size to yield statistically significant results.
If you want to test a subject line:
Think about why you are testing these two subject lines. Is one asking a question? Is one in a different voice or tone? Is the word order different? What about character length? Does one include an emoji? Where is the emoji placement?
Try to make them two different approaches, and be prepared to test more than once with these theories. If you want to test emojis in your subject line, don’t let one email determine the success of using an emoji in your subject line. You may need to test it for a bit.
Also, again, think about doing a 50/50 split with the subject lines. This looks like sending 50% of readers Subject Line A and 50% of readers Subject Line B.
This means you don’t pick a winner based on the first X% that open because there’s a high probability that the opens being counted are not real opens.
If you take the 50/50 approach, you can then analyze the data after the fact. You can take a look at the tests impact on Click Rate, Unsubscribe Rate, Conversion Rate.
If your ESP has this functionality, can you remove subscribers with Apple devices from the sends and see what those results are? How do they vary?
If any of your workflows are based off of Opens or Open Rates, try Clicks or time-based triggers instead.
A common workflow that utilizes Opens is a Sunset Policy.
We’ll get more detailed about Sunset Policies in a future post, but in a nutshell, it is a policy to remove unengaged users from email lists. This practice is good for engagement metrics, making sure you are sending pertinent emails to engaged users along with deliverability and overall sender reputation.
Engagement is typically defined as opening or clicking emails within a certain amount of time.
An example Sunset Policy might be:
Since Opens is being factored into the workflow, and we know Apple MPP-enabled subscribers generate false opens, we recommend not using Opens and just using Clicks.
A new example Sunset Policy might be:
Clicks remain a reliable metric regardless of MPP. If readers are clicking, they are engaging with the content.
Knowing that a large portion of users use Apple Mail and most likely have Apple MPP-enabled, basing anything off of Open Rate – like a Welcome flow, Sunset Policy or Cart Abandonment – is creating a workflow relying on inaccurate information and could make for some funky email experiences for users.
For example: If you think through an email flow knowing that some of your users have Apple MPP-enabled, therefore each email is going to be counted as opened even if they didn’t really open the email, what would that experience be like for the user?
Try to rethink it from a Clicks or Conversion Rate perspective. Something other than Opens.
Due to Apple MPP preloading and “opening” emails, the open timestamps are also not reliable. This means you won’t be able to easily identify the best time of day to send emails since the timestamps aren’t accurate.
For users with Apple MPP-enabled:
It’s key to consider why you’re sending an email and what metrics to focus on.
Here are some other metrics to measure success:
What do you think? Feel free to send us an email and we may feature some comments from readers in the next newsletter! If you're not signed up for our newsletter, sign up below!
Sign up for our newsletter to receive more analysis and insights on all things newsletter.