TABLE OF CONTENTS
Why is my open rate so unreliable?What does that mean if you're using Substack?How can I find my "invisible" readers on Substack?What should I track instead of open rates?What are "trust signals" and why should I care?How do I actually get more reader replies?What about zero-party data?So... should you clean your list or not?Your most loyal readers might be invisible. Your numbers are definitely lying...here's what to do about it

As soon as I started my online business in 2008, there was one mantra that was preached by every marketer, guru, and blogger I followed. It also proved to be true, as smarmy and cheesy as the phrase was (is):
"The money is in the list."
As in, your email list.
It took me a while to really get into email marketing, but once I did, I never looked back. Whether it was my "almost daily emails," launch emails, or abandoned cart sequences with my e-commerce company (where we recovered thousands of dollars in revenue per month).
Or the time I sent out just a few emails about a new cohort (Email Insiders) that cost $1k per person, and I sold 10 spots in less than a week.
So yes, I'm a huge proponent of email marketing and the power of understanding that not only does it work, but if you're in business, it's also an asset when managed correctly.
I've written a few articles recently about Substack's analytics and subscriber data. I also shared that I deleted 428 cold subscribers (from an imported email list in 2024).
So when a friend of mine, Dinah, left a comment on one of my recent Substack posts that gave me pause, I knew I needed to go deeper.
I'd mentioned cleaning up my subscriber list, and she offered a word of caution. Her husband reads every single one of her articles and a bunch of other Substack publications. But when she looked him up in her subscriber stats? Zero activity. Nothing. Zilch.
He reads in Proton Mail, and Proton blocks all tracking. So, according to Substack, he doesn't exist.

And you know what? They were really valid points. But her comment also sent me down a rabbit hole that I think every one of us who runs a newsletter as a business (or as a key component of our business) needs to understand. Because the real issue isn't whether or not you should clean your list (I still think you should, and I'll explain why).
The real issue is that a core metric we've all been using to judge our email health has been fundamentally broken since 2021.
Let that sink in.
Here's the short answer: Apple broke it, and nobody told most of us.
In September 2021, Apple launched something called Mail Privacy Protection (MPP) as part of iOS 15. And this is important to understand... it only affects the Apple Mail app. The native mail client on your iPhone, iPad, or Mac. If you use the Gmail app on your iPhone, MPP doesn't touch you. But if you've added your Gmail account to Apple's Mail app (which a lot of people do), it applies.
It's about the app, not the email address.
What MPP does is preload your tracking pixel on Apple's servers before the email ever reaches a human eye. So the system registers an "open" even if the person never looked at the email. Never scrolled. Never even knew it was there. 🤯
This inflates open rates by 15 to 30 percent. And Apple Mail accounts for nearly half of all email opens globally.


And it's not just Apple.
Gmail has been making its own moves. Back in 2013, they introduced inbox tabs that sort your email into Primary, Promotions, Social, and so on. If you use Google Workspace for business, you might not see tabs (I use it and don't have the tabs). But most personal Gmail users do. And in September 2025, Gmail changed the Promotions tab to sort by "Most Relevant" rather than chronologically, so your emails are now ranked by engagement. If your readers aren't clicking, you sink to the bottom. (If you've noticed your open rates dropping and can't figure out why, this might be it.)
But here's where it gets interesting... the problem runs in both directions.
On one side, you have Apple inflating your numbers and making everything look better than it is. On the other hand, you have privacy-conscious readers like Dinah's husband who use Proton, VPNs, or aggressive tracker blockers that mask their activity entirely. On Substack, these people show up as "0-star" or "invisible" subscribers, even if they're reading every word you write.
Now, let's be honest about the numbers here.
Proton crossed 100 million accounts in 2023, which sounds massive. But according to 2025/2026 Litmus data, Proton Mail accounts for less than 2% of email client usage. Compare that to Apple Mail at 45.5%, Gmail at 23.5%, and Outlook at 5.7%. It's a small player. A growing one, but still small.
So is Dinah's concern overblown? No... because Proton isn't the only one. Tuta Mail, StartMail, Mailfence, Mailbox.org, Canary Mail... they're all in the same anti-tracking lane, and they're all growing. Add in VPNs, browser-based tracker blockers, and Apple's own Hide My Email feature, and the percentage of readers who are invisible to your tracking is higher than any single provider's market share suggests. It's still a minority of your list. But these tend to be the intentional, privacy-aware readers who actually read what you send... they're just invisible to your dashboard.
So your open rate is simultaneously too high AND missing real readers. That's not a metric.
That's a guessing game.

Let's be real... I love Substack. I publish there, I grow my audience there, and it's been incredible for connecting with readers. But Substack's analytics are relatively basic. They rely heavily on traditional open tracking and those "Activity stars," which are exactly the kind of data that privacy blockers distort. ⭐⭐⭐
To Dinah's point, deleting people based on open rates alone is risky. You could be removing loyal readers who just happen to use privacy-focused email clients. And she had a rough experience when she culled her list... her traffic dropped, and it felt like Substack was penalizing her.
But here's the thing to understand about that. When you remove subscribers, you're sending to fewer people. Fewer sends means fewer opens, which means less traffic. That's not Substack punishing you for cleaning your list. That's just math. It doesn't mean the decision was wrong... it means the short-term numbers will dip before they stabilize around a more accurate picture of your actual engaged audience.
And this is where I think you have to be honest with yourself about what Substack is for in your business.
If it's a creative outlet, a place to share ideas, a community you enjoy, that's wonderful. Keep doing that. But if Substack is a revenue channel... if it needs to generate income for you... List health and management aren't optional.
A big subscriber count that isn't converting is a vanity metric, and the wrong people on your list will eventually start unsubscribing, which hurts your deliverability and sends negative signals to mailbox providers. That's worse than having a smaller, engaged list.
And it's not just the activity tracking.
I wrote about this previously, but after digging into my own subscriber data, I found that 40% of my dark subscribers came from the recommendation engine. These are people who subscribed because someone else recommended me in their post-subscribe flow, not because they found my content, read a Note, or connected with what I'm actually building. Substack doesn't differentiate between those subscribers and someone who actively sought you out. They all look the same in your dashboard. We saw this play out with Kit's Creator Network, too... a bunch of cold subscribers who were never going to engage because they didn't choose you.
A big subscriber count isn't the same as an engaged audience, and that's true whether you're paying for your list or not.
If you want the full breakdown on what I found, I wrote about it here.
I use both Substack and Bento.
Substack is my publishing platform and where organic subscribers find me. Bento is my email service provider where I actually own my list, track custom fields, run personalized sequences, and get deeper analytics than Substack can offer.
Now, here's an honest caveat: every ESP receives the same inflated open data from Apple (ESP stands for email service provider). The tracking pixel fires on Apple's proxy server before any ESP gets involved. The question is whether your ESP has built filtering to identify and exclude those false opens. Major ESPs like beehiiv, ActiveCampaign, and Klaviyo have built MPP detection at various levels. I'm checking with Bento on exactly how they handle this (if you're using a different ESP, you should ask yours the same question). But regardless of which ESP you use, this is exactly why the new metrics I'm about to walk you through matter so much. Clicks, replies, and trust signals aren't affected by MPP at all.
The real reason to use an ESP alongside Substack isn't just analytics, though.
It's segmentation.
When everyone on your Substack list gets the same email, the same offer, the same content, regardless of where they are or what they need... that's not a strategy, that's a broadcast (no judgment...again, it truly depends on why you have an email list and what your personal goals are).
The right offer to the right person at the right time is what drives conversions. And you can't do that on Substack alone. You need to be able to segment by lead source, engagement level, customer history, quiz results (if you have a quiz), what they've clicked on, and what they've bought. That's what an ESP gives you.
Before you make any decisions about your list, though, you need to get clear on what your email is actually for.
Where does Substack fit in your ecosystem?
Is it your primary revenue channel or a discovery tool?
Would it be beneficial to run an ESP alongside it?
Those answers are going to be different for everyone, and more than anybody else's opinion, your business model should drive those decisions.
You don't have to leave Substack. But you do need to be clear about what your subscriber list is for, how you want to use it, and whether you want to go deeper with data and segmentation.
If you're building on Substack right now, you can work around its data limitations. The key is looking for actions that privacy blockers can't fake, because they require a real human to do something intentional.
If you want to go deeper on Substack subscriber analytics, check out StackContacts by Finn Tropy. It gives you filtering and segmentation tools that Substack's native dashboard doesn't offer, so you can actually see what's happening with your list beyond activity stars.

Finn contributed his list cleanup playbook to this article, and I asked him to because I think it's the gold standard for doing this responsibly.
His framing is honest about something most people gloss over: there are legitimate reasons to hesitate before cleaning your list. "Quiet" isn't always "gone." Data isn't perfect. Lags, privacy settings, and platform quirks mean any rule is a best guess, not a guarantee. And if you do it by hand, scrolling through the Substack publisher UI one by one, it's genuinely tedious work.
His playbook exists because those tradeoffs are real. It doesn't pretend that cleaning is risk-free.
Here's how it works:
Define who you're looking at. Instead of eyeballing your subscriber page or exporting a CSV and losing track of who's who, you set up the same repeatable filter every time. Same rules, same signals, same definition of "inactive." So the next cleanup matches the last one, and you're not reinventing the process every quarter.
Enrich before you decide. This is the step most people skip. Before anyone gets flagged, the playbook checks whether they've bought anything, generated subscriber events, or appeared elsewhere in your data. "Inactive on email" doesn't automatically mean "inactive everywhere." Someone who hasn't opened a newsletter in four months but purchased something last week? That's a keeper, and you'd miss that if you were only looking at open rates.
Sort into three buckets. Keep (any engagement within the last 90 days), review (91 to 180 days since last activity), and candidate for removal (zero engagement across all signals for 180+ days). That's a completely different conversation from deleting someone because they have zero stars. If someone has done absolutely nothing... no clicks, no comments, no shares, no purchases, no activity of any kind... for over six months? That's when it makes sense to have the conversation about whether they belong on your list.
The last-chance email. This is the part I really want you to pay attention to. Before anyone gets removed, Finn's playbook recommends sending one short, honest email to the removal candidates only... not your whole list. Something like "I'm cleaning up my subscriber list. If you're still reading, click here, and you'll stay." Then you wait a week or two so opens and clicks can flow back into your data, refresh the list, and only remove the people who still show zero activity. It's fair notice. It catches Proton Mail users, privacy-blocking folks, and people who read in ways your tracking can't see. And it means you're not guessing from a stale snapshot.
Then, and only then, you execute. A controlled batch with logging, so you can audit what happened and have a record for next time.
As Finn puts it, the automation is not "the computer unsubscribes everyone blindly." It's automation for consistency... same rules, same exports, same enrichment, same pattern... so you're not improvising every time. The system removes friction and forgetfulness, not accountability. You still make the calls on the review bucket. You still decide your own comfort level with the policy.
And one caveat he's very clear about, which ties back to everything we've discussed here: "No opens" doesn't prove that someone never cared. Privacy tools and inbox behavior skew opens. Treat "candidate for removal" as a default label from rules, not a moral verdict.
This is exactly the kind of data-driven decision-making I'm talking about throughout this post.
You don't guess, you look at the actual signals.

Filter for Active Commenters and Paid Subscribers. Go to your Subscriber Dashboard and filter for "Active commenters" and "Subscription type is paid." Financial commitment and community participation are proof of a real, engaged reader. Full stop.
Track Link Clicks on Specific Posts. Substack lets you view post-specific stats to see exactly who clicked links in your recent posts. Link clicks require deliberate human action. Apple MPP can't fake a click.
Cross-Reference Signup Dates. Customize your dashboard columns to view "Email opens" alongside "Signup date." A "0-star" subscriber who has been on your list for years without unsubscribing or bouncing? That's very likely a dedicated reader behind a tracking blocker, not a dead lead.
Send a Re-Engagement Email Before You Remove Anyone. This is the last-chance email from Finn's playbook I described above. Target only your removal candidates, not your whole list. One short, honest message. Wait a week or two for the activity to update. Then make your decision with fresh data.
This is where it gets good. With open rates no longer reliable, the industry is shifting to deeper engagement signals. And honestly? These new metrics tell you way more about whether your content is actually working.

The Disaffection Index. Most of us have always asked, "How many people are engaging?" The Disaffection Index flips that question: How fast are we burning through our audience?
It combines your unsubscribes, spam complaints, and bounces into one number. And here's the thing... Gmail and Apple now treat negative signals as much stronger indicators of email quality than positive ones. If your complaints and unsubscribes stack up, it doesn't matter how good your click rate is. Your deliverability will suffer.
Think of it as an early warning system. Track it per campaign, and you'll spot problems with your content or sending frequency before you permanently damage your sender reputation.
Conversion Lag Time. Email influences decisions it doesn't directly get credit for. Conversion Lag Time measures how many newsletters someone reads before they take action, whether that's upgrading to a paid subscription, buying a product, or signing up for a workshop.
This is huge for those of us who aren't doing hard sells in every email. It proves your content is warming people up and building trust over time.
Reply Rate. Replies are the closest thing email has to a pure engagement signal. Unlike opens (unreliable) or even clicks (can be accidental), replying to an email requires genuine human intent. You have to care enough to type something back.
Even a 1% reply rate is highly meaningful. And mailbox algorithms treat replies as a strong trust signal, which improves your inbox placement over time. So, when I ask you a question at the end of my posts? That's not just community building. It's a strategy.
Trust can feel like a squishy, unmeasurable thing. But you can actually quantify it by tracking specific behaviors that prove your audience values what you're creating. And mailbox providers are increasingly evaluating these exact signals to decide whether your emails land in the primary inbox or get buried.

Replies are emerging as the strongest trust signal. Unlike an accidental click or a bot-triggered open, replying requires real human intent. Mailbox algorithms reward emails that prompt replies because it signals an actual conversation.
Saves and Forwards prove that a reader trusts your content enough to either revisit it later or share it with their own network. Both are powerful.
Long-term engagement consistency matters too. A reader who interacts with your content regularly over months or years is a different kind of asset than someone who opened one email and disappeared.
Preference updates are underrated. When readers proactively update their interests or communication preferences, they're telling you they want a long-term relationship with your content. Pay attention to that.
Active feedback, such as participating in polls or surveys, or responding when you explicitly ask for input, is another strong signal.
And on the flip side, complaint rates are a foundational negative trust signal. Monitoring how rarely people mark your emails as spam protects your sender reputation and keeps you in the primary inbox.
When you track these signals together, you're essentially measuring your brand's credibility, reliability, and intimacy. Newsletters that build this kind of trust convert with fewer aggressive calls to action and lose fewer subscribers across the board.
Knowing that replies matter is one thing. Getting them is another. Here's how to shift from broadcasting to actually having a conversation.
Use low-friction prompts. Make it as easy as possible for someone to reply. Don't ask complex, multi-part questions. Ask something simple and specific that doesn't require a lot of thought to answer. "What's the one thing you're stuck on right now?" lands better than "Please share your thoughts on the current state of AI in your business."
Embed questions naturally into your content. Build prompts for feedback directly into the body of your writing, not just tacked onto the end. When replying feels like a natural next step instead of a homework assignment, people actually do it.
Ask questions that touch on their real situation. Questions like "Does this sound like what you're dealing with right now?" or "What are you struggling with this week?" can actually outperform traditional call-to-action buttons. They're specific, relatable, and they invite a real conversation.
I've been doing this more in my own posts, and the replies I get are some of the most valuable insights into what my readers actually need. It's better than any survey.
The welcome sequence trick. You've probably seen this... someone signs up for your newsletter, and the first email says, "Hit reply and tell me what you're working on," or "Reply with the word YES and I'll send you a bonus resource." There's a reason smart email marketers do this, and it's not just friendliness. When a subscriber replies to one of your emails, it tells their email client (Gmail, Apple Mail, Outlook, whatever) that this is a real conversation with a real person they want to hear from. It's a trust signal that helps your future emails land in the primary inbox instead of getting buried.
This is also where using an ESP really pays off. When someone replies, clicks a link, completes your quiz, or takes any specific action, an email service provider like Bento lets you segment and tag them based on their actions. You can send different follow-up content to someone who replied versus someone who didn't. You can trigger automations based on real engagement, not phantom opens. That kind of behavioral segmentation is something Substack simply can't do on its own. (And honestly, it's the main reason I run both.)
This concept is worth understanding because it answers the question: "If tracking is broken, how do I know what my audience wants?"
First-party data is gathered by observing behavior, like tracking link clicks. Zero-party data is information readers intentionally and proactively share with you. Their interests, their struggles, their communication preferences, and what they want to learn next.
If you've taken my AI Advantage Profile quiz, that's exactly what zero-party data looks like. You told me your AI focus area, your biggest obstacle, and where you are in your building journey. That data is more valuable than a thousand open rates because you gave it to me voluntarily, and I can use it to send you content that actually matches where you are.
Quizzes, surveys, polls, preference centers... these create a transparent value exchange. You're asking readers what they want instead of trying to track what they do in the shadows. In a privacy-first world, that's not just smarter. It's more respectful.
Both things can be true. Dinah's right that open rates are a terrible metric for deciding who to delete. And I'm right that list health matters, especially if you're running a business.
The answer isn't to blindly purge based on star ratings. It's to get better data. Track the signals that actually tell you who's engaged: clicks, replies, paid subscriptions, comments, quiz completions, and link activity. And if someone has genuinely zero engagement across all of those signals over a sustained period of time? That's a different conversation than deleting someone because Apple's tracking pixel didn't fire.
Stop watching the open rate and start watching the relationship.
I'd love to hear from you on this. Have you noticed your open rates being unreliable? Have you found invisible subscribers on your list? Hit reply and tell me... I'm reading every single one. 😉 (see what I did there? lol)
8 questions. Your personalized path. No fluff.
Get My AI Advantage Profile →
Kim Doyal is a digital marketing strategist and AI builder with 18 years of online business experience. She is the founder of AI Spark Studios and SPARK Lab, and the creator of The Hub — a custom 33-agent AI operating system that runs her entire business. She has also built kimdoyal.com, StackRewards, and multiple AI tools and agents using vibe coding, a natural language approach to building software without a traditional development background.

I came into this week already tired. The kind of tired that's about the quantity of moving pieces, not any one thing. So instead of pushing through, I took an architecture week — five days of mapping my business instead of producing in it. Here's the four-pillar framework I landed on, and the audit prompts you can use to check your own.

If you've been following my journey into "vibe coding," you know I'm always on the lookout for tools that make bringing ideas to life faster and more intuitive. While I've had success with other platforms, a new tool recently caught my eye and has completely changed the game for me.

I've always believed that the best business ideas come from solving a problem you have personally experienced. That's exactly how my new app, TypeQuiz, was born.