On some Ethereum layer-2 networks, more than half of all gas is now burned by bots scanning for MEV (maximum extractable value) — and they pay only a small fraction of the fees relative to that load. What used to be framed as a philosophical debate about “anonymous money” has turned into an immediate market-structure and scaling problem.
For traders, protocol teams, and infrastructure providers, that raises an uncomfortable conclusion: Ethereum may need privacy technology not just to protect users, but to keep the system efficient and fair enough to scale.
How MEV Bots Turn Transparency Into a Scaling Tax
Ethereum’s open, fully readable state has always been a core feature: anyone can see balances, positions, and pending transactions. That same transparency is also what powers MEV. Bots watch every on-chain action and mempool transaction, competing to extract value from others’ activity via strategies like sandwich attacks, liquidations, and arbitrage.
Flashbots has documented how this “MEV search spam” has ballooned into a major cost center. On some leading L2s, MEV-related search traffic consumes over 50% of gas while accounting for less than 10% of the fees actually paid. The result: honest users and apps effectively subsidize an arms race between MEV bots that doesn’t contribute proportional value back to the network.
Over the 30-day period from Dec. 8, 2025, to Jan. 6, 2026, Alchemy, citing EigenPhi data, highlighted nearly $24 million in MEV profit extracted on Ethereum alone. For large traders, that extraction can overshadow gas entirely. When a hedge fund’s $10 million DEX swap sits visible in the mempool, the slippage from a well-executed sandwich attack can dwarf the transaction fee.
The conclusion emerging across the ecosystem is stark: privacy is no longer just optional user protection; it’s central to market fairness and, by extension, to Ethereum’s ability to scale without letting extractive MEV dominate blockspace.
Three Pillars of Ethereum Privacy: Reads, Writes, and Proving

The Ethereum Foundation’s Privacy and Scaling Explorations team frames the problem in three parts: private writes, private reads, and private proving. Each targets a different surface where data leakage fuels MEV or surveillance.
Private writes focus on hiding transaction intent before execution — for example, keeping the details of a swap or a liquidation repayment concealed until it’s too late for a bot to front-run or sandwich it. This is the area that has so far received the most attention, because it maps directly to obvious user harm in DeFi.
Private reads are about hiding what wallets and applications are looking at: balances, positions, liquidation thresholds, and query patterns. Those read-side signals are less visible to retail users, but they are a core data feed for MEV strategies and on-chain surveillance.
Private proving covers the zero-knowledge tooling that makes privacy and selective disclosure practical. It’s about generating and verifying zk-proofs cheaply enough that privacy can be embedded into everyday products and protocols instead of being reserved for specialist systems.
Cais Manai, co-founder and CPO of TEN Protocol, argues that the industry has historically fixated on the “who sent what to whom” problem — the write side — while the largest economic leak now sits on the read side. In his view, the fact that “every balance, every position, every liquidation threshold, every strategy is sitting there in plaintext” is what powers today’s MEV ecosystem and keeps many institutional players wary of DeFi.
TEN estimates that more than 112,000 ETH — roughly $400 million at recent prices — has already been extracted from users by sequencers and MEV bots exploiting this readable state. To address that, Manai advocates for encrypting the execution environment itself using Trusted Execution Environments (TEEs), so contract state and logic remain encrypted even while in use and nothing sensitive is left in the clear for bots to mine.
Others see the hierarchy differently. Tanisha Katara, founder of Katara Consulting Group, points out that read-side surveillance is a slow burn, while visible writes are destroying value immediately. For her, pre-execution exposure of trade intent — the classic front-running and sandwiching surface — is the biggest active drain on users, accounting for hundreds of millions of dollars per year in lost value.
Andy Guzman, who leads the Ethereum Foundation’s Privacy and Scaling Explorations team, adds that private reads, despite their importance, remain poorly understood in the broader ecosystem. Private writes draw most of the attention as the “first base,” while private proving has advanced significantly but still requires integration work before it can fully support both read and write privacy in production systems.
From Encrypted Mempools to Private Orderflow: What’s Shipping Now

On the write side, private orderflow is quickly turning from research into product. Flashbots’ MEV-Share turns MEV into an auction: users and wallets can selectively share portions of their transaction data, and by default, 90% of the extracted value is routed back to users rather than being captured entirely by bots and block builders.
A parallel effort is targeting the mempool itself. Shutter’s research into encrypted mempools describes a path for using threshold encryption and timed key release integrated with Ethereum’s proposer-builder separation. In such a design, transactions enter the mempool encrypted and are only decrypted once block order is fixed, removing the public mempool as a surface for sandwich and front-run strategies.
These designs acknowledge real-world frictions that builders must weigh: added latency, reorg edge cases, and the complexity of coordinating across diverse validator sets. Yet the economics are pushing infrastructure providers to move anyway. Alchemy’s own MEV overview frames MEV extraction as a systemic issue, with around $1 billion of documented MEV profit per year across major chains.
A simple way to think about the current landscape is in layers:
- Writes: Trade intent is exposed pre-execution, enabling sandwiching and slippage. Tools like MEV-Share, private orderflow integrations, and encrypted mempool research target this layer, with the main bottleneck being coordination and default behavior in wallets.
- Reads: Balances, positions, and queries reveal strategies and fuel MEV. Efforts include private RPC, stealth addresses (ERC‑5564), and confidential execution via TEEs. Here, UX — for both users and developers — is the main barrier.
- Proving: Portability and cost of privacy proofs still create friction, although zk tooling has improved markedly. Ethproofs data show roughly a fivefold latency reduction and fifteenfold cost reduction over 2025. The remaining work is mostly integration and product decision-making.
The common theme: the technology is increasingly available, but wallets, apps, and protocols must decide to make these paths the default rather than an advanced setting.
The Silent “Read Side” Leak: RPC, Stealth Addresses, and Developer UX
The Ethereum privacy roadmap now elevates private reads to a first-class concern. One of the most immediate surfaces is RPC privacy — hiding which addresses are querying which contracts and what they are checking. Repeated queries to the same contract function, such as a liquidation threshold, can signal stress to on-chain observers and MEV bots long before any transaction is broadcast.
At the wallet level, stealth addresses are starting to formalize read-side protections. ERC‑5564 defines a standard for generating unique, unlinkable recipient addresses for each payment, giving users recipient privacy even on a fully public chain. That spec is in place, but mainstream wallet adoption is slow. Challenges include how to scan for incoming payments, how to reconcile balances across many ephemeral addresses, and how to manage keys without overwhelming users.
Manai argues that the tightest bottleneck for read privacy and confidential execution is not cryptography but developer experience. Builders who want to create private applications still often face entirely new programming models, domain-specific languages, or bespoke proving systems. His vision is for full EVM and SVM environments running inside TEEs so developers can ship encrypted dApps using familiar tools without writing circuits or targeting custom virtual machines.
Until that gap is narrowed, the leak of read-side data will likely remain one of Ethereum’s largest under-acknowledged MEV fuels.
Zero-Knowledge Costs Are Falling — Coordination Is the New Bottleneck

On the proving side, cost used to be the dominant constraint. That is changing quickly. Ethproofs’ 2025 review notes that after onboarding multiple zkVMs and prover systems and verifying roughly 200,000 blocks, proof latency fell about fivefold and costs dropped around fifteenfold over the year.
This means that, for many use cases, proof generation is no longer the hard part. Instead, Ethereum’s bottlenecks are coordination, integration, and user experience. Guzman highlights that for retail users, gas costs and UX remain the primary friction; for institutions, regulation and compliance are the bigger limiting factors.
There is still a nontrivial cost premium for privacy on Ethereum. A standard public transfer costs around 21,000 gas — roughly $0.02 in low activity periods. A private transfer, by contrast, can be 420,000 gas or more. At quiet times, that might translate to around $0.40, but during busy periods, the same private operation can quickly become expensive for many use cases.
Katara frames the next phase as a coordination challenge rather than a cryptographic one. Questions such as who decides that shielded sends are on by default in a wallet, or who governs the threshold key servers needed for an encrypted mempool, will determine whether privacy becomes normal infrastructure or remains a niche feature. These are mechanism-design and governance problems, not purely technical ones.
Regulation, “Minimum Viable Privacy,” and Where Ethereum Goes Next
All of this is happening under a moving regulatory ceiling. The US Treasury’s 2025 decision to delist Tornado Cash sanctions removed one flashpoint but did not eliminate uncertainty. Developer Roman Storm ultimately faced a mixed verdict, including a conviction on operating an unlicensed money-transmitting business, demonstrating that legal risk around privacy tooling remains real.
In Europe, the implementation of the travel rule regime under Regulation (EU) 2023/1113 at the end of 2024 requires identity collection and transmission for crypto-asset transfers. That has nudged privacy builders toward designs compatible with selective disclosure, auditability, and policy controls rather than permanent black-box opacity.
Katara notes that permissioned and enterprise chains may end up delivering default privacy to institutions before public chains do the same for retail users, simply because enterprise environments can align compliance, product decisions, and governance more easily.
So what does “minimum viable privacy” look like for an everyday Ethereum user in 2026? Katara expects to see more wallets using one address per application, optional shielded sends, and early-stage RPC privacy features. Guzman points to stealth addresses and shielded pools as already practical, with user interfaces improving quickly and some L2s likely to specialize in payments and private transfers.
Manai is more pessimistic about near-term defaults on major chains, arguing that most users will still broadcast every swap, balance check, and approval in plaintext for now. In his view, a reasonable baseline would be: balances not publicly readable by default, trade intent hidden until execution, and protection against automatic value loss to frontrunners.
Looking ahead, three broad paths are visible:
- MEV forces privacy adoption: Private RPC, MEV-protecting routes like MEV-Share, encrypted mempools, and per-app addressing gradually become standard as MEV extraction and institutional flows force wallets and protocols to prioritize protection.
- Confidential execution goes enterprise-first: TEEs and policy-based encryption gain traction in regulated and institutional contexts, prioritizing business confidentiality and compliance over consumer anonymity, with public chains adopting similar tech later.
- Regulatory chill limits defaults: If enforcement broadly targets privacy tooling, privacy may remain opt-in and niche. In that case, builders emphasize selective disclosure and “policy privacy” approaches, such as Privacy Pools, rather than generalized shielding for all users.
Across all three trajectories, one theme is consistent: Ethereum’s MEV problem and data leakage are now measurable economic costs, not abstract ideals. The tooling to address them — encrypted mempools, stealth addresses, confidential execution, and cheaper zero-knowledge proving — largely exists. The real challenge is whether wallets, protocols, and infrastructure providers treat leaking everything by default as a bug to be fixed.
If they do, privacy could fade into the background as boring but essential infrastructure. If they don’t, it may remain a specialist feature for the paranoid and the institutional — while MEV bots continue burning half the gas to mine what the network exposes for free.

Hi, I’m Cary Huang — a tech enthusiast based in Canada. I’ve spent years working with complex production systems and open-source software. Through TechBuddies.io, my team and I share practical engineering insights, curate relevant tech news, and recommend useful tools and products to help developers learn and work more effectively.





