Ethereum co-founder Vitalik Buterin published a blog post on X outlining his latest vision for scaling the blockchain, arguing the network can boost capacity in the near term while laying the groundwork for a longer-term shift to advanced cryptography and data-heavy blobs that would change how Ethereum is validated. The plan comes on the heels of the Ethereum Foundation publishing a strawmap aimed at making the network more efficient in the long term. In the short term, Buterin says Ethereum can safely increase throughput by making blocks easier and faster to check. Upcoming upgrades will allow the computers that run Ethereum to review different parts of a block simultaneously, rather than processing everything step by step, and will include changes to how blocks are built to use more of each 12-second processing window in the Glamsterdam upgrade.
The result: Ethereum should be able to fit more transactions into each block without increasing the risk of errors or instability. Another major piece of the plan involves rethinking how transaction fees — known as gas — are calculated. Buterin argues that not all activity on Ethereum puts the same strain on the network. There’s a big difference between using computing power temporarily and permanently adding new data that every Ethereum computer, or node, must store forever.
OKX rolled out an AI-focused upgrade to OnchainOS, its developer platform, pitching it as infrastructure for autonomous crypto trading agents. The AI layer builds on familiar components such as wallet infrastructure, liquidity routing and onchain data feeds, combining them into a unified execution framework aimed at AI agents operating across chains. Rather than wiring price feeds, token approvals, gas estimation and swap routing manually, developers can connect an agent and issue a high-level instruction, such as swapping ETH for USDC below a certain price. OnchainOS handles the workflow behind the scenes, from monitoring markets to sourcing liquidity and confirming settlement.
The intersection between crypto and AI has grown exponentially in the past 12 months, and traders are using the technology to their advantage. One recent example occurred when a group of retail traders used AI to find glitches on platforms like Polymarket before instructing AI to trade on its behalf. Illia Polosukhin, a co-founder of NEAR, believes the divide is about to collapse, but not in the way many expect. AI is going to be on the front end, and blockchain is going to be the back end.
The goal is to make your AI hide all the blockchain. In this view, blockchain doesn’t disappear, it recedes. AI agents interact with protocols directly.
Bitcoin’s latest governance clash escalated as the first block signaling support for a temporary soft fork designed to restrict arbitrary, non-monetary data in the blockchain’s transactions was produced by mining pool Ocean. The proposal, formally assigned BIP-110 after evolving from earlier drafts, aims to reinstate strict limits on transaction output sizes and arbitrary data fields for about a year. The idea is to curb what proponents see as “spam” uses of block space for non-financial data. They argue that unchecked data, including large inscriptions and so-called OP_RETURN payloads, threaten the original blockchain’s role as sound monetary infrastructure and burden node operators.
The community remains deeply divided. Prominent critics, including Blockstream CEO Adam Back, have warned that consensus-level intervention could harm Bitcoin’s credibility and lead to the risk of the blockchain being split. He also questioned the level of support for the proposal, which, he said, increased the risk of the blockchain being split.
Vitalik Buterin outlined a near-term Ethereum scaling path in a recent post, arguing the network can safely boost throughput by making blocks easier and faster to verify. He envisions upgrades that let computers review different parts of a block in parallel, and to utilize more of each 12-second processing window within the Glamsterdam upgrade, increasing transaction capacity without compromising safety. A core part of the plan is rethinking how gas is calculated, recognizing that not all activity imposes the same load on the network. Buterin also notes a distinction between temporary compute and permanently stored data, underscoring the long-term shift toward cryptographic advances and data-heavy blobs that could influence how Ethereum is validated.














Leave a Reply