We are living in the era of AI. The AI has changed everything about computing.

The End of “Faster Clocks, Better Computers”

For decades, the computing industry operated under a simple assumption: make the processor faster, and everything improves. Moore’s Law and Dennard scaling gave us exponential gains in clock speed and transistor density, and software engineers could rely on next year’s hardware to run their code faster without changing a single line. That era is over. Clock frequencies plateaued in the mid-2000s, and power density hit physical limits. Yet the demand for computation has never been greater, driven almost entirely by the explosive growth of AI workloads.

Rethinking What “Computing” Means

The rise of AI forces us to fundamentally rethink what we mean by computing. Traditional computing was about executing a precise sequence of instructions as quickly as possible. AI computing, however, is about processing massive volumes of data through layers of mathematical transformations, where throughput and energy efficiency matter far more than single-thread speed. This shift means that the metrics we once used to evaluate computing performance, clock speed, IPC, FLOPS of a single core, are no longer sufficient. What matters now is how efficiently a system can move data, how well it can exploit parallelism, and how much useful work it can extract per watt of power consumed.

From Instruction-Centric to Data-Centric

The traditional von Neumann model treats data as something fetched to serve instructions. In the AI world, this relationship is inverted. Data is the central resource, and computation exists to serve the data. Neural networks do not follow complex branching logic; they perform relatively simple operations, matrix multiplications, activations, normalizations, repeated billions of times across enormous datasets. This regularity and data-centric nature of AI workloads opens the door to architectures that were once considered impractical, such as dataflow processors, systolic arrays, and near-memory computing, each designed to minimize data movement and maximize parallel throughput.

A New Mindset for a New Era

For computer architects, software engineers, and system designers, the AI era demands a change in mindset. Optimizing for sequential performance is no longer the primary goal. Instead, we must think about computation as a system-level problem: how data flows through the entire stack, from storage to memory to compute units and back. The most impactful innovations in the coming years will not come from shrinking transistors further, but from rethinking the fundamental relationship between data, computation, and energy. Those who cling to the old paradigm will find themselves designing systems that are technically impressive but practically irrelevant. The future belongs to those who recognize that AI has not just changed what we compute, but how we must think about computing itself.