3 Apr 2026
PC hardware's development has stagnated for consumer applications. We have not had major tooling level changes in the past twenty years since the widespread adaptation of 64-bit computing. As a result of this, we have come to the inevitable asymptote of limited returns for faster and more efficient hardware. I do not deny that hardware has gotten faster or more efficient or more affordable; rather, there is only so much that can be done with 64-bit computing and we have reached the point at which widely-adopted hardware is no longer in need of upgrading for longer periods.
I base this conclusion from the axiom that once held true: that hardware power doubles roughly every two years. The axiom had been the case in the past but the past (approximately) five generations of Intel processors are still relevant and entirely useable for the modern computing environment. The same holds true for Apple's M-Series processors: an M1 processor from early 2021 is entirely cromulent for most web applications.
Consumer hardware stagnation can be traced from two primary causes: the architectural limitations of 64-bit hardware and the modern consumer computing environment and commonly-run programs, which caused the shift of focus from power-based development to efficiency-based development.
As a corollary, I believe that the shift to virtualization and docker-containers is a stemming-factor in professional-grade hardware. Virtualized machines allow for greater security while providing lower bare-metal overhead at a large scale (though they do have the initial setup cost and one-to-one are per-se less efficient than bare-metal).
Graphically speaking, the video games from 1990 looked vastly different from the games of 1995, and the games of 1995 looked vastly different from the games of 2000. But the games of 2020 are more-or-less on-par with games from 2025 or 2026. At around 2015, Unreal and Unity engines reached a point of affordable photorealism. Publishers published games for the 8th generation consoles that had achieved the 80% of the 80/20 rule in terms of graphical fidelity. As such, the games from that time period still hold up visually compared to the generational shifts from years past. Bleeding-edge developments do happen, but their marginal difference is much smaller and the visual difference is harder to spot forthright; bleeding-edge graphical developments now take longer to make it to the mainstream because the hardware required is not cost-effective for most consumers.
This asymptotic phenomenon has also appeared in word processing suites and web browsers and spreadsheet systems and so on. The things we still run locally have again reached the inefficient, high-effort portion of the 80/20 rule. There is only so much that software developers are willing to invest in their programs. That is a simple fact of the limits of space and time. Thus, in aims of time-efficiency most choose not to exert disproportionate efforts in the less-abundant 20% zone.
In order for it to be worthwhile for hardware manufacturers to invest in the next power-level generation there needs to be a business case and so far there is simply none. Even if there were a ground-breaking new generation --say, quantum computing-- the current landscape of consumer products is a race to the bottom rather than providing top-of-the-line power. Your retired uncle, no matter how hard he is advertised to, most certainly does not need quantum computing to browse Facebook and type up the occasional letter.
The transition to ARM-based computing is still undergoing, and I predict it will be another five years before the widespread adoption of ARM-based PCs in the workforce and in consumer tech beyond what Apple has already done. This transition perfectly mimics the desire for efficiency over power. Ceteribus ARM and x86-64 seem to be in power parity, and the transition is not one out of a desire for power but rather for efficiency.
Further, I believe that AMD-x86-64 and ARM will coexist. Thankfully, it is so far decently possible to cross-compile and transpose ARM system calls on x86-64 machines and vice-versa. Even in the realm of graphics it is possible: I had a decent time coercing a smorgasbord of retro and modern games that were not written for ARM to run on my college years' Macbook M2. So long as x86-64 is the de facto standard for home computing, though, there is limited development on the running of home computing ARM applications on x86-64 (mobile programs are a different story).
Webapps run off of the lowest-common-denominator of on-machine processing: the web browser. As a result of this and the efficiency developments of the past decade on both Chromium and Firefox, running a web browser smoothly has become the entry-ticket to modern computing. Much of what a computer's resources were dedicated to have been offloaded to the web browser and off-machine: word processors have been replaced by docs, Adobe Illustrator has been replaced by Canva, and hard disks have been offloaded to the cloud. Because the work has been offloaded from the machine, there is no longer the incentive to have whole programs running and allocating resources to themselves; instead, as long as the web browser runs well it can have as much as possible. This has lead to the rise of Chromebooks and Walmart-tier Windows machines. Even much proprietary software runs off of web browsers now: during my time on the IT helpdesk, the user access to the proprietary software that more-or-less ran the business simply was a customized web browser (or a Telnet window, but I'd be surprised if a machine built within the past thirty or even forty years couldn't handle a single Telnet window to an in-house server).
Of course, there are several exceptions to the trend of webappification: serious writers still buy Word, graphic design studios still sell their souls to Adobe, and so on. But those teeter into the business-grade hardware sphere, which has always straddled between consumer-tier and professional-tier hardware.
As a result of this, no longer does consumer-hardware need to be powerful; powerful enough does the trick. I've had success renewing decade-old PCs with Linux for consumer usecases. And smartphone-computing is the ultimate example of this: it is (for the most part) not complex, but it is so efficient that the one device can last all day. This is Apple's selling point with their ARM devices, as well as with Chromebooks: doing the essentials while lasting forever.
There are some moral considerations to the transition.
Much to my chagrin, much processing is now off-machine. As a result of this, consumers are more susceptible to information poisioning and manipulation, for if processing is off-machine, it is per-se more likely to be intercepted or manipulated by either foreign actors or from the source itself.
Off-box computing also increases trust in black-box systems, which lead to the dampening of freedom of information and increased blind trust and algorithmic manipulation.
On the other hand, since consumer-grade computers last longer, there is potential for less e-waste. Unfortunately, though, due to Microsoft's arbitrary exclusion of 7th-gen Intels and before from Windows 11, they have artificially made obsolete millions of still-good consumer PCs. It gives me a hobby, though, to revive people's old PCs with Linux, but giving one tech nerd a hobby hardly atones for the millions of now-ewaste PCs.
Consumer hardware has hardly shifted in recent years. This comes from a transition to efficiency-focused development away from power-focused development as a result of off-box computing and webappification. Furthermore, we are still working with x86-64 hardware and foundation-level upgrades no longer come with massive marginal benefit while still having enormous overhead. And the transition from x86-64 to ARM is not one of power but one of efficiency. Consumer hardware is boring, now; it has been foundationally the same for a long time.
Made with love in USA. Site Mirrored from Gemini Protocol. It's better there :) Contact Me Back to Home