Lost Opportunity? Windows 10 Has The Same Minimum PC Requirements That Have Existed Since 2006
As the years roll by, I can't help wondering what we're actually giving up in exchange for holding the minimum system spec at a single-core 1GHz, 32-bit chip with just 1GB of RAM. Back when Windows Vista was in development, those specifications made some sense -- single-core chips were still quite common in the low end of the PC market, particularly in mobile systems. When Microsoft built Windows 7, it wanted to make the new OS available to as many buyers as possible, so it held system specs steady. Between improved OS performance and the proliferation of dual-core CPUs, Windows 7 was much snappier on the low end systems of 2009 than Windows Vista had been in January, 2007.
Does this desktop make you twitch uncontrollably? You may have Post Vista Stress Disorder
Then, along comes Windows 8, and again -- it makes sense for Microsoft to stick with the same system specs. This time around, the focus is on pushing the OS into mobile platforms and all the press discussions focus on lower power, higher efficiency, and a GPU-accelerated desktop to further improve battery life. If you're pushing into new form factors and lower-end hardware, freezing the x86 specs while you wait for ARM chips to catch their performance is the only smart thing to do.
Now, with Windows 10, Microsoft is again pandering to the bottom of the market and the minimum specifications haven't budged an iota. To be clear, I'm not saying Microsoft has utterly neglected the top end -- Windows 8 built on Windows 7 with Direct2D and significantly improved desktop compositing, while Windows 10 will introduce features like the much-discussed DirectX 12. The question isn't whether or not Microsoft has continued including new features and capabilities, because it has -- but where might we be if the company was more willing to push the envelope?
The Minimum Spec Anchor
It might be hard to remember, but there was a time when new versions of Windows genuinely pushed the envelope. It didn't always do so reliably, and even its best features, like Plug'n'Play, sometimes took years to work reliably across the hardware market. Still, Microsoft's Windows releases used to be demonstrations of what PCs could do -- up until Windows Vista.
With Vista, Microsoft caved to Intel's demands that it break its own rules regarding GPU support and its attempts to push the bar forward on PC performance often resulted in slower, buggier experiences (though a substantial cause of this, at least initially, was poor Nvidia drivers). Since then, the company has held the line -- despite the fact that even bargain basement PCs of 2014 are far more capable than the equivalent hardware of 2006.
True, Intel and AMD have continued to create innovative standards, with the launch of OpenCL, support for features like Haswell's Transactional Synchronization Extensions, and HSA itself -- but how much more quickly might those features expand if they were baked into Windows and supported natively by the operating system?
For decades, the standard argument has been that Microsoft had to continue supporting ancient operating systems and old configurations, ignoring the fact that the company did its most cutting-edge work when it was willing to kill off its previous products in fairly short order. I'm not arguing that we should return to the bad old days when a computer was old at 12 months and creaking at 24, but what would Windows look like if Microsoft jettisoned support for 32-bit processors (while keeping the ability to run 32-bit software) and at least mandated a dual-core product? What if DX10 -- a feature set that virtually every video card today supports, according to the Steam Hardware Survey, became the minimum standard, at least on the x86 side of the equation?
How much better might the final product be if Microsoft put less effort into validating ancient hardware and kicked those specs upwards, just a notch or two?
Guess we'll have to wait for Windows 11 to find out.