I'm imagining some utopian gaming future where my only computer is a retina MacBook Air, and to game I plug in an HD Oculus Rift and an external graphics card via Thunderbolt X, whereby "X" = the generation of Thunderbolt needed to provide full PCIe 16x bandwidth. How many years away?
True enough that we soon get left behind, but that's usually the case. As soon as new consoles are released, they offer some of the most powerful hardware available for gamers not willing to spend a fortune on high-end gaming PCs. But over the lifetime of any console, that advantage soon ebbs away. Then there's the games & for RTS gamers, et al, PC/Mac will always be the gaming platform of choice, on top of all the other reasons we buy Macs for.
On a brighter note, if each new generation of Macs continues to see an incremental increase in GPU power (regrettably, there are always a few exceptions with Apple, for eg. both 2011's high-end Mini & entry level iMac saw a slight downgrade in 2012), it still considerably increases the Mac games pool for gaming at optimal settings. Even most new games will already run on lower-end Macs at acceptable medium settings.
Also, with PC sales well down, I suspect it's in very few interests these days to make the sort of games that only a minority of computers can run at playable levels. The great thing for developers like Feral is that there are still a good number of older games to be ported over to Intel Macs. By the time more demanding titles like Rome 2: Total War come over to Mac (PC release date scheduled for circa December 2013), I think it's likely that even a modern Mini will be able to run that at close to medium settings, if not easily at medium.
None of this changes my views about why I'd prefer to be able to buy an upgradable "xMac".