We are rapidly coming up on the first anniversary of Vista, and it has been a very rocky year for Microsoft. New releases of an OS are always difficult, but many users have really turned up their nose on Vista. My experience has been much the same as everyone else's: Applications run slower in Vista (I know because I had a system set up to dual boot and A/B tested a number of applications). Networking, particularly wireless networking, is much less stable than in XP. Good drivers STILL don't exist for many legacy hardware devices, including may graphics cards. I ran into any number of quirks. The most irritating for me was that a laptop communicating with a printer via wireless network would lose connection with the printer every time the laptop was shut down in a way that could only be rectified (as confirmed by MS customer support) by reinstalling the print driver every time I wanted to use it.
Most computer NOOBs probably never noticed, not having anything to compare Vista with and only using their computers for a narrow range of functionality (ie email and internet browsing). However, many of us who are more comfortable with computers and who rely on our computers as an important tool have either avoided buying Vista computers (Dell, for example, still sells a lot of XP computers) and/or have taken the time to roll back their Vista to a dual boot system or even XP only (which I explain here). Which may explain why standalone XP packages are better sellers on Amazon than Vista.
For gamers, most of whom tend to be power users, Vista has been nothing but a negative, slowing games down and requiring use of buggy graphics card drivers (Microsoft crows that they get fewer customer service calls on Vista than XP, which may be, but I can gaurantee, from browsing gaming boards, that gaming companies get swamped with Vista calls from gamers who can't get the game to run on Vista).
Looming over all of this, though, has been one word: Crysis. Gamers have been lusting after this game for over a year, with its promise of knock-out graphics and game-play. To this end, Microsoft did something clever. It updated its DirectX graphics engine in Vista to revision 10, and included in it all kinds of new capabilities that would really make a game look fantastic. MS decided, either for technical or marketing issues, not to ever release these features on XP. If you wanted DirectX 10 games, you had to upgrade to Vista. Over the last year, graphics card makers have been releasing hardware to support DirectX 10. Crysis was set to be the first game that would really take advantage of DirectX 10, and many hardcore gamers upraded to Vista solely on the promise of running Crysis maxed out with the new DirectX 10 features.
Well, Crysis was released a few weeks ago. You may think I am building up to say it sucked, but just the opposite is true. It is absolutely fantastic. Easily the most visually stunning thing I have ever seen running on my PC. First-person shooter games are not really my favorite, but I have thoroughly enjoyed the game. (here is a trailer, but unlike most trailers, the game really looks like this in gameplay, maybe better due to limited resolution on YouTube.) Click below for larger screenshots:
But here is the interesting part. I keep my system state of the art. I have close to the fastest Intel multi-core processor currently made running with two of the newest Nvidia graphics cards (8800GT's) running ganged together in SLI mode (don't worry if you don't know what all that means, just take my word for it that it is about as fast as you can get with stock components and air cooling). Crysis, like most graphics games, can have its settings changed from "low", meaning there is less graphics detail but the game runs faster, through "med" to "high" and "very high". Only in the latter modes do the new features of DirectX10 really come into play. So I ran the calibration procedure the game provides and it told me that I needed to set the game to "medium!" That's not an error - apparently everyone else in my position who have a large monitor with high resolutions had about this experience. I can set the game to higher modes, but things really slow down. By the way, it still looks unbelievably awesome on Medium.
The designers of Crysis actually did something kind of cool. They designed with Moore's law in mind, and designed the highest game modes for computers that don't exist today, but likely will in a few years. So the game (and more importantly the engine, since they will likely sell the engine as a platform for other game makers to build their games atop) has some built-in obsolescence-proofing.
But lets return to Vista and Crysis being billed as a killer app. As it turns out, none of the directX10 features are really usable, because no one can turn the graphics engine up high enough with their current hardware. Worse, in a game where users are trying to eek out any tweek they can to improve frame rates and graphics speed, Crysis runs demonstrably slower on Vista than XP. Finally, those who have run the game in its higher modes withe DirectX 10 features (presumably at the cost of low frame rates) have found the actual visual differences in the DirectX 10 graphics to be subtle. The game boards are a total hoot, as folks who upgraded to Vista solely for Crysis are wailing that their experience on Vista is actually worse than on XP.