If you are not a gaming fan you might consider skipping this article. (it's boring anyway!)
First let me extract what is known by many people. Even after two progressive Windows releases, Windows XP still dominates the market as the primary operating system of users arround the world. With advancements of computer hardware and software, you would expect people to move forward. In fact, that's what actually happened in the past. If we consider only the Windows community, users have steadilly migrated onto new operating system versions from Windows 3.0 era to XP era. With hardware advancements over time, the operating system designers were able to deliver new features and concepts and users willingly grasped them.
But since then, the Windows ecosystem seems to be stuck on Windows XP for a long time. (Argubaly, there maybe several reasons for it including Microsoft screwing up over Vista but that's not what this article tries to discuss) The point is, despite rapid growth of hardware and software capabilities, most end-users have decided to stay with what they are already familiar with. It seems that constant "change" that have existed with end-users, is "stabalized" for the past few years.
In my point of view, the gaming industry too have faced this kind of stabalization point in terms of exploiting hardware capabilities. In the case of Windows XP, it was the end-users who seems to have reached a stabalizatin point. But in the gaming industry, it's the game developers that have reached a certain stabalization. In the past, we saw games being released with higher and higher hardware requirements. When you buy a graphics card, it'll be obsolete within a month or two. But now, all the new gaming titles seems to aim a certain level of hardware requirements that would no way require high end graphics cards.
There's a lot. Talking about Windows, users were pretty satisfied with features offered by Windows XP. Most couldn't justify moving onto Vista with the overhead it had on the system and the user himself. The jump caused a considerable change of user interaction. Users simply chose to stick with what they already had, since it was familliar and enough for their requirements.
The same goes for the gaming industry. Only that it maybe little more complex. There is a level of harware specification that can be regarded as familliar and enough for most gamers and that's the level the game developers going to build their games on. This is mostly decided by the current versions of gaming consoles. Unlike PCs, gaming consoles (XBOX, Playstation, etc...) are the major market for games. We can see new PC graphics cards released every month but gaming consoles may take years to re-iterate. So a game which requires the highest hardware capabilities end up being available to a limited set of the consumers. Since developers like to expand their market they compromise on the game graphics to make it available to a larger gaming community.
On Windows, what always happens will happen in this case as well. Developers will write software for XP because a large user base is available and users will continue to use XP because there are lot of software available. The downside is the old operating systems and software will not get to use new hardware features supported only by new software.
Talking about games, of course, graphics does not make a game. There's a lot more to it. Developers can make superb gaming titles by inventing controversial story lines, music and gameplay experience. But for "graphics-freaks" like me, gaming graphics will always matter! In fact, I'm using a somewhat old graphics card (see here). But it has more horsepower than all the recent games I've played need (except Crysis). I really wish those games exploited that extra power and improved more on graphics. Although we think the games will be more and more realistic, it will take a longer time to progress as the industry addresses the requirements of the global market.
If you read it this far, I know there are lot of points you may not agree. It's open for discussion...