Entry 100: DX 10, part 3 - The BioShock Dilemma ( not a review)
First, I feel sad for all my fellow PC nuts having technical glitches with BioShock. I told you, you had to be prepared, this is a geeky game. Otherwise, choose the X360. For my part, I'm almost embarrassed to confirm that my copy runs incredibly fine on my rig, moreover upon Vista/DX 10 which should be supposedly more unstable - not in my case. Like everyone else, I'm impressed by the overall gameplay AND the artistic achievement here, but nothing's perfect - just near perfect, as I will explain in my upcoming review sometimes in September.
At evidence, people talk aplenty about this game nowadays, so I just want to add my two cents on the very elitist subject regarding the game's technical achievements under Vista/DX 10 - if there are any worthy to be easily noticeable at all, compared to the DX9 option. Well, there are some water, smoke and bloom touches but BioShock wasn't built on DX10; the most part of the development phase took place under DX9 of course. It isn't the bandwagon of showcases that should bring later on the likes of Crysis & Unreal Tournament III. So I'm a bit disappointed about the very slight differences ( yet noticeable ) between the two modes offered, though the better water effects sure look amazing under Vista. The developers from Irrational definitely encased a stronger look artistically for conveying a, let's say, perhaps a less polished technical side that will surely need several patches.
''I can't stand jaggies''
Furthermore, there's a price to play BioShock under DX10 = NO anti aliasing support, either in game AND from Nvidia/ATI panels as of writing this. THE ONLY way to enable a forced AA, and only CSAA, is to play the DX9 mode under XP which brings additional bugs ( corrupted textures ).....Oh my! I know this is an extremely insane detail to care about, but some HD enthusiasts like me have difficulties to stand jaggies now - even on 1920 x 1200. Yep, we have this one little irritating geeky problem. To its credit though, BioShock irritates much less without AA than most other 3D extravaganzas ( indoor settings camouflage the lack of feature more than outdoor shooters). By chance, the framework seems to be built on a full HD resolution too BUT in this day and age, I must insist how it is unacceptable that BioShock lacks the support of AA when titles from 2004 like Half Life 2 do so well.
Some have found a scapegoat: the Unreal Engine 3.0 not fully optimised by Irrational to use simultaneously the new shader model with AA. How ironic then, when the engine proprietor ( Epic ) clearly states that its flagship title UT III WILL support full AA under Vista/DX 10, whilst BioShock doesn't. Adding more to the irony, Epic will offer it in game of course and only under Vista. Quite the contrary for BioShock, i.e. under dx9 and forced through the Nvidia control panel. ATI owners can't at all, whatever the DX mode. It is unknown if Nvidia will be able to process it via a new driver. As a side note, there's no AA on the X360 version either. End of rant.
This is what the dilemma is about: two faces of wo medals. 1) play the PC version, which is better looking and more fluid with a mouse; 2) or play the X360 version. If you chose the PC version, then: 1) you'll most likely experience frustrating glitches if you're unprepared; 2) the PC nuts running it well shall tolerate the little jaggies for the time being.
If I remember correctly, even the good old Nintendo 64 had some kind of anti aliasing feature. In the end, and read carefully, BioShock do innovate indeed as a cinematic experience but not technically. I think Microsoft and all the major developers should work more closely in order to quell this little technic quagmire, again shifting away different cIassesof gamers.