Serioussamik / Member

Forum Posts Following Followers
739 5 6

Serioussamik Blog

Your Graphic card is outdated!!Seriously?

Pardon me if I ask a much cliched question-Which part of the PC we gamers replace all too frequently?Yeah no prizes for guessing!The graphic cards.But is it imperative to own the latest series of GPU's as soon as they hit the matket?Is it that the old GPU gets'outdated' as frequently as they are made to look?Is it that the newest line of GPU's always deliver the graphical bliss that they promise?Pardon me for asking too many questions.With pixel-pushers skyrocketing,aliasing reaching new highs ,it's a worthwhile query,isn't it?

I was using an very able ATi 9250pro back in Jan 2004(I was on budget then).Then came the Geforce 5200 and some of my friends got it and proposed that I too change it to a 5200,coz it supports Dx9.I bit the bullet and sold the 9250,threw in some extra cash and got the 5200.Improvements?None,Zilch.

Ok then in Dec 2004 my pocket was looking up a bit and I got myself a Ati 9800pro.And only a few weeks later (to my horror) I found out that Nvidia was offering cards with full Dx9.0c compatibilty(the 6 series).Oh Nightmare!really ,I thought whatever will I do without all that DX9.0c goodness?So,back to the chore-sold the 9800pro,threw in some extra cash and got the only reasonable "DX9.0c" card I could afford,the 6600GT.To my utter dismay I found out quickly(from the person who bought my "old" 9800pro), that in some games the radeon card was outdoing the 6600GT.OH joy!

I was with that 6600GT back in 2007 when shiny new 8600GT,8600GTS and 8800GTS(Then the nice 8800GT)hit the market.Got a 8600GT and thought it was a worthwhile upgrade!WRONG!!!Quickly I read the reviews and benchmarks of the card and found that other than that DX10 compatibilty,8600GT was inferior(in every way possible) to my existing 6600GT.This time(tired of feeling cheated) I returned the card to the vendor giving him flimsy reasons!I didn't get any takers for my 6600GT(thankfully) so I resumed it's usage and then gifted it to my cousin(who's not much of a gamer,by the way) in fall-2008.He could,(beleive it or not!) play Crysis at low rez and settings and even did manage to play Call of Duty:MW2 at 1024x768 with reasonable settings.He's happily using it still!

So I got the HD4850 after thoroughly reading net reviews and am still using it and don't intend to upgrade within at least 6 months.

Figure this-some midrange/entry level GPU pack covers claim exotic features like "16x anti-aliasing","24x anisotropic filtering","full HDMI support","life-like physics support","X no.pixel shader units",shared-memory support upto colossal amounts and and the other shiny thingies!But in reality very few of them translate to real performance increase.What will a entry-level card do with all those 1gb/2gb amounts of memory?NOTHING,but waste them if they haven't got the speed/shaders to utilize all that memory.

Physics you say?A few extra ragdoll effects and some extra boulder debris?Is that what makes your exhorbitantly priced GPU so effective and reasonable?I don't think so.Yeah physics engine do add to the visual quality here and there but it's nothing that justifies a 'generation gap'.For the records,DirectX API improvements can mean a lot.But other than that monumental Dx-9.0c-DX10 jump,DX10.1 and DX11 offers very little new stuffs.

Oh,multi-monitor compatibility!As if the next best thing to have happened since the equally dubious multi-gpu setup.Let us get this thing straight.Using multiple monitors sharing the same image is lovely indeed!But that feature is quickly off-set by the fact that you have to shell out a lot of dough for 3 monitors!And does features like these make you squeeze any extra bit of framerate from your GPU.No sir!!!

So moral of the story-Graphics cards don't "evolve" as fast as their makers claim unless you are buying a flagship card.Even today a 8800GT can run like charm as long as you are gaming at 1440x900 or below.Don't fall prey to the tall claims of GPU manufacturers-they'll always ring the bell.But end of the day it's your call.