This is my main complaint with consoles, is that there is a major lack of anisotropic filtering and anti-aliasing. It's not that they cannot do this, but for some reason developers just don't use it.
Thats interesting, now you can post pics of games with temporal and adaptive(multi sampling-super sampling)Â AA (or however its called for nvidia) to make us see the difference between several types of anti aliasing that would be cool!
This thread with its examples just proves what I've said all along. Unless there's NO difference in performance by turning on AA and AF (or even just one or the other) they're not worth turning on. The performance hit you take with most cards (on newer games) is just not worth the modicum of difference the AA and AF make. Maybe it's just me, but unless I'm playing some SUPER slow paced game where you actually have time to inspect and dissect the details of the scenery (like Golf or something) the difference AA and AF make are not even noticeable. It's more important for me to have the resolution I want, the color (even though everything now is 32 bit) and the smooth gameplay than whether I notice the odd miniscule difference here and there. Now if you have a $600.00 dollar card that can handle full AA and AF, by all means, crank 'er up. I'm not saying that's a bad thing. Just that I'd rather save my money, get a card that can do max settings "without" AA and AF than to double the price of my comp by getting a card that can do max settings "with" AA and AF.
I think it's great of the TC to make and show these graphical displays (pardon the pun) of the differences so people can make an informed choice based on actual visuals. It will really help and take the guess work out of deciding whether to dump a LOT of cash that maybe one can't afford on a card, or going with one that you CAN afford that will do the job nicely, if not perfectly with AA and AF.
Log in to comment