PC gaming is crushing consoles

This topic is locked from further discussion.

#551 Edited by Zelda187 (715 posts) -

@mr_huggles_dog said:

@Bebi_vegeta said:

@mr_huggles_dog said:

Oh....so SW is about stupidity and idiocy? You mean there is no reason to converse or argue or discuss anything here b/c it's all just a bunch of trolls like that guy with the knight avatar who keeps replying to me as if I care what he thinks....as if I can't tell he NEEDS attention?

That the most idiotic thing I've ever heard. People who argue with facts are the ones that don't need to care about what anyone says b/c facts say a lot. I don't need to act like an idiot or "omg, you totally need new insults LOLWTFBBQSAUCE!!!"....b/c first...that wasn't my intent...I stated something that if fact.

YOU and your PC gamer need for everyone to think PC gaming is dominating has to spin every word I say.

IT'S PC vs CONSOLE. That INCLUDES the 4 current consoles most ppl are enjoying.

It's not PC vs whoever I decide to be my opponent at the time so I can win an argument. Personally I think the PC is 100000x better than the 360. BUT......the PS3 has too many great AAA exclusives to say PC is dominating the "PC gaming is better" argument. And soon....the PS4 will be in the same boat.

I named a FEW??? I named like 8 exclusives AND THOSE ARE THE ONLY ONES I COULD THINK OFF THE TOP OF MY HEAD.

Again....I'm not mad...I don't have to be...console gaming is clearly more profitable, has better exclusives, and whatever else....but you and the lackies felt the need to insult my integrity first which more than likely means you know I'm right.....and then you spin everything I say.....THATS why I type in caps.

B/c you seem to need to be able to read IN BIG LETTERS.

Ok, so you''re telling me system wars is really a serious debate forum ? As in you want to debate my preference over your preference?

Your intent or not, it is it what it is. You clearly tried to insult me, there's no turning back on that.

Yes, and there's a shit load of exclusive for PC that could take on the whole console list.

More profitable ? Why as a gamer should I care for that ?

Better exclusive ? According to who, you ?

Whatever else ? Oh nice point, can't argue something you don't know what you're even talking about.

You keep on repeating yourself about the spinning thing, the only thing spinning is your broken record.

Wrong...I stated my view point and then you and the lackies started assaulting my integrity saying in nerd talk "Oh...it's just a self preference b.s. opinion". Thats when I started acting like a prick...b/c fire with fire and all that crap.

The exclusives on PC are like a shit load of indy games with 90% of them are crap.

As a gamer you SHOULD care. B/c if companies know that a platform is more profitable...then they are more likely to make their game for that platform. As it is...MOBAs, F2P games and other non standard games are the only things that sell well on PC....so practically EVERYTHING ELSE worth a dam comes to consoles.

Better exclusives goes back to what I said earlier....there's shit for exclusives aside from indy games on PC...so yeah...like I said AAA games like the ones I mentioned....not "Harry's Weird Adventure: Episode 1" which is some sort of platformer with a minor twist that PC herald as the revolutionary second coming.

Again....anyone with half a brain knows the bullshit argument you put up with "WAAAAAH....you can't include consoles in general....b/c PC will get beat up in that argument!!!" is nothing but spin.

LOL

I enjoy playing videogames, but trying to act like it's a way of life or something is ridiculous.

Why the hell should I care about how much money these companies like Microsoft, Sony and Nintendo make? None of it is going into my pocket, so why do I give a shit?

And a shitload of the indie games that you're mentioning are free. So if I download one and it sucks...oh well. I didn't spend so much as a penny on it. No sweat.

If you're totally fine with playing the umpteenth installment of Halo, Gears of War, Mario, Zelda, etc....then suit yourself. But there's no question that PC has the most variety in genres of games. Meanwhile consoles mainly appeal to the mindless FPS and platformer demographic.

#552 Edited by 04dcarraher (19264 posts) -

@Cranler said:

@RyviusARC said:

@Cranler said:

@04dcarraher said:

@Cranler said:

@RyviusARC said:

@Cranler said:
@Dasein808 said:
@RyviusARC said:

Ummmm why is the directx 10 version null?

Because Cranler.

Hmmm... Is there something in this setup that could prevent one from using the dx10 mode?

  • Processor: Intel E6700 Core 2 Duo 266x10
  • Motherboard: ECS P35T-A
  • Memory: 2 x 1GB Mushkin XP9200 400FSB 5-4-4-12
  • Video Card(s): Gigabyte GeForce 8600 GT( Silent pipe II Technology Cooled)
  • Power Supply: Ultra X3 Modular Power supply
  • Hard Drive: 2 x WD 250GB 16MB cache SATA
  • Opticals: BenQ DW-1655 Lightscribe DVD-R, Sony DVD-ROM
  • O/S: Windows XP Service Pack 2
  • Comparison Card 1: XFX 8600GTS XXX
  • Comparison Card 2: Sapphire X1950 PRO Ultimate
  • Comparison Card 3: XFX 8600GT Fatal1ty
@clyde46 said:

Once again, why are we arguing over specs from 2005?

Because Dasein is reality reluctant.

Well I guess he didn't use dx10 because of Windows XP.

But it was still max dx9 settings which I believe was still a tad about the 360 settings as the dx10 switch was more for performance than visuals in Bioshock.

Again:The PC graphics at high quality are comparable to the Xbox 360's, but the PC still offers higher resolutions and extreme anti-aliasing support.

High is max. So pc has higher res and aa to take it above the 360 but the 8600 can't handle aa or higher res.

lol you missed "comparable" which tells you its using a slew of settings from low to high , its not running full high nor medium etc

If the 360 wasn't full high then they would surely have said that the pc has a slight advantage. You're reading way too much into their use of the term comparable.

Either way the weakest gpu that could beat the 360 in Bioshock and most other multiplats is the 8800.

the 8800 was around 3x the power of the 360 so it was quite a bit stronger.

The GPU that is comparable to the 360 is inbetween an 8600gt and 8600gts.

Skyrim runs better on 360 than a pc with 8600 gts. gets 30 fps at lower setting than 360 version.

https://www.youtube.com/watch?v=r58nEXV5EV0

lol look at the cpu used in that video.... E2200 at 2.2 ghz the cpu barely breaks past min requirements. Another video shows with a E6750 at 2.6ghz getting average 30 fps at 1400x900 with no AA high textures and medium shadows with a 8600GTS. Then another video even shows a 8600GT 512mb with a quad core being able to do 35-40 fps average with mostly high settings with low shadows at 720.

#553 Edited by Cranler (8730 posts) -

@04dcarraher said:

@Cranler said:

@RyviusARC said:

@Cranler said:

@04dcarraher said:

@Cranler said:

@RyviusARC said:

@Cranler said:
@Dasein808 said:
@RyviusARC said:

Ummmm why is the directx 10 version null?

Because Cranler.

Hmmm... Is there something in this setup that could prevent one from using the dx10 mode?

  • Processor: Intel E6700 Core 2 Duo 266x10
  • Motherboard: ECS P35T-A
  • Memory: 2 x 1GB Mushkin XP9200 400FSB 5-4-4-12
  • Video Card(s): Gigabyte GeForce 8600 GT( Silent pipe II Technology Cooled)
  • Power Supply: Ultra X3 Modular Power supply
  • Hard Drive: 2 x WD 250GB 16MB cache SATA
  • Opticals: BenQ DW-1655 Lightscribe DVD-R, Sony DVD-ROM
  • O/S: Windows XP Service Pack 2
  • Comparison Card 1: XFX 8600GTS XXX
  • Comparison Card 2: Sapphire X1950 PRO Ultimate
  • Comparison Card 3: XFX 8600GT Fatal1ty
@clyde46 said:

Once again, why are we arguing over specs from 2005?

Because Dasein is reality reluctant.

Well I guess he didn't use dx10 because of Windows XP.

But it was still max dx9 settings which I believe was still a tad about the 360 settings as the dx10 switch was more for performance than visuals in Bioshock.

Again:The PC graphics at high quality are comparable to the Xbox 360's, but the PC still offers higher resolutions and extreme anti-aliasing support.

High is max. So pc has higher res and aa to take it above the 360 but the 8600 can't handle aa or higher res.

lol you missed "comparable" which tells you its using a slew of settings from low to high , its not running full high nor medium etc

If the 360 wasn't full high then they would surely have said that the pc has a slight advantage. You're reading way too much into their use of the term comparable.

Either way the weakest gpu that could beat the 360 in Bioshock and most other multiplats is the 8800.

the 8800 was around 3x the power of the 360 so it was quite a bit stronger.

The GPU that is comparable to the 360 is inbetween an 8600gt and 8600gts.

Skyrim runs better on 360 than a pc with 8600 gts. gets 30 fps at lower setting than 360 version.

https://www.youtube.com/watch?v=r58nEXV5EV0

lol look at the cpu used in that video.... E2200 at 2.2 ghz the cpu barely breaks past min requirements. Another video shows with a E6750 at 2.6ghz getting average 30 fps at 1400x900 with no AA high textures and medium shadows with a 8600GTS. Then another video even shows a 8600GT 512mb with a quad core being able to do 35-40 fps average with mostly high settings with low shadows at 720.

Yet that cpu is probably better than the cpu's that cost $900 when the 360 launched.

That 1400x900 is using a core 2 duo which is much better than any cpu's at the 360 launch and the whole video is shot in one of the least demanding areas of the game.

He also says he gets freezes that last for a few seconds and fps drops into the teens.

#554 Edited by 04dcarraher (19264 posts) -

@Cranler said:

lol look at the cpu used in that video.... E2200 at 2.2 ghz the cpu barely breaks past min requirements. Another video shows with a E6750 at 2.6ghz getting average 30 fps at 1400x900 with no AA high textures and medium shadows with a 8600GTS. Then another video even shows a 8600GT 512mb with a quad core being able to do 35-40 fps average with mostly high settings with low shadows at 720.

Yet that cpu is probably better than the cpu's that cost $900 when the 360 launched.

Moving the goal posts again? Skyrim does not have the best coding for pc and originally only used two cores until a patch a few months later after release, Then memory allocation was poor like only allowing two a primary and 2nd 256mb buffer for cell's aka area your in and going to. Modders had to patch and fix many issues with the game. Put it this way.... an i7 2600k at 3.4 ghz with Skyrim only using two threads only allows 70 fps average with a GTX 580 max settings at 1680x1050. Speaks volumes how piss poor skyrim is coded.

But back to the 8600GTS it was a gpu comparable to the 360 and if you had the 512mb version higher quality setting are possible that aren't on the 360.

Clock per clock yes that E2200 is better then the cpu's in 2005 however that cpu is a striped down C2D that released in Q4 2007 meaning an AMD Athlon X2 at 3ghz from 2005/2006 can out process it.

A pc needed for it to match or surpass the 360 in 2005 was 1.5k+ because of the future standard architecture designed into it. Now in 2006 when new hardware standards came out along with cheaper prices left 360 in the dust.

#555 Edited by Mr_Huggles_dog (674 posts) -

@Bebi_vegeta said:

@mr_huggles_dog said:

Dude....stop making up lies.

I don't know where you're getting that I'm encouraging exclusives....probably just trying to get good with ppl who are light hearted about gaming and who aren't into the system wars thing and will be like "yeah...he's a good guy....he doesn't like exclusives....I agree with him".

Again...stop making up lies....I never said F2P games were bad.....again....you're making things up and putting words in my mouth.

I'll buy you the next AAA game that comes out that you want if you can point out where I said F2P games are bad.

Lastly...don't start acting like I'm crazy or something....you're the one that started whining about being insulted and getting your feelings hurt. Like I said....all I did was give my viewpoint and then....well, I'm not repeating myself....anyone following this conversation can either read the last couple of pages or reread my last couple of posts.

You started at me....not the other way around.

Don't pick a fight if you can't deal with ppl treating you like you treat them....like a 10 year old does.

You're obviously encouraging exclusive since they matter so much for you.

And seriously what are you going on with : probably just trying to get good with ppl who are light hearted about gaming and who aren't into the system wars thing and will be like "yeah...he's a good guy....he doesn't like exclusives....I agree with him". ? It almost sound like you're making a story in your head about me saying those things. HAHAH!

Sorry I was under the impression you were mocking F2P/MOBAs games.

Oh the "you started it" quote with the "10 year old" quote... yeah goes really well together. LOL!!

Lol....you're such a tool.

YOU are the one that whined about me insulting you first. I can't help that you're just spinning and lying left and right and then throwing the "NUH UH....YOU DID IT!!!" crap at me and then calling me a child....it's the truth....what do you want me to say to your crying?

You THOUGHT I was mocking F2P games and MIOBAs.....you were wrong. Yet, you're still intent that I'm all for exclusives. So far...you haven't been right about anything....why would this be any different.

@Zelda187: OMG...why don't you exaggerate a little more. If you don't understand why, as a gamer and a fan of any system, you would want said system to do well....then maybe you shouldn't open your mouth and just let the grown ups talk.

It has nothing to do with gaming being a way of life.

You say you just enjoy playing games and that this isn't a "way of life" for you....yet you put down playing games that are generally fun.

I'm sorry you're jaded and the only fun experience you can get is out of some shitty half assed indy game where a bug that does something weird is the highlight of you day.....but for me....I don't mind playing another Uncharted or whatever. B/c I enjoy games.

I would rather play another Halo game than play Goat Simulator. I bought that for my step son and he played it for 2 days.....and OMG did he have fun with it....for 2 days....but those crap indy games only go so far.

Yeah it's cool and hip to play then and be a part of the cool PC clique for a day....but then you realize you wasted that money after it bores you to death with nothing to do on the 2nd day.

#556 Posted by RoboCopISJesus (1408 posts) -

The exclusives on PC are like a shit load of indy games with 90% of them are crap.

Why would shit games score 8/10 and 9/10's? We aren't talking about low scoring games....

#557 Edited by Mr_Huggles_dog (674 posts) -

@RoboCopISJesus said:

@mr_huggles_dog said:

The exclusives on PC are like a shit load of indy games with 90% of them are crap.

Why would shit games score 8/10 and 9/10's? We aren't talking about low scoring games....

Why would boring games for the mindless masses that you guys say is the only thing on consoles score 8/10....and 9/10.

So your point is moot.

Plus...I just realied Zelda is your handle...Zelda187.....yet you're all on Zelda and how it's a mindless game and just another in the series for the umpteenth time....maybe you like those games, but you just want to argue for the sake of PC gaming.

I smell something foul....and it's not my bref.

#558 Edited by RoboCopISJesus (1408 posts) -

@mr_huggles_dog said:

@RoboCopISJesus said:

@mr_huggles_dog said:

The exclusives on PC are like a shit load of indy games with 90% of them are crap.

Why would shit games score 8/10 and 9/10's? We aren't talking about low scoring games....

Why would boring games for the mindless masses that you guys say is the only thing on consoles score 8/10....and 9/10.

So your point is moot.

Plus...I just realied Zelda is your handle...Zelda187.....

I never said that in that console only had mindless masses games....I just said indy games are high scoring and great, not just "crap" as you stated. They can easily be better than "big budget" games. Even better gfx sometimes (star citizen).

...and I'm not named Zelda.

Lay off whatever you are smoking...

#559 Edited by Puckhog04 (22606 posts) -

Yet, I still don't care for PC gaming, and never will. This topic is further proving that indie companies start on PC (hence all the indie jokes regarding PC)...but then they end up on Handhelds or Consoles.

#560 Edited by Mr_Huggles_dog (674 posts) -

@mr_huggles_dog said:

Why would boring games for the mindless masses that you guys say is the only thing on consoles score 8/10....and 9/10.

So your point is moot.

Plus...I just realied Zelda is your handle...Zelda187.....

I never said that in that console only had mindless masses games....I just said indy games are high scoring and great, not just "crap" as you stated. They can easily be better than "big budget" games. Even better gfx sometimes (star citizen).

...and I'm not named Zelda.

Lay off whatever you are smoking...

Well...way to avoid the point of my reply and focus on something not directed at you.

I only have so many replies being new here so I'm trying to consolidate them....Zelda is another person....you're not the only one in my conversation universe there slick.

#561 Edited by RoboCopISJesus (1408 posts) -

@Puckhog04 said:

Yet, I still don't care for PC gaming, and never will. This topic is further proving that indie companies start on PC (hence all the indie jokes regarding PC)...but then they end up on Handhelds or Consoles.

If you take out indy's I think pc still has more high scoring games than each console. If you take out indy's of ps4's library you end up with the smallest library this gen.

Think about that and then look at what you wrote^, hint: consoles get plenty of indy's too. I'd say the big budget/indy ratio is the same, PC gets more of both.

PC gets most multiplats, so I'm not worried about the devs "ending up" on consoles as long as they continue to make the better version on PC.

#562 Posted by Cranler (8730 posts) -

@Cranler said:

lol look at the cpu used in that video.... E2200 at 2.2 ghz the cpu barely breaks past min requirements. Another video shows with a E6750 at 2.6ghz getting average 30 fps at 1400x900 with no AA high textures and medium shadows with a 8600GTS. Then another video even shows a 8600GT 512mb with a quad core being able to do 35-40 fps average with mostly high settings with low shadows at 720.

Yet that cpu is probably better than the cpu's that cost $900 when the 360 launched.

Moving the goal posts again? Skyrim does not have the best coding for pc and originally only used two cores until a patch a few months later after release, Then memory allocation was poor like only allowing two a primary and 2nd 256mb buffer for cell's aka area your in and going to. Modders had to patch and fix many issues with the game. Put it this way.... an i7 2600k at 3.4 ghz with Skyrim only using two threads only allows 70 fps average with a GTX 580 max settings at 1680x1050. Speaks volumes how piss poor skyrim is coded.

But back to the 8600GTS it was a gpu comparable to the 360 and if you had the 512mb version higher quality setting are possible that aren't on the 360.

Clock per clock yes that E2200 is better then the cpu's in 2005 however that cpu is a striped down C2D that released in Q4 2007 meaning an AMD Athlon X2 at 3ghz from 2005/2006 can out process it.

A pc needed for it to match or surpass the 360 in 2005 was 1.5k+ because of the future standard architecture designed into it. Now in 2006 when new hardware standards came out along with cheaper prices left 360 in the dust.

How am I moving the goalposts? The whole debate was about what kind of pc it would take to match the 360 at launch. I allowed the naysayers to move the goalpost by using hardware that came after the 360 launch because there really wasn't a pc available in 2005 that could match the 360.

The first Athlon X2 3ghz didn't even come out til 2008.

So this piss poor coding doesn't extend to the 360? Are ou claiming that Bethesda fully optimized the game for 360?

#563 Edited by 04dcarraher (19264 posts) -

@Cranler:

The first Athlon x 2 at 3 ghz was not 2008 , "Windsor" chips Athlon X2's upto 3.2 ghz came out in late 2006 to early 2007 depending on location.You must be thinking of the Kuma or Regor versions of the Athlon X2's that were released in 2008/2009.

This is not even including overclocking ability to get most X2's 2.6ghz+ to 3 ghz.

Bethesda created the game for the consoles first which bypasses alot of the overheads and problems from porting. The consoles did have coding glitches too and had awful loading times because of their hdd's and slow cpu's. But pc version got nailed with a port job with some improvements later on but most being supplied from the users not the devs. Until the patch to support four threads came out most cpu's older and or slower then 2nd gen intel icores couldnt reach 60 fps even with medium to high end gpus.

Try using a better coded game say Bioshock infinite were an E6850 and 8600 GTS 256mb was able to deliver 1280x720 high settings 35+ fps at all times and peaked upto 70 fps.

#564 Edited by Dasein808 (335 posts) -

@04dcarraher said:

@Cranler:

The first Athlon x 2 at 3 ghz was not 2008 , The Windsor series of Athlon X2 came out in May 26 2006 and went upto 3.2 ghz. This is not even including overclocking ability to get most X2's 2.6ghz+ to 3 ghz.

Bethesda created the game for the consoles first which bypasses alot of the overheads and problems from porting. The consoles did have coding glitches too and had awful loading times because of their hdd's and slow cpu's. But pc version got nailed with a port job with some improvements later on but most being supplied from the users not the devs.

Try using a better coded game say Bioshock infinite were an E6850 and 8600 GTS 256mb was able to deliver 1280x720 high settings 35+ fps at all times and peaked upto 70 fps.

I would not bother with him.

Cranler is a troll who will continue to move the goal posts because he knows that he's wrong.

Any benchmarks that he provides will always be using GPUs/CPUs that barely exceed the developer's minimum recommendations instead of supplying benchmarks that actually use the developers' recommended GPU/CPU for the most current gaming experience of a given time because he likes to pretend that PCs are bound by the technological restrictions of console generations.

He will also never shut up or admit that his evidence is distorted garbage.

#565 Edited by Cranler (8730 posts) -
@04dcarraher said:

@Cranler:

The first Athlon x 2 at 3 ghz was not 2008 , "Windsor" chips Athlon X2's upto 3.2 ghz came out in late 2006 to early 2007 depending on location.You must be thinking of the Kuma or Regor versions of the Athlon X2's that were released in 2008/2009.

This is not even including overclocking ability to get most X2's 2.6ghz+ to 3 ghz.

Bethesda created the game for the consoles first which bypasses alot of the overheads and problems from porting. The consoles did have coding glitches too and had awful loading times because of their hdd's and slow cpu's. But pc version got nailed with a port job with some improvements later on but most being supplied from the users not the devs. Until the patch to support four threads came out most cpu's older and or slower then 2nd gen intel icores couldnt reach 60 fps even with medium to high end gpus.

Try using a better coded game say Bioshock infinite were an E6850 and 8600 GTS 256mb was able to deliver 1280x720 high settings 35+ fps at all times and peaked upto 70 fps.

You mean this video where he runs the game at lowest settings at 800x600? See 2:01

Hope he's using a crt because 800x600 will look absolutely awful on an lcd.

#566 Posted by Cranler (8730 posts) -

@04dcarraher said:

@Cranler:

The first Athlon x 2 at 3 ghz was not 2008 , The Windsor series of Athlon X2 came out in May 26 2006 and went upto 3.2 ghz. This is not even including overclocking ability to get most X2's 2.6ghz+ to 3 ghz.

Bethesda created the game for the consoles first which bypasses alot of the overheads and problems from porting. The consoles did have coding glitches too and had awful loading times because of their hdd's and slow cpu's. But pc version got nailed with a port job with some improvements later on but most being supplied from the users not the devs.

Try using a better coded game say Bioshock infinite were an E6850 and 8600 GTS 256mb was able to deliver 1280x720 high settings 35+ fps at all times and peaked upto 70 fps.

I would not bother with him.

Cranler is a troll who will continue to move the goal posts because he knows that he's wrong.

Any benchmarks that he provides will always be using GPUs/CPUs that barely exceed the developer's minimum recommendations instead of supplying benchmarks that actually use the developers' recommended GPU/CPU for the most current gaming experience of a given time because he likes to pretend that PCs are bound by the technological restrictions of console generations.

He will also never shut up or admit that his evidence is distorted garbage.

I move the goal posts to at least give the pc a little bit of a chance at matching the 360.

It's not my fault that the 360 can run games that $2000 pc's from 2005 can't.

#567 Posted by RyviusARC (4309 posts) -

@Cranler said:

@RyviusARC said:

@Cranler said:

@04dcarraher said:

@Cranler said:

@RyviusARC said:

@Cranler said:
@Dasein808 said:
@RyviusARC said:

Ummmm why is the directx 10 version null?

Because Cranler.

Hmmm... Is there something in this setup that could prevent one from using the dx10 mode?

  • Processor: Intel E6700 Core 2 Duo 266x10
  • Motherboard: ECS P35T-A
  • Memory: 2 x 1GB Mushkin XP9200 400FSB 5-4-4-12
  • Video Card(s): Gigabyte GeForce 8600 GT( Silent pipe II Technology Cooled)
  • Power Supply: Ultra X3 Modular Power supply
  • Hard Drive: 2 x WD 250GB 16MB cache SATA
  • Opticals: BenQ DW-1655 Lightscribe DVD-R, Sony DVD-ROM
  • O/S: Windows XP Service Pack 2
  • Comparison Card 1: XFX 8600GTS XXX
  • Comparison Card 2: Sapphire X1950 PRO Ultimate
  • Comparison Card 3: XFX 8600GT Fatal1ty
@clyde46 said:

Once again, why are we arguing over specs from 2005?

Because Dasein is reality reluctant.

Well I guess he didn't use dx10 because of Windows XP.

But it was still max dx9 settings which I believe was still a tad about the 360 settings as the dx10 switch was more for performance than visuals in Bioshock.

Again:The PC graphics at high quality are comparable to the Xbox 360's, but the PC still offers higher resolutions and extreme anti-aliasing support.

High is max. So pc has higher res and aa to take it above the 360 but the 8600 can't handle aa or higher res.

lol you missed "comparable" which tells you its using a slew of settings from low to high , its not running full high nor medium etc

If the 360 wasn't full high then they would surely have said that the pc has a slight advantage. You're reading way too much into their use of the term comparable.

Either way the weakest gpu that could beat the 360 in Bioshock and most other multiplats is the 8800.

the 8800 was around 3x the power of the 360 so it was quite a bit stronger.

The GPU that is comparable to the 360 is inbetween an 8600gt and 8600gts.

Skyrim runs better on 360 than a pc with 8600 gts. gets 30 fps at lower setting than 360 version.

https://www.youtube.com/watch?v=r58nEXV5EV0

Like 04dcarraher said that CPU is weak and Skyrim was very badly optimized for cpus.

I could point out many other games where the 8600gts matches or surpasses the Xbox 360.

#568 Edited by Cranler (8730 posts) -

@Cranler said:

@RyviusARC said:

@Cranler said:

@04dcarraher said:

@Cranler said:

@RyviusARC said:

@Cranler said:
@Dasein808 said:
@RyviusARC said:

Ummmm why is the directx 10 version null?

Because Cranler.

Hmmm... Is there something in this setup that could prevent one from using the dx10 mode?

  • Processor: Intel E6700 Core 2 Duo 266x10
  • Motherboard: ECS P35T-A
  • Memory: 2 x 1GB Mushkin XP9200 400FSB 5-4-4-12
  • Video Card(s): Gigabyte GeForce 8600 GT( Silent pipe II Technology Cooled)
  • Power Supply: Ultra X3 Modular Power supply
  • Hard Drive: 2 x WD 250GB 16MB cache SATA
  • Opticals: BenQ DW-1655 Lightscribe DVD-R, Sony DVD-ROM
  • O/S: Windows XP Service Pack 2
  • Comparison Card 1: XFX 8600GTS XXX
  • Comparison Card 2: Sapphire X1950 PRO Ultimate
  • Comparison Card 3: XFX 8600GT Fatal1ty
@clyde46 said:

Once again, why are we arguing over specs from 2005?

Because Dasein is reality reluctant.

Well I guess he didn't use dx10 because of Windows XP.

But it was still max dx9 settings which I believe was still a tad about the 360 settings as the dx10 switch was more for performance than visuals in Bioshock.

Again:The PC graphics at high quality are comparable to the Xbox 360's, but the PC still offers higher resolutions and extreme anti-aliasing support.

High is max. So pc has higher res and aa to take it above the 360 but the 8600 can't handle aa or higher res.

lol you missed "comparable" which tells you its using a slew of settings from low to high , its not running full high nor medium etc

If the 360 wasn't full high then they would surely have said that the pc has a slight advantage. You're reading way too much into their use of the term comparable.

Either way the weakest gpu that could beat the 360 in Bioshock and most other multiplats is the 8800.

the 8800 was around 3x the power of the 360 so it was quite a bit stronger.

The GPU that is comparable to the 360 is inbetween an 8600gt and 8600gts.

Skyrim runs better on 360 than a pc with 8600 gts. gets 30 fps at lower setting than 360 version.

https://www.youtube.com/watch?v=r58nEXV5EV0

Like 04dcarraher said that CPU is weak and Skyrim was very badly optimized for cpus.

I could point out many other games where the 8600gts matches or surpasses the Xbox 360.

What games? Now that the goal posts have been moved in favor of my opponents.

#569 Edited by 04dcarraher (19264 posts) -
@Cranler said:
@04dcarraher said:

@Cranler:

The first Athlon x 2 at 3 ghz was not 2008 , "Windsor" chips Athlon X2's upto 3.2 ghz came out in late 2006 to early 2007 depending on location.You must be thinking of the Kuma or Regor versions of the Athlon X2's that were released in 2008/2009.

This is not even including overclocking ability to get most X2's 2.6ghz+ to 3 ghz.

Bethesda created the game for the consoles first which bypasses alot of the overheads and problems from porting. The consoles did have coding glitches too and had awful loading times because of their hdd's and slow cpu's. But pc version got nailed with a port job with some improvements later on but most being supplied from the users not the devs. Until the patch to support four threads came out most cpu's older and or slower then 2nd gen intel icores couldnt reach 60 fps even with medium to high end gpus.

Try using a better coded game say Bioshock infinite were an E6850 and 8600 GTS 256mb was able to deliver 1280x720 high settings 35+ fps at all times and peaked upto 70 fps.

You mean this video where he runs the game at lowest settings at 800x600? See 2:01

Hope he's using a crt because 800x600 will look absolutely awful on an lcd.

There is no reason why that guy should be running at 800x600.

Your just being a troll though..... The 8800GT is able to get 80 fps average maxed out at 720, so at 720 a 8600GTS should be able to do 35-40 average. Now there's a video of someone playing the game at 1440x900 with ultra settings with a 2.2 ghz intel cpu on a 8600GT DDR2 version not even GDDR3 version and in many cases gets close to 30 fps.

#570 Edited by Bebi_vegeta (13558 posts) -

@mr_huggles_dog said:

@Bebi_vegeta said:

@mr_huggles_dog said:

Dude....stop making up lies.

I don't know where you're getting that I'm encouraging exclusives....probably just trying to get good with ppl who are light hearted about gaming and who aren't into the system wars thing and will be like "yeah...he's a good guy....he doesn't like exclusives....I agree with him".

Again...stop making up lies....I never said F2P games were bad.....again....you're making things up and putting words in my mouth.

I'll buy you the next AAA game that comes out that you want if you can point out where I said F2P games are bad.

Lastly...don't start acting like I'm crazy or something....you're the one that started whining about being insulted and getting your feelings hurt. Like I said....all I did was give my viewpoint and then....well, I'm not repeating myself....anyone following this conversation can either read the last couple of pages or reread my last couple of posts.

You started at me....not the other way around.

Don't pick a fight if you can't deal with ppl treating you like you treat them....like a 10 year old does.

You're obviously encouraging exclusive since they matter so much for you.

And seriously what are you going on with : probably just trying to get good with ppl who are light hearted about gaming and who aren't into the system wars thing and will be like "yeah...he's a good guy....he doesn't like exclusives....I agree with him". ? It almost sound like you're making a story in your head about me saying those things. HAHAH!

Sorry I was under the impression you were mocking F2P/MOBAs games.

Oh the "you started it" quote with the "10 year old" quote... yeah goes really well together. LOL!!

Lol....you're such a tool.

YOU are the one that whined about me insulting you first. I can't help that you're just spinning and lying left and right and then throwing the "NUH UH....YOU DID IT!!!" crap at me and then calling me a child....it's the truth....what do you want me to say to your crying?

You THOUGHT I was mocking F2P games and MIOBAs.....you were wrong. Yet, you're still intent that I'm all for exclusives. So far...you haven't been right about anything....why would this be any different.

Now I'm whining and crying ? Is this another one of your sick stories that you're running in your head ?

At the beginning you told me I was 12, and then you're telling me I'm 10... I guess next post I'll be 8.

You're obviously all about exclusives, that's the only thing you've been talking about.

You know, I'm really starting to think that you're actually crazy.

#571 Posted by Cranler (8730 posts) -

@Cranler said:
@04dcarraher said:

@Cranler:

The first Athlon x 2 at 3 ghz was not 2008 , "Windsor" chips Athlon X2's upto 3.2 ghz came out in late 2006 to early 2007 depending on location.You must be thinking of the Kuma or Regor versions of the Athlon X2's that were released in 2008/2009.

This is not even including overclocking ability to get most X2's 2.6ghz+ to 3 ghz.

Bethesda created the game for the consoles first which bypasses alot of the overheads and problems from porting. The consoles did have coding glitches too and had awful loading times because of their hdd's and slow cpu's. But pc version got nailed with a port job with some improvements later on but most being supplied from the users not the devs. Until the patch to support four threads came out most cpu's older and or slower then 2nd gen intel icores couldnt reach 60 fps even with medium to high end gpus.

Try using a better coded game say Bioshock infinite were an E6850 and 8600 GTS 256mb was able to deliver 1280x720 high settings 35+ fps at all times and peaked upto 70 fps.

You mean this video where he runs the game at lowest settings at 800x600? See 2:01

Hope he's using a crt because 800x600 will look absolutely awful on an lcd.

There is no reason why that guy should be running at 800x600.

Your just being a troll though..... The 8800GT is able to get 80 fps average maxed out at 720, so at 720 a 8600GTS should be able to do 35-40 average. Now there's a video of someone playing the game at 1440x900 with ultra settings with a 2.2 ghz intel cpu on a 8600GT DDR2 version not even GDDR3 version and in many cases gets close to 30 fps.

How am I trolling? Your the one who brought up that 800x600 Bioshock video not me.

Now you bring up another video where the only footage is just walking around, no fighting and it's still sub 30 fps most of the time.

#572 Posted by WallofTruth (1537 posts) -

I have found myself playing console gaming exclusively and before 2011 when I got an Xbox 360, I was an exclusive PC gamer. Used to do LAN parties and all that and hauling a burly desktop with monitor and all related stuff was a pain in the butt...then had to deal with other folk who had computer issues as not everyone at the LAN party is a PC specialist....so you know you become the PC technician for their system so you can get all up and running. Then not everyone in the LAN party would have a decent rig to game on...had guys show up with Walmart entry level Celeron's trying to play BF2142 and you can tell they were not having a good time as they had to play it at low settings and even then....not the best.

So with that said, the downside to PC gaming is the same as it's upside. If you want to play the latest games at a decent speed and resolution, you have to invest $$ in a good system with good components and every 2 -3 years you are in constant upgrading mode as the technology in hardware and games improve. I have in the 10 years I was PC gaming, went through over 8 computers between the upgrading this and that...I had enough spare parts laying around to put together complete systems.

My brother and cousin got Xbox 360 a year or so before I bought mine and it took alot of them harping at me to get me to switch over. I eventually got one and I enjoyed the experience...alot easier to pack a 360 to LAN parties than it was a complete computer system...and the best part is everyone had the same hardware and you didn't have to play PC tech anymore - just plug it up, connect to network, and game away. Took a while to get used to the controller - was real frustrating at first but eventually got used to it and now I can fully enjoy the experience.

Although Microsoft isn't my favorite company - they are notorious for "fixing" things that don't need fixed and not fixing what does need fixed. They constantly think they have to jack around with GUI's and remove features with new consoles that were awesome. For example, the Xbox 360 could be configured as a Windows Media Extender so I could watch movies I had on my PC on the console...on the Xbox One - no deuce... they removed it after they claimed the Xbox One would be the replacement for all entertainment devices....it couldn't even replace the Xbox 360 - because I have to use that to enjoy the Xbox 360 games and watch movies that I have on my PC.

So consoles have demons of their own. But in general, I do enjoy console gaming.

Yeah no, you don't need to upgrade every 2-3 years.

#573 Posted by WallofTruth (1537 posts) -

@Bebi_vegeta said:

@mr_huggles_dog said:

Wrong...I stated my view point and then you and the lackies started assaulting my integrity saying in nerd talk "Oh...it's just a self preference b.s. opinion". Thats when I started acting like a prick...b/c fire with fire and all that crap.

The exclusives on PC are like a shit load of indy games with 90% of them are crap.

As a gamer you SHOULD care. B/c if companies know that a platform is more profitable...then they are more likely to make their game for that platform. As it is...MOBAs, F2P games and other non standard games are the only things that sell well on PC....so practically EVERYTHING ELSE worth a dam comes to consoles.

Better exclusives goes back to what I said earlier....there's shit for exclusives aside from indy games on PC...so yeah...like I said AAA games like the ones I mentioned....not "Harry's Weird Adventure: Episode 1" which is some sort of platformer with a minor twist that PC herald as the revolutionary second coming.

Again....anyone with half a brain knows the bullshit argument you put up with "WAAAAAH....you can't include consoles in general....b/c PC will get beat up in that argument!!!" is nothing but spin.

Man, what non-sense are you talking about ? because fire with fire and that crap ? hahaha what ?

You just made more stats up, keep em coming.

As a gamer, the only thing I care about is games that I can enjoy. If that said game is good, it will sell itself.

And seriously, I don't see the issue with F2P games unless you hate gaming.

There's plenty of variety of games on PC. Like I said, there's a shit load of games on PC, there's plenty of AAA and AA exclusive and multiplats on PC.

If you were smart enough, you wouldn't encourage games being exclusive. You wouldn't need to buy 2-3 consoles every gen.

Better exclusive is subjective, you can't argue that.

It's funny how you said I was 12... you can't be far that off.


I'll buy you the next AAA game that comes out that you want if you can point out where I said F2P games are bad.

"MOBAs, F2P games and other non standard games are the only things that sell well on PC....so practically EVERYTHING ELSE worth a dam comes to consoles."

If MOBA's and F2P games are not "worth a damn" then you're obviously saying they're bad.

So, I'd like to get Grand Theft Auto V Steam pre-order from you.

#574 Posted by Evo_nine (1647 posts) -

@lawson_raider said:

I have found myself playing console gaming exclusively and before 2011 when I got an Xbox 360, I was an exclusive PC gamer. Used to do LAN parties and all that and hauling a burly desktop with monitor and all related stuff was a pain in the butt...then had to deal with other folk who had computer issues as not everyone at the LAN party is a PC specialist....so you know you become the PC technician for their system so you can get all up and running. Then not everyone in the LAN party would have a decent rig to game on...had guys show up with Walmart entry level Celeron's trying to play BF2142 and you can tell they were not having a good time as they had to play it at low settings and even then....not the best.

So with that said, the downside to PC gaming is the same as it's upside. If you want to play the latest games at a decent speed and resolution, you have to invest $$ in a good system with good components and every 2 -3 years you are in constant upgrading mode as the technology in hardware and games improve. I have in the 10 years I was PC gaming, went through over 8 computers between the upgrading this and that...I had enough spare parts laying around to put together complete systems.

My brother and cousin got Xbox 360 a year or so before I bought mine and it took alot of them harping at me to get me to switch over. I eventually got one and I enjoyed the experience...alot easier to pack a 360 to LAN parties than it was a complete computer system...and the best part is everyone had the same hardware and you didn't have to play PC tech anymore - just plug it up, connect to network, and game away. Took a while to get used to the controller - was real frustrating at first but eventually got used to it and now I can fully enjoy the experience.

Although Microsoft isn't my favorite company - they are notorious for "fixing" things that don't need fixed and not fixing what does need fixed. They constantly think they have to jack around with GUI's and remove features with new consoles that were awesome. For example, the Xbox 360 could be configured as a Windows Media Extender so I could watch movies I had on my PC on the console...on the Xbox One - no deuce... they removed it after they claimed the Xbox One would be the replacement for all entertainment devices....it couldn't even replace the Xbox 360 - because I have to use that to enjoy the Xbox 360 games and watch movies that I have on my PC.

So consoles have demons of their own. But in general, I do enjoy console gaming.

Yeah no, you don't need to upgrade every 2-3 years.

Especially this generation with the consoles about as powerful as a nice tablet.

#576 Edited by Mr_Huggles_dog (674 posts) -

@mr_huggles_dog said:

Lol....you're such a tool.

YOU are the one that whined about me insulting you first. I can't help that you're just spinning and lying left and right and then throwing the "NUH UH....YOU DID IT!!!" crap at me and then calling me a child....it's the truth....what do you want me to say to your crying?

You THOUGHT I was mocking F2P games and MIOBAs.....you were wrong. Yet, you're still intent that I'm all for exclusives. So far...you haven't been right about anything....why would this be any different.

Now I'm whining and crying ? Is this another one of your sick stories that you're running in your head ?

At the beginning you told me I was 12, and then you're telling me I'm 10... I guess next post I'll be 8.

You're obviously all about exclusives, that's the only thing you've been talking about.

You know, I'm really starting to think that you're actually crazy.

Ok dude....go on wit yo bad self.

@mr_huggles_dog said:


I'll buy you the next AAA game that comes out that you want if you can point out where I said F2P games are bad.

"MOBAs, F2P games and other non standard games are the only things that sell well on PC....so practically EVERYTHING ELSE worth a dam comes to consoles."

If MOBA's and F2P games are not "worth a damn" then you're obviously saying they're bad.

So, I'd like to get Grand Theft Auto V Steam pre-order from you.

I swear....it's like some of you ppl didn't pass 3rd grade.

If I said "anything else worth a dam"....something has to be worth a dam first for there to be anything else...worth a dam.

So, I dunno.....I don't know if I'm being trolled but you never know....there are ppl actually this stupid out there....and I'd like to think I can fix stupid.

#577 Edited by 04dcarraher (19264 posts) -

@Cranler said:

@04dcarraher said:
@Cranler said:
@04dcarraher said:

@Cranler:

The first Athlon x 2 at 3 ghz was not 2008 , "Windsor" chips Athlon X2's upto 3.2 ghz came out in late 2006 to early 2007 depending on location.You must be thinking of the Kuma or Regor versions of the Athlon X2's that were released in 2008/2009.

This is not even including overclocking ability to get most X2's 2.6ghz+ to 3 ghz.

Bethesda created the game for the consoles first which bypasses alot of the overheads and problems from porting. The consoles did have coding glitches too and had awful loading times because of their hdd's and slow cpu's. But pc version got nailed with a port job with some improvements later on but most being supplied from the users not the devs. Until the patch to support four threads came out most cpu's older and or slower then 2nd gen intel icores couldnt reach 60 fps even with medium to high end gpus.

Try using a better coded game say Bioshock infinite were an E6850 and 8600 GTS 256mb was able to deliver 1280x720 high settings 35+ fps at all times and peaked upto 70 fps.

You mean this video where he runs the game at lowest settings at 800x600? See 2:01

Hope he's using a crt because 800x600 will look absolutely awful on an lcd.

There is no reason why that guy should be running at 800x600.

Your just being a troll though..... The 8800GT is able to get 80 fps average maxed out at 720, so at 720 a 8600GTS should be able to do 35-40 average. Now there's a video of someone playing the game at 1440x900 with ultra settings with a 2.2 ghz intel cpu on a 8600GT DDR2 version not even GDDR3 version and in many cases gets close to 30 fps.

How am I trolling? Your the one who brought up that 800x600 Bioshock video not me.

Now you bring up another video where the only footage is just walking around, no fighting and it's still sub 30 fps most of the time.

lol no your not trolling? and yes there was fighting , and whats funny is that a 2.2 ghz cpu and 8600GT with DDR2 was almost playable with ultra at 1440x900 now imagine a C2D at 3 ghz or any modern cpu and a normal 8600GTS with GDDR3 it would handle the game just as well as the 360 if not better. Problem is that we will never see true update benchmarks with gpu's that are nearly 7 years old and were mid ranged. Fact is that during their brief usage it did in many cases provided better results in multiplats like COD4, UT3 and 1st bioshock.

The 8600GT gets 36fps maxed out in 1280x1024 that is more than double the resolution of the 1024x600 so it's safe to assume that the 8600GT would get around 80fps maxed out in COD4 at same resolution.Then in UT3 it get's 39fps maxed out in 1600x1200.It is safe to say that the 8600GT would be able to 60 fps at 720p. And Then the 8600GT was able to run bioshock at 40 fps average at 720p. So I would say as long as you have the cpu performance the 8600GTS would be able to perform as well if not better then 360's settings.

#578 Edited by Bebi_vegeta (13558 posts) -

@mr_huggles_dog said:

@Bebi_vegeta said:

@mr_huggles_dog said:

Lol....you're such a tool.

YOU are the one that whined about me insulting you first. I can't help that you're just spinning and lying left and right and then throwing the "NUH UH....YOU DID IT!!!" crap at me and then calling me a child....it's the truth....what do you want me to say to your crying?

You THOUGHT I was mocking F2P games and MIOBAs.....you were wrong. Yet, you're still intent that I'm all for exclusives. So far...you haven't been right about anything....why would this be any different.

Now I'm whining and crying ? Is this another one of your sick stories that you're running in your head ?

At the beginning you told me I was 12, and then you're telling me I'm 10... I guess next post I'll be 8.

You're obviously all about exclusives, that's the only thing you've been talking about.

You know, I'm really starting to think that you're actually crazy.

Ok dude....go on wit yo bad self.

Yeah that's what I taught, you bring poor arguments and got mad, in the end you had nothing to say.

#579 Posted by WallofTruth (1537 posts) -

@walloftruth said:
@mr_huggles_dog said:


I'll buy you the next AAA game that comes out that you want if you can point out where I said F2P games are bad.

"MOBAs, F2P games and other non standard games are the only things that sell well on PC....so practically EVERYTHING ELSE worth a dam comes to consoles."

If MOBA's and F2P games are not "worth a damn" then you're obviously saying they're bad.

So, I'd like to get Grand Theft Auto V Steam pre-order from you.

I swear....it's like some of you ppl didn't pass 3rd grade.

If I said "anything else worth a dam"....something has to be worth a dam first for there to be anything else...worth a dam.

So, I dunno.....I don't know if I'm being trolled but you never know....there are ppl actually this stupid out there....and I'd like to think I can fix stupid.

So let me guess, you're not keeping your word even though I pointed out where you said F2P games are bad?

#580 Edited by Cranler (8730 posts) -

@Cranler said:

@04dcarraher said:
@Cranler said:
@04dcarraher said:

@Cranler:

The first Athlon x 2 at 3 ghz was not 2008 , "Windsor" chips Athlon X2's upto 3.2 ghz came out in late 2006 to early 2007 depending on location.You must be thinking of the Kuma or Regor versions of the Athlon X2's that were released in 2008/2009.

This is not even including overclocking ability to get most X2's 2.6ghz+ to 3 ghz.

Bethesda created the game for the consoles first which bypasses alot of the overheads and problems from porting. The consoles did have coding glitches too and had awful loading times because of their hdd's and slow cpu's. But pc version got nailed with a port job with some improvements later on but most being supplied from the users not the devs. Until the patch to support four threads came out most cpu's older and or slower then 2nd gen intel icores couldnt reach 60 fps even with medium to high end gpus.

Try using a better coded game say Bioshock infinite were an E6850 and 8600 GTS 256mb was able to deliver 1280x720 high settings 35+ fps at all times and peaked upto 70 fps.

You mean this video where he runs the game at lowest settings at 800x600? See 2:01

Hope he's using a crt because 800x600 will look absolutely awful on an lcd.

There is no reason why that guy should be running at 800x600.

Your just being a troll though..... The 8800GT is able to get 80 fps average maxed out at 720, so at 720 a 8600GTS should be able to do 35-40 average. Now there's a video of someone playing the game at 1440x900 with ultra settings with a 2.2 ghz intel cpu on a 8600GT DDR2 version not even GDDR3 version and in many cases gets close to 30 fps.

How am I trolling? Your the one who brought up that 800x600 Bioshock video not me.

Now you bring up another video where the only footage is just walking around, no fighting and it's still sub 30 fps most of the time.

lol no your not trolling? and yes there was fighting , and whats funny is that a 2.2 ghz cpu and 8600GT with DDR2 was almost playable with ultra at 1440x900 now imagine a C2D at 3 ghz or any modern cpu and a normal 8600GTS with GDDR3 it would handle the game just as well as the 360 if not better. Problem is that we will never see true update benchmarks with gpu's that are nearly 7 years old and were mid ranged. Fact is that during their brief usage it did in many cases provided better results in multiplats like COD4, UT3 and 1st bioshock.

The 8600GT gets 36fps maxed out in 1280x1024 that is more than double the resolution of the 1024x600 so it's safe to assume that the 8600GT would get around 80fps maxed out in COD4 at same resolution.Then in UT3 it get's 39fps maxed out in 1600x1200.It is safe to say that the 8600GT would be able to 60 fps at 720p. And Then the 8600GT was able to run bioshock at 40 fps average at 720p. So I would say as long as you have the cpu performance the 8600GTS would be able to perform as well if not better then 360's settings.

Ok I found the video and it's easy to tell the average framerate is low 20's at best. Compare the opening rowboat section to this https://www.youtube.com/watch?v=d_7z_kLdYl0

You know the framerate is bad when you can find a 30 fps video that looks much smoother.

So now you want to move the goalposts even further from 2005 pc tech to c2d's at 3 ghz? LOL!

Not sure what game you speak of in first sentence of your last paragraph. Sounds like Bioshock which is 1280x720 on 360. 36 fps means the fps will drop into the 20's often.

CoD 4 8600 gts/e 8400 3.6 ghz 1024x768 0x aa 63 fps. 360 is 1024x 600 2x aa. Don't try to tell me that an extra 168 pixels is more demanding than 2xaa.

If the pc in that bench had been available in 2005 it would have cost $4,000.

http://www.techpowerup.com/reviews/Zotac/GeForce_9800_GTX/6.html

#581 Edited by jun_aka_pekto (15917 posts) -

@Cranler said:

CoD 4 8600 gts/e 8400 3.6 ghz 1024x768 0x aa 63 fps. 360 is 1024x 600 2x aa. Don't try to tell me that an extra 168 pixels is more demanding than 2xaa.

Isn't it 172,032 extra pixels? Also. Are you guys referring to Bioshock or Bioshock Infinite or both?

#582 Posted by Cranler (8730 posts) -

@Cranler said:

CoD 4 8600 gts/e 8400 3.6 ghz 1024x768 0x aa 63 fps. 360 is 1024x 600 2x aa. Don't try to tell me that an extra 168 pixels is more demanding than 2xaa.

Isn't it 172,032 extra pixels?

I meant pixel rows and again, small res differences don't affect performance as much as aa.

#583 Edited by 04dcarraher (19264 posts) -

@Cranler said:

@jun_aka_pekto said:

@Cranler said:

CoD 4 8600 gts/e 8400 3.6 ghz 1024x768 0x aa 63 fps. 360 is 1024x 600 2x aa. Don't try to tell me that an extra 168 pixels is more demanding than 2xaa.

Isn't it 172,032 extra pixels?

I meant pixel rows and again, small res differences don't affect performance as much as aa.

Not moving the goal posts lol, you are...... all im talking about is a 8600GTS is able to match or surpass the 360 even with modern games we are not talking about 2005 anymore we already know there was nothing besides the top tier cpu's that could compete with 360 back then.

Also lol 168 pixels..... 1024x768 vs 1024x600 is 22% difference in pixels and lets not forget that the 360 Xenos also had the edram daughter die that was used for AA. If you had a good cpu your framrates didnt vary much with bioshock. Look at CoH performance with 8600GTS or Quake wars at 1280x1024 43 fps with 2x AA vs 360's 1280x720 (no AA), or with UT3 at 1280x1024 with no AA being able to do nearly 60 fps.

Fact is you are not willing to come to terms that a 8600GTS performance was in the same league as the 360's gpu.

#584 Posted by Cranler (8730 posts) -

@Cranler said:

@jun_aka_pekto said:

@Cranler said:

CoD 4 8600 gts/e 8400 3.6 ghz 1024x768 0x aa 63 fps. 360 is 1024x 600 2x aa. Don't try to tell me that an extra 168 pixels is more demanding than 2xaa.

Isn't it 172,032 extra pixels?

I meant pixel rows and again, small res differences don't affect performance as much as aa.

Not moving the goal posts lol, you are...... all im talking about is a 8600GTS is able to match or surpass the 360 even with modern games we are not talking about 2005 anymore we already know there was nothing besides the top tier cpu's that could compete with 360 back then.

Also lol 168 pixels..... 1024x768 vs 1024x600 is 22% difference in pixels and lets not forget that the 360 Xenos also had the edram daughter die that was used for AA. If you had a good cpu your framrates didnt vary much with bioshock. Look at CoH performance with 8600GTS or Quake wars at 1280x1024 43 fps with 2x AA vs 360's 1280x720 (no AA), or with UT3 at 1280x1024 with no AA being able to do nearly 60 fps.

Fact is you are not willing to come to terms that a 8600GTS performance was in the same league as the 360's gpu.

You speak as if there's going to be a huge performance difference between those two resolutions. I just tested FC 3 since it allows you to change res in game. Between 1280x960 and 1280x800 I had a 1 fps difference.

I wasn't aware of a 360 version of CoH. Doom 3 engine is known to run like crap on 360.

And it seems you're using the bench I listed for your UT 3 bench. UT 3 is known to be a very cpu intensive game. That bench is with a E8400 @ 3.6 GHz

#585 Posted by Mr_Huggles_dog (674 posts) -

@mr_huggles_dog said:

I swear....it's like some of you ppl didn't pass 3rd grade.

If I said "anything else worth a dam"....something has to be worth a dam first for there to be anything else...worth a dam.

So, I dunno.....I don't know if I'm being trolled but you never know....there are ppl actually this stupid out there....and I'd like to think I can fix stupid.

So let me guess, you're not keeping your word even though I pointed out where you said F2P games are bad?

You lack the basic skills for reading most ppl have by the end of grade school.

YOU GET NOTHING.....GOOD DAY, SIR!!

#586 Posted by 04dcarraher (19264 posts) -

@Cranler said:

@04dcarraher said:

@Cranler said:

@jun_aka_pekto said:

@Cranler said:

CoD 4 8600 gts/e 8400 3.6 ghz 1024x768 0x aa 63 fps. 360 is 1024x 600 2x aa. Don't try to tell me that an extra 168 pixels is more demanding than 2xaa.

Isn't it 172,032 extra pixels?

I meant pixel rows and again, small res differences don't affect performance as much as aa.

Not moving the goal posts lol, you are...... all im talking about is a 8600GTS is able to match or surpass the 360 even with modern games we are not talking about 2005 anymore we already know there was nothing besides the top tier cpu's that could compete with 360 back then.

Also lol 168 pixels..... 1024x768 vs 1024x600 is 22% difference in pixels and lets not forget that the 360 Xenos also had the edram daughter die that was used for AA. If you had a good cpu your framrates didnt vary much with bioshock. Look at CoH performance with 8600GTS or Quake wars at 1280x1024 43 fps with 2x AA vs 360's 1280x720 (no AA), or with UT3 at 1280x1024 with no AA being able to do nearly 60 fps.

Fact is you are not willing to come to terms that a 8600GTS performance was in the same league as the 360's gpu.

You speak as if there's going to be a huge performance difference between those two resolutions. I just tested FC 3 since it allows you to change res in game. Between 1280x960 and 1280x800 I had a 1 fps difference.

I wasn't aware of a 360 version of CoH. Doom 3 engine is known to run like crap on 360.

And it seems you're using the bench I listed for your UT 3 bench. UT 3 is known to be a very cpu intensive game. That bench is with a E8400 @ 3.6 GHz

No i was not speaking as if there was going to be huge differences but you can if you are running the 512mb version of the card where it allows higher quality assets. With testing Farcry 3 at those resolutions on what gpu btw? memory bandwidth and bus play a big part in feeding the gpu. and 22% increase can change the game. just like with a 8800GT going from 1680x1050 to 1920x1080 can make a game unplayable even though its only a 15% difference. 1280x1024 to 1680x1050 also is only a 16% difference but can change the FPS by upto 25%. With UT3 AMD dual cores at or above 2.6 ghz runs that game just fine.

#587 Posted by scottpsfan14 (3792 posts) -

@Cranler said:

@04dcarraher said:

@Cranler said:

@jun_aka_pekto said:

@Cranler said:

CoD 4 8600 gts/e 8400 3.6 ghz 1024x768 0x aa 63 fps. 360 is 1024x 600 2x aa. Don't try to tell me that an extra 168 pixels is more demanding than 2xaa.

Isn't it 172,032 extra pixels?

I meant pixel rows and again, small res differences don't affect performance as much as aa.

Not moving the goal posts lol, you are...... all im talking about is a 8600GTS is able to match or surpass the 360 even with modern games we are not talking about 2005 anymore we already know there was nothing besides the top tier cpu's that could compete with 360 back then.

Also lol 168 pixels..... 1024x768 vs 1024x600 is 22% difference in pixels and lets not forget that the 360 Xenos also had the edram daughter die that was used for AA. If you had a good cpu your framrates didnt vary much with bioshock. Look at CoH performance with 8600GTS or Quake wars at 1280x1024 43 fps with 2x AA vs 360's 1280x720 (no AA), or with UT3 at 1280x1024 with no AA being able to do nearly 60 fps.

Fact is you are not willing to come to terms that a 8600GTS performance was in the same league as the 360's gpu.

You speak as if there's going to be a huge performance difference between those two resolutions. I just tested FC 3 since it allows you to change res in game. Between 1280x960 and 1280x800 I had a 1 fps difference.

I wasn't aware of a 360 version of CoH. Doom 3 engine is known to run like crap on 360.

And it seems you're using the bench I listed for your UT 3 bench. UT 3 is known to be a very cpu intensive game. That bench is with a E8400 @ 3.6 GHz

No i was not speaking as if there was going to be huge differences but you can if you are running the 512mb version of the card where it allows higher quality assets. With testing Farcry 3 at those resolutions on what gpu btw? memory bandwidth and bus play a big part in feeding the gpu. and 22% increase can change the game. just like with a 8800GT going from 1680x1050 to 1920x1080 can make a game unplayable even though its only a 15% difference. 1280x1024 to 1680x1050 also is only a 16% difference but can change the FPS by upto 25%. With UT3 AMD dual cores at or above 2.6 ghz runs that game just fine.

But that 8600 GTS couldn't run Halo 4 like the 360 could if it was on PC. DirectX is just too slow. The 343i were taking advantages of things that DirectX simply does not allow. DX is just a pain right dcarraher?

#588 Posted by Cranler (8730 posts) -

@Cranler said:

@04dcarraher said:

@Cranler said:

@jun_aka_pekto said:

@Cranler said:

CoD 4 8600 gts/e 8400 3.6 ghz 1024x768 0x aa 63 fps. 360 is 1024x 600 2x aa. Don't try to tell me that an extra 168 pixels is more demanding than 2xaa.

Isn't it 172,032 extra pixels?

I meant pixel rows and again, small res differences don't affect performance as much as aa.

Not moving the goal posts lol, you are...... all im talking about is a 8600GTS is able to match or surpass the 360 even with modern games we are not talking about 2005 anymore we already know there was nothing besides the top tier cpu's that could compete with 360 back then.

Also lol 168 pixels..... 1024x768 vs 1024x600 is 22% difference in pixels and lets not forget that the 360 Xenos also had the edram daughter die that was used for AA. If you had a good cpu your framrates didnt vary much with bioshock. Look at CoH performance with 8600GTS or Quake wars at 1280x1024 43 fps with 2x AA vs 360's 1280x720 (no AA), or with UT3 at 1280x1024 with no AA being able to do nearly 60 fps.

Fact is you are not willing to come to terms that a 8600GTS performance was in the same league as the 360's gpu.

You speak as if there's going to be a huge performance difference between those two resolutions. I just tested FC 3 since it allows you to change res in game. Between 1280x960 and 1280x800 I had a 1 fps difference.

I wasn't aware of a 360 version of CoH. Doom 3 engine is known to run like crap on 360.

And it seems you're using the bench I listed for your UT 3 bench. UT 3 is known to be a very cpu intensive game. That bench is with a E8400 @ 3.6 GHz

No i was not speaking as if there was going to be huge differences but you can if you are running the 512mb version of the card where it allows higher quality assets. With testing Farcry 3 at those resolutions on what gpu btw? memory bandwidth and bus play a big part in feeding the gpu. and 22% increase can change the game. just like with a 8800GT going from 1680x1050 to 1920x1080 can make a game unplayable even though its only a 15% difference. 1280x1024 to 1680x1050 also is only a 16% difference but can change the FPS by upto 25%. With UT3 AMD dual cores at or above 2.6 ghz runs that game just fine.

It's really not a good idea to use percentages with the resolutions. 1920x1080 and 1680x1050 have much bigger pixel count differences than 1024x768 and 1024x600. Find some benches that compare close resolutions to prove there would be more than a few fps difference.

The FC 3 test was on a gtx 680.

UT 3 is all about the cpu. A core 2 duo 2.2 ghz beats a x2 athlon 3.2 ghz in UT 3. 7900 gtx gets 55 fps at 1280x1024 yet on Bioshock and other UE 2.5-3 games you'd be looking at 30 fps or less with that card. http://www.anandtech.com/show/2352/8

#589 Posted by sam890 (1093 posts) -

PLEASE LET THIS THREAD DIE

#590 Edited by 04dcarraher (19264 posts) -

@Cranler said:

It's really not a good idea to use percentages with the resolutions. 1920x1080 and 1680x1050 have much bigger pixel count differences than 1024x768 and 1024x600. Find some benches that compare close resolutions to prove there would be more than a few fps difference.

The FC 3 test was on a gtx 680.

UT 3 is all about the cpu. A core 2 duo 2.2 ghz beats a x2 athlon 3.2 ghz in UT 3. 7900 gtx gets 55 fps at 1280x1024 yet on Bioshock and other UE 2.5-3 games you'd be looking at 30 fps or less with that card. http://www.anandtech.com/show/2352/8

Pixel percentages gives in general the difference in rendering the image. and when we are talking about resolutions close to each other gpu only being able to do 1280x**** or lower those percentages can make a difference because of the memory limits let alone gpu processing.

Doing FC3 on a GTX 680 doing those low resolutions is wrong since that gpu has the extra processing power and memory width and bus to handle those resolution with ease hence the 1 fps. Try doing 1680x1050 vs 1920x1200 you see a 10 fps difference easily. Or even doing 1280x1024 to 1680x1050 can see more then 20fps difference.

False a C2D at 2.2 ghz is slightly slower then 3 ghz Athlon X2. Cinebench R10 Athlon X2 6000 vs E4500 X2 gets 5595 vs c2d's 4806. or even pasmark 1600 vs 1262.

#591 Edited by 04dcarraher (19264 posts) -
@scottpsfan14 said:

But that 8600 GTS couldn't run Halo 4 like the 360 could if it was on PC. DirectX is just too slow. The 343i were taking advantages of things that DirectX simply does not allow. DX is just a pain right dcarraher?

... its like saying an 8800GT cant do UC3 because it does not have the Cell+RSX. And again direct x has no bearing on gpu performance or ability as long you have a cpu that take up the overhead, using the same rendering standards . But since we are talking about direct x 9 yes could not get the same tools and features since the 360's API and hardware is some what more advanced then standard direct x 9 api and hardware. However the 360 gpu is still limited to shader model 3 which is a standard in dx 9. The 8600GTS is a direct x 10 gpu able to do shader model 4 something the 360 can not do.

Direct x is not a scapegoat....

#592 Posted by Cranler (8730 posts) -

@Cranler said:

It's really not a good idea to use percentages with the resolutions. 1920x1080 and 1680x1050 have much bigger pixel count differences than 1024x768 and 1024x600. Find some benches that compare close resolutions to prove there would be more than a few fps difference.

The FC 3 test was on a gtx 680.

UT 3 is all about the cpu. A core 2 duo 2.2 ghz beats a x2 athlon 3.2 ghz in UT 3. 7900 gtx gets 55 fps at 1280x1024 yet on Bioshock and other UE 2.5-3 games you'd be looking at 30 fps or less with that card. http://www.anandtech.com/show/2352/8

Pixel percentages gives in general the difference in rendering the image. and when we are talking about resolutions close to each other gpu only being able to do 1280x**** or lower those percentages can make a difference because of the memory limits let alone gpu processing.

Doing FC3 on a GTX 680 doing those low resolutions is wrong since that gpu has the extra processing power and memory width and bus to handle those resolution with ease hence the 1 fps. Try doing 1680x1050 vs 1920x1200 you see a 10 fps difference easily. Or even doing 1280x1024 to 1680x1050 can see more then 20fps difference.

False a C2D at 2.2 ghz is slightly slower then 3 ghz Athlon X2. Cinebench R10 Athlon X2 6000 vs E4500 X2 gets 5595 vs c2d's 4806. or even pasmark 1600 vs 1262.

You haven't proven anything and I still say 1024x600 2xaa is more demanding than 1024x768 no aa and of course the aa difference would be easily noticable while the res difference would be just about impossible to notice. So CoD 4 360 version beats the 8600 gts.

Actually it was the 2.3 ghz e6550 that beats even the x2 6400 in UT 3. Even if you clocked the e6550 down to 2.2 it would still beat the x2 6400. Point is that UT 3 isn't a good game to use for gpu benches.

#593 Edited by 04dcarraher (19264 posts) -

@Cranler said:

You haven't proven anything and I still say 1024x600 2xaa is more demanding than 1024x768 no aa and of course the aa difference would be easily noticable while the res difference would be just about impossible to notice. So CoD 4 360 version beats the 8600 gts.

Actually it was the 2.3 ghz e6550 that beats even the x2 6400 in UT 3. Even if you clocked the e6550 down to 2.2 it would still beat the x2 6400. Point is that UT 3 isn't a good game to use for gpu benches.

lol 8600GTS creams 360 in CoD 4

Now lol with thinking the e6550 out right beats the Athlon X2 3.2ghz with the X2 6000 at 3 ghz passmark e6550 :1508 vs 1599 with x2 6000 then with Cinebench 5167 vs 5595. now with some things the e6550 does edge out the 6000 like with sysmark 2007 overall score 149 vs 133 a whole 11% difference. with an Athlon X2 6400 it edges out the cpu even more in the areas the 6000 did and overall sysmark score is less and with 3d portition of that test the 6400 beats it overall the difference is only 7%. Which means the performance would be nearly the same.

#594 Posted by scottpsfan14 (3792 posts) -
@scottpsfan14 said:

But that 8600 GTS couldn't run Halo 4 like the 360 could if it was on PC. DirectX is just too slow. The 343i were taking advantages of things that DirectX simply does not allow. DX is just a pain right dcarraher?

... its like saying an 8800GT cant do UC3 because it does not have the Cell+RSX. And again direct x has no bearing on gpu performance or ability as long you have a cpu that take up the overhead, using the same rendering standards . But since we are talking about direct x 9 yes could not get the same tools and features since the 360's API is more advanced then direct x 9 however the gpu is still limited to shader model 3 which is standard in dx 9. The 8600GTS is a direct x 10 gpu able to do shader model 4 something the 360 can not do. Direct x is not a scapegoat....

Lets break what you said down into little pieces shall we?

"its like saying an 8800GT cant do UC3 because it does not have the Cell+RSX"

The fact is, the Cell isn't fairytale. It does infact have real computational power that can offload polygons, particles and other effects. So it is safe to say that a 7800GT could not run UC3 like the PS3 could. 8800GT? Who knows? It's a 500GFlop GPU. More than what the PS3 has altogether including Cell, and it's shadermodel 4 like you say. So in theory, why not right?..

"And again direct x has no bearing on gpu performance or ability as long you have a cpu that take up the overhead, using the same rendering standards"

Thats bold statements there. Lets see.. So your theory is that if you have an Intel CPU that is powerful enough, the lack of drawcalls from the CPU can be brute forced and actually render more chunks of geometry at higher speeds than the console with a weaker CPU and 100x the draw calls?

So why is it developers from all over are 'frustrated' with the PC and Direct X if this problem can easily be overcome by buying a high end CPU?

http://www.computerandvideogames.com/314566/carmack-extremely-frustrated-that-pc-is-10-times-as-powerful-as-ps3-360/

John Carmack

"It is extremely frustrating knowing that the hardware we've got on the PC is often ten times as powerful as the consoles but it has honestly been a struggle in many cases to get the game running at 60 frames per second on the PC like it does on a 360,"

"A lot of it's driver overhead issues, where there's so much that we do in the game, all of this dynamic texture updating where on the console we say 'alright, we've got a new page of data', we put that page in and update the page table that points to that.

The truth is those 'driver overheads' are not CPU, it's GPU. DirectX is essentially a PC developers primary tool set, and it's tethered and massively limited to how the GPU can actually perform. But he was only talking about one instance in graphics rendering here.

In no way, shape, or form does he limit his statement to 'CPU overhead'. Put it this way, I can bet my life that Uncharted 4 and the likes could not be done at the same graphics quality on a 7850. I could link so many more devs saying the same thing. Direct X is way too abstract to compete with a console for performance. Fact. Tell me your insight on this. Utter BS?

#595 Posted by clyde46 (45004 posts) -

Hopefully DX12 will actually be a leap forward compared to DX10/11.

#596 Posted by Cranler (8730 posts) -

@Cranler said:

You haven't proven anything and I still say 1024x600 2xaa is more demanding than 1024x768 no aa and of course the aa difference would be easily noticable while the res difference would be just about impossible to notice. So CoD 4 360 version beats the 8600 gts.

Actually it was the 2.3 ghz e6550 that beats even the x2 6400 in UT 3. Even if you clocked the e6550 down to 2.2 it would still beat the x2 6400. Point is that UT 3 isn't a good game to use for gpu benches.

lol 8600GTS creams 360 in CoD 4

Now lol with thinking the e6550 out right beats the Athlon X2 3.2ghz with the X2 6000 at 3 ghz passmark e6550 :1508 vs 1599 with x2 6000 then with Cinebench 5167 vs 5595. now with some things the e6550 does edge out the 6000 like with sysmark 2007 overall score 149 vs 133 a whole 11% difference. with an Athlon X2 6400 it edges out the cpu even more in the areas the 6000 did and overall sysmark score is less and with 3d portition of that test the 6400 beats it overall the difference is only 7%. Which means the performance would be nearly the same.

I already linked the 8600 gts bench where the only way it can get 60 fps in CoD 4 is at 1024x768 no aa. 360 gets 60 fps with 1024x600 which is indistinguishable from 1024x768 and also has 2x aa so 360 wins.

I said the e6550 beats it in UT 3. it even beats the x2 6400. http://www.anandtech.com/show/2352/7

#598 Edited by 04dcarraher (19264 posts) -

@scottpsfan14 said:
@04dcarraher said:
@scottpsfan14 said:

But that 8600 GTS couldn't run Halo 4 like the 360 could if it was on PC. DirectX is just too slow. The 343i were taking advantages of things that DirectX simply does not allow. DX is just a pain right dcarraher?

... its like saying an 8800GT cant do UC3 because it does not have the Cell+RSX. And again direct x has no bearing on gpu performance or ability as long you have a cpu that take up the overhead, using the same rendering standards . But since we are talking about direct x 9 yes could not get the same tools and features since the 360's API is more advanced then direct x 9 however the gpu is still limited to shader model 3 which is standard in dx 9. The 8600GTS is a direct x 10 gpu able to do shader model 4 something the 360 can not do. Direct x is not a scapegoat....


"And again direct x has no bearing on gpu performance or ability as long you have a cpu that take up the overhead, using the same rendering standards"

Thats bold statements there. Lets see.. So your theory is that if you have an Intel CPU that is powerful enough, the lack of drawcalls from the CPU can be brute forced and actually render more chunks of geometry at higher speeds than the console with a weaker CPU and 100x the draw calls?

So why is it developers from all over are 'frustrated' with the PC and Direct X if this problem can easily be overcome by buying a high end CPU?

http://www.computerandvideogames.com/314566/carmack-extremely-frustrated-that-pc-is-10-times-as-powerful-as-ps3-360/


In no way, shape, or form does he limit his statement to 'CPU overhead'. Put it this way, I can bet my life that Uncharted 4 and the likes could not be done at the same graphics quality on a 7850. I could link so many more devs saying the same thing. Direct X is way too abstract to compete with a console for performance. Fact. Tell me your insight on this. Utter BS?

Developers are not 'frustrated" over direct x in general anymore it was having to abide to the older direct x 9 API limits and older code for multiplats. 2011 was year where we started seeing developers taking advantage of the modern directx 11 features and abilities and afterward all these direct x limiting claims disappeared. Carmark's past statements are not as relevant because of the shift of modern games being dx 10+ or just plain dx 11 only.

On consoles you can directly write commands to the GPU buffer, and you will write the commands directly in a format that the GPU hardware understands. It's just a few lines of code to add a single draw call.PC has both user space and kernel space drivers that process the draw calls. More than one software can be adding GPU commands simultaneously, and the driver must synchronize and store/return the GPU state accordingly). The GPU commands must be translated by the driver to a format understood by the GPU (many different manufacturers and GPU families). The commands and modified data must be sent over to the GPU.

Drawcall's on pc is handled directly by the CPU before going to the gpu like as I stated earlier as long as you have the cpu to bulldoze through the overhead. The overhead on direct x 11 is no where like the version in the past and the gains from a low level API "Mantle" bypassing direct x has not given any real gains with modern intel cpu's. The differences are only slight. Your idea of of direct x limiting gpus are over exaggerated.

#599 Edited by 04dcarraher (19264 posts) -
@Cranler said:

I already linked the 8600 gts bench where the only way it can get 60 fps in CoD 4 is at 1024x768 no aa. 360 gets 60 fps with 1024x600 which is indistinguishable from 1024x768 and also has 2x aa so 360 wins.

I said the e6550 beats it in UT 3. it even beats the x2 6400. http://www.anandtech.com/show/2352/7

You haven't proved anything dont ignore the fact that 8600GTS is able to get 36 fps average maxed at over double the resolution of the 360.

The only reason why the e6550 beats the Athlon 's slightly :by 10%" is because the bottleneck created by the use of such a low resolution. Come back when you have a proper excuse when we know that 8600GT is able to get nearly 40 fps average maxed out in 1600x1200 even still Athlon x2 3 ghz+ provided more then120fps average which means that athlon x2 can feed the gpu enough data to keep 60 fps.

#600 Posted by Cranler (8730 posts) -

@Cranler said:

I already linked the 8600 gts bench where the only way it can get 60 fps in CoD 4 is at 1024x768 no aa. 360 gets 60 fps with 1024x600 which is indistinguishable from 1024x768 and also has 2x aa so 360 wins.

I said the e6550 beats it in UT 3. it even beats the x2 6400. http://www.anandtech.com/show/2352/7

You haven't proved anything dont ignore the fact that 8600GTS is able to get 36 fps average maxed at over double the resolution of the 360.

The only reason why the e6550 beats the Athlon 's slightly :by 10%" is because the bottleneck created by the use of such a low resolution. Come back when you have a proper excuse when we know that 8600GT is able to get nearly 40 fps average maxed out in 1600x1200 even still Athlon x2 3 ghz+ provided more then120fps average which means that athlon x2 can feed the gpu enough data to keep 60 fps.

36 fps in what game?

So you know better than Anandtech about cpu benchmarking?

What game does the 8600 gt get 40 fps at 1600? You need to be more specific when citing benchmarks.