The truth about console vs pc.

This topic is locked from further discussion.

#151 Posted by Heirren (16657 posts) -
Here's the real truth, at least at the moment; pc gamer fanboys are pissed that upgrade season is coming up.
#152 Posted by clyde46 (45439 posts) -
Here's the real truth, at least at the moment; pc gamer fanboys are pissed that upgrade season is coming up.Heirren
Upgrade season is best season.
#153 Posted by napo_sp (226 posts) -
Here's the real truth, at least at the moment; pc gamer fanboys are pissed that upgrade season is coming up.Heirren
pc gamer fanboys = the fanboys of someone who is a pc gamer, why would they pissed about upgrade season? unless that pc gamer is charging some $$$ to them so that he can upgrade that is; in that case, he is the one that should be envied because a bunch of his fanboys are the one who pay his hardware.
#154 Posted by menes777 (2643 posts) -

[QUOTE="Heirren"]Here's the real truth, at least at the moment; pc gamer fanboys are pissed that upgrade season is coming up.clyde46
Upgrade season is best season.

Because at least we can upgrade, if you want to play PS4 and CasualBox720 games you have to buy them at their most expensive prices or pray that in a few years they will drop in price.

#155 Posted by zekere (2509 posts) -

The truth is that a PC needs a graphics card of at least a few hundred dollars to play even the simplest of games. It is a slow machine, fails to work properly for a week long, and the moment you connect to internet, your PC halves its speed, and last but not least, it is damn ugly!!!

#156 Posted by wis3boi (31186 posts) -

The truth is that a PC needs a graphics card of at least a few hundred dollars to play even the simplest of games. It is a slow machine, fails to work properly for a week long, and the moment you connect to internet, your PC halves its speed, and last but not least, it is damn ugly!!!

zekere

0/10

#157 Posted by Heirren (16657 posts) -
[QUOTE="Heirren"]Here's the real truth, at least at the moment; pc gamer fanboys are pissed that upgrade season is coming up.clyde46
Upgrade season is best season.

For some, I don't doubt. But for others that just upgraded fairly recently? The odd thing about this coming gen is the control interface on consoles is changing. If the additions to the control mechanisms are adopted, where does that leave the ms/keyboard?
#158 Posted by MK-Professor (3755 posts) -

I would love to see what the game looks like on an 8800 gts.

My guess is not that great. Performance will probably be crap too.

Just because it can run it, doesn't mean it's worth it. If you're trying to squeeze the latest and greatest games on a GPU that is literally 6 years old, then you're best off getting the lastest round of consoles and saving yourself a whole heap of trouble.

Wasdie

Actually a 8800 will play bioshock infinite(and any game) with better graphics and performance than consoles.

#159 Posted by tenaka2 (17029 posts) -

The truth is that a PC needs a graphics card of at least a few hundred dollars to play even the simplest of games. It is a slow machine, fails to work properly for a week long, and the moment you connect to internet, your PC halves its speed, and last but not least, it is damn ugly!!!

zekere

You are drowning in a sea of ignorance.

#160 Posted by clyde46 (45439 posts) -
[QUOTE="clyde46"][QUOTE="Heirren"]Here's the real truth, at least at the moment; pc gamer fanboys are pissed that upgrade season is coming up.Heirren
Upgrade season is best season.

For some, I don't doubt. But for others that just upgraded fairly recently? The odd thing about this coming gen is the control interface on consoles is changing. If the additions to the control mechanisms are adopted, where does that leave the ms/keyboard?

Thats the nature of PC gaming. You just accept that the stuff you buy, will be superseded when the next line of parts comes out. The KB/M has been around since the very beginning of PC's, you can't improve on perfection.
#161 Posted by Heirren (16657 posts) -
[QUOTE="clyde46"][QUOTE="Heirren"][QUOTE="clyde46"] Upgrade season is best season.

For some, I don't doubt. But for others that just upgraded fairly recently? The odd thing about this coming gen is the control interface on consoles is changing. If the additions to the control mechanisms are adopted, where does that leave the ms/keyboard?

Thats the nature of PC gaming. You just accept that the stuff you buy, will be superseded when the next line of parts comes out. The KB/M has been around since the very beginning of PC's, you can't improve on perfection.

Kb/mouse is alright. I feel a controller is a bit more immersive, but that's beside the point. My point is that game design centers around consoles. If big budget releases adopt the new, stock, control inputs, a controller would be necessary for some games--nullifying the ms/KGB setup.
#162 Posted by p4s2p0 (4167 posts) -
The KB/M has been around since the very beginning of PC's, you can't improve on perfection. clyde46
Not true mice going from ball to laser is an example.
#163 Posted by clyde46 (45439 posts) -
[QUOTE="Heirren"][QUOTE="clyde46"][QUOTE="Heirren"] For some, I don't doubt. But for others that just upgraded fairly recently? The odd thing about this coming gen is the control interface on consoles is changing. If the additions to the control mechanisms are adopted, where does that leave the ms/keyboard?

Thats the nature of PC gaming. You just accept that the stuff you buy, will be superseded when the next line of parts comes out. The KB/M has been around since the very beginning of PC's, you can't improve on perfection.

Kb/mouse is alright. I feel a controller is a bit more immersive, but that's beside the point. My point is that game design centers around consoles. If big budget releases adopt the new, stock, control inputs, a controller would be necessary for some games--nullifying the ms/KGB setup.

I don't agree. If you look at this gen, the big budget releases offer support for both. I play FC3 with either my 360 pad or my KB/M.
#164 Posted by clyde46 (45439 posts) -
[QUOTE="clyde46"] The KB/M has been around since the very beginning of PC's, you can't improve on perfection. p4s2p0
Not true mice going from ball to laser is an example.

Ok, you have me there :P
#165 Posted by jhonMalcovich (4612 posts) -

[QUOTE="p4s2p0"][QUOTE="clyde46"] The KB/M has been around since the very beginning of PC's, you can't improve on perfection. clyde46
Not true mice going from ball to laser is an example.

Ok, you have me there :P

It would be nice to have vibration in mice. This is the only thing lacking in them. 

#166 Posted by clyde46 (45439 posts) -

[QUOTE="clyde46"][QUOTE="p4s2p0"] Not true mice going from ball to laser is an example. jhonMalcovich

Ok, you have me there :P

It would be nice to have vibration in mice. This is the only thing lacking in them. 

No!
#167 Posted by jhonMalcovich (4612 posts) -

[QUOTE="jhonMalcovich"]

[QUOTE="clyde46"] Ok, you have me there :Pclyde46

It would be nice to have vibration in mice. This is the only thing lacking in them. 

No!

Hey ! It may have some cool implementations when playing single player, like mouse vibrating when shooting a minigun.

#168 Posted by Heirren (16657 posts) -
[QUOTE="clyde46"][QUOTE="Heirren"][QUOTE="clyde46"] Thats the nature of PC gaming. You just accept that the stuff you buy, will be superseded when the next line of parts comes out. The KB/M has been around since the very beginning of PC's, you can't improve on perfection.

Kb/mouse is alright. I feel a controller is a bit more immersive, but that's beside the point. My point is that game design centers around consoles. If big budget releases adopt the new, stock, control inputs, a controller would be necessary for some games--nullifying the ms/KGB setup.

I don't agree. If you look at this gen, the big budget releases offer support for both. I play FC3 with either my 360 pad or my KB/M.

But that's because current design can cater to both. The stock ps4 controller has new features.
#169 Posted by lowe0 (13692 posts) -
[QUOTE="Heirren"][QUOTE="clyde46"][QUOTE="Heirren"] Kb/mouse is alright. I feel a controller is a bit more immersive, but that's beside the point. My point is that game design centers around consoles. If big budget releases adopt the new, stock, control inputs, a controller would be necessary for some games--nullifying the ms/KGB setup.

I don't agree. If you look at this gen, the big budget releases offer support for both. I play FC3 with either my 360 pad or my KB/M.

But that's because current design can cater to both. The stock ps4 controller has new features.

Games are designed to be compatible with as many wallets as possible. Why specialize when it only costs sales?
#170 Posted by Heirren (16657 posts) -
[QUOTE="lowe0"][QUOTE="Heirren"][QUOTE="clyde46"] I don't agree. If you look at this gen, the big budget releases offer support for both. I play FC3 with either my 360 pad or my KB/M.

But that's because current design can cater to both. The stock ps4 controller has new features.

Games are designed to be compatible with as many wallets as possible. Why specialize when it only costs sales?

The pc gets left out of that equation. Even big releases like GTA are delayed because of piracy. I'm not even knocking the format, but new consoles will push the new features.
#171 Posted by rjdofu (9170 posts) -

[QUOTE="zekere"]

The truth is that a PC needs a graphics card of at least a few hundred dollars to play even the simplest of games. It is a slow machine, fails to work properly for a week long, and the moment you connect to internet, your PC halves its speed, and last but not least, it is damn ugly!!!

tenaka2

You are drowning in a sea of ignorance.

I think of it as little as a drop of trolling.
#172 Posted by menes777 (2643 posts) -

[QUOTE="p4s2p0"][QUOTE="clyde46"] The KB/M has been around since the very beginning of PC's, you can't improve on perfection. clyde46
Not true mice going from ball to laser is an example.

Ok, you have me there :P

Keyboards have also changed as well.  Take a keyboard from even up to the mid to late 90's and compare it to a current keyboard and you will notice a big difference.  There are added ergonomics as well as not being as loud.  Also you have those mini-keyboards specifically for gaming, roll out keyboards, etc...  PC is constantly evolving, only those with their head stuck in the sand or that are trolls are the ones missing it.

#173 Posted by menes777 (2643 posts) -

The truth is that a PC needs a graphics card of at least a few hundred dollars to play even the simplest of games. It is a slow machine, fails to work properly for a week long, and the moment you connect to internet, your PC halves its speed, and last but not least, it is damn ugly!!!

zekere

It's funny when stupid people get on the internet and do idiotic things and end up getting infected with malware and spyware, but somehow it's the hardwares fault they were morons!  :lol:

#174 Posted by clyde46 (45439 posts) -

[QUOTE="clyde46"][QUOTE="p4s2p0"] Not true mice going from ball to laser is an example. menes777

Ok, you have me there :P

Keyboards have also changed as well.  Take a keyboard from even up to the mid to late 90's and compare it to a current keyboard and you will notice a big difference.  There are added ergonomics as well as not being as loud.  Also you have those mini-keyboards specifically for gaming, roll out keyboards, etc...  PC is constantly evolving, only those with their head stuck in the sand or that are trolls are the ones missing it.

Yes but the actual design has stayed the same. Its like the motor car. Its got faster, safer, etc but at its core, its still the same design as the first motors cars way back.
#175 Posted by br0kenrabbit (12860 posts) -

I feel a controller is a bit more immersive, but that's beside the point. My point is that game design centers around consoles. If big budget releases adopt the new, stock, control inputs, a controller would be necessary for some games--nullifying the ms/KGB setup.Heirren

I disagree about immersion. With a 1:1 mouse (no acceleration) movement is fluid and natural. Thumbsticks have limited mobility (less than an inch) and almost always acceleration (the closer to the edge the stick is, the faster the movement). In essence your input is 'filtered'.

Also, M/KB isn't the only options for a PC control scheme. I have a whole cockpit setup for my flight sims. I've had a head-tracking device (TrackIR) since the mid-2000's. I have a force-feedback joystick (REAL force feedback, NOT rumble), and I have a steering wheel somewhere...I think I still have it.

Point being, PC isn't limited to M/KB. It's just the preferred method for MOST games among PC gamers, and that's becaues it works well. No amount of analog control can compete with the precision of 1:1 movement.

 

#176 Posted by locopatho (20193 posts) -

[QUOTE="Heirren"]I feel a controller is a bit more immersive, but that's beside the point. My point is that game design centers around consoles. If big budget releases adopt the new, stock, control inputs, a controller would be necessary for some games--nullifying the ms/KGB setup.br0kenrabbit

I disagree about immersion. With a 1:1 mouse (no acceleration) movement is fluid and natural. Thumbsticks have limited mobility (less than an inch) and almost always acceleration (the closer to the edge the stick is, the faster the movement). In essence your input is 'filtered'.

Also, M/KB isn't the only options for a PC control scheme. I have a whole cockpit setup for my flight sims. I've had a head-tracking device (TrackIR) since the mid-2000's. I have a force-feedback joystick (REAL force feedback, NOT rumble), and I have a steering wheel somewhere...I think I still have it.

Point being, PC isn't limited to M/KB. It's just the preferred method for MOST games among PC gamers, and that's becaues it works well. No amount of analog control can compete with the precision of 1:1 movement.

 

It's just whatever you prefer. I personally can't stand WASD for movement and love the 360 degree movement of a thumstick, complete with lighter pressure for sneaking, medium pressure for walking, and full pressure for running. That's a million times better then keyboard movement for me.
#177 Posted by DragonfireXZ95 (19818 posts) -

[QUOTE="br0kenrabbit"]

[QUOTE="Heirren"]I feel a controller is a bit more immersive, but that's beside the point. My point is that game design centers around consoles. If big budget releases adopt the new, stock, control inputs, a controller would be necessary for some games--nullifying the ms/KGB setup.locopatho

I disagree about immersion. With a 1:1 mouse (no acceleration) movement is fluid and natural. Thumbsticks have limited mobility (less than an inch) and almost always acceleration (the closer to the edge the stick is, the faster the movement). In essence your input is 'filtered'.

Also, M/KB isn't the only options for a PC control scheme. I have a whole cockpit setup for my flight sims. I've had a head-tracking device (TrackIR) since the mid-2000's. I have a force-feedback joystick (REAL force feedback, NOT rumble), and I have a steering wheel somewhere...I think I still have it.

Point being, PC isn't limited to M/KB. It's just the preferred method for MOST games among PC gamers, and that's becaues it works well. No amount of analog control can compete with the precision of 1:1 movement.

 

It's just whatever you prefer. I personally can't stand WASD for movement and love the 360 degree movement of a thumstick, complete with lighter pressure for sneaking, medium pressure for walking, and full pressure for running. That's a million times better then keyboard movement for me.

Unless a game doesn't support it and you simply press crouch to sneak.

Most first person shooters for example.

#178 Posted by br0kenrabbit (12860 posts) -

It's just whatever you prefer. I personally can't stand WASD for movement and love the 360 degree movement of a thumstick, complete with lighter pressure for sneaking, medium pressure for walking, and full pressure for running. That's a million times better then keyboard movement for me.locopatho

I can't think of a game where that much variance in movement speed would be beneficial enough to offset the loss of accuracy, but the CTRL (crawl) SHIFT (walk) No Modifier (Run) works perfectly for me.

In fact, I've had trouble with jumping/tightroping puzzles on consoles because it's so hard to get 'true north' input...it's always off to the left or right just a little bit when pressing up.

#179 Posted by mastershake575 (8354 posts) -
[QUOTE="clyde46"][QUOTE="Heirren"]Here's the real truth, at least at the moment; pc gamer fanboys are pissed that upgrade season is coming up.Heirren
Upgrade season is best season.

For some, I don't doubt. But for others that just upgraded fairly recently? ?

Those who upgraded recently should be packing at least an x6/x8 if they went AMD or i5/i7 if they went intel. Rams dirt cheap so most people already have 8-10GB and the next generation is suposely running a mid ranged card which those who upgraded recently should already have (I don't see any problems, those who upgraded recently should need at the very worse a better card if they choose to do so in another year and half assuming they went lower than $200) .
#180 Posted by ArchoNils2 (6160 posts) -

Actually the real truth between console vs pc is that consoles givve you aa brand to root for. I see it with sports teams and I see it with consoles, there are huge amount of fans that just want to make their team / system look good while insulting the others.

Oh, but that aside, consoles become obsolete as well or when did the last NES game release?

#181 Posted by 04dcarraher (19328 posts) -

[QUOTE="04dcarraher"]

[QUOTE="evildead6789"]

 

 

 

 

 

Crysis 2 runs at 1152 x 720 on consoles instead of 1280 x 720p. So?

 

 

 

 

you didn't get around 40-50 fps with you 8800 gt and that athlon on that resolution, the 5670 which is better with double the memory get's 32 fps. They clearly mention in the article high settings is only slightly better than the console versions, and this is with an i7 2600

link

Go do some research before you spout all this nonsense

evildead6789

:lol: need to do some real hard looking and do some researching there....

The fact that 9800GT gets 39 fps average at 1680x1050 is funny since you clearly dont know that 9800GT is a 8800GT :lol:

again thats 2x the resolution which means that 8800GT is at least 2x faster then 360 gpu. The 8800GT or 9800GT is between 2 to 3 times faster then the 360 gpu, both in gflops performance and in games running same resolutions with higher settings getting 2-3x the fps or running much higher resolutions and settings with equal to better fps.

maybe so , but that's twice the vram. which makes higher framerates at higher resolutions possible. I said from the beginning i was comparing the 8800 gt 256mb, since that is the same amount of vram the hd twins have. The 8800 gt 512 mb (and the 9800 gt which only comes in 512mb) are way faster at those resolutions.

The 5670 (even with 512mb) comes a lot closer in performance to the 8800 gt 256mb (but it's still faster) Yeah, you better did some hard looking and researching, smartass.

but it doesn't even matter, this is tested on a i7-2600, if you would use something similar as an xbox 360 cpu, your framerates would drop with 50 percent, Crysis 2 uses up to four cores and the higher you go in resolution the less cpu dependent it is, so combining an 8800 gt (it doesn't even matter what amount of memory) with an x2 athlon, will render it unplayable on quality console settings.

Since you're going to whine about it again, here 's the link

So yeah do some more hard looking and researching, fool.

 

 

 

:lol: so much fail

That link of your is with a GTX 590

so again do more research

#182 Posted by Xtasy26 (4251 posts) -

Whats funny is that you totally ignore the fact that bioshock infinite requires a direct x 10 gpu as a minimum. Also Bill Gardener stated on the various PC features and design philosophies that went into the pc version of the game. Among the features: DirectX 11 support, contact hardening shadows, a Field of View slider, optimized mouse controls, and more. The devs have to turn to the powerful PC to really allow their game to stand out from the crowd. The Pc version is getting special attention. 04dcarraher

Thank you.

Look at the IGN PC version of the review:

http://www.ign.com/articles/2013/03/22/bioshock-infinite-pc-review

The reviewer mentions how the console version has crappier textures, frame rate drops because it can't render the entire environment with a consistent frame rate.

I won't even go into the DX 11 effects they put into the PC version. And also you had to pause the game in order to switch between the powers you want to choose and choosing multitude of different weapons when shooting. In the PC it can easily be done instantly with a simple button press and mouse scroll.

In other words, the PC version is indeed the the special version. Looks at some of the 1080P vidoes and screenshots on the PC (especially the outdoor environments on the PC). The game looks gorgeous on the PC.

#183 Posted by MK-Professor (3755 posts) -

 

[QUOTE="evildead6789"]

[QUOTE="04dcarraher"]

:lol: need to do some real hard looking and do some researching there....

The fact that 9800GT gets 39 fps average at 1680x1050 is funny since you clearly dont know that 9800GT is a 8800GT :lol:

again thats 2x the resolution which means that 8800GT is at least 2x faster then 360 gpu. The 8800GT or 9800GT is between 2 to 3 times faster then the 360 gpu, both in gflops performance and in games running same resolutions with higher settings getting 2-3x the fps or running much higher resolutions and settings with equal to better fps.

04dcarraher

maybe so , but that's twice the vram. which makes higher framerates at higher resolutions possible. I said from the beginning i was comparing the 8800 gt 256mb, since that is the same amount of vram the hd twins have. The 8800 gt 512 mb (and the 9800 gt which only comes in 512mb) are way faster at those resolutions.

The 5670 (even with 512mb) comes a lot closer in performance to the 8800 gt 256mb (but it's still faster) Yeah, you better did some hard looking and researching, smartass.

but it doesn't even matter, this is tested on a i7-2600, if you would use something similar as an xbox 360 cpu, your framerates would drop with 50 percent, Crysis 2 uses up to four cores and the higher you go in resolution the less cpu dependent it is, so combining an 8800 gt (it doesn't even matter what amount of memory) with an x2 athlon, will render it unplayable on quality console settings.

Since you're going to whine about it again, here 's the link

So yeah do some more hard looking and researching, fool.

 

 

 

:lol: so much fail

That link of your is with a GTX 590

so again do more research

Did you expect anything better from a console peasant? :P

This benchmark is in extreme quality lowering to just high quality (will tax the CPU less), also smaller fov (what consoles use) tax the CPU less.

Also if the GPU on the benchmark was the 8800GT and not the GTX590 then all CPU's will give the same fps.

#184 Posted by faizan_faizan (7855 posts) -
Even big releases like GTA are delayed because of piracy. I'm not even knocking the format, but new consoles will push the new features.Heirren
Source? GTA has ALWAYS been late on PC. ALWAYS. (Except 1 and it's expansions and 2 I think)
#185 Posted by mitu123 (153911 posts) -
Here's the real truth, at least at the moment; pc gamer fanboys are pissed that upgrade season is coming up.Heirren

The truth is that a PC needs a graphics card of at least a few hundred dollars to play even the simplest of games. It is a slow machine, fails to work properly for a week long, and the moment you connect to internet, your PC halves its speed, and last but not least, it is damn ugly!!!

zekere
And you wonder why people don't take you guys seriously...
#186 Posted by RyviusARC (4421 posts) -

[QUOTE="04dcarraher"]

[QUOTE="evildead6789"]

 

 

 

 

 

Crysis 2 runs at 1152 x 720 on consoles instead of 1280 x 720p. So?

 

1680_High.png

 

 

you didn't get around 40-50 fps with you 8800 gt and that athlon on that resolution, the 5670 which is better with double the memory get's 32 fps. They clearly mention in the article high settings is only slightly better than the console versions, and this is with an i7 2600

link

Go do some research before you spout all this nonsense

evildead6789

:lol: need to do some real hard looking and do some researching there....

The fact that 9800GT gets 39 fps average at 1680x1050 is funny since you clearly dont know that 9800GT is a 8800GT :lol:

again thats 2x the resolution which means that 8800GT is at least 2x faster then 360 gpu. The 8800GT or 9800GT is between 2 to 3 times faster then the 360 gpu, both in gflops performance and in games running same resolutions with higher settings getting 2-3x the fps or running much higher resolutions and settings with equal to better fps.

maybe so , but that's twice the vram. which makes higher framerates at higher resolutions possible. I said from the beginning i was comparing the 8800 gt 256mb, since that is the same amount of vram the hd twins have. The 8800 gt 512 mb (and the 9800 gt which only comes in 512mb) are way faster at those resolutions.

The 5670 (even with 512mb) comes a lot closer in performance to the 8800 gt 256mb (but it's still faster) Yeah, you better did some hard looking and researching, smartass.

but it doesn't even matter, this is tested on a i7-2600, if you would use something similar as an xbox 360 cpu, your framerates would drop with 50 percent, Crysis 2 uses up to four cores and the higher you go in resolution the less cpu dependent it is, so combining an 8800 gt (it doesn't even matter what amount of memory) with an x2 athlon, will render it unplayable on quality console settings.

Since you're going to whine about it again, here 's the link

So yeah do some more hard looking and researching, fool.

 

 

 

 

Dude the link you posted was on extreme settings......off course the CPU would get bogged down at those settings.

As i've said before I was able to get between 40-50 fps on an 8800gt at 1680x1050 at CONSOLE SETTINGS.

Also it doesn't matter if I have the 512mb version. The extra RAM doesn't make my card have more processing power it just enables it to store higher res assets.

And the benchmarks you used are outdated.

There have been many driver updates and game patches to improve performance even more.

So once again I was correct when I said the 8800gt is around 3x the power of consoles if not more if you OC.

#187 Posted by evildead6789 (7543 posts) -

[QUOTE="evildead6789"]

[QUOTE="04dcarraher"]

:lol: need to do some real hard looking and do some researching there....

The fact that 9800GT gets 39 fps average at 1680x1050 is funny since you clearly dont know that 9800GT is a 8800GT :lol:

again thats 2x the resolution which means that 8800GT is at least 2x faster then 360 gpu. The 8800GT or 9800GT is between 2 to 3 times faster then the 360 gpu, both in gflops performance and in games running same resolutions with higher settings getting 2-3x the fps or running much higher resolutions and settings with equal to better fps.

04dcarraher

maybe so , but that's twice the vram. which makes higher framerates at higher resolutions possible. I said from the beginning i was comparing the 8800 gt 256mb, since that is the same amount of vram the hd twins have. The 8800 gt 512 mb (and the 9800 gt which only comes in 512mb) are way faster at those resolutions.

The 5670 (even with 512mb) comes a lot closer in performance to the 8800 gt 256mb (but it's still faster) Yeah, you better did some hard looking and researching, smartass.

but it doesn't even matter, this is tested on a i7-2600, if you would use something similar as an xbox 360 cpu, your framerates would drop with 50 percent, Crysis 2 uses up to four cores and the higher you go in resolution the less cpu dependent it is, so combining an 8800 gt (it doesn't even matter what amount of memory) with an x2 athlon, will render it unplayable on quality console settings.

Since you're going to whine about it again, here 's the link

So yeah do some more hard looking and researching, fool.

 

 

 

:lol: so much fail

That link of your is with a GTX 590

so again do more research

So what, christ, your stupidity has no limits

They benchmarked it onto extreme settings, the difference with higher end cpu's is also big and there's no bottleneck there

 So what do you think ? the athlon x2 with an 9800 gt will give the same framerates as an i7-2600 with an 9800 gt on console settings. There's already a difference between i7-2600's at different clock settings (which they also tested in that article). They concluded out of the bencmarks (and everyone who knows something about pc hardware) that  crysis 2 is very cpu dependent, it also makes use of four cores.

I suggest you stop making a fool of yourself :lol:

#188 Posted by evildead6789 (7543 posts) -

[QUOTE="evildead6789"]

[QUOTE="04dcarraher"] maybe so , but that's twice the vram. which makes higher framerates at higher resolutions possible. I said from the beginning i was comparing the 8800 gt 256mb, since that is the same amount of vram the hd twins have. The 8800 gt 512 mb (and the 9800 gt which only comes in 512mb) are way faster at those resolutions.

The 5670 (even with 512mb) comes a lot closer in performance to the 8800 gt 256mb (but it's still faster) Yeah, you better did some hard looking and researching, smartass.

but it doesn't even matter, this is tested on a i7-2600, if you would use something similar as an xbox 360 cpu, your framerates would drop with 50 percent, Crysis 2 uses up to four cores and the higher you go in resolution the less cpu dependent it is, so combining an 8800 gt (it doesn't even matter what amount of memory) with an x2 athlon, will render it unplayable on quality console settings.

Since you're going to whine about it again, here 's the link

So yeah do some more hard looking and researching, fool.

 

 

 

RyviusARC

:lol: so much fail

That link of your is with a GTX 590

so again do more research

Did you expect anything better from a console peasant? :P

This benchmark is in extreme quality lowering to just high quality (will tax the CPU less), also smaller fov (what consoles use) tax the CPU less.

Also if the GPU on the benchmark was the 8800GT and not the GTX590 then all CPU's will give the same fps.

oh please , another noob in pc hardware has his say... The benchmark is in extreme quality but they alos use a gtx 590. So if you lower the quality settings, the gpu will be taxed less, the cpu not so much, what do you think? all the shading is done on a cpu They clearly said crysis 2 uses four cores and is highly cpu dependent , hence the framerate differences on the i7-2600k at different speed settings. What do you think an i7-2600 bottlenecks a gtx 590 at 1080p. You will need something stronger and a much higher resolution If there's already such a big difference in frame rates between high end cpu's and there's no bottleneck it means only one thing, the game is very cpu dependent. So yeah, an i7-2600 instead of an x2 athlon will make a huge difference in this game when you use a 8800 gt at 1050p on console settings. Besides before you call me a console peasant, maybe you should have a look at my sig,
#189 Posted by evildead6789 (7543 posts) -

[QUOTE="evildead6789"]

[QUOTE="04dcarraher"]

:lol: need to do some real hard looking and do some researching there....

The fact that 9800GT gets 39 fps average at 1680x1050 is funny since you clearly dont know that 9800GT is a 8800GT :lol:

again thats 2x the resolution which means that 8800GT is at least 2x faster then 360 gpu. The 8800GT or 9800GT is between 2 to 3 times faster then the 360 gpu, both in gflops performance and in games running same resolutions with higher settings getting 2-3x the fps or running much higher resolutions and settings with equal to better fps.

RyviusARC

maybe so , but that's twice the vram. which makes higher framerates at higher resolutions possible. I said from the beginning i was comparing the 8800 gt 256mb, since that is the same amount of vram the hd twins have. The 8800 gt 512 mb (and the 9800 gt which only comes in 512mb) are way faster at those resolutions.

The 5670 (even with 512mb) comes a lot closer in performance to the 8800 gt 256mb (but it's still faster) Yeah, you better did some hard looking and researching, smartass.

but it doesn't even matter, this is tested on a i7-2600, if you would use something similar as an xbox 360 cpu, your framerates would drop with 50 percent, Crysis 2 uses up to four cores and the higher you go in resolution the less cpu dependent it is, so combining an 8800 gt (it doesn't even matter what amount of memory) with an x2 athlon, will render it unplayable on quality console settings.

Since you're going to whine about it again, here 's the link

So yeah do some more hard looking and researching, fool.

 

 

 

 

Dude the link you posted was on extreme settings......off course the CPU would get bogged down at those settings.

As i've said before I was able to get between 40-50 fps on an 8800gt at 1680x1050 at CONSOLE SETTINGS.

Also it doesn't matter if I have the 512mb version. The extra RAM doesn't make my card have more processing power it just enables it to store higher res assets.

And the benchmarks you used are outdated.

There have been many driver updates and game patches to improve performance even more.

So once again I was correct when I said the 8800gt is around 3x the power of consoles if not more if you OC.

It's the gpu that get's bogged down, smartass, the shading is not done on the cpu, nor the resolution I don't care what you said, i posted a benchmark that says otherwise with an i7-2600 instead of your x2 athlon 512 mb instead of 256 mb makes a huge difference on 1050p on those cards patches and updates won't give the game that much performance, these are cards from 2007. Stop saying the 8800 gt is 3 times the power of a console, everyone knows you're a dumbass who can't admit he's wrong.
#190 Posted by ronvalencia (15109 posts) -

[QUOTE="04dcarraher"]

[QUOTE="evildead6789"]evildead6789

huh??? the 360 cpu isnt not strong nor fast get over it , the 360 cpu can only do 19,000 MIPS while an Athlon X2's anywhere from 20000 to 30000 MIPS. Are you serious.... the consoles cpu are  weak. examples are all over but I guess your blind to see it. The x360 is not even close to the ps3 cell,  the 360 cpu has three symmetrical cores, while the PS3 has the Cell which has a PPE and 7 SPE cores. lol blaming bad coding on console that's fresh....  and you think your example is proof that 360 handles games better :lol: , A 8800GT can do 30-55fps at 1680x1050 after the patches, also to point out that even with the 6670 with version 1.0 of the game its able to 21 fps nearly 3x the resolution of the 360 lets just ignore all lower settings and resolutions used on the 360 to prove your flawed points.

Well , very smart of you to leave all  your moronic statements behind by just quoting my name. And there you go throw another stat like mips in my face , which you don't even know the meaning of.

Same with your comment about the x 360 and ps3. I never said the x360 was as strong as the ps3 on the cpu department, I was talking about the cores , everyone knows the ps3 has 8 cores and the x360 only 3. Don't act like you just invented the hot water.

You're talking about ppe and spe and again you don't even know what it means

quote from wiki ' Xenon is a CPU used in the Xbox 360 game console. The processor is based on IBM PowerPC instruction set architecture, consisting of three independent processor cores on a single die. These cores are slightly modified versions of the PPE in the Cell processor used on the PlayStation 3'

And spe is simply a part of ppe:

quote from wiki ' Each Power Processing Element (ppe) contains 8 APUs, which are now referred to as SPEs on the current Broadband Engine chip'

I never said the x360 handled games better than an 8800 gt, you said it was three times stronger, I proved it was nowhere near that.

SPU is not an APU i.e. it's missing GPU's fix function units. From IBM's own words, SPU is a DSP type solution.
#191 Posted by ronvalencia (15109 posts) -
Here's the real truth, at least at the moment; pc gamer fanboys are pissed that upgrade season is coming up.Heirren
Not with PCs with AMD Tahiti class GCNs.
#192 Posted by ronvalencia (15109 posts) -

The truth is that a PC needs a graphics card of at least a few hundred dollars to play even the simplest of games. It is a slow machine, fails to work properly for a week long, and the moment you connect to internet, your PC halves its speed, and last but not least, it is damn ugly!!!

zekere
Not with AMD based APUs. AMD Kaveri APU has a GCN with 8 CUs and GDDR5 memory (refer JEDEC GDDR5M SODIMM specs).
#193 Posted by RyviusARC (4421 posts) -

[QUOTE="RyviusARC"]

[QUOTE="evildead6789"] maybe so , but that's twice the vram. which makes higher framerates at higher resolutions possible. I said from the beginning i was comparing the 8800 gt 256mb, since that is the same amount of vram the hd twins have. The 8800 gt 512 mb (and the 9800 gt which only comes in 512mb) are way faster at those resolutions.

The 5670 (even with 512mb) comes a lot closer in performance to the 8800 gt 256mb (but it's still faster) Yeah, you better did some hard looking and researching, smartass.

but it doesn't even matter, this is tested on a i7-2600, if you would use something similar as an xbox 360 cpu, your framerates would drop with 50 percent, Crysis 2 uses up to four cores and the higher you go in resolution the less cpu dependent it is, so combining an 8800 gt (it doesn't even matter what amount of memory) with an x2 athlon, will render it unplayable on quality console settings.

Since you're going to whine about it again, here 's the link

So yeah do some more hard looking and researching, fool.

 

 

 

evildead6789

 

Dude the link you posted was on extreme settings......off course the CPU would get bogged down at those settings.

As i've said before I was able to get between 40-50 fps on an 8800gt at 1680x1050 at CONSOLE SETTINGS.

Also it doesn't matter if I have the 512mb version. The extra RAM doesn't make my card have more processing power it just enables it to store higher res assets.

And the benchmarks you used are outdated.

There have been many driver updates and game patches to improve performance even more.

So once again I was correct when I said the 8800gt is around 3x the power of consoles if not more if you OC.

It's the gpu that get's bogged down, smartass, the shading is not done on the cpu, nor the resolution I don't care what you said, i posted a benchmark that says otherwise with an i7-2600 instead of your x2 athlon 512 mb instead of 256 mb makes a huge difference on 1050p on those cards patches and updates won't give the game that much performance, these are cards from 2007. Stop saying the 8800 gt is 3 times the power of a console, everyone knows you're a dumbass who can't admit he's wrong.

 

Want some cheese with that whine.

Even Nvidia disagrees with you and they agree with me when I said the 8800gt is around 3x the performance of consoles.

I know that I am right because I have an 8800gt and have played the game on it.

And yes extreme settings do hurt the CPU compared to low settings.

But now you are just trying to derail the argument with bringing up the CPU.

The whole argument was about the 8800gt's performance.

If you look on YouTube you will see many people agreeing with me on the performance.

This guy was average 35-40fps at 1600x1200 which is a higher resolution than the 1680x1050 I played it on.

http://www.youtube.com/watch?v=5ksL2pwecHg

If he lowered it to 1680x1050 then it would be getting over 40fps.

This guy played at 1440x900 and was getting 55-90fps.

http://www.youtube.com/watch?v=gt1TR9unhnU

If he bumped it up to my resolution it would also play around 40-50fps.

This guy was using a weaker Athlon x2 2.4ghz CPU at 1080P and was getting 30-40fps on his 8800gt.

http://www.youtube.com/watch?v=cRsKJigMjfU

 

 

I could keep using examples.

#194 Posted by jun_aka_pekto (15942 posts) -

you didn't get around 40-50 fps with you 8800 gt and that athlon on that resolution, the 5670 which is better with double the memory get's 32 fps. They clearly mention in the article high settings is only slightly better than the console versions, and this is with an i7 2600

evildead6789

That's because High in Crysis 2 is actually the lowest setting.

The detail settings for Crysis 2 from highest to lowest detail:

Ultra

Extreme

Very High

High

#195 Posted by 04dcarraher (19328 posts) -

[QUOTE="04dcarraher"]

[QUOTE="evildead6789"] maybe so , but that's twice the vram. which makes higher framerates at higher resolutions possible. I said from the beginning i was comparing the 8800 gt 256mb, since that is the same amount of vram the hd twins have. The 8800 gt 512 mb (and the 9800 gt which only comes in 512mb) are way faster at those resolutions.

The 5670 (even with 512mb) comes a lot closer in performance to the 8800 gt 256mb (but it's still faster) Yeah, you better did some hard looking and researching, smartass.

but it doesn't even matter, this is tested on a i7-2600, if you would use something similar as an xbox 360 cpu, your framerates would drop with 50 percent, Crysis 2 uses up to four cores and the higher you go in resolution the less cpu dependent it is, so combining an 8800 gt (it doesn't even matter what amount of memory) with an x2 athlon, will render it unplayable on quality console settings.

Since you're going to whine about it again, here 's the link

So yeah do some more hard looking and researching, fool.

 

 

 

evildead6789

:lol: so much fail

That link of your is with a GTX 590

so again do more research

So what, christ, your stupidity has no limits

They benchmarked it onto extreme settings, the difference with higher end cpu's is also big and there's no bottleneck there

 So what do you think ? the athlon x2 with an 9800 gt will give the same framerates as an i7-2600 with an 9800 gt on console settings. There's already a difference between i7-2600's at different clock settings (which they also tested in that article). They concluded out of the bencmarks (and everyone who knows something about pc hardware) that  crysis 2 is very cpu dependent, it also makes use of four cores.

I suggest you stop making a fool of yourself :lol:

Your in denial....
#196 Posted by evildead6789 (7543 posts) -

[QUOTE="evildead6789"][QUOTE="RyviusARC"]

 

Dude the link you posted was on extreme settings......off course the CPU would get bogged down at those settings.

As i've said before I was able to get between 40-50 fps on an 8800gt at 1680x1050 at CONSOLE SETTINGS.

Also it doesn't matter if I have the 512mb version. The extra RAM doesn't make my card have more processing power it just enables it to store higher res assets.

And the benchmarks you used are outdated.

There have been many driver updates and game patches to improve performance even more.

So once again I was correct when I said the 8800gt is around 3x the power of consoles if not more if you OC.

RyviusARC

It's the gpu that get's bogged down, smartass, the shading is not done on the cpu, nor the resolution I don't care what you said, i posted a benchmark that says otherwise with an i7-2600 instead of your x2 athlon 512 mb instead of 256 mb makes a huge difference on 1050p on those cards patches and updates won't give the game that much performance, these are cards from 2007. Stop saying the 8800 gt is 3 times the power of a console, everyone knows you're a dumbass who can't admit he's wrong.

 

Want some cheese with that whine.

Even Nvidia disagrees with you and they agree with me when I said the 8800gt is around 3x the performance of consoles.

I know that I am right because I have an 8800gt and have played the game on it.

And yes extreme settings do hurt the CPU compared to low settings.

But now you are just trying to derail the argument with bringing up the CPU.

The whole argument was about the 8800gt's performance.

If you look on YouTube you will see many people agreeing with me on the performance.

This guy was average 35-40fps at 1600x1200 which is a higher resolution than the 1680x1050 I played it on.

http://www.youtube.com/watch?v=5ksL2pwecHg

If he lowered it to 1680x1050 then it would be getting over 40fps.

This guy played at 1440x900 and was getting 55-90fps.

http://www.youtube.com/watch?v=gt1TR9unhnU

If he bumped it up to my resolution it would also play around 40-50fps.

This guy was using a weaker Athlon x2 2.4ghz CPU at 1080P and was getting 30-40fps on his 8800gt.

http://www.youtube.com/watch?v=cRsKJigMjfU

 

 

I could keep using examples.

Sure and youtube is THE source when it comes to benchmarks

Besides the first video I opened was a quad core , the second a hexa core, and third was with a 512 mb version of a 8800 gt

I haven't seen any fps meters in any of the video's.

I didn't derail any argument, it told you the second i posted that benchmark it was with a much stronger cpu, the i7-2600. Which means that the 8800 gt performance would go down in crysis 2 (in this case the 9800 gt) if you would use a cpu like in an x360.   There a separate page in that article that talks about cpu performance, and they clearly say the game is very cpu dependent.You can clearly see it in the bencmarks with different cpu's.

Higher resolutions and higher quality settings are always more taxing on the gpu than on the cpu. So the lower you go into resolution and detail the more the load shifts from gpu to cpu, every hardware expert will tell you that. So the cpu benchmarks do show that the game is cpu dependent, that's also why they write in their conclusion.

Besides you seem to forget that 512 mb makes a lot of difference playing at higher resolutions,  That doesn't make the 8800 gt any faster, the 512 mb just makes that the the memory doesn't bottleneck the card at higher resolutions. If you gave the x360 double the vram it would be able to handle higher resolutions also.

You can try all day, no one with at least half a brain believes your '8800 gt is three times the power of console' statement.

A gtx 460 or hd 5770 , that's three times the power of a console but that's a whole different animal than a 8800 gt.

#197 Posted by evildead6789 (7543 posts) -
[QUOTE="evildead6789"]

[QUOTE="04dcarraher"]

:lol: so much fail

That link of your is with a GTX 590

so again do more research

04dcarraher

So what, christ, your stupidity has no limits

They benchmarked it onto extreme settings, the difference with higher end cpu's is also big and there's no bottleneck there

 So what do you think ? the athlon x2 with an 9800 gt will give the same framerates as an i7-2600 with an 9800 gt on console settings. There's already a difference between i7-2600's at different clock settings (which they also tested in that article). They concluded out of the bencmarks (and everyone who knows something about pc hardware) that  crysis 2 is very cpu dependent, it also makes use of four cores.

I suggest you stop making a fool of yourself :lol:

Your in denial....

Sure , very nice arguments
#199 Posted by Alienware_fan (1515 posts) -

[QUOTE="evildead6789"][QUOTE="RyviusARC"]

Dude the link you posted was on extreme settings......off course the CPU would get bogged down at those settings.

As i've said before I was able to get between 40-50 fps on an 8800gt at 1680x1050 at CONSOLE SETTINGS.

Also it doesn't matter if I have the 512mb version. The extra RAM doesn't make my card have more processing power it just enables it to store higher res assets.

And the benchmarks you used are outdated.

There have been many driver updates and game patches to improve performance even more.

So once again I was correct when I said the 8800gt is around 3x the power of consoles if not more if you OC.

RyviusARC

It's the gpu that get's bogged down, smartass, the shading is not done on the cpu, nor the resolution I don't care what you said, i posted a benchmark that says otherwise with an i7-2600 instead of your x2 athlon 512 mb instead of 256 mb makes a huge difference on 1050p on those cards patches and updates won't give the game that much performance, these are cards from 2007. Stop saying the 8800 gt is 3 times the power of a console, everyone knows you're a dumbass who can't admit he's wrong.

Want some cheese with that whine.

Even Nvidia disagrees with you and they agree with me when I said the 8800gt is around 3x the performance of consoles.

I know that I am right because I have an 8800gt and have played the game on it.

And yes extreme settings do hurt the CPU compared to low settings.

But now you are just trying to derail the argument with bringing up the CPU.

The whole argument was about the 8800gt's performance.

If you look on YouTube you will see many people agreeing with me on the performance.

This guy was average 35-40fps at 1600x1200 which is a higher resolution than the 1680x1050 I played it on.

http://www.youtube.com/watch?v=5ksL2pwecHg

If he lowered it to 1680x1050 then it would be getting over 40fps.

This guy played at 1440x900 and was getting 55-90fps.

http://www.youtube.com/watch?v=gt1TR9unhnU

If he bumped it up to my resolution it would also play around 40-50fps.

This guy was using a weaker Athlon x2 2.4ghz CPU at 1080P and was getting 30-40fps on his 8800gt.

http://www.youtube.com/watch?v=cRsKJigMjfU

I could keep using examples.

So 20 fps while recording and otherwise 40fps? Thats bogus, you dont loose that much fps while recording videos lol. even on low end pcs, recording video taxes ram and cpu not the gpu.

#200 Posted by 04dcarraher (19328 posts) -

[QUOTE="RyviusARC"]

[QUOTE="evildead6789"]

So what, christ, your stupidity has no limits

They benchmarked it onto extreme settings, the difference with higher end cpu's is also big and there's no bottleneck there

 So what do you think ? the athlon x2 with an 9800 gt will give the same framerates as an i7-2600 with an 9800 gt on console settings. There's already a difference between i7-2600's at different clock settings (which they also tested in that article). They concluded out of the bencmarks (and everyone who knows something about pc hardware) that  crysis 2 is very cpu dependent, it also makes use of four cores.

I suggest you stop making a fool of yourself :lol:

Alienware_fan

Your in denial....

Sure , very nice arguments

it is, since you have no clue....

The tests is done on a GTX 590 a dual gpu card which means gpu performance totally depends on the cpu. Also if the cpu can not supply the data to the gpu/gpus fast enough it bogs down the system's performance. With a single gpu based card a dual core can provide much more then 30 fps.