Is console 'next-gen' again holding back PC gaming?

  • 171 results
  • 1
  • 2
  • 3
  • 4

This topic is locked from further discussion.

#101 Posted by cainetao11 (17315 posts) -

I couldn't give a shit one way or the other.

I agree. Maybe gaming isn't for a person, if while PLAYING, they arent attention invested in the game, but wondering how something is holding back gaming.

#102 Posted by Wasdie (49745 posts) -

I respectfully disagree. RAM alone is not the answer. The CPU are damn too weak in next-gen consoles. Pretty damn sure the CPU in PS4 didn't meet the Planet Side 2 teams demand and they had to scale back.

And I highly disagree that the DirectX is actually has any bottleneck on PCs. You your self has said that the biggest impant these APIs have is on the CPU and not on the GPU it self. I'm sure PC gamers have CPUs magnitudes of times more powerful than the ones in 900pStation and 720pBox.

It is just the start and developers already started cutting resolutions ranging from 720p - 900p. I can easily see either more cuts in resolution if they go higher on graphics effects or just not push the boundaries at all which ultimately HOLDS BACK THE PC.

I would really be impressed if I can see any game on consoles matching Crysis 2 technically.

There have been no reports about the PS4 version of Planetside 2. None. SoE says they can run the PS4 version on the highest PC settings. That's the only thing we've heard. Anything else you've heard is just complete speculation.

DirectX has a CPU bottleneck that really throttles weaker CPUs when trying to push graphics. This is common knowledge. DirectX is all software driven instead of something like Mantle that is much lower level. The advantages is that DirectX can support a much larger variety of hardware at the cost of CPU performance. That's how software APIs work.

The PS4 and Xbox One CPUs are more powerful than the CPU recommended for Crysis 2, so you're also wrong there.

Cutting back some resolution because the GPU has some bottlenecks early on in a platforms life in by no way holds back the PC. Every PC game on the market today has a requires spec lower than the PS4/Xbox One.There is no logic to say that the consoles are holding back PC games when PC games are already being built for computers with less power than the consoles.

The whole "the console are holding back the PC" is complete hyperbole and garbage at this point. Last generation there was a significant problem with the consoles as they had so little RAM even core mechanics of the game had to be scale backed to fit within the tiny memory footprint of the console. They also ran IBM based CPUs that weren't x86 which made porting more difficult, especially on the PS3. Now that's no longer the case.

PC games do not need massive CPUs to run well. Even a CPU like the one found in the PS4 would still be plenty to run all games on the PC today and for the forseeable future. CPU requirements really aren't increasing they used to. Most games today are still programmed to use only one core of a processor, some are good with dual cores, and practically none are programmed with quad core in mind. The CPU is not a bottleneck anymore.

#103 Edited by kinectthedots (1674 posts) -

No,

it just one of the objective drawbacks of PC gaming that people have to deal with if they choose that platform.

There is no dedicated central fanbase or partiality of hardware on which all games will perform the same!!

The bold is the reason for your gripe, don't get mad at consoles because they will have games taylored to that specific platform and PC don't.

#104 Posted by Heil68 (43939 posts) -

NOpe. SONY is pushing the industry forward like normal.

#105 Edited by Peredith (2310 posts) -

PC gamers with Dual Cores and low end GPUs, and developers with low budgets are what's holding back PC gaming. If it wasn't for consoles, the PC wouldn't get half the high budget games it gets.

#106 Edited by TheFadeForever (1732 posts) -

@Peredith:

Most games are still programmed around single core and pc games have always been scalable that's why you can run low end card with most graphically intensive game.

You going talk about pc needing console when publisher like Sega make a lot of their profit on solely pc orientated titles. Also considering console isn't one single platform.

#107 Edited by bezza2011 (2408 posts) -

Nothing is holding back the pc except money. no matter how good you think your system is, if they push on with graphics and what not your system would be obsolete within a week.

I mean to power the next gen of pc would mean 2 titan gpu's just to get 4k gaming.

No one has that sort of money to push PC gaming to it's limits. no other reason,

Just to be clear PC's are actually there own worst enemy, there is such a vast difference from low end to high end, that they need to meet in the middle and thats the PC's problem, there is to much variation in what everyone has that it's hard to just program for.

#108 Posted by o0squishy0o (2754 posts) -

The question should be "Why is a platform with superior technology not progressing further than that of consoles" The fact nvidia have released a what like $150 card that runs Titan Fall better than xbox one shows us that the performance per cost is going to drop rapidly. With technologies such as Mantle which I was told by one person could be unlocked on the PS4 which would further its potential power, the PC platform could become something truely special this gen. Perhaps not exactly crazy powerful components, but better relative performance for less cost which is a winner for all.

Perhaps the steam box could be a dark horse!

#109 Posted by edwardecl (2239 posts) -

Short answer... Yes

Long Answer... if there were no consoles everyone who is a gamer would be buying games on PC with a constant upgrade cycle.

#110 Edited by SambaLele (5333 posts) -

Actually, I think the bigger question here is that costs is what's holding back devs.

Of course the X1 holds back the PS4, which in turn holds back the PC because they have smaller potential... but I don't think that devs still managed to really get 100% out of a 2010's PC hardware. they just won't optimize games to really harness that potential, because costs would soar...

Just look at how much hardware has evolved since 2005-6, and notice that, although the evolution in current games' visuals is quite obvious, it's still not as incredible as the increase in computing power, when you compare what they were able to do with a lot, lot less.

#111 Posted by W1NGMAN- (9959 posts) -

I don't understand why people spend thousands of dollars on these super ultra powerful PCs when there are literally a handful of devs that even bother trying to hit insane benchmarks for their games.

Crysis was the most powerful game for nearly an entire gen on PC, while developers working on consoles continually tried to one up one another with each release.

That's what's so exciting about the arrival of next gen, we're entering Crysis level visuals but now tailored towards totally new settings and worlds.

#112 Posted by Wasdie (49745 posts) -

@W1NGMAN- said:

I don't understand why people spend thousands of dollars on these super ultra powerful PCs when there are literally a handful of devs that even bother trying to hit insane benchmarks for their games.

Crysis was the most powerful game for nearly an entire gen on PC, while developers working on consoles continually tried to one up one another with each release.

That's what's so exciting about the arrival of next gen, we're entering Crysis level visuals but now tailored towards totally new settings and worlds.

It's not just about running the highest end games at their fullest, it's about being able to run all games perfect and being able to push older games to render at even higher resolutions while maintaining high framerates.

There are always ways to push the hardware you buy. Some games, like Battlefield 4, are begging for you to run in 4k and supersample. The quality of the picture increases dramatically. Crysis 3 is another.

#113 Posted by Zaraxius (205 posts) -

As if most developers aim to cater to the PC gamers with the highest performance PCs. Most aim for the PC players who have a far inferior performance than the new consoles.

#114 Posted by Gue1 (9758 posts) -

you need money to make a game and 90% of the time games are made for money. Money is not with the people that has high-end rigs because they are the minority. Game devs do a census about about PC set-ups and then they aim for a middle point that allows them to have certain margin of profit based on a prediction. So even if consoles weren't a factor PC gaming would still being held back by PC gaming itself and maybe even worse.

#115 Posted by zeeshanhaider (2454 posts) -

@zeeshanhaider said:

@Wasdie said:

No they aren't. Graphics can easily scale and the Xbox One and PS4 have enough ram and CPU power to play even the most demanding PC games on the market today and for the foreseeable future. The biggest limitation of the Xbox 360 and PS4 was the 512mbs of ram they had and their aged CPUs.

CPUs today have far more power than games need. Unless a game is poorly coded, it's rare that the CPU is the limitation. The biggest bottleneck on the PC is actually DirectX and OpenGL which have to do a lot more processing on the CPU than the consoles do to render graphics. This is slowly changing with Mantle and newer revisions of DirectX/OpenGL, but it's something the consoles have never had to deal with so their weaker CPUs are not a major issue.

With 8 gbs of total ram a developer can do quite a bit. Actual game logic, level design, AI, pathfinding, and all of those things do not take that much RAM. Even the most graphically intense and CPU demanding games on the PC rarely need more than 3 gbs of system ram. You can do quite a bit with that.

The thing with graphics is they can be scaled a lot more easily. You can much more easily scale back the amount of pretty graphical effects and focus on just the necessities. The PS4 and Xbox One, even the Wii U, are more than capable of running games with the basic graphical features that actually impact gameplay (draw distances, lighting, effects). If you have the power you can start making more rendering passes for more detail, increase the resolution so that more detail is rendered each frame, increase the AA to smooth out jaggies when not running uber high resolutions, and do all of that stuff. It's really easy to just tone that stuff down to make it run on a weaker GPU. A lot of PCs games today can run on a large variety of GPUs with the low end often falling well below what is in the PS4/Xbox One.

So don't worry about it. Now that the devs have 8 gbs of ram and a familiar x84 processor, developers will not be bottlenecked by the console's hardware. They'll still have to design their games around a typical living room setup and controller though.

I respectfully disagree. RAM alone is not the answer. The CPU are damn too weak in next-gen consoles. Pretty damn sure the CPU in PS4 didn't meet the Planet Side 2 teams demand and they had to scale back.

And I highly disagree that the DirectX is actually has any bottleneck on PCs. You your self has said that the biggest impant these APIs have is on the CPU and not on the GPU it self. I'm sure PC gamers have CPUs magnitudes of times more powerful than the ones in 900pStation and 720pBox.

It is just the start and developers already started cutting resolutions ranging from 720p - 900p. I can easily see either more cuts in resolution if they go higher on graphics effects or just not push the boundaries at all which ultimately HOLDS BACK THE PC.

I would really be impressed if I can see any game on consoles matching Crysis 2 technically.

You can disagree with him/her all you like. Doesn't change facts. Ram constraints on the last consoles was part of what held back pc gaming.

The overhead that the pc API's(DX3D/openGL) make is also holding back PC game development, Now with Mantle we might see some changes.

Lastly, as I mention in an earlier post, ignored by all, is the fact that game devs have to take into account that the end user might still be on 32bit OS. limiting their use of RAM to between -3gb on system ram and VRAM combined. This is the reason why Skyrim, though looking good, looks like shit compared to what it could have looked like if they didn't ship with being locked at 2gb.

Well if you call yours fact, at least bring something to back it up or I simply have to accept what you say while I already mentioned the bottlenecks in 900pStation and 720pBox.

#116 Posted by zeeshanhaider (2454 posts) -

@Wasdie said:

@zeeshanhaider said:

I respectfully disagree. RAM alone is not the answer. The CPU are damn too weak in next-gen consoles. Pretty damn sure the CPU in PS4 didn't meet the Planet Side 2 teams demand and they had to scale back.

And I highly disagree that the DirectX is actually has any bottleneck on PCs. You your self has said that the biggest impant these APIs have is on the CPU and not on the GPU it self. I'm sure PC gamers have CPUs magnitudes of times more powerful than the ones in 900pStation and 720pBox.

It is just the start and developers already started cutting resolutions ranging from 720p - 900p. I can easily see either more cuts in resolution if they go higher on graphics effects or just not push the boundaries at all which ultimately HOLDS BACK THE PC.

I would really be impressed if I can see any game on consoles matching Crysis 2 technically.

There have been no reports about the PS4 version of Planetside 2. None. SoE says they can run the PS4 version on the highest PC settings. That's the only thing we've heard. Anything else you've heard is just complete speculation.

Pretty damn sure about the Planetside 2 debacle. Read the post right here on Gamespot, right from the mouth of the Planetside 2 devs.

DirectX has a CPU bottleneck that really throttles weaker CPUs when trying to push graphics. This is common knowledge. DirectX is all software driven instead of something like Mantle that is much lower level. The advantages is that DirectX can support a much larger variety of hardware at the cost of CPU performance. That's how software APIs work.

Again 'WEAK CPU's'. Not what you would found in many gaming PCs.

The PS4 and Xbox One CPUs are more powerful than the CPU recommended for Crysis 2, so you're also wrong there.

Don't tell me that tablet CPU is better than Core 2 Duo E6750 2.66 GHz and as far as I know Cryengine 3 even back then was optimized for 8 cores.

Cutting back some resolution because the GPU has some bottlenecks early on in a platforms life in by no way holds back the PC. Every PC game on the market today has a requires spec lower than the PS4/Xbox One.There is no logic to say that the consoles are holding back PC games when PC games are already being built for computers with less power than the consoles.

The lower spec PC's are because of Cross generation development. If they ditch last gen I can guarantee you the disappearance of lower spec PC's. By the way which game has lower spec than PS4/X1? Serious question. I don't see system requirements while getting games.

PC games do not need massive CPUs to run well. Even a CPU like the one found in the PS4 would still be plenty to run all games on the PC today and for the forseeable future. CPU requirements really aren't increasing they used to. Most games today are still programmed to use only one core of a processor, some are good with dual cores, and practically none are programmed with quad core in mind. The CPU is not a bottleneck anymore.

Didn't you just contradicted your self over high level graphics API issue?

#117 Edited by Wasdie (49745 posts) -

@zeeshanhaider said:

@Wasdie said:

There have been no reports about the PS4 version of Planetside 2. None. SoE says they can run the PS4 version on the highest PC settings. That's the only thing we've heard. Anything else you've heard is just complete speculation.

Pretty damn sure about the Planetside 2 debacle. Read the post right here on Gamespot, right from the mouth of the Planetside 2 devs.

DirectX has a CPU bottleneck that really throttles weaker CPUs when trying to push graphics. This is common knowledge. DirectX is all software driven instead of something like Mantle that is much lower level. The advantages is that DirectX can support a much larger variety of hardware at the cost of CPU performance. That's how software APIs work.

Again 'WEAK CPU's'. Not what you would found in many gaming PCs.

The PS4 and Xbox One CPUs are more powerful than the CPU recommended for Crysis 2, so you're also wrong there.

Don't tell me that tablet CPU is better than Core 2 Duo E6750 2.66 GHz and as far as I know Cryengine 3 even back then was optimized for 8 cores.

Cutting back some resolution because the GPU has some bottlenecks early on in a platforms life in by no way holds back the PC. Every PC game on the market today has a requires spec lower than the PS4/Xbox One.There is no logic to say that the consoles are holding back PC games when PC games are already being built for computers with less power than the consoles.

The lower spec PC's are because of Cross generation development. If they ditch last gen I can guarantee you the disappearance of lower spec PC's. By the way which game has lower spec than PS4/X1? Serious question. I don't see system requirements while getting games.

PC games do not need massive CPUs to run well. Even a CPU like the one found in the PS4 would still be plenty to run all games on the PC today and for the forseeable future. CPU requirements really aren't increasing they used to. Most games today are still programmed to use only one core of a processor, some are good with dual cores, and practically none are programmed with quad core in mind. The CPU is not a bottleneck anymore.

Didn't you just contradicted your self over high level graphics API issue?

Nowhere on Gamespot that I can find has a PS2 dev said anything like that. It's true that the individual cores of a PS4 aren't aren't as powerful as the cores of even mid-range PCs, but that's why there are 8 of them. It's designed for lower power usage and low heat. The PS4's processor is more than powerful enough to handle Planetside 2 once they split their processes up.

According to these benchmarks the Core 2 Duo model you mentioned is weaker than the A4-5000 CPU that the PS4 and Xbox One CPU is based off of. The A4-5000 is actually a 4 core 1.5ghz CPU while the PS4 and Xbox One models have a 1.7-1.8 ghz 8 core model. So technically the PS4 and Xbox One's CPUs are going to be more powerful.

Intel Core 2 Duo E6750

AMD A4-5000 "Jaguar"

The specs for most games today require only a Core 2 Duo processor and 2 gbs of RAM with a few modern games requiring older i5s. Games are pretty scaleable. ArmA 3 only requires a Core 2 Duo and Rome 2 Total War only requires a Core 2 Duo processor. Those are the most CPU intensive games I can think of outside of Planetside 2, which also only requires a Core 2 Duo.

CPUs haven't been a huge factor in game development since the Core 2 Duo series came out really. Since the CPUs have been getting more and more powerful while the processing requirements of games haven't been increasing that much. If anything the processing requirements have actually dropped a bit as stuff becomes more parallel.

You have to remember that the PS4 and Xbox One are pushing high parallelism with their APUs with 8 cores instead of single core performance. Hopefully this pushes more devs to utilizing more cores of a CPU. My i7 3770k rarely has more than 2 cores used for video games which is a sad waste of it's power. Parallel programming can alleviate a lot of bottlenecks. Also the PS4/Xbox One have APUs which give them better performance for things like physics and lighting calculations. As more processes are done in parallel the need for super power single cores decreases even further.

So yes, the PS4/Xbox One processors are more than capable of running modern games and games in the foreseeable future. They aren't close to high end CPUs but high end CPUs go woefully under utilized during games because they aren't a huge player.

When I was talking about the CPU requirements of DirectX, it's not really slowing the CPU down as much as it holds the GPU up waiting for the CPU to communicate with the ram, process the data, and send it to the GPU for processing. It's an unneeded step. If you have a weaker CPU the wait on the GPU is even longer which results in even lower FPS. It's really a bottleneck of the GPU, not CPU. I didn't write that properly earlier. DirectX is almost always placed on its own thread so it doesn't actually slow down the game's processing.

This is why DICE is so eager to get Mantle out. On high end rigs Mantle only gives you about a 3-4% FPS boost but on weaker machine it gives an FPS boost in the 40% range. This is because the GPU spends far less time waiting for data to process. The consoles both already have these low-level graphic APIs so software bottlenecks don't really exist. That's always been an advantage of a console allowing devs to get a lot more out of weaker hardware.

PC developers will always try to keep their games running on the largest subset of hardware. Look at the Steam hardware survey, the majority of gamers are rocking PCs that are 2-3 years old if not older and many of those PCs weren't high end when they were new. Generally devs aren't going to abandon those gamers. They will in the case of Star Citizen because gamers with higher end PCs funded the project, but that's the exception, not the rule.

The PS4 and Xbox One will hit their GPU limitations much sooner. In a lot of ways they already are. However there are still some other bottlenecks with the consoles that haven't been overcome. It's clear that Titanfall on the Xbox One is really suffering from that ESRAM. Less powerful GPUs can actually perform better than the Xbox One can.

#118 Edited by KungfuKitten (20963 posts) -

The gap is becoming so big that I expect more devs to develop on PC first this generation.

#119 Posted by slateman_basic (3960 posts) -

The gap is becoming so big that I expect more devs to develop on PC first this generation.

The only question that matters is: Will it be profitable?

#120 Posted by Wasdie (49745 posts) -

@KungfuKitten said:

The gap is becoming so big that I expect more devs to develop on PC first this generation.

The only question that matters is: Will it be profitable?

Not if they don't make console ports. PC gaming cannot hold those massive AAA budget titles by itself, the audience is too small and there is too much piracy that happens with the more expensive games to make it viable.

It doesn't matter. Most games this gen will be built on the PC first and ported to the consoles. The consoles have enough ram and CPU power to fit any sort of gameplay design that the developer wants. They also have very robust and sturdy networks capable of any level of network connectivity that is needed by the game. Devs are no longer limited by RAM so they can greatly increase the amount of unique art assets in the game worlds.

The only thing that will really need to be scaled down is the final graphics rendering. While the Xbox One and PS4 have capeable GPUs, they aren't anything special. Not only aren't they the most powerful things out there now, they are static. PCs will continue to widen the gap over time.

This is a huge difference from last gen where the console's limited amount of RAM was a massive bottleneck for game developers. This is no longer the case. You can fit an extraordinary amount of data within 2-3 gbs of RAM. Stuff like tiled resources and other advancements on the API level allow games to utilize their ram even more efficiently than before which reduces the game's overall memory footprint and increases performance.

#121 Edited by slateman_basic (3960 posts) -

@Wasdie said:

@slateman_basic said:

@KungfuKitten said:

The gap is becoming so big that I expect more devs to develop on PC first this generation.

The only question that matters is: Will it be profitable?

Not if they don't make console ports. PC gaming cannot hold those massive AAA budget titles by itself, the audience is too small and there is too much piracy that happens with the more expensive games to make it viable.

It doesn't matter. Most games this gen will be built on the PC first and ported to the consoles. The consoles have enough ram and CPU power to fit any sort of gameplay design that the developer wants. They also have very robust and sturdy networks capable of any level of network connectivity that is needed by the game. Devs are no longer limited by RAM so they can greatly increase the amount of unique art assets in the game worlds.

The only thing that will really need to be scaled down is the final graphics rendering. While the Xbox One and PS4 have capeable GPUs, they aren't anything special. Not only aren't they the most powerful things out there now, they are static. PCs will continue to widen the gap over time.

This is a huge difference from last gen where the console's limited amount of RAM was a massive bottleneck for game developers. This is no longer the case. You can fit an extraordinary amount of data within 2-3 gbs of RAM. Stuff like tiled resources and other advancements on the API level allow games to utilize their ram even more efficiently than before which reduces the game's overall memory footprint and increases performance.

IT seems to me that developers are doing the opposite. They're producing games for consoles and then bringing a PC version later. I'm guessing because it's easier to develop games for one standard of hardware and then worry about optimizing it to work on various configurations later.

And there are some pretty noticeable differences between console gaming and PC gaming. Trying to cater to both would end up with a product that isn't ideal on either one.

#122 Posted by remiks00 (1799 posts) -

@Wasdie said:

@zeeshanhaider said:

@Wasdie said:

There have been no reports about the PS4 version of Planetside 2. None. SoE says they can run the PS4 version on the highest PC settings. That's the only thing we've heard. Anything else you've heard is just complete speculation.

Pretty damn sure about the Planetside 2 debacle. Read the post right here on Gamespot, right from the mouth of the Planetside 2 devs.

DirectX has a CPU bottleneck that really throttles weaker CPUs when trying to push graphics. This is common knowledge. DirectX is all software driven instead of something like Mantle that is much lower level. The advantages is that DirectX can support a much larger variety of hardware at the cost of CPU performance. That's how software APIs work.

Again 'WEAK CPU's'. Not what you would found in many gaming PCs.

The PS4 and Xbox One CPUs are more powerful than the CPU recommended for Crysis 2, so you're also wrong there.

Don't tell me that tablet CPU is better than Core 2 Duo E6750 2.66 GHz and as far as I know Cryengine 3 even back then was optimized for 8 cores.

Cutting back some resolution because the GPU has some bottlenecks early on in a platforms life in by no way holds back the PC. Every PC game on the market today has a requires spec lower than the PS4/Xbox One.There is no logic to say that the consoles are holding back PC games when PC games are already being built for computers with less power than the consoles.

The lower spec PC's are because of Cross generation development. If they ditch last gen I can guarantee you the disappearance of lower spec PC's. By the way which game has lower spec than PS4/X1? Serious question. I don't see system requirements while getting games.

PC games do not need massive CPUs to run well. Even a CPU like the one found in the PS4 would still be plenty to run all games on the PC today and for the forseeable future. CPU requirements really aren't increasing they used to. Most games today are still programmed to use only one core of a processor, some are good with dual cores, and practically none are programmed with quad core in mind. The CPU is not a bottleneck anymore.

Didn't you just contradicted your self over high level graphics API issue?

Nowhere on Gamespot that I can find has a PS2 dev said anything like that. It's true that the individual cores of a PS4 aren't aren't as powerful as the cores of even mid-range PCs, but that's why there are 8 of them. It's designed for lower power usage and low heat. The PS4's processor is more than powerful enough to handle Planetside 2 once they split their processes up.

According to these benchmarks the Core 2 Duo model you mentioned is weaker than the A4-5000 CPU that the PS4 and Xbox One CPU is based off of. The A4-5000 is actually a 4 core 1.5ghz CPU while the PS4 and Xbox One models have a 1.7-1.8 ghz 8 core model. So technically the PS4 and Xbox One's CPUs are going to be more powerful.

Intel Core 2 Duo E6750

AMD A4-5000 "Jaguar"

The specs for most games today require only a Core 2 Duo processor and 2 gbs of RAM with a few modern games requiring older i5s. Games are pretty scaleable. ArmA 3 only requires a Core 2 Duo and Rome 2 Total War only requires a Core 2 Duo processor. Those are the most CPU intensive games I can think of outside of Planetside 2, which also only requires a Core 2 Duo.

CPUs haven't been a huge factor in game development since the Core 2 Duo series came out really. Since the CPUs have been getting more and more powerful while the processing requirements of games haven't been increasing that much. If anything the processing requirements have actually dropped a bit as stuff becomes more parallel.

You have to remember that the PS4 and Xbox One are pushing high parallelism with their APUs with 8 cores instead of single core performance. Hopefully this pushes more devs to utilizing more cores of a CPU. My i7 3770k rarely has more than 2 cores used for video games which is a sad waste of it's power. Parallel programming can alleviate a lot of bottlenecks. Also the PS4/Xbox One have APUs which give them better performance for things like physics and lighting calculations. As more processes are done in parallel the need for super power single cores decreases even further.

So yes, the PS4/Xbox One processors are more than capable of running modern games and games in the foreseeable future. They aren't close to high end CPUs but high end CPUs go woefully under utilized during games because they aren't a huge player.

When I was talking about the CPU requirements of DirectX, it's not really slowing the CPU down as much as it holds the GPU up waiting for the CPU to communicate with the ram, process the data, and send it to the GPU for processing. It's an unneeded step. If you have a weaker CPU the wait on the GPU is even longer which results in even lower FPS. It's really a bottleneck of the GPU, not CPU. I didn't write that properly earlier. DirectX is almost always placed on its own thread so it doesn't actually slow down the game's processing.

This is why DICE is so eager to get Mantle out. On high end rigs Mantle only gives you about a 3-4% FPS boost but on weaker machine it gives an FPS boost in the 40% range. This is because the GPU spends far less time waiting for data to process. The consoles both already have these low-level graphic APIs so software bottlenecks don't really exist. That's always been an advantage of a console allowing devs to get a lot more out of weaker hardware.

PC developers will always try to keep their games running on the largest subset of hardware. Look at the Steam hardware survey, the majority of gamers are rocking PCs that are 2-3 years old if not older and many of those PCs weren't high end when they were new. Generally devs aren't going to abandon those gamers. They will in the case of Star Citizen because gamers with higher end PCs funded the project, but that's the exception, not the rule.

The PS4 and Xbox One will hit their GPU limitations much sooner. In a lot of ways they already are. However there are still some other bottlenecks with the consoles that haven't been overcome. It's clear that Titanfall on the Xbox One is really suffering from that ESRAM. Less powerful GPUs can actually perform better than the Xbox One can.

Excellent posts as usual Wasdie. You've cleared up the cpu side of things for me. But, what about the GPU's? You're saying that the current consoles can run games like Planetside 2 on the highest PC settings? So how does that compare to my OC gtx 770? It doesn't makes sense for the consoles to be able to match the same settings as the card for a cheaper price. Just trying to wrap my head around this info.

#123 Posted by Wasdie (49745 posts) -

@remiks00 said:

Excellent posts as usual Wasdie. You've cleared up the cpu side of things for me. But, what about the GPU's? You're saying that the current consoles can run games like Planetside 2 on the highest PC settings? So how does that compare to my OC gtx 770? It doesn't makes sense for the consoles to be able to match the same settings as the card for a cheaper price. Just trying to wrap my head around this info.

They are significantly weaker than a GTX 770. A single GTX 570 can max out Planetside 2, it's not GPU bottlenecked at all. Planetside 2 is mostly CPU limited because it's very poorly multithreaded. They've drastically improved that lately but it's still far away from where it needs to be. It's why the PS4 version is taking so much time, the engine needs to be gutted yet again and rebuilt to take advantage of 8 cores. They also have to go from DirectX 9 to Sony's OpenGL version, not really something done overnight.

I'm not trying to make the GPUs in the PS4/Xbox One out to be more powerful than they are. They are about as good as mid-high end PC GPUs that came out in 2011/2012. They aren't super powerful. However there are really no games today that truely push even those old GPUs to their limit, and the ones that do usually are very sloppy in their code. When pushing DirectX or OpenGL really hard you end up waiting for the CPU a lot because they are still software bound.

Also DirectX/OpenGL are tuned for many different configurations, it's very rare a game even comes close to properly utilizing the GPU. Even if a GPU is running at 100% it's rare using 100% of its power efficiently. On the consoles they have one significant advantage that the PC will never have, standard hardware. You can get away with building a game with tricks that will only work on one piece of hardware. You don't ever have to worry about if your shader will work well on other GPUs, you just have to worry about the one. That's a massive advantage and allows devs to get far more performance out of hardware than they would with a big "one size fits all" API like OpenGL or DirectX.

However those GPUs do have their limitations. Even with this increase in efficiency they aren't extremely powerful cards by today's standards and their limits will be hit pretty quickly. That's fine because even within their limits they can produce some very beautiful visuals. Look at Ryse and The Order 1886, beautiful games and we've only begun to see what developers can do with the tech. The APIs are still young, engines are still being optimized around that 8 core APU and the new architectures, and the devs probably haven't really started seriously optimizing for the specific hardware yet.

I don't think we'll see the jump of graphics we saw last gen (going from Gears of War 1 to Gears of War 3 as an example), but graphics will look like games like Ryse and The Order pretty much consistently throughout the gen with some more detail and whatnot added. What will really change is how developers optimize the rendering so that they distribute the detail to the important parts.

#124 Edited by remiks00 (1799 posts) -

@Wasdie said:

@remiks00 said:

Excellent posts as usual Wasdie. You've cleared up the cpu side of things for me. But, what about the GPU's? You're saying that the current consoles can run games like Planetside 2 on the highest PC settings? So how does that compare to my OC gtx 770? It doesn't makes sense for the consoles to be able to match the same settings as the card for a cheaper price. Just trying to wrap my head around this info.

They are significantly weaker than a GTX 770. A single GTX 570 can max out Planetside 2, it's not GPU bottlenecked at all. Planetside 2 is mostly CPU limited because it's very poorly multithreaded. They've drastically improved that lately but it's still far away from where it needs to be. It's why the PS4 version is taking so much time, the engine needs to be gutted yet again and rebuilt to take advantage of 8 cores. They also have to go from DirectX 9 to Sony's OpenGL version, not really something done overnight.

I'm not trying to make the GPUs in the PS4/Xbox One out to be more powerful than they are. They are about as good as mid-high end PC GPUs that came out in 2011/2012. They aren't super powerful. However there are really no games today that truely push even those old GPUs to their limit, and the ones that do usually are very sloppy in their code. When pushing DirectX or OpenGL really hard you end up waiting for the CPU a lot because they are still software bound.

Also DirectX/OpenGL are tuned for many different configurations, it's very rare a game even comes close to properly utilizing the GPU. Even if a GPU is running at 100% it's rare using 100% of its power efficiently. On the consoles they have one significant advantage that the PC will never have, standard hardware. You can get away with building a game with tricks that will only work on one piece of hardware. You don't ever have to worry about if your shader will work well on other GPUs, you just have to worry about the one. That's a massive advantage and allows devs to get far more performance out of hardware than they would with a big "one size fits all" API like OpenGL or DirectX.

However those GPUs do have their limitations. Even with this increase in efficiency they aren't extremely powerful cards by today's standards and their limits will be hit pretty quickly. That's fine because even within their limits they can produce some very beautiful visuals. Look at Ryse and The Order 1886, beautiful games and we've only begun to see what developers can do with the tech. The APIs are still young, engines are still being optimized around that 8 core APU and the new architectures, and the devs probably haven't really started seriously optimizing for the specific hardware yet.

I don't think we'll see the jump of graphics we saw last gen (going from Gears of War 1 to Gears of War 3 as an example), but graphics will look like games like Ryse and The Order pretty much consistently throughout the gen with some more detail and whatnot added. What will really change is how developers optimize the rendering so that they distribute the detail to the important parts.

Okay, that really put things in the perspective, thanks Wasdie.

#125 Posted by trugs26 (5388 posts) -

PC gaming is holding back supercomputer gaming.

Human minds are holding back true graphical fidelity.

X is holding back Y.

In the end, it's whatever is popular is being supported. And it's popular for a reason. People want to use it. Regardless, there are limitations to everything in the world. You might dislike it, but in the end you have to just vote with your wallet, and hope that everyone agrees with you. But then someone else could like something less popular than what you want and is better (e.g supercomputer gaming), and the cycle repeats.

#126 Posted by 04dcarraher (19476 posts) -

@Wasdie said:

@endlessinfinity said:

@Wasdie:

how many rams do the console have for reserve

That question makes no sense. Do you mean how much do the consoles hold back for the OS? I'm not sure at the exact number, but people speculate anywhere from 1-3 gbs of it. At the beginning of the generation the OS's memory footprint is always the largest. As time goes on they'll reduce the footprint. They can't increase the footprint because that would have a negative impact on all games built prior to the OS update.

It's not a huge deal. Even a game like Planetside 2 only requires roughly 3gbs of system ram and about 1 gb of video ram. ArmA 3 only takes a maximum of 3.5 gbs of RAM. Even if the consoles only had 6 gbs of ram allowed for the game that would still give a game 1.5 gbs of video ram and with modern APIs and their tools like tiled resources, that's more than enough for games to look great at 1080p.

Just to let you know, the X1 allocates 3gb for OS and features and the PS4 allocates 3.5gb for OS and features. and both consoles have two cores allocated for OS+features. so these consoles only have 6 cores for the games.

#127 Posted by foxhound_fox (88053 posts) -

PC gaming marches to the beat of it's own drum.

It's why it's only been growing in popularity and total market value, while console gaming has only dwindled in the same time frame.

#128 Posted by foxhound_fox (88053 posts) -

@Kjranu said:

I think it is. My PC is powered by a eight core 4.2GHz FX 8350 and a 4.9TF R9-290 video card. The console CPUs are much less capable than my CPU and their video card are immeasurably weaker (XBO 1.2 and PS4 1.8 teraflops). Most developers out there are building games with the consoles capability in mind ... not what PCs are capable of. It's making me feel shafted because of lame-arse "next-gen" games COULD look a lot better if they were developed for top-spec PCs in mind ... and then downscale for the consoles.

Most of those games you're referring to wouldn't even been made if not for consoles driving sales up. Consoles are like the bee you need us but we don't need you.

In 2008 there was over 263 million online PC gamers. At the time, that was over 100 million more people than all three major consoles had sold at the time. The PC market has only been growing since.

#129 Posted by Magescrew (522 posts) -

Maybe. I mean, next gen consoles are weaker than my gaming laptop. But it would be better to say that people in general are holding back gaming. If everyone dropped $1-2K on a great gaming PC then you would see leaps and bounds of game development. But that would never happen. Many are still happy with their 360s and PS3s. Until there's a huge demand for true next gen stuff you'll continue to see consoles catering to the median denominator.

#130 Edited by 04dcarraher (19476 posts) -

@remiks00 said:

Excellent posts as usual Wasdie. You've cleared up the cpu side of things for me. But, what about the GPU's? You're saying that the current consoles can run games like Planetside 2 on the highest PC settings? So how does that compare to my OC gtx 770? It doesn't makes sense for the consoles to be able to match the same settings as the card for a cheaper price. Just trying to wrap my head around this info.

On the cpu's front these consoles do have weak cpu's and they have to rely on multithreading to get them by. and in many aspects it will limit in what they can do since the jaguars core itself is on par with 7-8 year old pc cpu's for processing power. So developers have six of the cores solely for the games which is more then enough right now for gaming and can use gpu compute for other tasks. Now about the gpu's these console gpu's are weak in terms of performance standards today. the X1 is rocking a gpu that sits between a 7770 and 7790 in terms of performance and the PS4's gpu more or less slightly faster then 7850 or AMD's r265.

Your 770 flat out creams these consoles, its no different from when Geforce 8800's showed up in 2006 and outclassed the 360 and PS3 even to this day. These consoles abilities are not going to be like what happened with the 360 or PS3 from beginning to end where they had to learn the hardware and wait and design the software to make most use of the hardware. These new consoles pretty much took out the page out of pc hardware and use the available hardware to fit their budgets and limits. While as time goes on they will be able to squeeze more out of them and come up to new techniques to save resources. But in the end they are still limited be the physical processing power from the hardware which was already multiple times slower then Pc gpu's from 2011/2012.

#131 Posted by Ripsaw1994 (104 posts) -
#132 Edited by incuensuocha (1509 posts) -

No. The argument only comes up in relation to ports of console games. They without question look better on PC, but they were designed originally as console games. How could consoles be holding back PCs when the games in question weren't originally intended to be PC games. PC gaming is ultimately defined by PC exclusives. In that regard we are able to see how the PC is capable of doing things that a console can't even come close to doing.

#133 Posted by remiks00 (1799 posts) -

@remiks00 said:

Excellent posts as usual Wasdie. You've cleared up the cpu side of things for me. But, what about the GPU's? You're saying that the current consoles can run games like Planetside 2 on the highest PC settings? So how does that compare to my OC gtx 770? It doesn't makes sense for the consoles to be able to match the same settings as the card for a cheaper price. Just trying to wrap my head around this info.

On the cpu's front these consoles do have weak cpu's and they have to rely on multithreading to get them by. and in many aspects it will limit in what they can do since the jaguars core itself is on par with 7-8 year old pc cpu's for processing power. So developers have six of the cores solely for the games which is more then enough right now for gaming and can use gpu compute for other tasks. Now about the gpu's these console gpu's are weak in terms of performance standards today. the X1 is rocking a gpu that sits between a 7770 and 7790 in terms of performance and the PS4's gpu more or less slightly faster then 7850 or AMD's r265.

Your 770 flat out creams these consoles, its no different from when Geforce 8800's showed up in 2006 and outclassed the 360 and PS3 even to this day. These consoles abilities are not going to be like what happened with the 360 or PS3 from beginning to end where they had to learn the hardware and wait and design the software to make most use of the hardware. These new consoles pretty much took out the page out of pc hardware and use the available hardware to fit their budgets and limits. While as time goes on they will be able to squeeze more out of them and come up to new techniques to save resources. But in the end they are still limited be the physical processing power from the hardware which was already multiple times slower then Pc gpu's from 2011/2012.

Well said, thanks for sharing that knowledge. I definitely learned some new things.

#134 Posted by thereal25 (401 posts) -

Just look at it this way. The current gen of consoles is a vast improvement over the last gen.

#135 Posted by Ballroompirate (22755 posts) -

To put it simple

No

#136 Edited by Butcer2 (63 posts) -

No consoles are what is keeping pc gaming advancing , without it compnaies could not afford to make expensive next gen games because theres not nearly enough dough to be made on pc. without them all you would have is low budget indies

#137 Posted by Butcer2 (63 posts) -

Short answer... Yes

Long Answer... if there were no consoles everyone who is a gamer would be buying games on PC with a constant upgrade cycle.

wrong you would have to upgrade far less if it was not for consoles because games would advance far slower because the pc gaming market is chump change compared to the consoles

#138 Posted by Crunchy_Nuts (2749 posts) -

Yes. Console manufacturers are deliberately forcing PC developers to make inferior games. It has absolutely nothing to do with the fact that developers want to cut corners to get to most acceptable product out on the market as quick as they can.

#139 Posted by SEANMCAD (5464 posts) -

@KungfuKitten said:

The gap is becoming so big that I expect more devs to develop on PC first this generation.

The only question that matters is: Will it be profitable?

hugely so.

HOWEVER, the games needs to cross over to tablets.

#140 Posted by gregbmil (2607 posts) -

*shakes head* such a stupid concept, and argument.

The only people holding back PC games are PC gamers whom refuse to buy unlocked 1k dollar i7's and setup nvidia titans in 4 way sli...etc...etc Basically PC Gamers are holding the PC gaming back by not buying 2k dollar systems.

Also of note the best selling PC game Minecraft, and other low system requirement games which are more powerful on the pc than high end system needed games.

I never thought about it that way. You are probably right.

#141 Posted by V3rciS (2213 posts) -

You "smart and wise" hermits do realize that even if consoles wouldn't exist, and devs would build games for PCs only. Then guess what specs they'd target? I guess the average right? Which is far behind the specs you posted OP. Oppsss that means that your "super duper imba eight core 4.2GHz FX 8350 and a 4.9TF R9-290 video card PC" would still be irrelevant.

So my dear smart ass basement crawlers, consoles are not holding back your pc gaming.

#142 Posted by lostrib (35762 posts) -

@V3rciS: now, try the same thing but without being an assclown

#143 Posted by True_Gamer_ (6081 posts) -

@Wasdie:

Its funny how in November 2005 the Xenos raped all PC GPUs....And that ancient tech lasted 8 years. I doubt that the Bones' GPU will survive that long. I expect the next Xbox announced at E3 2018.....

#144 Edited by V3rciS (2213 posts) -

@lostrib said:

@V3rciS: now, try the same thing but without being an assclown

Truth hurts my friend, anyway I'll consider my point taken ^^

#145 Edited by lostrib (35762 posts) -

@V3rciS said:

@lostrib said:

@V3rciS: now, try the same thing but without being an assclown

Truth hurts my friend, anyway I'll consider my point taken ^^

What truth? it's just speculation and insults

#146 Edited by trasherhead (3058 posts) -

@trasherhead said:

This is the reason why Skyrim, though looking good, looks like shit compared to what it could have looked like if they didn't ship with being locked at 2gb.

Skyrim released in 2011 when consoles had 512mb total RAM/VRAM. Most PCs were already coming out with at least 4gb by 2009 (6gb by the following year).

Of the 4gb, 32-bit XP and 32-bit Vista/Win 7 can address 3.25gb plus whatever VRAM is on the video card. That's still a huge gap between 512mb and 3.25gb + VRAM. I doubt the 32-bit OS is a limiting factor.

The average steam gamer today don't have more then 4GB of ram. But Skyrim was locked to use no more then 2 GB of ram, it was later patched after moders made a fan patch to let it access more then 2GB of ram. Yes, there is a huge gap there, but after you remove what the OS and applications uses, there is considerably less of a gap and you suddenly are down at close to 2gb of ram left total. The only way that consoles were holding it back was because they were both at DX9c feature sets(yes, 360 had a few DX10 features and PS3 was OpenGL), meaning Skyrim had to support that too.

This has been discussed in length before, 32bit os has lingered too long and has been and is a limiting factor.

#147 Posted by trasherhead (3058 posts) -

@trasherhead said:

@zeeshanhaider said:

@Wasdie said:

No they aren't. Graphics can easily scale and the Xbox One and PS4 have enough ram and CPU power to play even the most demanding PC games on the market today and for the foreseeable future. The biggest limitation of the Xbox 360 and PS4 was the 512mbs of ram they had and their aged CPUs.

CPUs today have far more power than games need. Unless a game is poorly coded, it's rare that the CPU is the limitation. The biggest bottleneck on the PC is actually DirectX and OpenGL which have to do a lot more processing on the CPU than the consoles do to render graphics. This is slowly changing with Mantle and newer revisions of DirectX/OpenGL, but it's something the consoles have never had to deal with so their weaker CPUs are not a major issue.

With 8 gbs of total ram a developer can do quite a bit. Actual game logic, level design, AI, pathfinding, and all of those things do not take that much RAM. Even the most graphically intense and CPU demanding games on the PC rarely need more than 3 gbs of system ram. You can do quite a bit with that.

The thing with graphics is they can be scaled a lot more easily. You can much more easily scale back the amount of pretty graphical effects and focus on just the necessities. The PS4 and Xbox One, even the Wii U, are more than capable of running games with the basic graphical features that actually impact gameplay (draw distances, lighting, effects). If you have the power you can start making more rendering passes for more detail, increase the resolution so that more detail is rendered each frame, increase the AA to smooth out jaggies when not running uber high resolutions, and do all of that stuff. It's really easy to just tone that stuff down to make it run on a weaker GPU. A lot of PCs games today can run on a large variety of GPUs with the low end often falling well below what is in the PS4/Xbox One.

So don't worry about it. Now that the devs have 8 gbs of ram and a familiar x84 processor, developers will not be bottlenecked by the console's hardware. They'll still have to design their games around a typical living room setup and controller though.

I respectfully disagree. RAM alone is not the answer. The CPU are damn too weak in next-gen consoles. Pretty damn sure the CPU in PS4 didn't meet the Planet Side 2 teams demand and they had to scale back.

And I highly disagree that the DirectX is actually has any bottleneck on PCs. You your self has said that the biggest impant these APIs have is on the CPU and not on the GPU it self. I'm sure PC gamers have CPUs magnitudes of times more powerful than the ones in 900pStation and 720pBox.

It is just the start and developers already started cutting resolutions ranging from 720p - 900p. I can easily see either more cuts in resolution if they go higher on graphics effects or just not push the boundaries at all which ultimately HOLDS BACK THE PC.

I would really be impressed if I can see any game on consoles matching Crysis 2 technically.

You can disagree with him/her all you like. Doesn't change facts. Ram constraints on the last consoles was part of what held back pc gaming.

The overhead that the pc API's(DX3D/openGL) make is also holding back PC game development, Now with Mantle we might see some changes.

Lastly, as I mention in an earlier post, ignored by all, is the fact that game devs have to take into account that the end user might still be on 32bit OS. limiting their use of RAM to between -3gb on system ram and VRAM combined. This is the reason why Skyrim, though looking good, looks like shit compared to what it could have looked like if they didn't ship with being locked at 2gb.

Well if you call yours fact, at least bring something to back it up or I simply have to accept what you say while I already mentioned the bottlenecks in 900pStation and 720pBox.

Ok, let me put it simpler then. Until PC games has to sacrifice feature sets, such as using DX11.2 instead of DX18.89, then consoles are not holding games back. Textures and models can be easily scaled down to fit within a systems ram and chipset limitations. The fact that all the games coming out now started development before the specs and feature sets were known, MIGHT have something to do with a lot of the titles not performing as well as the devs would hope and like, forcing them to cut corners were it is quickest, resolution.

#148 Edited by AdrianWerner (28005 posts) -

Technically? Yes

But actually? No. Because without consoles nobody would be able to make those games with those kinds of budgets as PC exclusives anyway.

#149 Posted by HalcyonScarlet (4223 posts) -

I have to agree. This time round the current gen consoles are weak off the bat and are/will get surpassed by low end PC games gear pretty quickly.

In the long run, I see it having an effect on PC games.

This is going to be one long ass gen. With graphics failing to impress me in any way so far, I think the sweet spot when they do will be short.

I'd be more impressed with the current gen consoles, if they stopped wasting the ram on high res textures and just made these huge worlds and pushed resources more into gameplay. But it's easier to sell people on what they can see.

Still, physics are better now... that's nice.

#150 Edited by sukraj (22566 posts) -

Just look at it this way. The current gen of consoles is a vast improvement over the last gen.

love everything about the current gen of consoles.