PS4 and Xbox One Already Hitting a Performance Wall

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#1  Edited By Coseniath
Member since 2004 • 3183 Posts

I wouldn't normally posted this but this heavily affects PC games so...

This is an article from PCPerspective.

Sony PS4 and Microsoft Xbox One Already Hitting a Performance Wall

A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600x900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?

For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:

The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. ...With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.

What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.

We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..

So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.

  1. The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920x1080 resolution with Assassin's Creed Unity.
  2. The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
  3. The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
  4. Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.

It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture - we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.

If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:

PlayStation 4Xbox One
Processor8-core Jaguar APU8-core Jaguar APU
MotherboardCustomCustom
Memory8GB GDDR58GB DDR3
Graphics Card1152 Stream Unit APU768 Stream Unit APU
Peak Compute1,840 GFLOPS1,310 GFLOPS

The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.

If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).

Also note that if the developer is uing 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.

Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.

But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.

Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?

-----------

This is really bad news for PC gaming. If they already reached a performance wall, I think in a couple of years the owners of Titan III 20GB will play games at 30FPS, 720p with only DX11 image quality...

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#2  Edited By 04dcarraher
Member since 2004 • 23829 Posts

Singing to the choir, Jaguar architecture, is has ipc performance slightly better then 2006 era cpu's. If that was bad how about the fact that they are only clocked around 1.6 ghz. Also there is a two core allocation for OS and features which means there is only 6 cores available for games. So needless to say they are rocking a cpu with the total processing power of around an 2.4 ghz Athlon 2 X4. Then as for the memory both consoles have to use 3-3.5 gb for OS and features leaving 4.5/5gb for the use of the game cache and Vram. And the gpu's your looking at 7850(PS4) and 7790(X1) type of gpu performance ranges.

Coming from the 360 and PS3 these consoles are a great upgrade, once engines become standardized and are using native dx11 features we will start seeing larger more dynamic worlds vs last gen. Pc gaming as of right now and the near future better off then it was before the release of the X1/PS4. Being stuck with consoles with hardware from 2005/2006...

Now about the requirements for unity I think they are inflated/fake because of the current trend of gaming saying you need 3-4gb of Vram and i7's FX 8's etc and isnt true. Ive read somewhere that the requirements are based on what they have at the site. We will see if its a poorly coded game like AC3 and early versions of AC:BF. Or if the requirements are just bogus.

Avatar image for Old_Gooseberry
Old_Gooseberry

3958

Forum Posts

0

Wiki Points

0

Followers

Reviews: 76

User Lists: 0

#3 Old_Gooseberry
Member since 2002 • 3958 Posts

they won't be releasing new consoles for a while... I guess all they can do is scale back their games and make them run better. Does suck to be a PC gamer though cause many multiplatform releases will be junky.

I'm still using a 680gtx i got back in early 2012. How come it can run every game max'd at 1080p over 60 fps and these new consoles can't. I even ran Wolfenstein New Order earlier today at 4k resolution with DSR and was getting 50-60 fps. On these garbage consoles i'm guessing they'd run games at 5 fps at 4k

Avatar image for commander
commander

16217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#4  Edited By commander
Member since 2010 • 16217 Posts

@Old_Gooseberry said:

they won't be releasing new consoles for a while... I guess all they can do is scale back their games and make them run better. Does suck to be a PC gamer though cause many multiplatform releases will be junky.

I'm still using a 680gtx i got back in early 2012. How come it can run every game max'd at 1080p over 60 fps and these new consoles can't. I even ran Wolfenstein New Order earlier today at 4k resolution with DSR and was getting 50-60 fps. On these garbage consoles i'm guessing they'd run games at 5 fps at 4k

oh don't worry next year we get star citizen, if devs will see the success of that , they will know what to do.

server authenticated supergames.

As for your gtx 680, well it's way stronger than the gpu in the ps4 and xboxone and it's combined with an even stronger cpu.

Avatar image for linkyshinks
linkyshinks

1332

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 linkyshinks
Member since 2006 • 1332 Posts

I highly doubt this is happening, the PS4 & XO have after all been developed with significant developer input. If there is a wall, we won't be arriving at it anytime soon. I think radically new software design, working alongside current conventional GPU's & CPU's, will prolong the life of the current crop of consoles significantly, and down the line, radically new CPU's & GPU's which are focused entirely around these new programming techniques... I think the use of resource-heavy Polygons is certainly coming to an end, when I see the likes of "unlimited detail point cloud graphics". Far improved, less resource intensive graphics, will allow for a greater number of AI streams and improved physics.

Avatar image for neatfeatguy
neatfeatguy

4400

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#6 neatfeatguy
Member since 2005 • 4400 Posts

Hahaha...ha....? if this is true, that's so sad yet so expected.

I'm sure they'll find ways to better use the available hardware in the consoles - hopefully though it isn't "design a game that runs 1600x900, cap it at 30fps" and throw thier shit on the PC.

Guess time will tell.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7 Coseniath
Member since 2004 • 3183 Posts
@04dcarraher said:

Singing to the choir, Jaguar architecture, is has ipc performance slightly better then 2006 era cpu's. If that was bad how about the fact that they are only clocked around 1.6 ghz. Also there is a two core allocation for OS and features which means there is only 6 cores available for games. So needless to say they are rocking a cpu with the total processing power of around an 2.4 ghz Athlon 2 X4. Then as for the memory both consoles have to use 3-3.5 gb for OS and features leaving 4.5/5gb for the use of the game cache and Vram. And the gpu's your looking at 7850(PS4) and 7790(X1) type of gpu performance ranges.

Coming from the 360 and PS3 these consoles are a great upgrade, once engines become standardized and are using native dx11 features we will start seeing larger more dynamic worlds vs last gen. Pc gaming as of right now and the near future better off then it was before the release of the X1/PS4. Being stuck with consoles with hardware from 2005/2006...

Now about the requirements for unity I think they are inflated/fake because of the current trend of gaming saying you need 3-4gb of Vram and i7's FX 8's etc and isnt true. Ive read somewhere that the requirements are based on what they have at the site. We will see if its a poorly coded game like AC3 and early versions of AC:BF. Or if the requirements are just bogus.

Can't agree more...

Those 1,6Ghz Jaguar cores might not be equal to Q6600 2,4Ghz in games...

Oh and I think he used 7790 for both cause console versions might have moar cores but they have far lower clocks.

Thats why he just take compute performance which 7790's compute performance is 1790GFLOPs compared to 1840GFLOPs of PS4's and 1310GFLOPs of X1s.

Well about Unity, we have reached to a point that devs are no longer capable of analyzing their own product system requirements. So they can't be trusted and we have to wait for hardware site benchmarks to do the dirty job for them....

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#8  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Coseniath said:
@04dcarraher said:

Singing to the choir, Jaguar architecture, is has ipc performance slightly better then 2006 era cpu's. If that was bad how about the fact that they are only clocked around 1.6 ghz. Also there is a two core allocation for OS and features which means there is only 6 cores available for games. So needless to say they are rocking a cpu with the total processing power of around an 2.4 ghz Athlon 2 X4. Then as for the memory both consoles have to use 3-3.5 gb for OS and features leaving 4.5/5gb for the use of the game cache and Vram. And the gpu's your looking at 7850(PS4) and 7790(X1) type of gpu performance ranges.

Coming from the 360 and PS3 these consoles are a great upgrade, once engines become standardized and are using native dx11 features we will start seeing larger more dynamic worlds vs last gen. Pc gaming as of right now and the near future better off then it was before the release of the X1/PS4. Being stuck with consoles with hardware from 2005/2006...

Now about the requirements for unity I think they are inflated/fake because of the current trend of gaming saying you need 3-4gb of Vram and i7's FX 8's etc and isnt true. Ive read somewhere that the requirements are based on what they have at the site. We will see if its a poorly coded game like AC3 and early versions of AC:BF. Or if the requirements are just bogus.

Can't agree more...

Those 1,6Ghz Jaguar cores might not be equal to Q6600 2,4Ghz in games...

Oh and I think he used 7790 for both cause console versions might have moar cores but they have far lower clocks.

Thats why he just take compute performance which 7790's compute performance is 1790GFLOPs compared to 1840GFLOPs of PS4's and 1310GFLOPs of X1s.

Well about Unity, we have reached to a point that devs are no longer capable of analyzing their own product system requirements. So they can't be trusted and we have to wait for hardware site benchmarks to do the dirty job for them....

The 7790's compute performance is never seen because of its limited memory bus&bandwidth, The X1's gpu performance is roughly around the 7790 performance and the PS4's gpu is on par with a 7850. But those weak cpu's are limiting many aspects of the gaming abilities

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9  Edited By Coseniath
Member since 2004 • 3183 Posts
@04dcarraher said:

The 7790's compute performance is never seen because of its limited memory bus&bandwidth, The X1's gpu performance is roughly around the 7790 performance and the PS4's gpu is on par with a 7850. But those weak cpu's are limiting many aspects of the gaming abilities

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Coseniath said:
@04dcarraher said:

The 7790's compute performance is never seen because of its limited memory bus&bandwidth, The X1's gpu performance is roughly around the 7790 performance and the PS4's gpu is on par with a 7850. But those weak cpu's are limiting many aspects of the gaming abilities

This will give you an idea , you can go just by FLOPS, because a 7790 based on flop performance should beat 7850 but it dont. And a 7770 isnt too far behind a 7790, the 128bit bus hurts it.

Avatar image for deactivated-579f651eab962
deactivated-579f651eab962

5404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11  Edited By deactivated-579f651eab962
Member since 2003 • 5404 Posts

This has to be the worse console generation ever because it's the first one I'm not being a part of.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12  Edited By KHAndAnime
Member since 2009 • 17565 Posts

They'll need to optimize games more. Running 1080P isn't "optimized". Compared to 720p it's a huge decrease in performance for a marginal improvment in sharpness and clarity. If they were willing to sacrifice that bit of sharpness (which seems unlikely considering Sony has spent the whole time building up their 1080p gimmick), then console games could start potentially looking hell of a lot better, and the PC multiplats would look better too. 1080P isn't a realistic target for games on consoles this generation. I've said it before a million times, and I'll say it until this generation is over. Developers are going to pick up on this, start cutting down the resolution, and start improving the game's graphics. I mean, it's already in progress: consoles games are using black bars to pretend they're 1080P so they can squeeze extra performance out of the hardware. Give it time and console games will start running at 900P standard and hopefully by the end of the gen we will have come full circle and we will have 720P console games again (like last generation).

1080P is nice for games that aren't destined to use all the power in the consoles (like arcade titles), but 1080P is hardly necessary if they're sacrificing graphical bells and whistles just to render the game at that resolution. I'd rather simply sit an extra foot away from my TV and have better textures, shaders, etc.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13  Edited By Coseniath
Member since 2004 • 3183 Posts
@04dcarraher said:

This will give you an idea , you can go just by FLOPS, because a 7790 based on flop performance should beat 7850 but it dont. And a 7770 isnt too far behind a 7790, the 128bit bus hurts it.

Lol I know that 7850 is far better for gaming. I was even the one that criticised 7790 in its 1st day of release for not giving enough performance for money and we should have 7850 for 7790 price (everyone back then disagreed with me till 650ti boost arrived a few days later at the same price)...

You probably misread my sentence.

@Coseniath said:
Oh and I think he used 7790 for both cause console versions might have moar cores but they have far lower clocks.

Thats why he just take compute performance which 7790's compute performance is 1790GFLOPs compared to 1840GFLOPs of PS4's and 1310GFLOPs of X1s.

I was just explained why he probably used 7790 to compare it with console GPUs. Not that 7850 has less performance...

@klunt_bumskrint said:

This has to be the worse console generation ever.

+1...

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#14 04dcarraher
Member since 2004 • 23829 Posts

I did misread my bad

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15 Coseniath
Member since 2004 • 3183 Posts

@04dcarraher: No problem. I have misread countless of times too xD.

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#16 wis3boi
Member since 2005 • 32507 Posts

@klunt_bumskrint said:

This has to be the worse console generation ever because it's the first one I'm not being a part of.

i think they milked the last one to far,and then needed to rush something out, and we got this one. It'll probably be shorter and like something to fill the gaps....hopefully.

Avatar image for alucrd2009
Alucrd2009

787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#19 Alucrd2009
Member since 2007 • 787 Posts

what happened to steam box ?

now is the right time to get it in ... Valve will destroy both m$ and sony !

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#20 wis3boi
Member since 2005 • 32507 Posts

@Jawad2007 said:

what happened to steam box ?

now is the right time to get it in ... Valve will destroy both m$ and sony !

they wont be destroying anybody because a steambox is just another custom pc anyone can make

Avatar image for phaythos
Phaythos

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#21 Phaythos
Member since 2014 • 25 Posts

I hope in DX12 and OpenGL Next. I love consoles but 7 years like this will be really bad :(

Avatar image for Pedro
Pedro

69428

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#22 Pedro
Member since 2002 • 69428 Posts

The vast majority of gamers generally do not care if devs are hitting a wall. The most played PC game still is not a graphical wonder so I don't see how this realization of console performance is going to affect anything.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#23 ShadowDeathX
Member since 2006 • 11698 Posts

Coming from Ubisoft, one of the worst optimizers on the market. I'll say this is meh... Ubisoft has always been crap at optimizing engines for hardware.

Assassin's Creed: Black Flag wouldn't run at even a close 60fps on my 6-core 3930k @ 4.5Ghz and Trifire 7970.

**** Splinter Cell: Chaos Theory won't even run at a solid 60fps on my current PC.

So Ubisoft, I don't care.

Avatar image for GTR12
GTR12

13490

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 GTR12
Member since 2006 • 13490 Posts

I think I'll wait for a new engine and then judge the consoles, something like the new unreal engine should give a good indication.

Avatar image for digitm64
digitm64

470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 5

#25 digitm64
Member since 2013 • 470 Posts

I'm sure the consoles would have sold the same if they had them at double the price but with much faster components in them to reflect the price. Or release both entry level, and have one that costs a lot more called Elite. Sony and MS were too focused on price points and not specifications.

Avatar image for GTR12
GTR12

13490

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 GTR12
Member since 2006 • 13490 Posts

@digitm64 said:

I'm sure the consoles would have sold the same if they had them at double the price but with much faster components in them to reflect the price. Or release both entry level, and have one that costs a lot more called Elite. Sony and MS were too focused on price points and not specifications.

The PS3 launch says hello

Avatar image for BSC14
BSC14

4187

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#27 BSC14
Member since 2002 • 4187 Posts

This is the worst gen of new consoles I have ever seen and I'm 42 years old. It's just sad at how under powered junk they were from day one. I mean the xbox one can hardly run 1080p.....it's just sad and it's going to hurt pc as far pushing hardware.

Avatar image for JangoWuzHere
JangoWuzHere

19032

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#28  Edited By JangoWuzHere
Member since 2007 • 19032 Posts

Whatever

I remember seeing the bad looking and poor performing games in 2005-2006. It took a couple of years before last gen consoles actually started to wow people. I think gamers and developers need to stop getting hung up on this resolution bullshit and judge the games by actually looking at them.

Avatar image for JangoWuzHere
JangoWuzHere

19032

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 JangoWuzHere
Member since 2007 • 19032 Posts

@JangoWuzHere said:

Whatever

I remember seeing the bad looking and poor performing games in 2005-2006 for 360. It took a couple of years before last gen consoles actually started to wow people. I think gamers and developers need to stop getting hung up on this resolution bullshit and judge the games by actually looking at them.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#30 04dcarraher
Member since 2004 • 23829 Posts

@JangoWuzHere said:

Whatever

I remember seeing the bad looking and poor performing games in 2005-2006. It took a couple of years before last gen consoles actually started to wow people. I think gamers and developers need to stop getting hung up on this resolution bullshit and judge the games by actually looking at them.

Do you know the reason why early games on the PS3 and 360 didnt look or run as good as they did until a few years after release? In 2005 all software was single threaded and the 360 introduced its triple core cpu where developers would have to learn and create engines and code to use all its processing power. It took time before game engines started to become multithreaded. And the 360 introduced a new gpu standard onto the market in 2005 the unified shader architecture. Which allowed gpu's processors to do vertex and shader workloads in any combination allowing more flexibility where the standard then was a fixed number of dedicated processors limiting one area or another.

Then the PS3's case with the Cell, the cell was originally designed to be basically an APU doing both cpu and gpu workloads but sony soon found out the PPE and SPE's weren't up the job so they added the RSX (G70) chip aka geforce 7). PS3 had alot more hurdles to overcome older gpu tech and then the PPE(cpu) was really slow so they assigned SPE's to do both cpu tasks and gpu tasks to augment. Both companies introduced complex and high end hardware back in the day but now they did not.

Now after devs learned these customized proprietary based consoles (360,PS3) game engines matured to fit the consoles limits and abilities, but after a handful of years graphics in general plateaued. Now lets fast forward to now, these new consoles are only semi customized pc's using the same standards in software and hardware with low to medium ranged ranged hardware. They have not introduced new standards, no high ended hardware to rival pc counterparts, same base in coding and hardware usage. Developers are right at home with these consoles because of how standardized they are, the X1 and its esram may take some time to tweak out the bugs, but the PS4 is a straight forward design. there is no real learning curve with these new consoles. We will see improvements, but they will be minor steps but nothing like what the 360 or PS3 did years ago.

Avatar image for JangoWuzHere
JangoWuzHere

19032

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#31 JangoWuzHere
Member since 2007 • 19032 Posts

@04dcarraher said:

@JangoWuzHere said:

Whatever

I remember seeing the bad looking and poor performing games in 2005-2006. It took a couple of years before last gen consoles actually started to wow people. I think gamers and developers need to stop getting hung up on this resolution bullshit and judge the games by actually looking at them.

Now after devs learned these customized proprietary based consoles (360,PS3) game engines matured to fit the consoles limits and abilities, but after a handful of years graphics in general plateaued. Now lets fast forward to now, these new consoles are only semi customized pc's using the same standards in software and hardware with low to medium ranged ranged hardware. They have not introduced new standards, no high ended hardware to rival pc counterparts, same base in coding and hardware usage. Developers are right at home with these consoles because of how standardized they are, the X1 and its esram may take some time to tweak out the bugs, but the PS4 is a straight forward design. there is no real learning curve with these new consoles. We will see improvements, but they will be minor steps but nothing like what the 360 or PS3 did years ago.

That is all assumption as far as I'm concerned. As new engines are created and tweaked, developers will be able to squeeze new life from these consoles in the future.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#32  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@JangoWuzHere said:

That is all assumption as far as I'm concerned. As new engines are created and tweaked, developers will be able to squeeze new life from these consoles in the future.

Nope its not all assumptions, They are already tapping out the cpu, which in turn limits the gpu. Needless to say we will only see minor steps in improvements not massive like the 360/PS3. What do you expect them to do with a six core 1.6 ghz cpu that is as slow as AMD 5 year old cpu's and a gpus that have been outclassed since 2012? These consoles are using the same hardware standards as Pc with at best mid range performance from 2012.