"PS4 can render 1080p/60fps with room to spare” -Kojima

This topic is locked from further discussion.

#301 Posted by 04dcarraher (18933 posts) -

@560ti said:

PS4 is using a CPU that inferior to midrange Phenom series CPU from 2008 and a GPU that was midrange 2 years ago.......... (1080p 60FPS with that hardware is going to require quite a few settings turned down/off).

AMD's Mantle and BF4 have shown us just how little importance CPUs are overall for gaming. With just Mantle, PC games with decent GPU's and weak CPU's are seeing 30%-40% framerate improvements. I think the CPU's in the One and PS4 are quite sufficient for gaming given the to-the-metal coding. Consoles don't need i5's and i7's, and suggesting so is something of a misunderstanding of modern GPU capabilities and demands of most games.

The whole point of mantle is lower the overhead of cpu communicating to the gpu freeing up cpu resources and allowing the cpu feed the data to the gpu better. Which is why intel based systems with AMD gpu's seen little to no difference because those cpu's have the processing power to feed the gpu's already. CPU's are important in what your you are doing and when it comes to games requiring alot of non gpu based workloads with things like physics and many AI or NPC's we will see these console cpu's choke as they do in BF4 already. Also to point out that the consoles only have 6 cores usable for games and the architecture for them are weaker then cpu's from 6 year ago let alone their low clockrates.

#302 Posted by Cyberdot (3510 posts) -

Yeah, yeah, just move on. I'm typing this with my motherfookin' kickass rig next to me that is laughing at the OP.

What matters the most is the exclusive games that takes advantage of dat POWER!

#303 Edited by psx_warrior (1501 posts) -

@Strutten said:

@R3FURBISHED: Dude let it go its rabid cows.. even the article is way out of context.. just read the article or my comment above. cows only read topic and jump the gun instant.. they were worried six months ago but not anymore since updates and new sdk.. anyway ps might still be strong overall but this hatred/bashing is way out of hand..

Saw how great Quantum Break looks. Not worried about trolling cows. Course, I'm a manticore myself, I just don't have the money for all the systems. Will get a PS4 at some point in the future, but for right now, happy with my Xbone and 360 purchase.

#304 Edited by aroxx_ab (9121 posts) -

The Ps4 8GB GDDR5 will own Xboner whole gen :D

#305 Posted by 04dcarraher (18933 posts) -

@aroxx_ab said:

The Ps4 8GB GDDR5 will own Xboner whole gen :D

Its not the memory that is the determining factor

#306 Edited by stuff238 (610 posts) -

But Forza 5 runs at 1080P@60FPS.

So 1 game on xbone vs 30 PS4 games that are 1080p. Yeah, xbone fails. Xbone fanboys can enjoy your 1% of 1080p games, while PS4 will have 99-100% 1080p games.

#307 Posted by xhawk27 (7074 posts) -

The PS4 is a ghetto mid PC at best. Kojima is an idiot. hahaha

#308 Posted by SolidTy (41464 posts) -

@Strutten said:

@R3FURBISHED: Dude let it go its rabid cows.. even the article is way out of context.. just read the article or my comment above. cows only read topic and jump the gun instant.. they were worried six months ago but not anymore since updates and new sdk.. anyway ps might still be strong overall but this hatred/bashing is way out of hand..

Saw how great Quantum Break looks. Not worried about trolling cows. Course, I'm a manticore myself, I just don't have the money for all the systems. Will get a PS4 at some point in the future, but for right now, happy with my Xbone and 360 purchase.

When you went to the store with your new Tax money, were they sold out of PS4s?

I noticed my brother encountered a super salesman that almost got him to buy a Wii U a few months ago when the PS4/Xbone were sold out.

#309 Posted by pelvist (4466 posts) -

So why doesnt it then?

#310 Posted by The_Stand_In (362 posts) -

@stuff238 said:

@freedomfreak said:

But Forza 5 runs at 1080P@60FPS.

So 1 game on xbone vs 30 PS4 games that are 1080p. Yeah, xbone fails. Xbone fanboys can enjoy your 1% of 1080p games, while PS4 will have 99-100% 1080p games.

HAHAHAHA! Off to a pretty bad start, huh? 900p Battlefield 4 laughs at you, and it's only going to get worse.

Get a PC if you want that kind of 1080p+ percentage.

#311 Edited by ronvalencia (15109 posts) -

@tormentos:

@tormentos said:

@ronvalencia said:

AMD PRT makes the existing TMUs efficient by pulling the required textures (from a slower data storage source) to be ready for consumption by the TMUs at maximum transfer rate (this is memory bandwidth factor).

Also 7770 has one Rasterizer unit while X1 has two Rasterizer units.

Radeon HD 7770

From http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers

"In addition to asynchronous compute queues, the Xbox One hardware supports two concurrent render pipes," Goossen pointed out. "The two render pipes can allow the hardware to render title content at high priority while concurrently rendering system content at low priority."

The new driver, it unlocks X1's second Rasterizer unit. With the old driver, X1 acts like 7770 i.e. bottlenecking the GPU with a time slice sharing method. LOL.

Radeon HD 7790 has two Rasterizer units. 78x0 also has two Rasterizer units. Only R9-290/R9290X has four Rasterizer units i.e. near copy-and-paste 4X scaled 7770 .

The prototype 7850 with 48 TMUs(from 12 CUs) and 860Mhz are backed by 153.6 GB/s memory bandwidth. Prototype 7850 with 48 TMUs wouldn't beat retail 7850 with higher CU/TMU count and same memory bandwidth.

7790/R7-260X (with 14 CU at ~1Ghz) hardware with 153.6 GB/s video memory bandwidth doesn't exist e.g. wait for GDDR6 memory types..

I'm not hiding behind the PC since I don't have plans to buy any game consoles.

Where are the textures stored on the HDD.? On the system memory.? Because it doesn't say in any documents AMD have about PRT,PRT are just a way to not load all the textures at once into the memory where ever the textures store before they are loaded into the memory are irrelevant because they are not been store in the Vram which is the important thing,so once again where are they stored because you say loaded from a slower memory storage,that doesn't saying anything and that could very well be the HDD.

Yes just like the xbox one has a 256bit bus which the 7770 doesn't have either,yet the xbox one perform badly,there i no magic nothing you have claim as secret sauce has work,nothing hell MGSV is say to look allot better on PS4,which is funny because Kujima actually defended the xbox one.

The prototye 7850 has 32 ROP and 153GB/s,the xbox one is 16 RPO,and from 140Gb/s to 150 GB/s on its very best case scenario,so basically it would still have less bandwidth than the prototype,the 7790 has 2 Rasterizers and barely need 100GB's at 1.79TF,so yeah the xbox one at 1.28TF after the 8% increase need it even less,that is 7770 power you don't need 100GB/s for that,i have been telling you this for months all that sh** didn't matter in the end still is a 7790 with 7770 like performance,the 7790 doesn't beat the 7770 because it has 2 Rasterizers dude,it beat the 7770 because it has 4 more CU at the same speed for a total of 1.79TF,if kill 2 CU and lower the speed to so that the flop performance fall to 7770 performance level it would perform basically equal,or even worse sin performance now is 1.18TF because 10% of the GPU still it tied..

A hard drive has a lower stream rate performance than physical memory and it's worst on 5400 RPM laptop hard drive like on PS4.

PC can have very fast SSD in RAID config and Intel's SATA controllers are usually superior to AMD's SATA controllers i.e. AMD just got their native PCI-E version 3.0 chipset with AMD Kaveri. LOL.

Microsoft and myself has the done the math and 16 ROPs are sufficient for 150 GB/s level memory bandwidth i.e. the bottlenecks are somewhere else. You are ignoring the fact that 7950 BE's 32 ROPs at 850 Mhz base clockspeed is superior to 7850's 32 ROPs at 860 Mhz and it's mostly due to 7950 BE's superior memory bandwidth.

7770 doesn't have the option to improve with 32 MB ESRAM memory bandwidth booster and dual rasterizer units.

Rebellion's POV mirrors my POV. The prototype 7850 with 12 CUs and boosted 153.6 GB/s memory bandwidth shows what 1.3TFLOPs GCN can do. Remember, 153.6 GB/s limits 32 ROPs.

The main difference between the prototype 7850 vs X1 is it's handling of non-tiling 3d engines with the prototype 7850 handles legacy 3d engines better than X1.

Part of the X1's unlocking package is to enable X1's second rasterzier unit and better middleware support for eSRAM. Your 7790 is not equipped with 153.6 GB/s VRAM.

If you notice, 7790's dual rasterzier units feeds into the CU blocks i.e. 7790's has less bottleneck at this point than 7770. The rasterzier unit is one the important hardware blocks that makes a DSP like** solution into a GPU.

**Also, AMD departed from the DSP's in-order processing model with an out-of-order (for wavefront/MIMD instruction) processing model. AMD's GCN Wavefront is just MIMD instruction issue, which is an evolution over the SIMD instruction issue.

In CU terms, R9-290's 40CU vs 7970's 32 CU is minor but AMD boosted rasterizer (2 vs 4) and 64 ROPs (with memory bandwidth increase) i.e. dual rasterizer units would have bottlenecked the 40 CUs.

To increase throughput, you have to minimise the bottlenecks at the front-end and back-ends (memory writes) i.e. this is what the prototype 7850 shows.

#312 Edited by the_bi99man (11024 posts) -

So much stupid... It hurts.

For the last time people, resolution and framerate aren't the only things that matter. You know what else can do 1080p and 60 fps with "room to spare"? My 5 year old laptop. What else? The Xbox 360 and Ps3. What else? The Wii U. What else? My old PC, from like 6 years ago. It depends on the game. Some games are easier to run than others. Blanket statements about consoles like "PS4 can do 1080p and 60 fps", or "X1 can't do 1080p" are fucking stupid, because there's far more to it than that. No matter how many games run 1080p/60fps on PS4, there will be games that it can't handle that well. No matter how many games don't run 1080p/60fps on X1, there will be games that can, and do. It's going to vary from game to game, as it always has and, in fact, must.

What is so hard to understand about this concept?

#313 Posted by WilliamRLBaker (28309 posts) -

Than why aren't all games 1080p/60fps on the ps4?

#314 Posted by gpuking (2653 posts) -

I think he meant running GZ, PS4 being 1080p/60fps with room to spare compared to Xbone which could well be 720p/45fps. We all know any console can run any res as long as it's willing to sacrifice other things.

But evidently it shows how underpowered the Bone is when compared to PS4.

#315 Edited by Evo_nine (1573 posts) -

Hmmm....stupid comment considering the PS4 has no games that run 1080p 60fps lol

Buy a PC mr kojima

#316 Posted by StormyJoe (4561 posts) -

@StormyJoe:

I always feel silly when I read your posts. A bit embarassed too. Oh well I will learn to live with it. Got too search alot of crap to find the nuggets

If my posts go above your head, that is not my problem.

#317 Posted by kinectthedots (1477 posts) -

yet no graphically intensive games are at 1080p and 60fps on the PS4. Just more marketing BS that console makers like to say.

Kojima is a console maker? Damn lemmings are really stupid right now.

#318 Edited by KillzoneSnake (1623 posts) -

@Evo_nine said:

Hmmm....stupid comment considering the PS4 has no games that run 1080p 60fps lol

Buy a PC mr kojima

Funny, im currently playing Outlast at 1080/60 and next month MGSV at 1080/60. So great to own a PS4.

#319 Edited by HiraiKazuo (289 posts) -

@Evo_nine said:

Hmmm....stupid comment considering the PS4 has no games that run 1080p 60fps lol

Buy a PC mr kojima

Than why aren't all games 1080p/60fps on the ps4?

@xhawk27 said:

The PS4 is a ghetto mid PC at best. Kojima is an idiot. hahaha

Butthurt lem defense force. And only 1 of them own an Xbox One how sad.

#320 Edited by ronvalencia (15109 posts) -

@tormentos said:


@freedomfreak said:

But Forza 5 runs at 1080P@60FPS.

“It was clearly a bit more complicated to extract the maximum power from the Xbox One when you’re trying to do that. I think eSRAM is easy to use. The only problem is…Part of the problem is that it’s just a little bit too small to output 1080p within that size. It’s such a small size within there that we can’t do everything in 1080p with that little buffer of super-fast RAM.

ESRAM is easy to use but is not enough period,doing tiling is a problem and is the reason why almost all xbox 360 games don't have 4xAA and 720p like MS advertise before launch of the xbox 360 10MB was to small for 4xAA with 720p so they have to tile,the end result we already know it almost all games on xbox 360 don't have 4XAA even that MS advertise it that way before launch.

So yeah all generation long the xbox one will struggle with 1080p not only is weaker than the PS4 but also has a limitation ESRAM size this is not new it has been say for almost 1 year now,and MS and people refuse to admit it,hell Ronvalencia even argue 32MB was enough for 1080p when it clearly isn't.

Most developers will just not care and will not go for all that trouble,because in the end no matter what the xbox one would still be weaker period because it has a gap in power a considerable one.

The PS4 bandwidth is faster than ESRAM,ESRAM has has 140Gb/s to 150GB/s that is what can be achieve realistic stated by MS it self,and i am sure that number is the very best scenario,that will not always be the case.

There is no catching up and only a true blind fanboy will believe other wise,the xbox one if it had the same bandwidth the PS4 had would do 1080p but always do so slower because that is a limitation imposed by having a weaker GPU and there is no work around to fix that there is no catching up.

Here the xbox one GPU when the 8% is unlock will have the same performance of the 7770,the PS4 is a little stronger than a 7850,look at the difference between both cards,is 13 FPS with the same effects,the difference now between the PS4 and xbox one in that game is 21 FPS and in some points 30 FPS,while also having better textures,better effects to.

The gap is not going away.

Your using Titan era drivers and Tomb Raider 2013 builds i.e. Feb 2013. My graph uses R7-265 era drivers and Tomb Raider 2013 builds i.e. Feb 2014.

http://www.guru3d.com/articles_pages/amd_radeon_r7_265_review,11.html

Radeon HD R7-265 = 1.894 TFLOPS with 179 GB/s (5600Mhz GDDR5 256bit) VRAM.

Radeon HD R7-260X = 1.971 TFLOPS with 104 GB/s (6500Mhz GDDR5 128bit) VRAM.

With roughly similar TFLOPS, the difference between Radeon HD R7-265 and Radeon HD R7-260X is 10 fps i.e. Radeon HD R7-265 has a faster memory/faster memory read and writes.

There's no PC SKU that replicates X1's 12 CU at 853 Mhz with 68GB/s + 32MB ESRAM, except for the prototype 7850 with 12 CU at 860Mhz and 153.6 GB/s memory bandwidth.

PS; Mantle gives a small uplift for R7-265 into near R9-270 level. With high-end Intel CPU (e.g. Core i7 3960 Extreme "Sandy Bridge-E" @ 4.6 GHz on all six cores), R9-290X doesn't require Mantle.

--------------

PS; I have applied R9-290X BIOS on my R9-290 i.e. enables 1Ghz turbo mode.

#321 Edited by Alcapello (507 posts) -

Than why aren't all games 1080p/60fps on the ps4?

Multiplats, older game engines, open-world developer choices, etc...nothing really to do with hardware, as the PS4 is more than capable. It's basically a beast.

#322 Edited by ronvalencia (15109 posts) -
@silversix_ said:

@Strutten said:

@silversix_: Read the article. He is saying that if you read the whole actually that he is certain that x1 will catch up with ps, None of that bs gue spews out is remotely true (beyond me how come, most trolls are here still) i sincerely hope you guys dont react like this outside sw.

It won't catch up. Do you know what an HD7770 is? There's not progress. The only thing that will happen is PS4 surpassing the Bone even more than it is now. The Order. Watch for The Order lol in the first year pas4 will already have a title that isn't possible on the bone... imagine in 3 years.

Radeon HD 7770 with one Rasterizer unit.

Radeon HD 7790 with two Rasterizer units.

http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers

"In addition to asynchronous compute queues, the Xbox One hardware supports two concurrent render pipes," Goossen pointed out. "The two render pipes can allow the hardware to render title content at high priority while concurrently rendering system content at low priority. The GPU hardware scheduler is designed to maximise throughput and automatically fills 'holes' in the high-priority processing. This can allow the system rendering to make use of the ROPs for fill, for example, while the title is simultaneously doing synchronous compute operations on the compute units."

Microsoft to unlock the second render pipe. A single Rasterizer unit and 68 GB/s bandwidth with less optimal ESRAM usage is similar 7770's single Rasterizer unit and 72 GB/s bandwidth setup.

--------------

http://gamingbolt.com/xbox-ones-esram-too-small-to-output-games-at-1080p-but-will-catch-up-to-ps4-rebellion-games#SBX8QyXmrlJEyBW1.99

Will the process become easier over time as understanding of the hardware improves? “Definitely, yeah. They are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines. The Xbox One is a bit more multimedia, a bit more hub-centric so its a bit more complex. There’s stuff you can and can’t do because it’s a sort of multimedia hub. PS4 doesn’t have that. PS4 is just a games machine.”

#323 Edited by tormentos (16405 posts) -

Your using Titan era drivers and Tomb Raider 2013 builds i.e. Feb 2013. My graph uses R7-265 era drivers and Tomb Raider 2013 builds i.e. Feb 2014.

http://www.guru3d.com/articles_pages/amd_radeon_r7_265_review,11.html

Radeon HD R7-265 = 1.894 TFLOPS with 179 GB/s (5600Mhz GDDR5 256bit) VRAM.

Radeon HD R7-260X = 1.971 TFLOPS with 104 GB/s (6500Mhz GDDR5 128bit) VRAM.

With roughly similar TFLOPS, the difference between Radeon HD R7-265 and Radeon HD R7-260X is 10 fps i.e. Radeon HD R7-265 has a faster memory/faster memory read and writes.

There's no PC SKU that replicates X1's 12 CU at 853 Mhz with 68GB/s + 32MB ESRAM, except for the prototype 7850 with 12 CU at 860Mhz and 153.6 GB/s memory bandwidth.

--------------

PS; I have applied R9-290X BIOS on my R9-290 i.e. enables 1Ghz turbo mode.

What is your point once again your spinning.

The R265 has nothing to do here,neither does the R260.

Oh please dude stop with the excuses consoles will not get better drivers to improve games already released those are rare.

The xbox one doesn't have 1.9TF it has it has 1.18Tf and after the the release of the 8% it will have 1.28 which exactly what the 7770 has,regardless of the bandwidth it will be limited by its GPU performance,which is low.

The xbox one doesn't have 12 CU it has 14 CU it is a 7790 with 2 CU disable,just like the PS4 is a 7870 with 2 CU disable this was already proven,i remember how you use to defend the xbox one version of Ghost not been 720p before it was confirmed,you basically ignore any evidence posted to you,and in the end Ghost was confirmed to be 720p like many claimed.

MS already confirmed 14 CU 2 disable for redundancy.

The Prototype doesn't have 16 ROP neither has ESRAM,and the xbox one is not based on Pitcairn.

Like always all you say is irrelevant.

#324 Edited by ronvalencia (15109 posts) -

@tormentos said:

@ronvalencia said:

Your using Titan era drivers and Tomb Raider 2013 builds i.e. Feb 2013. My graph uses R7-265 era drivers and Tomb Raider 2013 builds i.e. Feb 2014.

http://www.guru3d.com/articles_pages/amd_radeon_r7_265_review,11.html

Radeon HD R7-265 = 1.894 TFLOPS with 179 GB/s (5600Mhz GDDR5 256bit) VRAM.

Radeon HD R7-260X = 1.971 TFLOPS with 104 GB/s (6500Mhz GDDR5 128bit) VRAM.

With roughly similar TFLOPS, the difference between Radeon HD R7-265 and Radeon HD R7-260X is 10 fps i.e. Radeon HD R7-265 has a faster memory/faster memory read and writes.

There's no PC SKU that replicates X1's 12 CU at 853 Mhz with 68GB/s + 32MB ESRAM, except for the prototype 7850 with 12 CU at 860Mhz and 153.6 GB/s memory bandwidth.

--------------

PS; I have applied R9-290X BIOS on my R9-290 i.e. enables 1Ghz turbo mode.

What is your point once again your spinning.

The R265 has nothing to do here,neither does the R260.

Oh please dude stop with the excuses consoles will not get better drivers to improve games already released those are rare.

The xbox one doesn't have 1.9TF it has it has 1.18Tf and after the the release of the 8% it will have 1.28 which exactly what the 7770 has,regardless of the bandwidth it will be limited by its GPU performance,which is low.

The xbox one doesn't have 12 CU it has 14 CU it is a 7790 with 2 CU disable,just like the PS4 is a 7870 with 2 CU disable this was already proven,i remember how you use to defend the xbox one version of Ghost not been 720p before it was confirmed,you basically ignore any evidence posted to you,and in the end Ghost was confirmed to be 720p like many claimed.

MS already confirmed 14 CU 2 disable for redundancy.

The Prototype doesn't have 16 ROP neither has ESRAM,and the xbox one is not based on Pitcairn.

Like always all you say is irrelevant.

My point between R7-265's 32 ROPS and R9-260X's 16 ROPS doesn't equal 2X frame rates and it's impact is minimal at 1920x1080p.

Notice R9-280X's fps result with it's 32 ROPS i.e. it's 32 ROPS is still able to drive high fps.

PS4's 32 ROPS doesn't operate at same level as R9-280X's 32 ROPS i.e. PS4's 32 ROPS operates nearly like 7790's 16 ROPS i.e. R7-265 ~= PS4. Again,R7-265's 32 ROPS impact is minimal at 1920x1080p.

The only time 32 ROPS might get useful is higher MSAA settings with matching memory bandwidth i.e. AMD's ROPS contains MSAA processors.

Your code name arguments doesn't address the common building blocks that forms any GCN GPUs.

As for COD Ghost... If you nuke X1's second rasterizer unit and run mostly on 68GB/s VRAM, you get 7770 like result. Without tiling/AMD PRT, the GPU's TMUs will not be effectively fetching textures from 32 MB ESRAM i.e. it's too small for non-tiling 3D engines.

AMD PRT/Tiling+32MB ESRAM combo is a workaround to replicate the prototype 7850's simple 153.6 GB/s memory bandwidth. X1 has a very specific/narrow path for a prototype 7850 (12 CU/1.3 TFLOPS/153.6 GB/s VRAM) like replication and 7770 doesn't have that option.

Unlike you, I can resolve both COD Ghost's like results and Rebellion like statements.

#325 Posted by 560ti (153 posts) -

@WilliamRLBaker said:

Than why aren't all games 1080p/60fps on the ps4?

nothing really to do with hardware, as the PS4 is more than capable. It's basically a beast.

I wouldn't go that far. Its using a very lowend CPU (even by standards of 2-3 years ago) and a GPU that was midrange 2 years ago...... (its not awful hardware but hardly a beast).

This is the first gen in a long ass time to where the consoles are actually using lowend hardware at launch..... (wouldn't classify that as beast). We waited 8 freaking years for a new console in a very drawn out cycle and they give us lowend hardware (once again, its not awful but far from beastly, ESPECIALLY if they draw this generation out again).

#326 Posted by ronvalencia (15109 posts) -

@560ti:

Radeon HD R7-265 is the nearest PS4 GPU solution.

#327 Edited by 560ti (153 posts) -

@ronvalencia said:

@560ti:

Radeon HD R7-265 is the nearest PS4 GPU solution.

I wouldn't doubt that (10% slower than a midrange 7870 of two years ago).

#328 Posted by tormentos (16405 posts) -

My point between R7-265's 32 ROPS and R9-260X's 16 ROPS doesn't equal 2X frame rates and it's impact is minimal at 1920x1080p.

Notice R9-280X's fps result with it's 32 ROPS i.e. it's 32 ROPS is still able to drive high fps.

PS4's 32 ROPS doesn't operate at same level as R9-280X's 32 ROPS i.e. PS4's 32 ROPS operates nearly like 7790's 16 ROPS i.e. R7-265 ~= PS4. Again,R7-265's 32 ROPS impact is minimal at 1920x1080p.

The only time 32 ROPS might get useful is higher MSAA settings with matching memory bandwidth i.e. AMD's ROPS contains MSAA processors.

Your code name arguments doesn't address the common building blocks that forms any GCN GPUs.

As for COD Ghost... If you nuke X1's second rasterizer unit and run mostly on 68GB/s VRAM, you get 7770 like result. Without tiling/AMD PRT, the GPU's TMUs will not be effectively fetching textures from 32 MB ESRAM i.e. it's too small for non-tiling 3D engines.

AMD PRT/Tiling+32MB ESRAM combo is a workaround to replicate the prototype 7850's simple 153.6 GB/s memory bandwidth. X1 has a very specific/narrow path for a prototype 7850 (12 CU/1.3 TFLOPS/153.6 GB/s VRAM) like replication and 7770 doesn't have that option.

Unlike you, I can resolve both COD Ghost's like results and Rebellion like statements.

Oh please dude stop the excuse...

Here some facts..

BF4- 900p PS4 10FPS faster across the board.

Ghost 1080p native on PS4 720p on xbox one.

AC4 1080p on PS4 900p on xbox one worse AA worse image quality on xbox one.

Tomb Raider 1080 up to 60 FPS unlock from 20 to 30 FPS advantage over the xbox one,lower quality textures on xbox one,lower quality effects on xbox one,effects at half the resolution in xbox one,900p cut scenes on xbox one.

Funny how instead of decreasing the gap actually widen when the xbox one runs at 1080p like the PS4,oh and Tomb Raider,came after launch it had more time to be made,oh did i mention Tomb Raider is a Directx game.? Which mean it is easier to port to another Directx machine than to the PS4 were the code has to be re write to opengl.

This ^^^ is all i care the end product the end result they show a up to 30FPS gap in performance while also having better effects and textures that is huge and contradict all your arguments.

#329 Edited by silversix_ (13546 posts) -

@silversix_ said:

@Strutten said:

@silversix_: Read the article. He is saying that if you read the whole actually that he is certain that x1 will catch up with ps, None of that bs gue spews out is remotely true (beyond me how come, most trolls are here still) i sincerely hope you guys dont react like this outside sw.

It won't catch up. Do you know what an HD7770 is? There's not progress. The only thing that will happen is PS4 surpassing the Bone even more than it is now. The Order. Watch for The Order lol in the first year pas4 will already have a title that isn't possible on the bone... imagine in 3 years.

Radeon HD 7770 with one Rasterizer unit.

Radeon HD 7790 with two Rasterizer units.

http://www.eurogamer.net/articles/digitalfoundry-microsoft-to-unlock-more-gpu-power-for-xbox-one-developers

"In addition to asynchronous compute queues, the Xbox One hardware supports two concurrent render pipes," Goossen pointed out. "The two render pipes can allow the hardware to render title content at high priority while concurrently rendering system content at low priority. The GPU hardware scheduler is designed to maximise throughput and automatically fills 'holes' in the high-priority processing. This can allow the system rendering to make use of the ROPs for fill, for example, while the title is simultaneously doing synchronous compute operations on the compute units."

Microsoft to unlock the second render pipe. A single Rasterizer unit and 68 GB/s bandwidth with less optimal ESRAM usage is similar 7770's single Rasterizer unit and 72 GB/s bandwidth setup.

--------------

http://gamingbolt.com/xbox-ones-esram-too-small-to-output-games-at-1080p-but-will-catch-up-to-ps4-rebellion-games#SBX8QyXmrlJEyBW1.99

Will the process become easier over time as understanding of the hardware improves? “Definitely, yeah. They are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines. The Xbox One is a bit more multimedia, a bit more hub-centric so its a bit more complex. There’s stuff you can and can’t do because it’s a sort of multimedia hub. PS4 doesn’t have that. PS4 is just a games machine.”

ARE YOU FUCKING SERIOUS WITH YOUR LINKS EVERY TIME YOU QUOTE ME? GTFO. I'd throw a rock at you if it was possible.

#330 Posted by highking_kallor (445 posts) -
#331 Edited by ronvalencia (15109 posts) -

@tormentos said:

@ronvalencia said:

My point between R7-265's 32 ROPS and R9-260X's 16 ROPS doesn't equal 2X frame rates and it's impact is minimal at 1920x1080p.

Notice R9-280X's fps result with it's 32 ROPS i.e. it's 32 ROPS is still able to drive high fps.

PS4's 32 ROPS doesn't operate at same level as R9-280X's 32 ROPS i.e. PS4's 32 ROPS operates nearly like 7790's 16 ROPS i.e. R7-265 ~= PS4. Again,R7-265's 32 ROPS impact is minimal at 1920x1080p.

The only time 32 ROPS might get useful is higher MSAA settings with matching memory bandwidth i.e. AMD's ROPS contains MSAA processors.

Your code name arguments doesn't address the common building blocks that forms any GCN GPUs.

As for COD Ghost... If you nuke X1's second rasterizer unit and run mostly on 68GB/s VRAM, you get 7770 like result. Without tiling/AMD PRT, the GPU's TMUs will not be effectively fetching textures from 32 MB ESRAM i.e. it's too small for non-tiling 3D engines.

AMD PRT/Tiling+32MB ESRAM combo is a workaround to replicate the prototype 7850's simple 153.6 GB/s memory bandwidth. X1 has a very specific/narrow path for a prototype 7850 (12 CU/1.3 TFLOPS/153.6 GB/s VRAM) like replication and 7770 doesn't have that option.

Unlike you, I can resolve both COD Ghost's like results and Rebellion like statements.

Oh please dude stop the excuse...

Here some facts..

BF4- 900p PS4 10FPS faster across the board.

Ghost 1080p native on PS4 720p on xbox one.

AC4 1080p on PS4 900p on xbox one worse AA worse image quality on xbox one.

Tomb Raider 1080 up to 60 FPS unlock from 20 to 30 FPS advantage over the xbox one,lower quality textures on xbox one,lower quality effects on xbox one,effects at half the resolution in xbox one,900p cut scenes on xbox one.

Funny how instead of decreasing the gap actually widen when the xbox one runs at 1080p like the PS4,oh and Tomb Raider,came after launch it had more time to be made,oh did i mention Tomb Raider is a Directx game.? Which mean it is easier to port to another Directx machine than to the PS4 were the code has to be re write to opengl.

This ^^^ is all i care the end product the end result they show a up to 30FPS gap in performance while also having better effects and textures that is huge and contradict all your arguments.

You didn't read Rebellion's post and they are very specific on the optimization path.

The key part is to get GPU's TMUs reading from ESRAM instead from main memory i.e. the engine has to be reprogrammed for AMD PRT/tiling tricks. If you didn't do this optimization path, X1's results would be like 7770.

Any ALU bound games wouldn't be solved by AMD PRT/tiling tricks nor the second rasterizer unit, but AMD PRT/tiling/ESRAM tricks helps CU's TMUs. TMU's performance has influence on ALU's performance.

While Tomb Raider is a DirectX games, is it DirectX11.2 game with tiled resource support?

I don't see any DirectX 11.2 games. Rebellion's new game with tiling features seems to be DirectX11.2 type game.

#332 Edited by tormentos (16405 posts) -

You didn't read Rebellion's post and they are very specific on the optimization path.

The key part is to get GPU's TMUs reading from ESRAM instead from main memory i.e. the engine has to be reprogrammed for AMD PRT/tiling tricks. If you didn't do this optimization path, X1's results would be like 7770.

Any ALU bound games wouldn't be solved by AMD PRT/tiling tricks nor the second rasterizer unit, but AMD PRT/tiling/ESRAM tricks helps CU's TMUs. TMU's performance has influence on ALU's performance.

While Tomb Raider is a DirectX games, is it DirectX11.2 game with tiled resource support?

I don't see any DirectX 11.2 games. Rebellion's new game with tiling features seems to be DirectX11.2 type game.

Yeah how many times you selectively quoted developers saying the gap wasn't big.? How many times you quoted Activision saying Ghost would be 1080p on xbox one.? Developers are always diplomatic when it comes to hardware.

He say the xbox one will catch up something that is impossible the 7770 will never catch up the 7850 no matter what developers do,this is a fact an you know it well.

It is easy to see ESRAM was been use 68Gb/s from the main memory isn't enough to feed the GPU and CPU and i already proved that there wasn't a second 30GB's line between CPU and GPU on xbox one,the 30GB's is used by the CPU and system and shared with the GPU as well,and eats from the main pool.

It doesn't matter if it is a tile resources game or not is a DirectX game which should be easier to code to a directX console unlike the PS4 version which has to be re coded to opengl which will make it more difficult and still the PS4 version show a huge gap,not only in frames but also in quality of textures and effects.

Tiling is a problem and a headache,is the reason why most games on 360 aren't 720p with 4XAA,because of tilling.

The gap is huge up to 100% difference in frames,with better quality textures,effects and always 1080p unlike the xbox one version which switch to 900p.

#333 Posted by tormentos (16405 posts) -

The whole point of mantle is lower the overhead of cpu communicating to the gpu freeing up cpu resources and allowing the cpu feed the data to the gpu better. Which is why intel based systems with AMD gpu's seen little to no difference because those cpu's have the processing power to feed the gpu's already. CPU's are important in what your you are doing and when it comes to games requiring alot of non gpu based workloads with things like physics and many AI or NPC's we will see these console cpu's choke as they do in BF4 already. Also to point out that the consoles only have 6 cores usable for games and the architecture for them are weaker then cpu's from 6 year ago let alone their low clockrates.

Is unknown how many cores the PS4 is actually using for games,the 2 cores reserve were from an old Killzone demo,which was done on unfinish hardware,it was never confirmed by sony.

The PS4 doesn't need and i7 is not running windows,and the PS4 can use its GPU for heavy compute without hurting the graphics.

I remember when i posted benchmarks you where one of those who claimed the PS4 would be 5 to 10 frames faster than the xbox one at the same resolution,and look at how big the gap has been up to 100% in Tomb Raider while also having better textures and better effects.lol

#334 Posted by killzowned24 (7286 posts) -

@Strutten said:

@R3FURBISHED: Dude let it go its rabid cows.. even the article is way out of context.. just read the article or my comment above. cows only read topic and jump the gun instant.. they were worried six months ago but not anymore since updates and new sdk.. anyway ps might still be strong overall but this hatred/bashing is way out of hand..

Saw how great Quantum Break looks. Not worried about trolling cows. Course, I'm a manticore myself, I just don't have the money for all the systems. Will get a PS4 at some point in the future, but for right now, happy with my Xbone and 360 purchase.

I wouldn't put too much faith in devs who managed to have the lowest resolution game on 360.

#335 Edited by 04dcarraher (18933 posts) -

@tormentos:

It was confirmed by Sony that the OS and features allocate 2 cores and 3.5gb of memory. Even if all eight cores were for the games they are still slower then 5 year old 3ghz quad cores.

When the PS4 gpu has to do heavy based compute loads such as real time physics it does hurt the bottom line in graphics and performance. so the PS4 can not allocate cycles and not lose something in the end. Tomb Raider's TressFX was tone downed on the new console version, and even still there is massive frame fluctuation on PS4 version.

The main problem with the X1 is that developers have to use the esram to over come the DDR3. And when they dont use it you see the massive differences in resolution and performance. Also the X1's gpu has half the ROPS and is overall thirty some percent slower hurts its ability to render games at the same levels as PS4 . Which why we see in games like BF4 where both consoles are being used fully you see 30% difference in resolution and slightly higher fps averages. and with MP both console suffer because of their cpu's.

Tomb raider on both console run at 1080, X1 runs the game near the set standard of 30 fps while only dipping to 24. while the PS4 framerate is all over seeing max of 60 to averaging low 50's and dropping to 33 fps. Going from 50-60 fps and dropping down to low 30's is worse then staying at 30 then dropping in mid 20's. Also lets not forget there was two different developers that tackled the game on each console. Can not really gauge the results because of the quality of the coding could be in favor of the PS4 because its straight forward design over the X1.

Even AC4 both consoles are locked at 30fps while PS4 runs at 1080p while X1 at 900p and only showing roughly 30 some percentage in rendering abilities.

We all know the PS4 is the stronger console, but how well developers code and make use of each system's strengths and weaknesses. and the X1 has drawn the short straw because of the need to use the esram and to the fact that gpu is a more then a third slower then the PS4. Its sort of the same thing that happened to the PS3 where developers didnt code and use the all the PS3 resources. But the X1 will still not see equal results as PS4 because of the processing power difference between the gpu's.

#336 Posted by tormentos (16405 posts) -

@tormentos:

It was confirmed by Sony that the OS and features allocate 2 cores and 3.5gb of memory. Even if all eight cores were for the games they are still slower then 5 year old 3ghz quad cores.

When the PS4 gpu has to do heavy based compute loads such as real time physics it does hurt the bottom line in graphics and performance. so the PS4 can not allocate cycles and not lose something in the end. Tomb Raider's TressFX was tone downed on the new console version, and even still there is massive frame fluctuation on PS4 version.

The main problem with the X1 is that developers have to use the esram to over come the DDR3. And when they dont use it you see the massive differences in resolution and performance. Also the X1's gpu has half the ROPS and is overall thirty some percent slower hurts its ability to render games at the same levels as PS4 . Which why we see in games like BF4 where both consoles are being used fully you see 30% difference in resolution and slightly higher fps averages. and with MP both console suffer because of their cpu's.

Tomb raider on both console run at 1080, X1 runs the game near the set standard of 30 fps while only dipping to 24. while the PS4 framerate is all over seeing max of 60 to averaging low 50's and dropping to 33 fps. Going from 50-60 fps and dropping down to low 30's is worse then staying at 30 then dropping in mid 20's. Also lets not forget there was two different developers that tackled the game on each console. Can not really gauge the results because of the quality of the coding could be in favor of the PS4 because its straight forward design over the X1.

Even AC4 both consoles are locked at 30fps while PS4 runs at 1080p while X1 at 900p and only showing roughly 30 some percentage in rendering abilities.

We all know the PS4 is the stronger console, but how well developers code and make use of each system's strengths and weaknesses. and the X1 has drawn the short straw because of the need to use the esram and to the fact that gpu is a more then a third slower then the PS4. Its sort of the same thing that happened to the PS3 where developers didnt code and use the all the PS3 resources. But the X1 will still not see equal results as PS4 because of the processing power difference between the gpu's.

Please link me to it because the whole 2 cores only DF stated it and was taken from Guerrilla post morten,and sony say up to 5GB can be use for games,oh and the OS can be shrink the reservation was implemented just in case MS had something big that need it those kind of resources,the PS3 suffer big time because sony reduced the OS to much,and it wasn't enough for party chat,so sony play it safe.

Please prove to me that Tomb Raider is not using ESRAM,in fact there ins't a single game on xbox one that doesn't use ESRAM,68GB/s from the main DDR3 pool ins't enough for the GPU let alone for the GPU and CPU.

Oh please the PS4 dropped to 33 FPS just once and the xbox one was at 24,the PS4 version never goes under 30,and average 50FPS which is 20 more than the xbox one,that alone kill your whole argument,and prove you wrong,worse since cut scenes on xbox one are 900p,textures are half the resolution in many places,effects are done at half the resolution in many places,and several effects are lower quality,that is a huge huge blow.

Considering that you claimed the difference you be from 5 to 10 FPS it says allot,oh and Tomb Raider is a DirectX game is even worse because it has to be re coded for Opengl on PS4 and still perform better on PS4.

#337 Edited by 04dcarraher (18933 posts) -

Sony’s PlayStation 4 design docs reveal that the "PlayStation 4 OS uses two out of the 8 cores, and a whopping 3.5 GB of RAM." "Also, 2 of the Jaguar CPU's 8 cores are always going to be off limits as well, meaning developers will only be able to use 75% of the CPU's resources." that 512mb flexible memory that brings the PS4's pool to 5gb is a paging file. The OS and features will not shrink to the degree as the PS3 OS has because of complexity and all the features imbedded into it.

Please prove that the one dev spent as much time and effort in coding TR as the other dev did for PS4. Fact is that even a piss poor 7770 on pc version can run TR beyond 34 fps averaging 40 fps at 1920x1200 with FXAA.

You missed the point going from 50-60 and rapidly into the 40's 30's or whatever is worse then 30 to 25. Which means that they should have set a frame rate limit on PS4. And again the X1 runs TR native 1080 "Tomb Raider: Definitive Edition both run at native 1080p" Also saying x1 is using half resolution and effects is BS the differences are slight at best. "inconsistency in the PS4 experience," The massive frame rates drops are an issue with PS4.

I never claimed 5-10 fps at same resolution..... you have me mixed up with I do believe since Crypt_MX is the one who stated the 5-10 fps "The REAL power difference can be best judged by comparing the two lowest framerates which are 24 and 33, giving the PS4 a 9fps advantage (Which is what I've been guessing since day one, PS4=5-10 more fps).",

just like you claim I said the PS4 wont use more then 4gb when in fact I was referring about the gpu will not use 4gb but only use 2-3gb. And then every time you ignore that fact.

#338 Posted by xdluffy (23 posts) -

1080p and 60fps is nice but hardly necessary. I have an xbox one and aslong as it runs all the multi plats even if theyre a bit under 1080p im good its not like the ps4 versions look loads better, the difference is barely noticeable and besides if youre playing for graphics get a pc not a ps4. Also comparing games like AC4 and BF4, ps4 runs them at 1080p yet they look better on the xbox one, kinda says something about how overblown the whole 1080p thing is. At the end of the day its about the exclusives for me and playstation just doesnt hold a candle to the xbox titles. You guys have the best indies though so give yourselves a pat on the back

#339 Posted by GrenadeLauncher (3164 posts) -

Nice, getting slowly more hyped for this. Always loved MGS.

@xdluffy said:

1080p and 60fps is nice but hardly necessary. I have an xbox one and aslong as it runs all the multi plats even if theyre a bit under 1080p im good its not like the ps4 versions look loads better, the difference is barely noticeable and besides if youre playing for graphics get a pc not a ps4. Also comparing games like AC4 and BF4, ps4 runs them at 1080p yet they look better on the xbox one, kinda says something about how overblown the whole 1080p thing is. At the end of the day its about the exclusives for me and playstation just doesnt hold a candle to the xbox titles. You guys have the best indies though so give yourselves a pat on the back

Here's how a delusional lemming deals with grief.

#340 Posted by tormentos (16405 posts) -

Sony’s PlayStation 4 design docs reveal that the "PlayStation 4 OS uses two out of the 8 cores, and a whopping 3.5 GB of RAM." "Also, 2 of the Jaguar CPU's 8 cores are always going to be off limits as well, meaning developers will only be able to use 75% of the CPU's resources." that 512mb flexible memory that brings the PS4's pool to 5gb is a paging file. The OS and features will not shrink to the degree as the PS3 OS has because of complexity and all the features imbedded into it.

Please prove that the one dev spent as much time and effort in coding TR as the other dev did for PS4. Fact is that even a piss poor 7770 on pc version can run TR beyond 34 fps averaging 40 fps at 1920x1200 with FXAA.

You missed the point going from 50-60 and rapidly into the 40's 30's or whatever is worse then 30 to 25. Which means that they should have set a frame rate limit on PS4. And again the X1 runs TR native 1080 "Tomb Raider: Definitive Edition both run at native 1080p" Also saying x1 is using half resolution and effects is BS the differences are slight at best. "inconsistency in the PS4 experience," The massive frame rates drops are an issue with PS4.

I never claimed 5-10 fps at same resolution..... you have me mixed up with I do believe since Crypt_MX is the one who stated the 5-10 fps "The REAL power difference can be best judged by comparing the two lowest framerates which are 24 and 33, giving the PS4 a 9fps advantage (Which is what I've been guessing since day one, PS4=5-10 more fps).",

just like you claim I said the PS4 wont use more then 4gb when in fact I was referring about the gpu will not use 4gb but only use 2-3gb. And then every time you ignore that fact.

Once again link...

Because the documents didn't say that it was Gerrilla post morten and only DF claimed it,the rest of the sites just re used the news..

Dude the only reason sony reserved such a high memory was to play it safe,period the PS4 OS run on the same source the PS3 one runs and it need 50MB on PS3,is not that demanding,Sony play it safe,you don't need 3GB to run netflix and a bunch of apps,cell phones with 1 GB do it all the time using way weaker CPU to.

That depends on the quality setting in minimum setting which is high it can,on mid setting which is ultra the 7770 runs at 29FPS,and drops to 21 FPS,sounds familiar.?

The xbox one version does 30 and drop to 24 FPS,but unlike the test done on the 7770 which had the exact same setting in all video cards when it got 29 FPS,on xbox one it has lower resolution textures,lower quality effects,some effects done at half the resolution,and cut scenes change to 900p because the xbox one can't keep up even on cut scenes,because they are done in engine and not videos.

That is a performance that is under the 7770,why you think is that.?Oh yeah like i told you like for 1000 times the xbox one had a GPU reservation of 10% which leave the xbox one with 1.18TF even lower than 7770,add to that that the test system in which the 7770 was tested wasn't using an 8 core Jaguar,it was using a damn

  • Intel Core i7-3960X Extreme Edition (3.30GHz)

So yeah when you join all those factors yeah the xbox one will under perform under the 7770,worse the xbox one wans't even modify to use compute to aid the CPU like the PS4 was.

To begin with, let's address the differences between the two versions of the Definitive Edition on offer. PlayStation 4 users get a comfortably delivered 1080p presentation backed up with a post-process FXAA solution that has minimal impact on texture quality, sporting decent coverage across the scene, bar some shimmer around more finely detailed objects. Meanwhile the situation is more interesting on the Xbox One: the anti-aliasing solution remains unchanged, but we see the inclusion of what looks like a variable resolution framebuffer in some scenes, while some cut-scenes are rendered at a locked 900p, explaining the additional blur in some of our Xbox One screenshots. Curiously, the drop in resolution doesn't seem to occur during gameplay - it's only reserved for select cinematics - suggesting that keeping performance consistent during these sequences was a priority for Xbox One developer United Front Games.

For the most part the main graphical bells and whistles are lavished equally across both consoles, although intriguingly there are a few areas that do see Xbox One cutbacks. As demonstrated in our head-to-head video below (and in our vast Tomb Raider comparison gallery), alpha-based effects in certain areas give the appearance of rendering at half resolution - though other examples do look much cleaner. We also see a lower-quality depth of field in cut-scenes, and reduced levels of anisotropic filtering on artwork during gameplay. Curiously, there are also a few lower-resolution textures in places on Xbox One, but this seems to be down to a bug (perhaps on level of detail transitions) as opposed to a conscious downgrade.

http://www.eurogamer.net/articles/digitalfoundry-2014-tomb-raider-definitive-edition-next-gen-face-off

Owned...lol

Hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaaa So because the PS4 dropped once to 33 FPS and some how we most take that and compare it to the xbox one minimum drop to 24 to claim the difference is 9FPS.?

When both games didn't spent more than 2 seconds on those frames an average they spend it much higher,hahaha yeah lets ignore that what has always been use in the industry is the average and not the minimum to determine frame difference.

So since the 7770 drop to 21 and the 7850 to 28 the real difference is 7 frames between both cards,even that both cards most of the time are running with a 13 FPS difference in average.?

My god you really are butthurt about losing your argument,you claimed even worse 9FPS the average gap is 20 and in many instances it grows to 30FPS but i guess you didn't saw those moments right.? Which funny enough are way more abundant that the 33 FPS drop.

And yeah you claim in the same resolution,and you even hold to a suppose PS4 reservation,in many instances,there is a difference not only in frames but also on quality,and the difference was as small as 9FPS for just 1 moment,most of the time is 20 and some times even 30 which is a devastating 100% difference in frames.

#341 Edited by tormentos (16405 posts) -

@xdluffy said:

1080p and 60fps is nice but hardly necessary. I have an xbox one and aslong as it runs all the multi plats even if theyre a bit under 1080p im good its not like the ps4 versions look loads better, the difference is barely noticeable and besides if youre playing for graphics get a pc not a ps4. Also comparing games like AC4 and BF4, ps4 runs them at 1080p yet they look better on the xbox one, kinda says something about how overblown the whole 1080p thing is. At the end of the day its about the exclusives for me and playstation just doesnt hold a candle to the xbox titles. You guys have the best indies though so give yourselves a pat on the back

Hahhaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa....

Which is funny because a PC with higher than PS4 setting will cost you more than a PS4,but a PC with stronger than xbox one settings will probably cost you what the xbox one cost or a little more.

#342 Edited by SamiRDuran (2656 posts) -

ps4 is weak outdated hardware it cannot render any visually impressive game at 1080p 60fps.

#343 Edited by MonsieurX (28649 posts) -

ps4 is weak outdated hardware it cannot render any visually impressive game at 1080p 60fps.

...yet

#344 Posted by Suppaman100 (3573 posts) -

@Shewgenja said:

@tymeservesfate said:

did any cows actually read the article yet. this thread is even debunk by the title of the article alone lol:

"Xbox One’s eSRAM Too Small to Output Games At 1080p But Will Catch up to PS4"

not to mention what was said in the actual article:

"Will the process become easier over time as understanding of the hardware improves? “Definitely, yeah. They are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines. The Xbox One is a bit more multimedia, a bit more hub-centric so its a bit more complex. There’s stuff you can and can’t do because it’s a sort of multimedia hub. PS4 doesn’t have that. PS4 is just a games machine.”"

smh, play fair cows...stop trolling.

I might start calling it the KuturagiBone at this point. You are DCing like a Sony fanboy circa 2006 so hard right now.

saying what the article actually says is damage controlling? well ur a special kind of stupid aren't you? i didnt even do anything but copy and paste the title and one paragraph...unless you're saying the author and the developer he's interviewing are DCing then i'm not sure what ur going on about lol.

Weak damage control as usual.

Just give up, the xbone lost, everyone knows it. The hardware sucks, multiplats suck, exclusives suck, interface sucks, kinect sucks and sales suck compared to PS4.

/thread

#345 Edited by Suppaman100 (3573 posts) -

@xdluffy said:

1080p and 60fps is nice but hardly necessary. I have an xbox one and aslong as it runs all the multi plats even if theyre a bit under 1080p im good its not like the ps4 versions look loads better, the difference is barely noticeable and besides if youre playing for graphics get a pc not a ps4. Also comparing games like AC4 and BF4, ps4 runs them at 1080p yet they look better on the xbox one, kinda says something about how overblown the whole 1080p thing is. At the end of the day its about the exclusives for me and playstation just doesnt hold a candle to the xbox titles. You guys have the best indies though so give yourselves a pat on the back

#346 Edited by EZs (1309 posts) -

@xdluffy said:

1080p and 60fps is nice but hardly necessary. I have an xbox one and aslong as it runs all the multi plats even if theyre a bit under 1080p im good its not like the ps4 versions look loads better, the difference is barely noticeable and besides if youre playing for graphics get a pc not a ps4. Also comparing games like AC4 and BF4, ps4 runs them at 1080p yet they look better on the xbox one, kinda says something about how overblown the whole 1080p thing is. At the end of the day its about the exclusives for me and playstation just doesnt hold a candle to the xbox titles. You guys have the best indies though so give yourselves a pat on the back

Let me guess, he's on stage 7?

#347 Posted by Suppaman100 (3573 posts) -

@EZs said:

@Suppaman100 said:

@xdluffy said:

1080p and 60fps is nice but hardly necessary. I have an xbox one and aslong as it runs all the multi plats even if theyre a bit under 1080p im good its not like the ps4 versions look loads better, the difference is barely noticeable and besides if youre playing for graphics get a pc not a ps4. Also comparing games like AC4 and BF4, ps4 runs them at 1080p yet they look better on the xbox one, kinda says something about how overblown the whole 1080p thing is. At the end of the day its about the exclusives for me and playstation just doesnt hold a candle to the xbox titles. You guys have the best indies though so give yourselves a pat on the back

Let me guess, he's on stage 7?

Could be, but I definitely see some anger issues in his post.

#348 Posted by clone01 (24127 posts) -

So much stupid... It hurts.

For the last time people, resolution and framerate aren't the only things that matter. You know what else can do 1080p and 60 fps with "room to spare"? My 5 year old laptop. What else? The Xbox 360 and Ps3. What else? The Wii U. What else? My old PC, from like 6 years ago. It depends on the game. Some games are easier to run than others. Blanket statements about consoles like "PS4 can do 1080p and 60 fps", or "X1 can't do 1080p" are fucking stupid, because there's far more to it than that. No matter how many games run 1080p/60fps on PS4, there will be games that it can't handle that well. No matter how many games don't run 1080p/60fps on X1, there will be games that can, and do. It's going to vary from game to game, as it always has and, in fact, must.

What is so hard to understand about this concept?

While I agree with everything you said...look where you are. This is a cesspool devoid of logic, understanding, or intelligent conversation.

#349 Posted by nicecall (428 posts) -

didn't the ps3 and 360 do 1080p? or was it all upscaled from 720p to 1080p? i just remember almost all my game cases saying 1080p for most of the games...

seems weird that they have to make it seem like a great achievement that next gen consoles can do 1080p when last gen could also do it.

#350 Posted by Gue1 (9099 posts) -

Than why aren't all games 1080p/60fps on the ps4?

it's early in the gen and devs are still tapping into its power but still way higher frame-rate and resolution than every single Xbone game.