Xbox One’s Driver Update To Bring 10% Performance Boost Read

This topic is locked from further discussion.

Avatar image for theshensolidus
TheShensolidus

224

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#101 TheShensolidus
Member since 2013 • 224 Posts

Oh Thuway - an insider who regularly posts bad info. No, there is no major performance update coming right now. It would take a major, NXE-style update for this. In fact, where would this 10% be applicable? The OS? Games?

As someone who works closely with both MS and on their console, I know for a fact this makes absolutely no sense whatsoever.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#102  Edited By lostrib
Member since 2009 • 49999 Posts

@theshensolidus said:

Oh Thuway - an insider who regularly posts bad info. No, there is no major performance update coming right now. It would take a major, NXE-style update for this. In fact, where would this 10% be applicable? The OS? Games?

As someone who works closely with both MS and on their console, I know for a fact this makes absolutely no sense whatsoever.

Prove it

Avatar image for Jacobistheman
Jacobistheman

3975

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 Jacobistheman
Member since 2007 • 3975 Posts

@theshensolidus said:

Oh Thuway - an insider who regularly posts bad info. No, there is no major performance update coming right now. It would take a major, NXE-style update for this. In fact, where would this 10% be applicable? The OS? Games?

As someone who works closely with both MS and on their console, I know for a fact this makes absolutely no sense whatsoever.

BS. You wouldn't leak this kind of information if you actually worked somewhere you had access to it.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104  Edited By Tighaman
Member since 2006 • 1038 Posts

@theshensolidus: You still on that BS? You are worse than misterxmedia at least he have evidence to back up his claims but you talk like you are a dev but you never come here with no real news, no update, no real info so shut the hell up and keep it moving homie.

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#105 darkangel115
Member since 2013 • 4562 Posts

MS has a % of perfromance set aside for kinect. everyone knows this. all they need to do is allow devs to disable kinect and allocate the resources to the games to give a performance boost. you guys do realize that both the PS4 and X1 do not operate even close to 100% of their hardware abilities right?

Avatar image for WitIsWisdom
WitIsWisdom

9583

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#106 WitIsWisdom
Member since 2007 • 9583 Posts

lol... 10% increase from server updates and 400% more power from the cloud.

and then another 100% increase when it eat a sensu bean, and another 100% if it eats the mushroom, and another 100% with a phoenix down, and who could forget the 100% increase it will gain from putting two Viagra and a glass of water in the disc drive?

Avatar image for mrxboxone
MrXboxOne

799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#107  Edited By MrXboxOne
Member since 2013 • 799 Posts

@I_can_haz said:

10% of shit is still shit.

Does it crush your spirit to know Gamespot, IGN, Eurogamer, Adam Sessler, GameTrailers, Rev3gaming, Digital Foundry, Game informer and others (ALL the top gaming website experts) award Ryse as "having hands down the best graphics of next gen."?

Even your own poll says Ryse. Your console is weak and worthless.... I know, I own a PS4, its a POS

Avatar image for LJS9502_basic
LJS9502_basic

178860

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#108 LJS9502_basic
Member since 2003 • 178860 Posts

@mrxboxone said:

@I_can_haz said:

10% of shit is still shit.

Does it crush your spirit to know Gamespot, IGN, Eurogamer, Adam Sessler, GameTrailers, Rev3gaming, Digital Foundry, Game informer and others (ALL the top gaming website experts) award Ryse as "having hands down the best graphics of next gen."?

Even your own poll says Ryse. Your console is weak and worthless.... I know, I own a PS4, its a POS

LOL appeal to so called authority. No true gamer would ever call another system a POS.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#109  Edited By ronvalencia
Member since 2008 • 29612 Posts

@acp_45 said:

"DirectX 11.X's superset features supports most of AMD Radeon HD "Graphics Core Next" and it's 32MB eSRAM features and it still waiting for the driver update to support it's ACE units(ref 2).”

Move engines: Only the the Xbox One has one and it has a total of 4 move engine processors. Move engines greatly reduce the overhead of transporting data between the CPU and the GPU, freeing up clock cycles for more important things (improving CPU and GPU performance). It also helps transfer data to the SRAM.

You are right, there is only 32mb that can be used. Thank you for clearing that up.

The RAM and cache situation is a bit more complicated than you would like to lead people to believe. First off, CPUs love low latency and GPUs love high bandwidth. Because of this Sony has crippled the PS4's CPU with DDR5 and Microsoft has crippled the Xbox One's GPU with DDR3. What does this mean? This means better CPU performance in the Xbox One and better GPU performance in the PS4. But...

The Xbox One has a higher CPU and GPU clock speed than the PS4 and it's higher GPU speed actually makes it perform better than if they had just unlocked it's two additional compute units due to additional CUs not adding performance linearly (decreasing reward for additional CUs). As everyone should know GPU's, ESPECIALLY AMD's, are highly CPU dependent (this is why high end i7s are used for GPU benchmarks, to eliminate CPU bottlenecks) which means that the PS4's GPU may turn out to be bottlenecked by it's CPU. As for Microsoft's bandwidth problem, they added the high speed SRAM as a cache to help solve this and the GPU can actually access data from both the SRAM and the DDR3 at the same time making it's total accessible bandwidth larger than either of them separately. The Xbox One's total bandwidth is estimated to be near (still below possibly )that of the PS4.

....(cut for space)


On the PS4 is faster comparison...... Calculating speedup when comparing two machines is incredibly complex and takes into factor alot of things which is why the actual speedup is currently unknown but tech sites are calling it close to a wash (based on how current x86 PC machines work).

Peace bro.

Like to hear more from you :)

BF4 runs pretty good on my AMD Radeon HD R9-290.

All GCNs has move engines i.e. they are DMA engines. The main difference with Xbox One's version is the support for JPEG and LZ compression.

The complexity with Xbox One's memory model is similar to Xbox 360, but with additional flexibility e.g. the ability to spill the rendering targets over to DDR3.

PS4's GCN superior to X1's GCN solution e.g. the prototype 7850 with 12 CUs wasn't able to match the retail 7850 with 16 CUs and both of them have the same memory bandwidth and clock speed. GDDR5 is fine for CPUs i.e. Intel is using it for Xeon Phi.

PC's DirectX 11.2 is nowhere near metal nor light weight, hence why AMD's Mantle exist on PCs. PS4 doesn't need AMD's Mantle since it has it's own efficient APIs. AMD's Mantle is similar to PS4's APIs.

It seems you haven't seen AMD's APU13 lectures why Intel Core i7 is not required for Mantle based games i.e. AMD FX 8 core CPU was under-clocked 2 Ghz was still limited by Radeon HD 290X. One can conclude that PS4's 8 core CPU would be more than enough for it's lesser 18 CU enabled GCN.

Avatar image for I_can_haz
I_can_haz

6511

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#110  Edited By I_can_haz
Member since 2013 • 6511 Posts

@mrxboxone said:

@I_can_haz said:

10% of shit is still shit.

Does it crush your spirit to know Gamespot, IGN, Eurogamer, Adam Sessler, GameTrailers, Rev3gaming, Digital Foundry, Game informer and others (ALL the top gaming website experts) award Ryse as "having hands down the best graphics of next gen."?

Even your own poll says Ryse. Your console is weak and worthless.... I know, I own a PS4, its a POS

Lying filth. Get out of here with your POS console that can't run COD and BF4 in 720p without frame drops.

"b..bu...bu..but teh 10% boost!!!1"

Avatar image for Bigboi500
Bigboi500

35550

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#111 Bigboi500
Member since 2007 • 35550 Posts

Thunder/Freedomfree and Brofists/Battlefieldfan should be locked in a closet together and find out which one comes out alive.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#112 tormentos
Member since 2003 • 33784 Posts

@acp_45 said:

@tormentos: http://kotaku.com/xbox-one-call-of-duty-offers-better-framerate-than-ps4-1463163114

COD:Ghosts was running at a steady 60fps. Read what I have written.

Don’t bring BF4 in here. That is the worst optimized multi-platform game I’ve seen. Even the PC version has problems. This is because of EA’s greediness. Everyone knows that. I can’t even play that game anymore.

I claimed it theoretically. I didn’t say it was 218GB/s. But theoretically.

"DirectX 11.X's superset features supports most of AMD Radeon HD "Graphics Core Next" and it's 32MB eSRAM features and it still waiting for the driver update to support it's ACE units(ref 2).”

Move engines: Only the the Xbox One has one and it has a total of 4 move engine processors. Move engines greatly reduce the overhead of transporting data between the CPU and the GPU, freeing up clock cycles for more important things (improving CPU and GPU performance). It also helps transfer data to the SRAM.

You are right, there is only 32mb that can be used. Thank you for clearing that up.

The RAM and cache situation is a bit more complicated than you would like to lead people to believe. First off, CPUs love low latency and GPUs love high bandwidth. Because of this Sony has crippled the PS4's CPU with DDR5 and Microsoft has crippled the Xbox One's GPU with DDR3. What does this mean? This means better CPU performance in the Xbox One and better GPU performance in the PS4. But...

The Xbox One has a higher CPU and GPU clock speed than the PS4 and it's higher GPU speed actually makes it perform better than if they had just unlocked it's two additional compute units due to additional CUs not adding performance linearly (decreasing reward for additional CUs). As everyone should know GPU's, ESPECIALLY AMD's, are highly CPU dependent (this is why high end i7s are used for GPU benchmarks, to eliminate CPU bottlenecks) which means that the PS4's GPU may turn out to be bottlenecked by it's CPU. As for Microsoft's bandwidth problem, they added the high speed SRAM as a cache to help solve this and the GPU can actually access data from both the SRAM and the DDR3 at the same time making it's total accessible bandwidth larger than either of them separately. The Xbox One's total bandwidth is estimated to be near (still below possibly )that of the PS4.

MS did not have the same luxury. They did not have the access to mass quantities of DDR5, and that forced them into ESRAM, which took up most of the GPU space. The largest GPU to fit, was the smaller 7790 comparable. MS was forced to have to add other methods and means to, essentially, fake-widen their highway (think, a bypass on the highway that allows cars to, well, bypass the highway) and assist the GPU. These are in the form of 4, smaller command processors (compute processors and 2 graphics processors) of Microsoft custom design, which allows faster, light/ medium work to be offloaded too, allowing the GPU to do the heavy hitting. "Tricks..." that's all this game is. Now, it may not mean they can hit harder than the much more direct PS4 approach, but it certainly means much, much more than just a mere 7790. It's all about how you distribute the data.

Ignorant kids automatically consider the PS4's 50% faster performance ( just as with PS3) to be endgame in the contest. Not even close. That 50% performance boost (actually, slightly less, since, Microsoft's speed boost), could work out to serve as a mere portion of headroom.

Sony or Microsoft might one day choose to enable the CUs in future console versions. This is unknown, typically console manufacturers don’t update core specs post-launch, but consoles have been trending towards greater upgradeability over the past two generations. It’s not impossible that this could change.

Resolution: PS4 doesn't even run BF4 at 1080p60, it runs at 900p60. I guess that means that the PS4 sucks too huh? Or maybe, just maybe it's like the last generation and it will take time for the devs to figure things out. LOGIC EXPLAINS.

Things like tiled resources were a Microsoft thing first, stolen and implemented by OpenGL coders for fame, standardized later, and then (and this is the important part) never incorporated into hardware-compatible solutions. In DirectX 11.1/11.2 all of these features are either directly processed on dedicated graphics hardware or have the ability to be easily loaded into hardware compute systems with little loss in efficiency.

It is claimed DX11.2, will come very close to writing to metal, that would be a huge advantage, unless, Sony opted to use Mantel.

I don’t know why you are going crazy with your swearing.

If there is something wrong with what I have said then I would like to know why and then I will change it in the future.

I’m not biased. I am open-minded. Open to both sides.

Please, you don’t know me.

Don’t state something that is completely irrelevant.

So if I would say something like “ the Xbox One has a custom gaming audio block capable of 512 audio channels whereas The PS4 on the other hand has an audio block capable of only 200 channels, Games hardly ever use more than 100 channels at the same time, and they're usually mixed within the game engine into a single channel for the soundcard. It's a nice feat but I doubt it makes much meaningful difference.” So this would make me a Sony fanboy ? Cause’ this is true in terms of audio the Xbox One blows the PS4 out of the ground but it’s irrelevant because games barely use 100 audio channels. I have a PS4 and an Xbox One. I do not choose sides but choose the facts.

I’m just stating facts. If your facts shows differently then I would take them warm-heartedly.

On the PS4 is faster comparison...... Calculating speedup when comparing two machines is incredibly complex and takes into factor alot of things which is why the actual speedup is currently unknown but tech sites are calling it close to a wash (based on how current x86 PC machines work).

Peace bro.

Like to hear more from you :)

http://www.eurogamer.net/articles/digitalfoundry-call-of-duty-ghosts-next-gen-face-off

Both have frames drop in fact the xbox one drops into the 30's at one point where no such case was for the PS4.

That funny because BF4 is more demanding than Ghost, and the xbox on version runs at a consistent 10 FPS slower than the PS4 version,is funny that against the xbox one you claim the game is unoptimize,but you use it latter down in your argument against the PS4 because is 900p.

2-Is irrelevant because you try to use a theoretical numbers,which is meaning less the PS4 was already measure to use 172G/s out of the 176Gb/s which is incredibly higher % than what the xbox one is effectively getting for ESRAM.

3-Move engines are also on the PS4 they are part of GCN and are call DMA,yeah that little you know,the xbox one juts have 2 or 3 more,much like the xbox one has 2 aces and the PS4 has 8 Aces with a volatile bit and 64 commands way over what the xbox one has,because is modify for compute.

4-I know about low latency i also know that DDR2 had less latency than DDR3,and that didn't stop PC from moving to DDR3,because of the speed benefit.

There are not even test on Pc that you can find that prove that GDDR5 has way more latency for CPU than DDr3,in fact the only one posted here by a poster i can remember his name,had the same latency for both memory,and that test was done by Henix which is a memory manufacturer,.

Please show me a Link now where it show that sony cripple the PS4 CPU by using GDDR5,because i just posted one where it clearly show the PS4 CPU actually outperforming the xbox one CPU on a engine,and you just simple ignore it..lol

5-The higher CPU and GPU clock are meaning less,specially the GPU one,the R7 260 has 1.1ghz 14 CU and yet it performs worse than the 7850 that has 16 CU and only 860mhz.

By the way MS lied when they claimed that the speed bump was better than having those 2 extra CU at 800mhz,in fact this is proven by the 7790 it self is has 14 CU 1.0ghz and is 1.79 TF,yet it losses to the 7850 that has again 16 CU 2 more than the 7790,but lower clocked to 860mhz 140mhz slower,the xbox one GPU speed is not even 60mhz faster than the speed of the PS4 one,oh hell the extreme example is there the R7 260 is a re bag 7790,with 100mhz more in speed,it does 2 more frames per second in BF4 over the 7790,and bot under perform vs the 7850 at 860mhz.

So yeah 2 more CU would have been better than just a bump speed,the bump was the only thing MS could do at the last minute and is what they did,they could not have made the xbox one GPU with 14 CU without actually sending the xbox one into a huge shortage,because those 2 CU disable are there for a reason.

Making those chips is a complex process and not all chips come with 14 CU working,if they set the goal for 14 CU,every chip that is manufacture that has less than 14CU has to be thrown away period,

The PS4 CPU is not only performing better than the xbox one now,latter down the road when developers start to push GPU compute they will free even more CPU time,and MS will have an even bigger problem,you are arguing crap spew by fanboys out there,it show you don't have a clue,in fact i have posted already test for the CPU and you ignore it and continue to claim without anything to back you up that the CPU on the PS4 will be a bottle neck.

When compute kick into gear,the xbox one will suffer even more,the xbox one can't run compute and graphics without one hurting the other the PS4 can and will use it,for example physic on Killzone run on the CPU,latter those will run on the GPU without actually hurting the graphics,the CPU will have more free cycles and work even better.

6-WTF do you mean by MX doesn't have the same luxury of sony of having GDDR5.? MS has sh** tons of money if they wanted GDDR5 they would have had it,GDDR5 is not exclusive to the PS4 or sony in any damn way,here what MS did was simple they wanted 8GB of ram and GDDR5 was expensive for then because they wanted to pack kinect period,so the chose ESRAM which is cheap and allow them to use cheap DDR3 as well.

7-The xbox one is not even on the standard of the 7790,it has 2 less CU and 147 mhz lower clock speed for 1.3 TF,the 7790 has 1.79.

8-The disable CU inside the xbox one and PS4 can't be turn on,just like Cell disable 8th SPE could not,because every single xbox one or PS4 that has less than 20 working CU or 14 working CU will not work and will fail.

So if 60% of the xbox one have GPU that only have 13 working CU or 12 it would that mean that as soon as you update your console it will fail,because not all GPU have 14.

And that is the sole reason why the xbox one has 12 CU and 2 disable so that any chip that comes out with 14,13 or 12 CU could be use the same apply to the PS4.

9-BF4 is 900p on PS4 sure but it also runs 10 FPS faster than the xbox one version,so not only has a 50% pixel advantage it also has 10 FPS more.

10-From where in hell did you pull that coders stole Tile Resources from MS.? Link please..

There is no fun in debating if you are not going to back things up,Tile Resources or PRS is also on PS4 and supported by Opengl way before DX.

11.Ok By now i am 100% sure that you don't know sh*8 of what your saying,reason simple why in fu**ing hell would sony need to use mantle for the PS4.? Coding to the metal has always been a sony thing,hell before MS even dreamed of it,oh and before PC got mantle to,and DX does not allow to the metal coding,it has get closer but not to the metal,the PS4 has its own set of tools it doesn't mean Mantle at all,and Mantle is a API that goes over DX which is something MS would not like much on PC,also it whole existence is so that certain GPU on PC have something like what consoles have.

Mantle is a redundancy on PS4.

12-Oh now you want to talk about Shapes,dude shapes is almost all for Kinect very little is for developers to use,is 15Gflops is nothing,oh and the PS4 has AMD Audio it has its own sound block,but unlike the xbox one is not mostly for a camera and a mic.

Hell is even funny that an xbox fan even care about audio for 8 years the PS3 has had 7.1 loss less sound and MS has been stock with 5.1..

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#113  Edited By ronvalencia
Member since 2008 • 29612 Posts
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

Avatar image for I_can_haz
I_can_haz

6511

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#114 I_can_haz
Member since 2013 • 6511 Posts

@Bigboi500 said:

Thunder/Freedomfree and Brofists/Battlefieldfan should be locked in a closet together and find out which one comes out alive.

You forgot to add mrxboxone

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#115 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

I have a link with information and test you have only you biased lemming opinion,all the crap you say is useless

Ghost 1080p vs 720p so much for all your crappy secret sauce and theories,but but jit compression,but but Tile resources,but but 133GB/s alpha blending...lol

Quote me again when the xbox one is pulling 1080p in FPS other wise your losing your time.

My link >>>>>>>>>>>>>>>> your opinion,but but the xbox one is faster,is now showing in that test,

Avatar image for deactivated-57d8401f17c55
deactivated-57d8401f17c55

7221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#116  Edited By deactivated-57d8401f17c55
Member since 2012 • 7221 Posts

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

20 GB/s is more than enough. Xbox will only use that much bandwidth with the gpu, so it means nothing.

What makes a difference if it's a single core test? It's the same damn core.

Avatar image for mrxboxone
MrXboxOne

799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#117 MrXboxOne
Member since 2013 • 799 Posts

@I_can_haz said:

@mrxboxone said:

@I_can_haz said:

10% of shit is still shit.

Does it crush your spirit to know Gamespot, IGN, Eurogamer, Adam Sessler, GameTrailers, Rev3gaming, Digital Foundry, Game informer and others (ALL the top gaming website experts) award Ryse as "having hands down the best graphics of next gen."?

Even your own poll says Ryse. Your console is weak and worthless.... I know, I own a PS4, its a POS

Lying filth. Get out of here with your POS console that can't run COD and BF4 in 720p without frame drops.

"b..bu...bu..but teh 10% boost!!!1"

Not lying..... Enjoy second rate graphics LOL.

B..bu...bu...bu....bu but all that power tho and nothing to sho fo it!

Avatar image for DefconRave
DefconRave

806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 40

User Lists: 0

#118 DefconRave
Member since 2013 • 806 Posts

lol cloud gaming, no wonder they wanted a 24/7 always online requirement.

Avatar image for remiks00
remiks00

4249

Forum Posts

0

Wiki Points

0

Followers

Reviews: 37

User Lists: 0

#119  Edited By remiks00
Member since 2006 • 4249 Posts

@theshensolidus: Shensolidus has been right about a couple of things in the past if you haven't noticed.Especially from his previous account. He's been trying his best to fly under the radar for a while now. He was also one the 1st people to reveal that the Day One update for Xbone was going to be over 2GB, when PS4 caught flack over their meager "300mb" update. Don't be ignorant...

At seems that a lot of you guys try to discredit him too prematurely.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#120 tormentos
Member since 2003 • 33784 Posts

@mrxboxone said:

Not lying..... Enjoy second rate graphics LOL.

B..bu...bu...bu....bu but all that power tho and nothing to sho fo it!

Is it me or you have there more PS4 games than xbox one games.?

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#121 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@tormentos: LOL,

I don’t respect anything from you tormentos. You go around calling anyone a lem when they are trying to speak what they’ve heard and read. You clearly have no friends that have an Xbox. I can tell you that I friends on both consoles. WHY ? Cause I don’t hate on something that someone else has. Probably, because I have PC, Xbox One, PS4. Playing Metro: Last Light on PC atm. :P

The way you said, " BF4 is 900p on PS4 sure but it also runs 10 FPS faster than the xbox one version,so not only has a 50% pixel advantage it also has 10 FPS more.MrXboxOne just destroyed you. Anyway, you calling me an Xbox One fan when I also enjoy playing my PS4. “ Basic math ? 900p is not 50% more than 720p. Plus calling me an Xbox One fan. PFFT. You clearly don’t know me. I know what I have heard and read. So I will simply state it in my opinion. You sir, are hardcore Sony fan and I’m pretty sure that you hate seeing good things happen to the Xbox One. All in all, the industry is entirely dependent on ALL THREE NEXT GEN CONSOLES. Personally, I think you would be terrible at attracting people to your console to someone who doesn’t have it. I luckily have it and I know how awesome the PS4 is. You are repulsive and a disgrace to the Sony race.

I clearly don’t know anything by your standards. -_-.

ALSO:

The thing is ...... I didn’t ignore your link to the speed test. In fact, I told you that, " Calculating speedup when comparing two machines is incredibly complex and takes into factor alot of things which is why the actual speedup is currently unknown but tech sites are calling it close to a wash (based on how current x86 PC machines work)."

Here is more to our debate............................

In a game whenever a textured object is displayed, like say a high resolution brick texture, the entire texture is compressed and stored in RAM and takes up its respective size of RAM. Textures are the greediest RAM consumers by far compared to everything else that needs to be stored in RAM. With partial resident textures, or tiled resources, the texture is split up into smaller tiles allowing you to load only the tiles necessary to be displayed at a particular detail level.

While texture tiling has been done in software before, it had certain limitations. By moving it to hardware the limitations were removed. The benefits of removing these limitations are impressive enough that it allows developers to store texture data sizes that previously took up 3GB of RAM in only 16Mb of RAM! Not only does it offer a drastic reduction in size, but it can also allow more detailed worlds than before since now developers have a lot more texture storage available.

In addition to textures, and presumably the reason Microsoft prefers to call this partial resident resources, is because this technique can also be applied to other areas such as shadows through shadow mapping.

For the XboxOne this is particularly important since using this technique the 32Mb of eSRAM can theoretically be capable of storing up to 6GB worth of tiled textures going by those numbers. Couple the eSRAM's ultra fast bandwidth with tiled texture streaming middleware tools like Granite, and the eSRAM just became orders of magnitute more important for your next gen gaming. Between software developments such as this and the implications of the data move engines with LZ encode/decode compression capabilities on making cloud gaming practical on common broadband connections, Microsoft's design choice of going with embedded eSRAM for the Xbox One is beginning to make a lot more sense. Pretty amazing. huh ?

So yeah 100gb on broadband connections will take insanely long, your fricken right.

BUT STILL NOTHING TO DO WITH WHAT I HAVE TO SAY AND WHAT WAS IMPORTANT. SADLY, YOU COULDN’T FIGURE IT OUT.

Direct X 11.2 is the first DirectX to support hardware-based tiled resources.

The Xbox One’s ESRAM does go straight to the GPU. MS has actually been using this type of EDRAM/eSRAM approach since the Xbox 360. For example, beginning with titles as early as Kameo, it used the 360's EDRAM and GPUONLY for particle effects without ever touching the CPU. This frees up the CPU and CPU/RAM/GPU bandwidth to do other processes. BAM.

The way it works is that it needs to spit out unused tiles and re-load them as needed or quickly replacing lower res tiles with higher res tiles and vice versa, very fast. The eSRAM is basically a scratchpad for this, built on chip, so the big advantage is that you don't have to deal with the latency issues of GPU/RAM access or tie up resources when going back and forth. You also don't have to deal with the bottlenecks of the software implementation. It doesn't suck up additional resources from your CPU. I think the point you are missing here is that there's no need for your GPU to go back to your ram pool if you can do this within the confines of the eSRAM. The eSRAM is the dedicated hardware for tiled resources and DirectX 11.2 contains the APIs to take advantage of it. This technique is a perfect fit to leverage the 32Mb of on-chip eSRAM. There is a difference versus storing your tiles in RAM and on-chip eSRAM.

It's a big deal and a real "game changer" I believe. I think given the chance, most developers would highly prefer tile rendering then what we use today. The problem with tile rendering, there wasn't any hardware support for it.

The hardware level implementation of this functionality is not in the memory controller; which is unique to the Xbox One due to it's ram configuration; but the main APU itself (according to AMD's talk on Jaguar based APUs), which is shared between the PS4 and Xbox One. Microsoft has implemented API calls in DirectX to utilize this hardware directly, skipping overhead for developers that they have had to do within DX10 and DX11 as opposed to other APIs (OpenGL)

In terms of Cloud gaming, Developers are probably not ready yet to require online connectivity for their non-MP focused games, but remote cloud storage and potential processing is great for the entire industry.

THE MOVE ENGINES :

The Data Move Engines on the Xbox One differ from the DMA on the PS4. For one the Xbox One has four of them, each one with slightly different capabilities, as noted in the diagram. The thing is that the PS4's DMA doesn't support compression capabilities. Just decompression. This makes a difference when you're talking about two way transferring and compression capabilities and an important distinction especially if you plan on doing real time offloading to and from a server.

While the PS4 has its own advantages with additional CUs, and raw horsepower, it can't possibly emulate a missing slab of silicone memory, missing hardware compression capabilities built into the silicone, or a different hardware configuration to match some of the unique advantages the Xbox One has. They both have their advantages. Sony gave the PS4 more raw power and more programmable compute units, but their functions are different. It's going to have to rely on some software emulation, whether by tasking the CPU or dedicating some of its CUs to it(which would naturally take away from its GPU capabilities) if developers want to match some of the capabilities of the X1 that will be a result of relying on the on-chip eSRAM and its dedicated compression capabilities. The Xbox One has dedicated hardware for specific applications and vice versa.

So it seems like Microsoft did it's homework on all things graphics and built a great console.

Xbox One's architectural design is built for tile rendering and surprisingly for cloud computing. PS4 is simply built for raw power and seems as if that was Sony's only answer, which Microsoft could have pulled off if that was the only option.

You said it your self, MS is rich.

Maybe they didn’t choose GDDR5 for a reason. ( funny how you took my sarcasm. In my previous post. )

Interestingly there's an interview with Hihopgamer and Chris Doran regarding the Xbox One and PS4 and apparently both consoles start to chug when using 5-6GB of RAM at 60fps. So developers have no trouble filling up that RAM. So yeah MS did do it’s homework and just like the 360 they're using the eSRAM to address some of the most common bottlenecks in game development.

If Epic decides to marry its streaming tech along with supporting MS's hardware-based implementations of partial resident textures not only it should save a lot of that ram, bandwidth and CPU, but probably fix some of the issues that plagued the Unreal 3 engine.

I think both the PS4 and XBOX ONE’s builds will work fine. The battle is all in YOUR head.

The thing is, I don’t give a crap about these specs because I would choose gameplay over graphics any day.

If you would choose graphics over gameplay then buy yourself a 3000$ gaming rig.

Avatar image for PAL360
PAL360

30570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 31

User Lists: 0

#122 PAL360
Member since 2007 • 30570 Posts

That's cool, i guess.

I think Sony and MS should find a way to free some RAM for games. 3GB (3.5GB on PS4) just for the OS seems alot!

Avatar image for urbansys
urbansys

235

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#123  Edited By urbansys
Member since 2003 • 235 Posts

@farrell2k: why because you can't understand the tools devs have at their disposal with cloud tech. Ask yourself why is Sony trying to desperately use cloud as well? Sadly they don't have the money that MS has to truly implement it.

Avatar image for tdkmillsy
tdkmillsy

6054

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#124 tdkmillsy
Member since 2003 • 6054 Posts

Why is so impossible for drivers to improve things by 10%. They do it all the time on PC.

Microsoft where behind on the development. This is good news and actually expected.

There are rumours on other sites that its more like 50% but I think 10% is more likely.

Cloud computing is already helping add to the experience in games like Froza 5. Both Sony and Microsoft are investing big in this area, they must have plans and see the potential in using it.

Tiled textures now, cloud later

Avatar image for stereointegrity
stereointegrity

12151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#125 stereointegrity
Member since 2007 • 12151 Posts

@acp_45 said:

@tormentos: LOL,

I don’t respect anything from you tormentos. You go around calling anyone a lem when they are trying to speak what they’ve heard and read. You clearly have no friends that have an Xbox. I can tell you that I friends on both consoles. WHY ? Cause I don’t hate on something that someone else has. Probably, because I have PC, Xbox One, PS4. Playing Metro: Last Light on PC atm. :P

The way you said, " BF4 is 900p on PS4 sure but it also runs 10 FPS faster than the xbox one version,so not only has a 50% pixel advantage it also has 10 FPS more.MrXboxOne just destroyed you. Anyway, you calling me an Xbox One fan when I also enjoy playing my PS4. “ Basic math ? 900p is not 50% more than 720p. Plus calling me an Xbox One fan. PFFT. You clearly don’t know me. I know what I have heard and read. So I will simply state it in my opinion. You sir, are hardcore Sony fan and I’m pretty sure that you hate seeing good things happen to the Xbox One. All in all, the industry is entirely dependent on ALL THREE NEXT GEN CONSOLES. Personally, I think you would be terrible at attracting people to your console to someone who doesn’t have it. I luckily have it and I know how awesome the PS4 is. You are repulsive and a disgrace to the Sony race.

I clearly don’t know anything by your standards. -_-.

ALSO:

The thing is ...... I didn’t ignore your link to the speed test. In fact, I told you that, " Calculating speedup when comparing two machines is incredibly complex and takes into factor alot of things which is why the actual speedup is currently unknown but tech sites are calling it close to a wash (based on how current x86 PC machines work)."

Here is more to our debate............................

In a game whenever a textured object is displayed, like say a high resolution brick texture, the entire texture is compressed and stored in RAM and takes up its respective size of RAM. Textures are the greediest RAM consumers by far compared to everything else that needs to be stored in RAM. With partial resident textures, or tiled resources, the texture is split up into smaller tiles allowing you to load only the tiles necessary to be displayed at a particular detail level.

While texture tiling has been done in software before, it had certain limitations. By moving it to hardware the limitations were removed. The benefits of removing these limitations are impressive enough that it allows developers to store texture data sizes that previously took up 3GB of RAM in only 16Mb of RAM! Not only does it offer a drastic reduction in size, but it can also allow more detailed worlds than before since now developers have a lot more texture storage available.

In addition to textures, and presumably the reason Microsoft prefers to call this partial resident resources, is because this technique can also be applied to other areas such as shadows through shadow mapping.

For the XboxOne this is particularly important since using this technique the 32Mb of eSRAM can theoretically be capable of storing up to 6GB worth of tiled textures going by those numbers. Couple the eSRAM's ultra fast bandwidth with tiled texture streaming middleware tools like Granite, and the eSRAM just became orders of magnitute more important for your next gen gaming. Between software developments such as this and the implications of the data move engines with LZ encode/decode compression capabilities on making cloud gaming practical on common broadband connections, Microsoft's design choice of going with embedded eSRAM for the Xbox One is beginning to make a lot more sense. Pretty amazing. huh ?

So yeah 100gb on broadband connections will take insanely long, your fricken right.

BUT STILL NOTHING TO DO WITH WHAT I HAVE TO SAY AND WHAT WAS IMPORTANT. SADLY, YOU COULDN’T FIGURE IT OUT.

Direct X 11.2 is the first DirectX to support hardware-based tiled resources.

The Xbox One’s ESRAM does go straight to the GPU. MS has actually been using this type of EDRAM/eSRAM approach since the Xbox 360. For example, beginning with titles as early as Kameo, it used the 360's EDRAM and GPUONLY for particle effects without ever touching the CPU. This frees up the CPU and CPU/RAM/GPU bandwidth to do other processes. BAM.

The way it works is that it needs to spit out unused tiles and re-load them as needed or quickly replacing lower res tiles with higher res tiles and vice versa, very fast. The eSRAM is basically a scratchpad for this, built on chip, so the big advantage is that you don't have to deal with the latency issues of GPU/RAM access or tie up resources when going back and forth. You also don't have to deal with the bottlenecks of the software implementation. It doesn't suck up additional resources from your CPU. I think the point you are missing here is that there's no need for your GPU to go back to your ram pool if you can do this within the confines of the eSRAM. The eSRAM is the dedicated hardware for tiled resources and DirectX 11.2 contains the APIs to take advantage of it. This technique is a perfect fit to leverage the 32Mb of on-chip eSRAM. There is a difference versus storing your tiles in RAM and on-chip eSRAM.

It's a big deal and a real "game changer" I believe. I think given the chance, most developers would highly prefer tile rendering then what we use today. The problem with tile rendering, there wasn't any hardware support for it.

The hardware level implementation of this functionality is not in the memory controller; which is unique to the Xbox One due to it's ram configuration; but the main APU itself (according to AMD's talk on Jaguar based APUs), which is shared between the PS4 and Xbox One. Microsoft has implemented API calls in DirectX to utilize this hardware directly, skipping overhead for developers that they have had to do within DX10 and DX11 as opposed to other APIs (OpenGL)

In terms of Cloud gaming, Developers are probably not ready yet to require online connectivity for their non-MP focused games, but remote cloud storage and potential processing is great for the entire industry.

THE MOVE ENGINES :

The Data Move Engines on the Xbox One differ from the DMA on the PS4. For one the Xbox One has four of them, each one with slightly different capabilities, as noted in the diagram. The thing is that the PS4's DMA doesn't support compression capabilities. Just decompression. This makes a difference when you're talking about two way transferring and compression capabilities and an important distinction especially if you plan on doing real time offloading to and from a server.

While the PS4 has its own advantages with additional CUs, and raw horsepower, it can't possibly emulate a missing slab of silicone memory, missing hardware compression capabilities built into the silicone, or a different hardware configuration to match some of the unique advantages the Xbox One has. They both have their advantages. Sony gave the PS4 more raw power and more programmable compute units, but their functions are different. It's going to have to rely on some software emulation, whether by tasking the CPU or dedicating some of its CUs to it(which would naturally take away from its GPU capabilities) if developers want to match some of the capabilities of the X1 that will be a result of relying on the on-chip eSRAM and its dedicated compression capabilities. The Xbox One has dedicated hardware for specific applications and vice versa.

So it seems like Microsoft did it's homework on all things graphics and built a great console.

Xbox One's architectural design is built for tile rendering and surprisingly for cloud computing. PS4 is simply built for raw power and seems as if that was Sony's only answer, which Microsoft could have pulled off if that was the only option.

You said it your self, MS is rich.

Maybe they didn’t choose GDDR5 for a reason. ( funny how you took my sarcasm. In my previous post. )

Interestingly there's an interview with Hihopgamer and Chris Doran regarding the Xbox One and PS4 and apparently both consoles start to chug when using 5-6GB of RAM at 60fps. So developers have no trouble filling up that RAM. So yeah MS did do it’s homework and just like the 360 they're using the eSRAM to address some of the most common bottlenecks in game development.

If Epic decides to marry its streaming tech along with supporting MS's hardware-based implementations of partial resident textures not only it should save a lot of that ram, bandwidth and CPU, but probably fix some of the issues that plagued the Unreal 3 engine.

I think both the PS4 and XBOX ONE’s builds will work fine. The battle is all in YOUR head.

The thing is, I don’t give a crap about these specs because I would choose gameplay over graphics any day.

If you would choose graphics over gameplay then buy yourself a 3000$ gaming rig.

still trying to make the x1 seem stronger when its not.....

Avatar image for BattlefieldFan3
BattlefieldFan3

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#126  Edited By BattlefieldFan3
Member since 2012 • 361 Posts

@urbansys said:

@farrell2k: why because you can't understand the tools devs have at their disposal with cloud tech. Ask yourself why is Sony trying to desperately use cloud as well? Sadly they don't have the money that MS has to truly implement it.

Just like SONY with only $10 mill couldn't outengineer Microsoft's engineering teem that was funded $6 billion in R&D for the Xbone, right?

SONY has Cerny, the smartest man alive in the gaming industry. Microsoft has mediocre engineers.

It doesn't matter how much money you spend on average schmoo down the block. He ain't gonna do anything significant with his life.

Avatar image for BattlefieldFan3
BattlefieldFan3

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#128 BattlefieldFan3
Member since 2012 • 361 Posts

@farrell2k said:

@BattlefieldFan3 said:

@urbansys said:

@farrell2k: why because you can't understand the tools devs have at their disposal with cloud tech. Ask yourself why is Sony trying to desperately use cloud as well? Sadly they don't have the money that MS has to truly implement it.

Just like SONY with only $10 mill couldn't outengineer Microsoft's engineering teem that was funded $6 billion in R&D for the Xbone, right?

SONY has Cerny, the smartest man alive in the gaming industry. Microsoft has mediocre engineers.

It doesn't matter how much money you spend on average schmoo down the block. He ain't gonna do anything significant with his life.

Talk about mediocre engineers. Wasn't this Cerny fellow the one responsible for Knack?

Knack isn't a technical failure. It was a failure in terms of gameplay. There's a difference.

You know what's a technical failure? Being funded $6 billion in R&D and making the weaksauce monstrosity known as the Xbone.

Avatar image for enviouseyezonme
EnviousEyezOnMe

272

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#129 EnviousEyezOnMe
Member since 2013 • 272 Posts

@acp_45 said:

@tormentos: LOL,

I don’t respect anything from you tormentos. You go around calling anyone a lem when they are trying to speak what they’ve heard and read. You clearly have no friends that have an Xbox. I can tell you that I friends on both consoles. WHY ? Cause I don’t hate on something that someone else has. Probably, because I have PC, Xbox One, PS4. Playing Metro: Last Light on PC atm. :P

The way you said, " BF4 is 900p on PS4 sure but it also runs 10 FPS faster than the xbox one version,so not only has a 50% pixel advantage it also has 10 FPS more.MrXboxOne just destroyed you. Anyway, you calling me an Xbox One fan when I also enjoy playing my PS4. “ Basic math ? 900p is not 50% more than 720p. Plus calling me an Xbox One fan. PFFT. You clearly don’t know me. I know what I have heard and read. So I will simply state it in my opinion. You sir, are hardcore Sony fan and I’m pretty sure that you hate seeing good things happen to the Xbox One. All in all, the industry is entirely dependent on ALL THREE NEXT GEN CONSOLES. Personally, I think you would be terrible at attracting people to your console to someone who doesn’t have it. I luckily have it and I know how awesome the PS4 is. You are repulsive and a disgrace to the Sony race.

I clearly don’t know anything by your standards. -_-.

ALSO:

The thing is ...... I didn’t ignore your link to the speed test. In fact, I told you that, " Calculating speedup when comparing two machines is incredibly complex and takes into factor alot of things which is why the actual speedup is currently unknown but tech sites are calling it close to a wash (based on how current x86 PC machines work)."

Here is more to our debate............................

In a game whenever a textured object is displayed, like say a high resolution brick texture, the entire texture is compressed and stored in RAM and takes up its respective size of RAM. Textures are the greediest RAM consumers by far compared to everything else that needs to be stored in RAM. With partial resident textures, or tiled resources, the texture is split up into smaller tiles allowing you to load only the tiles necessary to be displayed at a particular detail level.

While texture tiling has been done in software before, it had certain limitations. By moving it to hardware the limitations were removed. The benefits of removing these limitations are impressive enough that it allows developers to store texture data sizes that previously took up 3GB of RAM in only 16Mb of RAM! Not only does it offer a drastic reduction in size, but it can also allow more detailed worlds than before since now developers have a lot more texture storage available.

In addition to textures, and presumably the reason Microsoft prefers to call this partial resident resources, is because this technique can also be applied to other areas such as shadows through shadow mapping.

For the XboxOne this is particularly important since using this technique the 32Mb of eSRAM can theoretically be capable of storing up to 6GB worth of tiled textures going by those numbers. Couple the eSRAM's ultra fast bandwidth with tiled texture streaming middleware tools like Granite, and the eSRAM just became orders of magnitute more important for your next gen gaming. Between software developments such as this and the implications of the data move engines with LZ encode/decode compression capabilities on making cloud gaming practical on common broadband connections, Microsoft's design choice of going with embedded eSRAM for the Xbox One is beginning to make a lot more sense. Pretty amazing. huh ?

So yeah 100gb on broadband connections will take insanely long, your fricken right.

BUT STILL NOTHING TO DO WITH WHAT I HAVE TO SAY AND WHAT WAS IMPORTANT. SADLY, YOU COULDN’T FIGURE IT OUT.

Direct X 11.2 is the first DirectX to support hardware-based tiled resources.

The Xbox One’s ESRAM does go straight to the GPU. MS has actually been using this type of EDRAM/eSRAM approach since the Xbox 360. For example, beginning with titles as early as Kameo, it used the 360's EDRAM and GPUONLY for particle effects without ever touching the CPU. This frees up the CPU and CPU/RAM/GPU bandwidth to do other processes. BAM.

The way it works is that it needs to spit out unused tiles and re-load them as needed or quickly replacing lower res tiles with higher res tiles and vice versa, very fast. The eSRAM is basically a scratchpad for this, built on chip, so the big advantage is that you don't have to deal with the latency issues of GPU/RAM access or tie up resources when going back and forth. You also don't have to deal with the bottlenecks of the software implementation. It doesn't suck up additional resources from your CPU. I think the point you are missing here is that there's no need for your GPU to go back to your ram pool if you can do this within the confines of the eSRAM. The eSRAM is the dedicated hardware for tiled resources and DirectX 11.2 contains the APIs to take advantage of it. This technique is a perfect fit to leverage the 32Mb of on-chip eSRAM. There is a difference versus storing your tiles in RAM and on-chip eSRAM.

It's a big deal and a real "game changer" I believe. I think given the chance, most developers would highly prefer tile rendering then what we use today. The problem with tile rendering, there wasn't any hardware support for it.

The hardware level implementation of this functionality is not in the memory controller; which is unique to the Xbox One due to it's ram configuration; but the main APU itself (according to AMD's talk on Jaguar based APUs), which is shared between the PS4 and Xbox One. Microsoft has implemented API calls in DirectX to utilize this hardware directly, skipping overhead for developers that they have had to do within DX10 and DX11 as opposed to other APIs (OpenGL)

In terms of Cloud gaming, Developers are probably not ready yet to require online connectivity for their non-MP focused games, but remote cloud storage and potential processing is great for the entire industry.

THE MOVE ENGINES :

The Data Move Engines on the Xbox One differ from the DMA on the PS4. For one the Xbox One has four of them, each one with slightly different capabilities, as noted in the diagram. The thing is that the PS4's DMA doesn't support compression capabilities. Just decompression. This makes a difference when you're talking about two way transferring and compression capabilities and an important distinction especially if you plan on doing real time offloading to and from a server.

While the PS4 has its own advantages with additional CUs, and raw horsepower, it can't possibly emulate a missing slab of silicone memory, missing hardware compression capabilities built into the silicone, or a different hardware configuration to match some of the unique advantages the Xbox One has. They both have their advantages. Sony gave the PS4 more raw power and more programmable compute units, but their functions are different. It's going to have to rely on some software emulation, whether by tasking the CPU or dedicating some of its CUs to it(which would naturally take away from its GPU capabilities) if developers want to match some of the capabilities of the X1 that will be a result of relying on the on-chip eSRAM and its dedicated compression capabilities. The Xbox One has dedicated hardware for specific applications and vice versa.

So it seems like Microsoft did it's homework on all things graphics and built a great console.

Xbox One's architectural design is built for tile rendering and surprisingly for cloud computing. PS4 is simply built for raw power and seems as if that was Sony's only answer, which Microsoft could have pulled off if that was the only option.

You said it your self, MS is rich.

Maybe they didn’t choose GDDR5 for a reason. ( funny how you took my sarcasm. In my previous post. )

Interestingly there's an interview with Hihopgamer and Chris Doran regarding the Xbox One and PS4 and apparently both consoles start to chug when using 5-6GB of RAM at 60fps. So developers have no trouble filling up that RAM. So yeah MS did do it’s homework and just like the 360 they're using the eSRAM to address some of the most common bottlenecks in game development.

If Epic decides to marry its streaming tech along with supporting MS's hardware-based implementations of partial resident textures not only it should save a lot of that ram, bandwidth and CPU, but probably fix some of the issues that plagued the Unreal 3 engine.

I think both the PS4 and XBOX ONE’s builds will work fine. The battle is all in YOUR head.

The thing is, I don’t give a crap about these specs because I would choose gameplay over graphics any day.

If you would choose graphics over gameplay then buy yourself a 3000$ gaming rig.

This is all that needs to be said. I would put this post in my sig if it fit......LoL

Avatar image for ignman23
IGNMAN23

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#130 IGNMAN23
Member since 2013 • 25 Posts

@farrell2k: hey tell me this why is the order in 801p and driveclub is still not in 1080p 6fps

Avatar image for mrxboxone
MrXboxOne

799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#131  Edited By MrXboxOne
Member since 2013 • 799 Posts

@tormentos said:

@mrxboxone said:

Not lying..... Enjoy second rate graphics LOL.

B..bu...bu...bu....bu but all that power tho and nothing to sho fo it!

Is it me or you have there more PS4 games than xbox one games.?

Its just you...... those are called Bluray movies lol.

Avatar image for I_can_haz
I_can_haz

6511

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#132  Edited By I_can_haz
Member since 2013 • 6511 Posts

@mrxboxone: PS4 has "nothing to sho fo it" and yet you have more PS4 games than XBone games. Hahahahaha, you just selfowned yourself idiot.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#134 Tighaman
Member since 2006 • 1038 Posts

@acp_45: Thank you Ive been trying to tell this guy TORM for the longest that he needs to read more. Hopefully he will shut up and read outside of system wars.

Avatar image for mrxboxone
MrXboxOne

799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#135 MrXboxOne
Member since 2013 • 799 Posts

@I_can_haz said:

@mrxboxone: PS4 has "nothing to sho fo it" and yet you have more PS4 games than XBone games. Hahahahaha, you just selfowned yourself idiot.

Those are blurays dumbass. So delusional you are.

Does it crush your spirit to know Gamespot, IGN, Eurogamer, Adam Sessler, GameTrailers, Rev3gaming, Digital Foundry, Game informer and others (ALL the top gaming website experts), all award Ryse as "having hands down the best graphics of next gen."?

Even your own poll says Ryse. Your console is weak and worthless....

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#136  Edited By tormentos
Member since 2003 • 33784 Posts

@Tighaman said:

@acp_45: Thank you Ive been trying to tell this guy TORM for the longest that he needs to read more. Hopefully he will shut up and read outside of system wars.

Oh dude Shut up,you ride secrete sauce like any other moron here,where in hell do you think MS will get 10% power from a damn drive update.? Well i guess sony can get the same 10% from a drive update since both are the same GPU family.

This will be nothing 10% will not even help the xbox one close the 1080p -720p gap,is funny how some of you downplay the 40% power advantage or more the PS4 has,yet want to believe that 10% of the xbox one total power mean anything.

@mrxboxone said:

@I_can_haz said:

@mrxboxone: PS4 has "nothing to sho fo it" and yet you have more PS4 games than XBone games. Hahahahaha, you just selfowned yourself idiot.

Those are blurays dumbass. So delusional you are.

Does it crush your spirit to know Gamespot, IGN, Eurogamer, Adam Sessler, GameTrailers, Rev3gaming, Digital Foundry, Game informer and others (ALL the top gaming website experts), all award Ryse as "having hands down the best graphics of next gen."?

Even your own poll says Ryse. Your console is weak and worthless....

Yeah because you will not hide your games and set up the picture like you wanted it to look right.?

Does it crush you that the xbox one has 19 games and that they are lower rated over all vs the 29 games the PS4 has.?

Not only the PS4 has more games they are also higher rated..hahaha

The xbox has no games...ll

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#137 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@tormentos: LOL dude if you are going to compare 19 games vs 29 games.

What if Microsofts created an extra ten games and them all being at least rated 70 on Metacritic and maybe one 50 that will boost it to surpass the PS4.

Last thing I heard is most Ponies say reviews aren’t important when Knack got destroyed and when Ryze came you all went ballistic. Shows you.

You are a fool and you are stubborn.

Both consoles are good but you make yours look worse because you are highly repulsive.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#138  Edited By deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@stereointegrity: DID YOU EVEN READ WHAT I HAVE SAID IN MY POST ?

If you did then you would probably agree that most people don’t know anything about what MS or Sony is doing.

If you still think Xbox One is a crap console in terms of specs compared to PS4 then you probably didn’t understand a thing that I just posted.

You are stubborn.

Sony went for RAW PERFORMANCE which is cheaper than an asymmetrical build that MS took. Sony used off shelf products and just put it in their console. They didn’t have any problem with that. It was easy simple. MS on the other hand, HAD A LOT OF HOMEWORK TO DO. They had to have the Kinect, Azure cloud, gaming, etc in mind which is why everything in the Xbox One has been tweaked, EVERYTHING so that it could work right.

How can I make the console look better than it is, when what I’m saying is exactly what it’s capable of.

I’m sorry but it is definitely stronger than you think it is.

LOL.

If you disagree then you probably didn’t understand a thing I wrote. What Microsoft has done is benefitting the whole industry meaning SONY might use what they are doing in the future. You are incapable of seeing what is good and what is bad. Fool.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#139 tormentos
Member since 2003 • 33784 Posts

@acp_45 said:

@tormentos: LOL,

I don’t respect anything from you tormentos. You go around calling anyone a lem when they are trying to speak what they’ve heard and read. You clearly have no friends that have an Xbox. I can tell you that I friends on both consoles. WHY ? Cause I don’t hate on something that someone else has. Probably, because I have PC, Xbox One, PS4. Playing Metro: Last Light on PC atm. :P

The way you said, " BF4 is 900p on PS4 sure but it also runs 10 FPS faster than the xbox one version,so not only has a 50% pixel advantage it also has 10 FPS more.MrXboxOne just destroyed you. Anyway, you calling me an Xbox One fan when I also enjoy playing my PS4. “ Basic math ? 900p is not 50% more than 720p. Plus calling me an Xbox One fan. PFFT. You clearly don’t know me. I know what I have heard and read. So I will simply state it in my opinion. You sir, are hardcore Sony fan and I’m pretty sure that you hate seeing good things happen to the Xbox One. All in all, the industry is entirely dependent on ALL THREE NEXT GEN CONSOLES. Personally, I think you would be terrible at attracting people to your console to someone who doesn’t have it. I luckily have it and I know how awesome the PS4 is. You are repulsive and a disgrace to the Sony race.

I clearly don’t know anything by your standards. -_-.

ALSO:

The thing is ...... I didn’t ignore your link to the speed test. In fact, I told you that, " Calculating speedup when comparing two machines is incredibly complex and takes into factor alot of things which is why the actual speedup is currently unknown but tech sites are calling it close to a wash (based on how current x86 PC machines work)."

Here is more to our debate............................

In a game whenever a textured object is displayed, like say a high resolution brick texture, the entire texture is compressed and stored in RAM and takes up its respective size of RAM. Textures are the greediest RAM consumers by far compared to everything else that needs to be stored in RAM. With partial resident textures, or tiled resources, the texture is split up into smaller tiles allowing you to load only the tiles necessary to be displayed at a particular detail level.

While texture tiling has been done in software before, it had certain limitations. By moving it to hardware the limitations were removed. The benefits of removing these limitations are impressive enough that it allows developers to store texture data sizes that previously took up 3GB of RAM in only 16Mb of RAM! Not only does it offer a drastic reduction in size, but it can also allow more detailed worlds than before since now developers have a lot more texture storage available.

In addition to textures, and presumably the reason Microsoft prefers to call this partial resident resources, is because this technique can also be applied to other areas such as shadows through shadow mapping.

For the XboxOne this is particularly important since using this technique the 32Mb of eSRAM can theoretically be capable of storing up to 6GB worth of tiled textures going by those numbers. Couple the eSRAM's ultra fast bandwidth with tiled texture streaming middleware tools like Granite, and the eSRAM just became orders of magnitute more important for your next gen gaming. Between software developments such as this and the implications of the data move engines with LZ encode/decode compression capabilities on making cloud gaming practical on common broadband connections, Microsoft's design choice of going with embedded eSRAM for the Xbox One is beginning to make a lot more sense. Pretty amazing. huh ?

So yeah 100gb on broadband connections will take insanely long, your fricken right.

BUT STILL NOTHING TO DO WITH WHAT I HAVE TO SAY AND WHAT WAS IMPORTANT. SADLY, YOU COULDN’T FIGURE IT OUT.

Direct X 11.2 is the first DirectX to support hardware-based tiled resources.

The Xbox One’s ESRAM does go straight to the GPU. MS has actually been using this type of EDRAM/eSRAM approach since the Xbox 360. For example, beginning with titles as early as Kameo, it used the 360's EDRAM and GPUONLY for particle effects without ever touching the CPU. This frees up the CPU and CPU/RAM/GPU bandwidth to do other processes. BAM.

The way it works is that it needs to spit out unused tiles and re-load them as needed or quickly replacing lower res tiles with higher res tiles and vice versa, very fast. The eSRAM is basically a scratchpad for this, built on chip, so the big advantage is that you don't have to deal with the latency issues of GPU/RAM access or tie up resources when going back and forth. You also don't have to deal with the bottlenecks of the software implementation. It doesn't suck up additional resources from your CPU. I think the point you are missing here is that there's no need for your GPU to go back to your ram pool if you can do this within the confines of the eSRAM. The eSRAM is the dedicated hardware for tiled resources and DirectX 11.2 contains the APIs to take advantage of it. This technique is a perfect fit to leverage the 32Mb of on-chip eSRAM. There is a difference versus storing your tiles in RAM and on-chip eSRAM.

It's a big deal and a real "game changer" I believe. I think given the chance, most developers would highly prefer tile rendering then what we use today. The problem with tile rendering, there wasn't any hardware support for it.

The hardware level implementation of this functionality is not in the memory controller; which is unique to the Xbox One due to it's ram configuration; but the main APU itself (according to AMD's talk on Jaguar based APUs), which is shared between the PS4 and Xbox One. Microsoft has implemented API calls in DirectX to utilize this hardware directly, skipping overhead for developers that they have had to do within DX10 and DX11 as opposed to other APIs (OpenGL)

In terms of Cloud gaming, Developers are probably not ready yet to require online connectivity for their non-MP focused games, but remote cloud storage and potential processing is great for the entire industry.

THE MOVE ENGINES :

The Data Move Engines on the Xbox One differ from the DMA on the PS4. For one the Xbox One has four of them, each one with slightly different capabilities, as noted in the diagram. The thing is that the PS4's DMA doesn't support compression capabilities. Just decompression. This makes a difference when you're talking about two way transferring and compression capabilities and an important distinction especially if you plan on doing real time offloading to and from a server.

While the PS4 has its own advantages with additional CUs, and raw horsepower, it can't possibly emulate a missing slab of silicone memory, missing hardware compression capabilities built into the silicone, or a different hardware configuration to match some of the unique advantages the Xbox One has. They both have their advantages. Sony gave the PS4 more raw power and more programmable compute units, but their functions are different. It's going to have to rely on some software emulation, whether by tasking the CPU or dedicating some of its CUs to it(which would naturally take away from its GPU capabilities) if developers want to match some of the capabilities of the X1 that will be a result of relying on the on-chip eSRAM and its dedicated compression capabilities. The Xbox One has dedicated hardware for specific applications and vice versa.

So it seems like Microsoft did it's homework on all things graphics and built a great console.

Xbox One's architectural design is built for tile rendering and surprisingly for cloud computing. PS4 is simply built for raw power and seems as if that was Sony's only answer, which Microsoft could have pulled off if that was the only option.

You said it your self, MS is rich.

Maybe they didn’t choose GDDR5 for a reason. ( funny how you took my sarcasm. In my previous post. )

Interestingly there's an interview with Hihopgamer and Chris Doran regarding the Xbox One and PS4 and apparently both consoles start to chug when using 5-6GB of RAM at 60fps. So developers have no trouble filling up that RAM. So yeah MS did do it’s homework and just like the 360 they're using the eSRAM to address some of the most common bottlenecks in game development.

If Epic decides to marry its streaming tech along with supporting MS's hardware-based implementations of partial resident textures not only it should save a lot of that ram, bandwidth and CPU, but probably fix some of the issues that plagued the Unreal 3 engine.

I think both the PS4 and XBOX ONE’s builds will work fine. The battle is all in YOUR head.

The thing is, I don’t give a crap about these specs because I would choose gameplay over graphics any day.

If you would choose graphics over gameplay then buy yourself a 3000$ gaming rig.

Well the filling is mutual since you know sh** of what your talking.

1-900p vs 720p is like 50% more pixels,1080p has more than double the pixels 720p has so is more than 100% the different in Ghost actually.

2-Take your variables else where test was perform using and Engine that is use for games,and the PS4 was faster,most developers out there use middle ware tools many times on their games,tools like this one actually performed better on PS4 period the end result is what matter,the CPU on the PS4 was able to beat the one on the xbox one for which ever reason call it what you want the xbox one CPU was behind so yeah you are wrong,clock speed isn't everything when it comes to CPU you should have know that,the PS4 is a true HSA design with hUMA unlike the xbox one.

3-PRT is also on PS4 period there is no way around it there is no advantage there and is not MS exclusive period is the same,tile resources,PRT hell it has like 2 more names,period is on PS4 not an advantage for the xbox one and is use by OpenGL way before MS use it on DX.

4-If you can fit 6 GB of textures on 32 MB of ram,how much can you feet on the 5GB.? Because the PS4 doesn't need ESRAM it has a direct link to its memory unlike the xbox one,PRT can be store directly on the PS4 memory without having to move them any where.

5-Streaming textures from a cloud would mean that MS will have to run part of the game accepts from the cloud,loading textures from a cloud would be a disaster because any latency will make everything pop all of the sudden,make warping,or things not load at all,invisible cars,invisible buildings it is a joke.

6-DX is late to the tile resources party OpenGL has been there since 2011 and guess what the PS4 has it,DX is irrelevant anything it can do other software like OpenGL can do it.

7-Particles effects in this days are done by all the GPUs out there not by the CPU so your point is irrelevant again,Cell was doing more than that on PS3.And on PS4 the GPU handles that,using its memory period,even if ESRAM wasn't there MS still generate does particles on the GPU using the main memory WTF man stop sucking it to ESRAM is just a damn bypass to help with banwidth limitations do to MS using DDR3 nothing more nothing less is not the xbox one secret sauce,and it surely introduced complexity into the unit because not everything is not suitable to be there.

8-For the 10th time the PS4 can use Tile resources is not a damn advantage of the xbox one get that into your head.

9-Please dude DME mill change sh** is 2014 both units are out and the version of Ghost on PS4 run at 1080p the xbox one version 720p,DME,tile resources,ESRAM any of that crap changed nothing the xbox one has several games that are 720p and is launch,the only game on PS4 that is not 1080p is BF4 and runs with 50% more pixels on PS4 while running 10 FPS faster across the board than the xbox one version,the time for secret sauces is over the xbox one will never catch the PS4 it can't period.

10-The xbox one was build like a sh** console,it runs hotter than the PS4 while gaming when the PS4 has a stronger GPU,it way bigger than the PS4 yet it doesn't have a build in power supply,and is doing 1080p onb crappy sport games or racing games,because not even on a damn fighting game it could do 1080p.

11-There wasn't any sarcasm on your post,you don't know sh** of what your talking and you are just quoting other morons who speak about the secret sauce,ESRAM,DME,Tile resources all that is sh** and has help the xbox one in nothing,the xbox one is been outdone badly in multiplatform games by the PS4 and it will continue to be that way,cloud will change nothing neither will ESRAM.

12-Oh by the way anything that has to do with cloud mean you mandatory need online,because as you know offline the cloud doesn't work,so that mean any game that requires cloud requires online,and that was one of the things many people complained on E3 about then xbox one that only was require for games,take t hat into notice because that will have a cost to.

Avatar image for mrxboxone
MrXboxOne

799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#140 MrXboxOne
Member since 2013 • 799 Posts

@tormentos said:

@Tighaman said:

@acp_45: Thank you Ive been trying to tell this guy TORM for the longest that he needs to read more. Hopefully he will shut up and read outside of system wars.

Oh dude Shut up,you ride secrete sauce like any other moron here,where in hell do you think MS will get 10% power from a damn drive update.? Well i guess sony can get the same 10% from a drive update since both are the same GPU family.

This will be nothing 10% will not even help the xbox one close the 1080p -720p gap,is funny how some of you downplay the 40% power advantage or more the PS4 has,yet want to believe that 10% of the xbox one total power mean anything.

@mrxboxone said:

@I_can_haz said:

@mrxboxone: PS4 has "nothing to sho fo it" and yet you have more PS4 games than XBone games. Hahahahaha, you just selfowned yourself idiot.

Those are blurays dumbass. So delusional you are.

Does it crush your spirit to know Gamespot, IGN, Eurogamer, Adam Sessler, GameTrailers, Rev3gaming, Digital Foundry, Game informer and others (ALL the top gaming website experts), all award Ryse as "having hands down the best graphics of next gen."?

Even your own poll says Ryse. Your console is weak and worthless....

Yeah because you will not hide your games and set up the picture like you wanted it to look right.?

Does it crush you that the xbox one has 19 games and that they are lower rated over all vs the 29 games the PS4 has.?

Not only the PS4 has more games they are also higher rated..hahaha

The xbox has no games...ll

LOL you really think anyone cares about 27 indie games? I don't. 90% of PS4 gamers don't even buy them unless there free lol.

Forza 9 only AAAE on the next gen platform, Higher then any PS4. KI, DR3 and Ryse all as good or better then the PS4 games....... Enjoy that fun fact. Good luck twisting that.

Does it crush your man hood to know I just wrecked you? LOL

Avatar image for urbansys
urbansys

235

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#141  Edited By urbansys
Member since 2003 • 235 Posts

@tormentos said:

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

I have a link with information and test you have only you biased lemming opinion,all the crap you say is useless

Ghost 1080p vs 720p so much for all your crappy secret sauce and theories,but but jit compression,but but Tile resources,but but 133GB/s alpha blending...lol

Quote me again when the xbox one is pulling 1080p in FPS other wise your losing your time.

My link >>>>>>>>>>>>>>>> your opinion,but but the xbox one is faster,is now showing in that test,

He quotes you specifics and all you can talk about is the res difference. By the way how is the fps dips in CoD ghost on that PS4....

Sign you have stumped a fanboy is you bro.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#142 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@tormentos:

PS4 titles such as Killzone: Shadow Fall are using 800MB for render targets alone.

At Microsoft’s BUILD event this year, the team showed the hardware based tiled resources support added in DX11.2.

3GBs of textures were able to be stored in 16MB of RAM.

Hardware tiled resources offers a bunch of improvements over the shader based method available in Granite 1.8. Especially when using HQ filtering modes such as full anisotropic trilinear filtering with a high degree of anisotropy there are clear advantages.

Firstly,the shader can be simplified, reducing the instruction count from around 30 to 10-15 shader instructions in the case of anisotropic filtering.

Secondly, since no overlap needs to be used on tile borders cache use and compression can be improved.

Finally, streaming throughput is improved by 33% as no mipmaps have to be generated for uploaded tiles.

The eSRAM is the dedicated hardware for tiled resources and DirectX 11.2 contains the APIs to take advantage of it. AS I’VE SAID BEFORE. -_-

Microsoft has implemented some APIs in DirectX11.2 so that developers don't have to utilize their own implementation from scratch.

The more exciting implications are the possibility of the combination of tiled textures and the cloud. Developers could go crazy since they wouldn't have to store these massive textures on a disc. They could either offer them as : I would imagine the possibility of actually streaming the tiles straight from the cloud in real time thanks to the LZ encode/decode capabilities of the data move engines straight to the eSRAM to be fed into the GPU.Using the cloud to process your procedural textures for free rather than depending on your CPU. This idea is amazing.

The eSRAM and data move engines are not simply a work around for bandwidth and to label it as such(which I have seen numerous times) is disingenuous to the X1's design. They have specifically equipped it with this for specific applications beyond mitigating bandwidth limitations. Tiled textures, shadow mapping, cloud offloading, cloud streaming and of course a very fast scratch pad for the GPU. Simply put, eSRAM is superior to both GDDR5 and DDR3 for certain applications. That's why it's there. Not just to boost bandwidth.

PS4 supports the hardware implementation of PRT and the difference would be that it would require part of its RAM and RAM/GPU bandwidth to emulate it while dealing with the GPU/RAM latency.

DirectX11.2 introduces new technology called Tiled Resources - essentially, what Tiled Resources does is it increases and decreases graphics fidelity based on location and what the player is actively viewing.

To simplify - imagine your house rendered as a video game. The room that you are in and rooms that are visible to you are rendered as usual, but as you approach something, it maintains the same quality because its render quality has been increased, whereas the objects you are moving away from, and the room you no longer occupy, have had their render quality decreased.

It's like an automatic light dimmer, for video games.

OpenGL has a very similar (yet not complete) plugin.

PS4 load on cpu and gpu causing optimization troubles that will only get worse as comlexity tries to increase.

Xbox One shines here as cpu cache/register/esram snoops allow computes to be offloaded and retrieved from specialized CU's seamlessly alleviating cpu/gpu load.

The 1.3 tflop do converting 6gb to 32mb and on call !

This is much better than 1.8tflop rendering full texture loads!

50% more TFlops does not necessarily mean that the PS4 will have 50% more graphics ability.

PS4 is stronger raw, hands down. This won't make any Xbox One fans happy but thats the truth of it, but it’s not really important. Now let’s look at diminishing returns of raw gpu performance. Once PS4 raw compute limit is reached games will cease to increase in detail, complexity and performance. 1.8Tflop today is nothing great. Especially for an off the shelf card. Thats why games are buggy atm. Thats why Crytek say that Cryengine could already max out console.

So why does a dev with a Gfx engine that can already max out next gen console go exclusive with a supposedly less powerful console.

No not because MS paid them. Cause’ that would be FANBOY LOGIC .

And the answer is because PS4 cant render it at stable FPS. Once PS4 hits performance cap there is no workaround, you just cant do anything about it. Xbox One performance cap reduced by factors of ( I DON’T EVEN FREAKING KNOW) by Tiled Resources. We don’t even know by how much yet as its just now being used by big Devs. Xbox One has low level API that runs seamlessly with High level DX11.2 and Tiled Resources. Game changer.

PS4 does not have access and they are using OpenGL to try and duplicate but nowhere close to what Xbox One does with it.

32mb=6+GB of texture on call. Think about the imlications of this number for a minute when it starts getting used widespread and efficiently.

When MS said they will let the games do the talking they meant it because once more games launch using this resource there will be no denying the proof. Ryse is just the first wave.

Now what happens when an 1.8 tflop machine runs out of headroom to render and a 1.3tflop(raw) can convert GB's of data on the fly only using MB's of memory address?

MS didn’t care about GDDR5 as its cost put too much loss on Xbox One sales when Tiled Resources(referred to as TR from this point on) significantly changes the game.

host/guest gpus : It is R280x based, but with DDR3 mem won’t hit the Tflop number. They just wanted the processing power of the GPU and the latency of DDR3. Used in conjunction with TR it’s crazy.

An 8GB texture render takes 100% GPU load on ps4 while TR on XB1 can do that with 48mb when devs utilize it. Ryse is just the beginning!

http://www.youtube.com/watch?feature=player_embedded&v=EswYdzsHKMc Watch this please. It’s pretty cool.

When Devs start utilizing TR to it’s full potential when that 1.8 PS4 has reached it’s computational limit because of memory space and compute the Xbox One can render exponentially more data and exponentially more fidelity because it can take far more data and render it using only a fraction of memory. 32mb-->rendering 6gb and more!

Microsoft created a hardware accelerated version of an existing technique, though it offers a lot of improvements because now DirectX takes care of a lot of the problems that programmers used to have to deal with in order to implement Tiled Resources (mainly blending issues).

How many Tflops does it take to call on 32mb? So not only is gpu load significantly less but that means that it can call on significantly more data using far less gpu resources.

http://www.youtube.com/watch?v=QB0VKmk5bmI&feature=player_detailpage. Watch this please. Yet again. COOL !

So that is a one of the many tech inside that will separete Xbox One from PS4. Developers still do not know how powerfull this tech is because that could not be calculated with RAW TFs numbers. The hardware argument used around most of the net is completely moot, as it only considers the raw gpu specs, and not the APU as a whole, and both systems are running custom APU with key differences.

The important thing is both will have the same type of technique but one will have more renewed version.

But the part that is interesting is the Cloud. If this could happen then hardware would become less and less relevant. See this as something good.

Not because you want to glorify your console.

Comparing a Multigenerational game like BF4 or COD: Ghosts is irrelevant because it is most likely not using that technique due to the games also running on PS3 and Xbox 360. Those consoles are not using this. It’s poorly optimized on all platforms not just Xbox ONE. Exclusives on PS4 and Xbox One would rely more on the technique.

If you are going on about 1080p then you are easily IMPRESSED. I was expecting 4K.

But it isn’t important to me.

Anyway both consoles are good.

So why are you downplaying something like this, when it’s good news.

Avatar image for MlauTheDaft
MlauTheDaft

5189

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#143  Edited By MlauTheDaft
Member since 2011 • 5189 Posts

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

It's a little vague is'nt it?

It says 1 CPU, not 1 Core.

Avatar image for nicecall
nicecall

528

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#144  Edited By nicecall
Member since 2013 • 528 Posts

those ps4 to xbox comparisons look like they used a old RCA video to capture the ps4 screenshots. either that or used photoshop and blurred the image like 10 x over. its weird random people would waste time modifying screenshots to make them feel better about their xbox one

Avatar image for stereointegrity
stereointegrity

12151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#145  Edited By stereointegrity
Member since 2007 • 12151 Posts

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

x1 is 30 not 55...and only reads into ram. MS wanted to say hUMA and HSA but per their own layouts its not there and amd even said it wasnt

Avatar image for HaloinventedFPS
HaloinventedFPS

4738

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#146 HaloinventedFPS
Member since 2010 • 4738 Posts

Isn't this just because 10% of Xbone's power was reserved for Kinect? im going to assume Kinect doesn't really need that much & they gave it back to Xbone, there is no way you can just pull 10% more power out off your ass

Avatar image for iwasgood2u
iwasgood2u

831

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#147  Edited By iwasgood2u
Member since 2009 • 831 Posts

New firmware to bring in 10% more lemmings

Avatar image for SystemWarsMan
SystemWarsMan

913

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#148 SystemWarsMan
Member since 2007 • 913 Posts

Shen lol still lying about being a dev? I thought the UI wasnt working as it was intended. Guy is a straight up liar and a ms hater.

Avatar image for Douevenlift_bro
Douevenlift_bro

6804

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#149  Edited By Douevenlift_bro
Member since 2013 • 6804 Posts

@farrell2k said:

Great, so it will still be 40% slower than the PS4 and 50 - 60% slower than $400 pc hardware. You XB720 owners got such a wonderful deal. What was it that PT Barnum said about suckers?

LOOL TLHBO.

Avatar image for Gargus
Gargus

2147

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#150 Gargus
Member since 2006 • 2147 Posts