PS4 ram better than Xbox1 ram based on AMD website

  • 101 results
  • 1
  • 2
  • 3

This topic is locked from further discussion.

Avatar image for zero_cool098
zero_cool098

1886

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#1 zero_cool098
Member since 2006 • 1886 Posts

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx#2

Anyone can say anything otherwise?

Avatar image for Truth_Hurts_U
Truth_Hurts_U

9703

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#2 Truth_Hurts_U
Member since 2006 • 9703 Posts

Ownage... :)

PS4 > XBO

$399 > $499

Victory!

Avatar image for rjdofu
rjdofu

9171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 rjdofu
Member since 2008 • 9171 Posts
You just notice this :|? Where've you been all this time? Up the mountain?
Avatar image for Truth_Hurts_U
Truth_Hurts_U

9703

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#4 Truth_Hurts_U
Member since 2006 • 9703 Posts

You just notice this :|? Where've you been all this time? Up the mountain?rjdofu

Buttt... Butt... Forza Motor SPORT!!!

:lol:

He's just trying to own anyone saying stuff is gonna look better on XBO over PS4.

Avatar image for rjdofu
rjdofu

9171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 rjdofu
Member since 2008 • 9171 Posts

[QUOTE="rjdofu"]You just notice this :|? Where've you been all this time? Up the mountain?Truth_Hurts_U

Buttt... Butt... Forza Motor SPORT!!!

:lol:

He's just trying to own anyone saying stuff is gonna look better on XBO over PS4.

It's teh cloudz bro ;).
Avatar image for SRTtoZ
SRTtoZ

4800

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 SRTtoZ
Member since 2009 • 4800 Posts

GDDR5 is better than DDR3.  GDDR5 is also more costly but MS needed to cut corners when they wanted to include a motion sensor in the package.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7 ronvalencia
Member since 2008 • 29612 Posts

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx#2

Anyone can say anything otherwise?

zero_cool098

X1 has 32MB ESRAM to patch the performance gap with GDDR5.

Avatar image for zero_cool098
zero_cool098

1886

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#8 zero_cool098
Member since 2006 • 1886 Posts

[QUOTE="zero_cool098"]

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx#2

Anyone can say anything otherwise?

ronvalencia

X1 has 32MB ESRAM to patch the performance gap with GDDR5.

can you back that up with some good article? so far all I found is that 32mb ESRAM wont be enough as per this link http://www.examiner.com/article/ps4-gddr5-ram-compared-against-xbox-720-ddr3-ram-and-esram not exactly the best in credibility by how the website looks, but can you show one that actually support it on a positive light?

Avatar image for T-Bone91
T-Bone91

283

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 T-Bone91
Member since 2013 • 283 Posts
You just notice this :|? Where've you been all this time? Up the mountain?rjdofu
All lemmings are up on the cloudz.Hence the hallucinations resulting in idiotic threads due to lack of oxygen.
Avatar image for WilliamRLBaker
WilliamRLBaker

28915

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 WilliamRLBaker
Member since 2006 • 28915 Posts

GDDR5 is better than DDR3.  GDDR5 is also more costly but MS needed to cut corners when they wanted to include a motion sensor in the package.

SRTtoZ
That would be incorrect, GDDR5 is good for some things, Bad for others. DDR3 is good for some things bad for others there is not some measuring stick where GDDR5 is all of a sudden down pat better than DDR3 otherwise we'd have seen PC's using it as main ram years ago when GDDR5 was released. Instead DDR4 is being worked upon and is expected to be released in 2013 to the market.
Avatar image for zero_cool098
zero_cool098

1886

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11 zero_cool098
Member since 2006 • 1886 Posts

[QUOTE="SRTtoZ"]

GDDR5 is better than DDR3. GDDR5 is also more costly but MS needed to cut corners when they wanted to include a motion sensor in the package.

WilliamRLBaker

That would be incorrect, GDDR5 is good for some things, Bad for others. DDR3 is good for some things bad for others there is not some measuring stick where GDDR5 is all of a sudden down pat better than DDR3 otherwise we'd have seen PC's using it as main ram years ago when GDDR5 was released. Instead DDR4 is being worked upon and is expected to be released in 2013 to the market.

perhaps for a multi purpose traditional pc that's true, but we are talking about machines dedicated for gaming and the charts supports it. As for DDR4 it doesn't matter since the xboxone has ddr3 and its a comparison between xbox1 vs ps4

Avatar image for WilliamRLBaker
WilliamRLBaker

28915

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 WilliamRLBaker
Member since 2006 • 28915 Posts

[QUOTE="WilliamRLBaker"][QUOTE="SRTtoZ"]

GDDR5 is better than DDR3. GDDR5 is also more costly but MS needed to cut corners when they wanted to include a motion sensor in the package.

zero_cool098

That would be incorrect, GDDR5 is good for some things, Bad for others. DDR3 is good for some things bad for others there is not some measuring stick where GDDR5 is all of a sudden down pat better than DDR3 otherwise we'd have seen PC's using it as main ram years ago when GDDR5 was released. Instead DDR4 is being worked upon and is expected to be released in 2013 to the market.

perhaps for a multi purpose traditional pc that's true, but we are talking about machines dedicated for gaming and the charts supports it. As for DDR4 it doesn't matter since the xboxone has ddr3 and its a comparison between xbox1 vs ps4

because as every one knows these consoles don't have cpu's in them and don't use the non-existant cpu for physics, and other hard hitting computations that would benefit from DDR3, these consoles also don't have multiple purposes built into them only gaming ....lol.....*laughs hard* but but but but but DDR4 doesn't matter in a console debate... Its a perfectly applicable argument when it comes to countering a claim that GDDR5 is across the board better than DDR3 when its not, Otherwise as I said we'd have seen a transition from DDR3 to GDDR5 years ago, but we didn't because GDDR5 isn't across the board better than DDR3, and why the next iteration will be DDR4. GDDR5 in this instance possibly is better in some ways than DDR3 but the claim of end all be all better is fallacy.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="zero_cool098"]

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx#2

Anyone can say anything otherwise?

zero_cool098

X1 has 32MB ESRAM to patch the performance gap with GDDR5.

can you back that up with some good article? so far all I found is that 32mb ESRAM wont be enough as per this link http://www.examiner.com/article/ps4-gddr5-ram-compared-against-xbox-720-ddr3-ram-and-esram not exactly the best in credibility by how the website looks, but can you show one that actually support it on a positive light?

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

...

It turns out that for current workloads, Intel didnt see much benefit beyond a 32MB eDRAM however it wanted the design to be future proof

....

Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.

Avatar image for killzowned24
killzowned24

7345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 killzowned24
Member since 2007 • 7345 Posts

:P

 

http://www.youtube.com/watch?v=HSyRUL3pNkM

Avatar image for legalize82
legalize82

2293

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 legalize82
Member since 2013 • 2293 Posts

[QUOTE="zero_cool098"]

[QUOTE="ronvalencia"] X1 has 32MB ESRAM to patch the performance gap with GDDR5.

ronvalencia

can you back that up with some good article? so far all I found is that 32mb ESRAM wont be enough as per this link http://www.examiner.com/article/ps4-gddr5-ram-compared-against-xbox-720-ddr3-ram-and-esram not exactly the best in credibility by how the website looks, but can you show one that actually support it on a positive light?

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

 

...

It turns out that for current workloads, Intel didnt see much benefit beyond a 32MB eDRAM however it wanted the design to be future proof

....

Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.

 

u never give up huh. admit it ps4 is a beats compared to xbox one

 

let it go

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 btk2k2
Member since 2003 • 440 Posts

[QUOTE="zero_cool098"]

[QUOTE="ronvalencia"] X1 has 32MB ESRAM to patch the performance gap with GDDR5.

ronvalencia

can you back that up with some good article? so far all I found is that 32mb ESRAM wont be enough as per this link http://www.examiner.com/article/ps4-gddr5-ram-compared-against-xbox-720-ddr3-ram-and-esram not exactly the best in credibility by how the website looks, but can you show one that actually support it on a positive light?

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

...

It turns out that for current workloads, Intel didnt see much benefit beyond a 32MB eDRAM however it wanted the design to be future proof

....

Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.

So you are saying that AMD and MS are using a similar system to the one Intel is using with Crystalwell? Do you have any links to show that the memory interface on the X1 APU is comparable to the one used on this Intel APU?
Avatar image for legalize82
legalize82

2293

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 legalize82
Member since 2013 • 2293 Posts

[QUOTE="ronvalencia"]

[QUOTE="zero_cool098"] can you back that up with some good article? so far all I found is that 32mb ESRAM wont be enough as per this link http://www.examiner.com/article/ps4-gddr5-ram-compared-against-xbox-720-ddr3-ram-and-esram not exactly the best in credibility by how the website looks, but can you show one that actually support it on a positive light?

btk2k2

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

 

...

It turns out that for current workloads, Intel didnt see much benefit beyond a 32MB eDRAM however it wanted the design to be future proof

....

Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.

So you are saying that AMD and MS are using a similar system to the one Intel is using with Crystalwell? Do you have any links to show that the memory interface on the X1 APU is comparable to the one used on this Intel APU?

 

its not and he knows it but he never gives up.

xbox one is just a mainstream console

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="zero_cool098"] can you back that up with some good article? so far all I found is that 32mb ESRAM wont be enough as per this link http://www.examiner.com/article/ps4-gddr5-ram-compared-against-xbox-720-ddr3-ram-and-esram not exactly the best in credibility by how the website looks, but can you show one that actually support it on a positive light?

btk2k2

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

...

It turns out that for current workloads, Intel didnt see much benefit beyond a 32MB eDRAM however it wanted the design to be future proof

....

Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.

So you are saying that AMD and MS are using a similar system to the one Intel is using with Crystalwell? Do you have any links to show that the memory interface on the X1 APU is comparable to the one used on this Intel APU?

1. AMD PRT makes the smaller/faster memory into hardware managed last level cache.

2. SRAM has lower latency than DRAM e.g. no refresh overheads.

3. Intel's eDRAM has 50GB/s per direction, while AMD's eSRAM has 102 GB/s per direction.

4. AMD's eSRAM is GPU centeric.

DX11.2's tile resource goes together with ESRAM.

From http://semiaccurate.com/2013/06/18/a-glimpse-of-future-amd-graphics-offerings/

AMD has some future PC GCNs (e.g. Venus PRO MCM) with embedded DRAMs.

AMD betting on 3 horses

1. external GDDR5

2. external GDDR6

3. Embedded DRAM or Embedded SRAM or stacked embedded memory.

Avatar image for SecretPolice
SecretPolice

44277

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 SecretPolice
Member since 2007 • 44277 Posts

NEO's holistic architecture > PS4's piece meal architecture. :P

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 ronvalencia
Member since 2008 • 29612 Posts
[QUOTE="zero_cool098"]

[QUOTE="WilliamRLBaker"] That would be incorrect, GDDR5 is good for some things, Bad for others. DDR3 is good for some things bad for others there is not some measuring stick where GDDR5 is all of a sudden down pat better than DDR3 otherwise we'd have seen PC's using it as main ram years ago when GDDR5 was released. Instead DDR4 is being worked upon and is expected to be released in 2013 to the market.WilliamRLBaker

perhaps for a multi purpose traditional pc that's true, but we are talking about machines dedicated for gaming and the charts supports it. As for DDR4 it doesn't matter since the xboxone has ddr3 and its a comparison between xbox1 vs ps4

because as every one knows these consoles don't have cpu's in them and don't use the non-existant cpu for physics, and other hard hitting computations that would benefit from DDR3, these consoles also don't have multiple purposes built into them only gaming ....lol.....*laughs hard* but but but but but DDR4 doesn't matter in a console debate... Its a perfectly applicable argument when it comes to countering a claim that GDDR5 is across the board better than DDR3 when its not, Otherwise as I said we'd have seen a transition from DDR3 to GDDR5 years ago, but we didn't because GDDR5 isn't across the board better than DDR3, and why the next iteration will be DDR4. GDDR5 in this instance possibly is better in some ways than DDR3 but the claim of end all be all better is fallacy.

CPU's cache hides most of the latencies between CPU and external memory, hence very slow migration. Without AMD PRT or DX11.2's tile resource, current texture related workloads are not cache friendly.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="btk2k2"][QUOTE="ronvalencia"]

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

 

...

It turns out that for current workloads, Intel didnt see much benefit beyond a 32MB eDRAM however it wanted the design to be future proof

....

Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.

legalize82

So you are saying that AMD and MS are using a similar system to the one Intel is using with Crystalwell? Do you have any links to show that the memory interface on the X1 APU is comparable to the one used on this Intel APU?

 

its not and he knows it but he never gives up.

xbox one is just a mainstream console

By the time of Q4 2013, both consoles are mainstream. AMD already displaced their top 7870 GE (Pitcairn XT) with 7870 XT (Tahiti LE)..
Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#22 StormyJoe
Member since 2011 • 7806 Posts

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx#2

Anyone can say anything otherwise?

zero_cool098

Those charts are really misleading. At best, it's 25% faster (and that was just one game). But, if you look at the chart, it looks like it is 2-3 times as fast.

Avatar image for Bruce_Benzing
Bruce_Benzing

1731

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 Bruce_Benzing
Member since 2012 • 1731 Posts

[QUOTE="ronvalencia"]

[QUOTE="zero_cool098"] can you back that up with some good article? so far all I found is that 32mb ESRAM wont be enough as per this link http://www.examiner.com/article/ps4-gddr5-ram-compared-against-xbox-720-ddr3-ram-and-esram not exactly the best in credibility by how the website looks, but can you show one that actually support it on a positive light?

legalize82

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

 

...

It turns out that for current workloads, Intel didnt see much benefit beyond a 32MB eDRAM however it wanted the design to be future proof

....

Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.

 

u never give up huh. admit it ps4 is a beats compared to xbox one

 

let it go

I love when Cows still pretend they have an iota of a clue what he's taling about. Moove along and leave the discussion to the people that can actually comprehend...

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="zero_cool098"]

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx#2

Anyone can say anything otherwise?

StormyJoe

Those charts are really misleading. At best, it's 25% faster (and that was just one game). But, if you look at the chart, it looks like it is 2-3 times as fast.

Did you actually read the side labels on the chart? For example, 120%, 140%

48593.png

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 btk2k2
Member since 2003 • 440 Posts

[QUOTE="btk2k2"][QUOTE="ronvalencia"]

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

...

It turns out that for current workloads, Intel didnt see much benefit beyond a 32MB eDRAM however it wanted the design to be future proof

....

Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.

ronvalencia

So you are saying that AMD and MS are using a similar system to the one Intel is using with Crystalwell? Do you have any links to show that the memory interface on the X1 APU is comparable to the one used on this Intel APU?

1. AMD PRT makes the smaller/faster memory into hardware managed last level cache.

2. SRAM has lower latency than DRAM e.g. no refresh overheads.

3. Intel's eDRAM has 50GB/s per direction, while AMD's eSRAM has 102 GB/s per direction.

4. AMD's eSRAM is GPU centeric.

DX11.2's tile resource goes together with ESRAM.

From http://semiaccurate.com/2013/06/18/a-glimpse-of-future-amd-graphics-offerings/

AMD has some future PC GCNs (e.g. Venus PRO MCM) with embedded DRAMs.

AMD betting on 3 horses

1. external GDDR5

2. external GDDR6

3. Embedded DRAM or Embedded SRAM or stacked embedded memory.

None of what you have written actually answers the question I asked. Is the system AMD/MS are using on the X1 comparable to the system Intel are using on the architectural level? If they are can you provide a link to back up this claim? 1) PRT is available on both consoles so it is not an advantage to the X1 2) Granted and agreed 3) ESRAM <> EDRAM and do we know if the interface AMD is using is as good as Intels? This is not a like for like comparison so is there anything in the wild made by AMD that shows itself to be similar to the Intel implimentation? 4) A meaningless fact without context. No different to saying that the PS4s GDDR5 is GPU centric AMD's future SOC's have no bearing on what they are producing now with the X1 and the PS4.
Avatar image for Gargus
Gargus

2147

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 Gargus
Member since 2006 • 2147 Posts

GDDR5 is better than DDR3. GDDR5 is also more costly but MS needed to cut corners when they wanted to include a motion sensor in the package.

SRTtoZ

Its only more costly to the end user.

Sony has the same kind of camera you can buy seperately for only 60 dollars, still 40 dollars cheaper than an xbox one with the camera, and PS4 has hardware advantages.

Microsoft just wanted to charge more is all, the simplest answer is usually the correct one.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#27 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="btk2k2"] So you are saying that AMD and MS are using a similar system to the one Intel is using with Crystalwell? Do you have any links to show that the memory interface on the X1 APU is comparable to the one used on this Intel APU?btk2k2

1. AMD PRT makes the smaller/faster memory into hardware managed last level cache.

2. SRAM has lower latency than DRAM e.g. no refresh overheads.

3. Intel's eDRAM has 50GB/s per direction, while AMD's eSRAM has 102 GB/s per direction.

4. AMD's eSRAM is GPU centeric.

DX11.2's tile resource goes together with ESRAM.

From http://semiaccurate.com/2013/06/18/a-glimpse-of-future-amd-graphics-offerings/

AMD has some future PC GCNs (e.g. Venus PRO MCM) with embedded DRAMs.

AMD betting on 3 horses

1. external GDDR5

2. external GDDR6

3. Embedded DRAM or Embedded SRAM or stacked embedded memory.

None of what you have written actually answers the question I asked. Is the system AMD/MS are using on the X1 comparable to the system Intel are using on the architectural level? If they are can you provide a link to back up this claim? 1) PRT is available on both consoles so it is not an advantage to the X1 2) Granted and agreed 3) ESRAM <> EDRAM and do we know if the interface AMD is using is as good as Intels? This is not a like for like comparison so is there anything in the wild made by AMD that shows itself to be similar to the Intel implimentation? 4) A meaningless fact without context. No different to saying that the PS4s GDDR5 is GPU centric AMD's future SOC's have no bearing on what they are producing now with the X1 and the PS4.

1. The key statement for AMD PRT from AMD

Partially Resident Textures (PRT) enables future games to utilize ultra-high resolution textures with the same performance as today's small and often repetitive textures.

This basically says, the GPU will continue operate at high preforamce without being pulled down by memory related bottlenecks. Relative to X1, PS4 has lower memory bottlenecks i.e. fast single speed memory setup.

PC also has slow and fast memory setup, hence why Intel, AMD and NVIDIA are supporting the related DX11.2 extension. AMD PRT is just a workaround for systems with memory related bottlenecks.

To keep the high frame rates with traditional game engines, X1 would be lowering it's texture quality. AMD PRT+ESRAM was designed to workaround this problem. On the PC, the fast/smaller GDDR5 VRAM would use in place of ESRAM.

PS4 doesn't have this problem, hence the workaround has very little value.

3. AMD/MS/VGleak's theoretical number is higher than Intel's theoretical number.

4. The context was against Intel's eDRAM not GDDR5.


I frame my statements outisde of X1 vs PS4.

Avatar image for Heil68
Heil68

60721

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#28 Heil68
Member since 2004 • 60721 Posts
Technology-->Innovation-->Value The most technologically advanced console in the world, the PS4. The PlayStation®4 system is the best place to play with dynamic, connected gaming, powerful graphics and speed, intelligent personalization, deeply integrated social capabilities, and innovative second-screen features. Combining unparalleled content, immersive gaming experiences, all of your favorite digital entertainment apps, and PlayStation® exclusives, the PS4 system focuses on the gamers.
Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 FoxbatAlpha
Member since 2009 • 10669 Posts
OLD.....
Avatar image for shawn30
shawn30

4409

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 shawn30
Member since 2006 • 4409 Posts

Great. I will happily await the games that clearly show off this amazing graphical difference between the PS4 and Xbox One. Until then I will choose the system that offers the most things I am interested in, and not just better graphics and the number 4 instead of the number 3. Xbox One Day 1 :)

Avatar image for Douevenlift_bro
Douevenlift_bro

6804

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 Douevenlift_bro
Member since 2013 • 6804 Posts

You just notice this :|? Where've you been all this time? Up the mountain?rjdofu
But Titanfall... Have you SEEN it??

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#32 lostrib
Member since 2009 • 49999 Posts

dammit! aren't we done with this shit

Avatar image for navyguy21
navyguy21

17458

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#33 navyguy21
Member since 2003 • 17458 Posts
I think it was clear from the beginning.....
Avatar image for robybaggio
robybaggio

562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 robybaggio
Member since 2004 • 562 Posts
Does PS3 v.2 have Cloud and Titanfall? I thought so. Carry on.
Avatar image for timbers_WSU
timbers_WSU

6076

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#35 timbers_WSU
Member since 2012 • 6076 Posts

I just want the systems to come out. Cows are gonna end up pretty disappointed with the graphics gap. Of course they will deny it but that's another story. I am just tired of the 360 and PS3 and want something new.

Avatar image for robybaggio
robybaggio

562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 robybaggio
Member since 2004 • 562 Posts

[QUOTE="rjdofu"]You just notice this :|? Where've you been all this time? Up the mountain?Douevenlift_bro

But Titanfall... Have you SEEN it??

You won't. :D

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#37 lostrib
Member since 2009 • 49999 Posts

Does PS3 v.2 have Cloud and Titanfall? I thought so. Carry on. robybaggio

it can do "The Cloud" if needed.  and why are you bragging about a multiplat?

Avatar image for starwarsjunky
starwarsjunky

24765

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 starwarsjunky
Member since 2009 • 24765 Posts

[QUOTE="StormyJoe"]

[QUOTE="zero_cool098"]

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx#2

Anyone can say anything otherwise?

ronvalencia

Those charts are really misleading. At best, it's 25% faster (and that was just one game). But, if you look at the chart, it looks like it is 2-3 times as fast.

Did you actually read the side labels on the chart? For example, 120%, 140%

48593.png

ouch. didn't realize it was THAT much better. ~25% better when using just HALF as much(1gb vs 2gb)?!?! thats nuckin futs!
Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#39 StormyJoe
Member since 2011 • 7806 Posts

[QUOTE="StormyJoe"]

[QUOTE="zero_cool098"]

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx#2

Anyone can say anything otherwise?

ronvalencia

Those charts are really misleading. At best, it's 25% faster (and that was just one game). But, if you look at the chart, it looks like it is 2-3 times as fast.

Did you actually read the side labels on the chart? For example, 120%, 140%

48593.png

Perhaps I am confused. That chart represents GDDR5 vs DDR3 for threee different GPUs (6770, 6750, and 6730). Isn't this apples and oranges when comparing the PS4 and XB1? I thought they used the same GPU (AMD Radeon 7000 series), but the PS4 has 18 cores instead of the 12 the XB1 has?

Avatar image for clr84651
clr84651

5643

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#40 clr84651
Member since 2010 • 5643 Posts

PS4 has better specs than X1. We'll see how they're use in game development to improve PS4 exclusives. 

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#41 StormyJoe
Member since 2011 • 7806 Posts

From the link below:

"

The PS4, in comparison, has an 8-core Jaguar AMD CPU, with a GPU thats around the same level as the Radeon 7870 (which is significantly more powerful than the 7790). The PS4 has 8GB of GDDR5 RAM, providing 176GB/s of bandwidth to both the CPU and GPU. The Xbox One mostly ameliorates this difference with 32MB of high-speed SRAM on the GPU, but it will be a more complex architecture to take advantage of.

In both consoles, the CPU and GPU will be on the same die (an AMD APU). Just as the PS4 has 8GB of high-speed memory that is shared by the CPU and GPU, the Xbox One, by virtue of being based on the same APU heterogeneous system architecture (HSA), will probably be the same. In short, while there are small hardware differences between the consoles, they will ultimately have very similar performance characteristics. The PS4, with its one, big block of fast RAM, and bigger GPU, probably has the edge."

http://www.extremetech.com/gaming/156273-xbox-720-vs-ps4-vs-pc-how-the-hardware-specs-compare

Most non-fanboy sites I have searched say the same thing. The PS4 will probably be slightly more powerful. I have yet to read an article by a tech site that claims, for certain, that this "50% more powerful" figure is a reality, especially since the XB1 has the 32 MB of SRAM.

I am not saying that the "50% more powerful" figure isn't true, I just have not read an article that states that figure as fact.

Avatar image for Truth_Hurts_U
Truth_Hurts_U

9703

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#42 Truth_Hurts_U
Member since 2006 • 9703 Posts

It's 33% more powerful... People keep saying 50% because they don't know math.

SOOOOO STOP SAYING 50%!!!

God...

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 tormentos
Member since 2003 • 33784 Posts

 X1 has 32MB ESRAM to patch the performance gap with GDDR5.

ronvalencia

 

 

Yes 32 MB which will no be enough..

Having 32MB of fast ram is not the same as having 8GB of fast ram,no matter what only the connection to the ESRAM is fast the rest is slow ass DDR3.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 tormentos
Member since 2003 • 33784 Posts

[QUOTE="ronvalencia"]

[QUOTE="zero_cool098"]

http://www.amd.com/us/products/technologies/gddr5/Pages/gddr5.aspx#2

Anyone can say anything otherwise?

zero_cool098

X1 has 32MB ESRAM to patch the performance gap with GDDR5.

can you back that up with some good article? so far all I found is that 32mb ESRAM wont be enough as per this link http://www.examiner.com/article/ps4-gddr5-ram-compared-against-xbox-720-ddr3-ram-and-esram not exactly the best in credibility by how the website looks, but can you show one that actually support it on a positive light?

 

Anandtech PC site say the same it will not be enough.

Avatar image for messedupworld
messedupworld

101

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 messedupworld
Member since 2013 • 101 Posts
[QUOTE="Heil68"]Technology-->Innovation-->Value The most technologically advanced console in the world, the PS4. The PlayStation®4 system is the best place to play with dynamic, connected gaming, powerful graphics and speed, intelligent personalization, deeply integrated social capabilities, and innovative second-screen features. Combining unparalleled content, immersive gaming experiences, all of your favorite digital entertainment apps, and PlayStation® exclusives, the PS4 system focuses on the gamers.

Another day, another dollar eh Heli68.
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts
xbox one got shitted on
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts
[QUOTE="SRTtoZ"]

GDDR5 is better than DDR3.  GDDR5 is also more costly but MS needed to cut corners when they wanted to include a motion sensor in the package.

WilliamRLBaker
That would be incorrect, GDDR5 is good for some things, Bad for others. DDR3 is good for some things bad for others there is not some measuring stick where GDDR5 is all of a sudden down pat better than DDR3 otherwise we'd have seen PC's using it as main ram years ago when GDDR5 was released. Instead DDR4 is being worked upon and is expected to be released in 2013 to the market.

in this case, GDDR5 is noticeably better then the DDR3 ram setup in the Xbox One, The gddr5 in the ps4 doesnt suffer from latency issues and so the ps4 has the best of both worlds (fast bandwidth/low latency)
Avatar image for deactivated-5ba16896d1cc2
deactivated-5ba16896d1cc2

2504

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 deactivated-5ba16896d1cc2
Member since 2013 • 2504 Posts

From the link below:

"

The PS4, in comparison, has an 8-core Jaguar AMD CPU, with a GPU thats around the same level as the Radeon 7870 (which is significantly more powerful than the 7790). The PS4 has 8GB of GDDR5 RAM, providing 176GB/s of bandwidth to both the CPU and GPU. The Xbox One mostly ameliorates this difference with 32MB of high-speed SRAM on the GPU, but it will be a more complex architecture to take advantage of.

In both consoles, the CPU and GPU will be on the same die (an AMD APU). Just as the PS4 has 8GB of high-speed memory that is shared by the CPU and GPU, the Xbox One, by virtue of being based on the same APU heterogeneous system architecture (HSA), will probably be the same. In short, while there are small hardware differences between the consoles, they will ultimately have very similar performance characteristics. The PS4, with its one, big block of fast RAM, and bigger GPU, probably has the edge."

http://www.extremetech.com/gaming/156273-xbox-720-vs-ps4-vs-pc-how-the-hardware-specs-compare

Most non-fanboy sites I have searched say the same thing. The PS4 will probably be slightly more powerful. I have yet to read an article by a tech site that claims, for certain, that this "50% more powerful" figure is a reality, especially since the XB1 has the 32 MB of SRAM.

I am not saying that the "50% more powerful" figure isn't true, I just have not read an article that states that figure as fact.

StormyJoe
50% more shader performance in the PS4, GPU, around 33% overall more power performance for the PS4 gpu over the Xbox One GPU... but the ps4 gpu will blow away the xbox one gpu in compute
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 tormentos
Member since 2003 • 33784 Posts

 

http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

 

...

It turns out that for current workloads, Intel didnt see much benefit beyond a 32MB eDRAM however it wanted the design to be future proof

....

Intel claims that it would take a 100 - 130GB/s GDDR memory interface to deliver similar effective performance to Crystalwell since the latter is a cache. Accessing the same data (e.g. texture reads) over and over again is greatly benefitted by having a large L4 cache on package.

ronvalencia

 

Unlike previous eDRAM implementations in game consoles, Crystalwell is true 4th level cache in the memory hierarchy. It acts as a victim buffer to the L3 cache, meaning anything evicted from L3 cache immediately goes into the L4 cache. Both CPU and GPU requests are cached. The cache can dynamically allocate its partitioning between CPU and GPU use. If you dont use the GPU at all (e.g. discrete GPU installed), Crystalwell will still work on caching CPU requests. Thats right, Haswell CPUs equipped with Crystalwell effectively have a 128MB L4 cache.

 

The eDRAM clock tops out at 1.6GHz.

Theres only a single size of eDRAM offered this generation: 128MB. Since its a cache and not a buffer (and a giant one at that), Intel found that hit rate rarely dropped below 95%. It turns out that for current workloads, Intel didnt see much benefit beyond a 32MB eDRAMhowever it wanted the design to be future proof. Intel doubled the size to deal with any increases in game complexity, and doubled it again just to be sure.

 

Edram And ESRAM are not the same,which is another detail you miss.

Edram on xbox 360 is even faster than the ESRAM on xbox one,is 256GB/s,sure it doesn't have all the advantages ESRAM has,but it is faster,also that EDRAM you quote is clocked at 1600mhz,the xbox one i think 800mhz as well.

By the way Intel = AMD since when.?

 

 

Ronvalencia the selective reader.

 

Avatar image for Mrmedia01
Mrmedia01

1917

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 Mrmedia01
Member since 2007 • 1917 Posts

Its a no brainier, PS4 has a big ram advantage.