GDDR 5 vs DDR3

  • 73 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#1 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

Please no fanboys.

Just speak sense. No Fanboy logic.

Be open minded and try not to be bias toward PS4 or Xbox One.

Thanks

Why Sony went for GDDR5 ram and why MS went with DDR3.

They are both rich, and not stupid at all. A lot of money could be at stake, so why choose this ?

The following is without having Xbox One’s ESRAM in mind.

A computer (server or console) doesn't need a large pool of DDR5 memory to perform at a high level.

The only reason that Sony went with one unified pool of DDR5 memory is that they have CPU and GPU on the same package which share a single memory controller.

MS has the same dilemma as Sony: One memory controller limits you to using only one type of memory since each memory type has it's own controller.

So you either go with cheaper DDR3 or you go with more expensive DDR5.

The problem with using all DDR3 is that your GPU needs faster memory to do its work. Sony solved this problem by just going all DDR5. 4 GB was economical but they wanted 8 GB mainly with system functions in mind.

MS decided to solve the problem another way: Go with 8 GB of the more economical (and easily available) memory and deal with the memory bandwidth problem with DME's and a fast L4 cache. The DMEs + L4 cache is like an HOV lane for GPU work only.

So we have two companies that faced the same problem but chose to tackle it in different ways achieving the same results.

Sony chose the brute force method and MS chose to make a few adjustments.

Sony went with 8 GB mainly for system functions and this will have a negligible effect on graphics from 4 GB of memory.

PS4 graphics would have been fine with 4 GB of memory, but with all they want to do with the OS (new dash, sharing, Gakai, etc) more memory was necessary.

I think personally think GDDR5 is sort of a bad choice in a way. PS4 is suffering from a bit of GDDR5 latency, although Mark Cerny was saying in an article that it’s “NOT MUCH OF A FACTOR”. But in the article he only talks about the GPU which higher latency won’t be a problem at all. What I was wondering was...... why he didn’t talk about the CPU. OS might be affected by latency. Logically.

8 GB of GDDR5 was included for more storage space. PS4's GPU won't come close to using half of that. The other 4 GB is for game data and OS functions. You'll have game logic and data just sitting in half of your expensive memory while your GPU barely needs 3 GB of it and I'm being generous. GDDR5 is basically cache memory, whatever goes in there doesn't sit in there for very long.

It's really like buying a 1 TB SSD and storing mostly family pictures and documents on it.

But it still fixes the problem.

I don’t want to sound biased at all. I love my PS4. But I think MS came up with the better idea. An efficient, well designed solution to the same problem engineering high speed traffic lanes to it's memory.

Adding the idea of ESRAM to the Xbox One adds a few complications, not to the hardware but for the developers.

The ESRAM is making it a bit harder for the devs.

Reportedly there was a test that the ESRAM does however boost the Xbox One’s bandwidth to 192gb/s

Not sure what PS4’s is but I think 192gb/s is still fast even if PS4’s is higher.

Apparently, some game developers who had access to the console before it was launched claimed that the Xbox One’s bandwidth is faster than even MS thought.

But GDDR5 is still very tough. I wonder if the DDR3 and ESRAM actually comes close to it.

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#2  Edited By clyde46
Member since 2005 • 49061 Posts

Avatar image for Heil68
Heil68

60681

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3 Heil68
Member since 2004 • 60681 Posts

DDR5 is better.

Avatar image for Floppy_Jim
Floppy_Jim

25930

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#4 Floppy_Jim
Member since 2007 • 25930 Posts

@Heil68 said:

DDR5 is better.

GDDR5, chum. The G stands for "Great".

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#5 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@Heil68: Yes, but did you read anything I just posted.

It’s irrelevant.

Avatar image for Heil68
Heil68

60681

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6 Heil68
Member since 2004 • 60681 Posts

@acp_45 said:

@Heil68: Yes, but did you read anything I just posted.

It’s irrelevant.

It's not because that's what SONY chose.

Avatar image for way2funny
way2funny

4570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#7 way2funny
Member since 2003 • 4570 Posts

@acp_45 said:

Please no fanboys.

Just speak sense. No Fanboy logic.

Be open minded and try not to be bias toward PS4 or Xbox One.

Thanks

Why Sony went for GDDR5 ram and why MS went with DDR3.

They are both rich, and not stupid at all. A lot of money could be at stake, so why choose this ?

The following is without having Xbox One’s ESRAM in mind.

A computer (server or console) doesn't need a large pool of DDR5 memory to perform at a high level.

The only reason that Sony went with one unified pool of DDR5 memory is that they have CPU and GPU on the same package which share a single memory controller.

MS has the same dilemma as Sony: One memory controller limits you to using only one type of memory since each memory type has it's own controller.

So you either go with cheaper DDR3 or you go with more expensive DDR5.

The problem with using all DDR3 is that your GPU needs faster memory to do its work. Sony solved this problem by just going all DDR5. 4 GB was economical but they wanted 8 GB mainly with system functions in mind.

MS decided to solve the problem another way: Go with 8 GB of the more economical (and easily available) memory and deal with the memory bandwidth problem with DME's and a fast L4 cache. The DMEs + L4 cache is like an HOV lane for GPU work only.

So we have two companies that faced the same problem but chose to tackle it in different ways achieving the same results.

Sony chose the brute force method and MS chose to make a few adjustments.

Sony went with 8 GB mainly for system functions and this will have a negligible effect on graphics from 4 GB of memory.

PS4 graphics would have been fine with 4 GB of memory, but with all they want to do with the OS (new dash, sharing, Gakai, etc) more memory was necessary.

I think personally think GDDR5 is sort of a bad choice in a way. PS4 is suffering from a bit of GDDR5 latency, although Mark Cerny was saying in an article that it’s “NOT MUCH OF A FACTOR”. But in the article he only talks about the GPU which higher latency won’t be a problem at all. What I was wondering was...... why he didn’t talk about the CPU. OS might be affected by latency. Logically.

8 GB of GDDR5 was included for more storage space. PS4's GPU won't come close to using half of that. The other 4 GB is for game data and OS functions. You'll have game logic and data just sitting in half of your expensive memory while your GPU barely needs 3 GB of it and I'm being generous. GDDR5 is basically cache memory, whatever goes in there doesn't sit in there for very long.

It's really like buying a 1 TB SSD and storing mostly family pictures and documents on it.

But it still fixes the problem.

I don’t want to sound biased at all. I love my PS4. But I think MS came up with the better idea. An efficient, well designed solution to the same problem engineering high speed traffic lanes to it's memory.

Adding the idea of ESRAM to the Xbox One adds a few complications, not to the hardware but for the developers.

The ESRAM is making it a bit harder for the devs.

Reportedly there was a test that the ESRAM does however boost the Xbox One’s bandwidth to 192gb/s

Not sure what PS4’s is but I think 192gb/s is still fast even if PS4’s is higher.

Apparently, some game developers who had access to the console before it was launched claimed that the Xbox One’s bandwidth is faster than even MS thought.

But GDDR5 is still very tough. I wonder if the DDR3 and ESRAM actually comes close to it.

DDR5 is not the same as GDDR5

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#8 lostrib
Member since 2009 • 49999 Posts

hasn't this shit been done to death?

Avatar image for Scipio8
Scipio8

937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9  Edited By Scipio8
Member since 2013 • 937 Posts

@Heil68:

LOL there is not DDR5 yet, DDR4 is only starting to appear. Both GDDR5 and DDR3 are from the same family.

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10  Edited By MonsieurX
Member since 2008 • 39858 Posts

lol brofists thread

Avatar image for stereointegrity
stereointegrity

12151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 stereointegrity
Member since 2007 • 12151 Posts

Ps4 doesn't have latency issues due to gddr5

http://www.redgamingtech.com/ps4-vs-xbox-one-gddr5-vs-ddr3-latency/

Avatar image for DarthaPerkinjan
DarthaPerkinjan

1318

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#12  Edited By DarthaPerkinjan
Member since 2005 • 1318 Posts

Hermits use DDR3 in their 1440p ultra settings 120fps PCs, it must be great

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13  Edited By lostrib
Member since 2009 • 49999 Posts

@DarthaPerkinjan said:

Hermits use DDR3 in their 1440p ultra settings 120fps PCs, it must be great

That's system RAM. most GPUs for gaming use GDDR5 for memory

Avatar image for NFJSupreme
NFJSupreme

6605

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 NFJSupreme
Member since 2005 • 6605 Posts

If you look at the design for each console you would understand why both use the type of RAM that they use.

Avatar image for Telekill
Telekill

12061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#16 Telekill
Member since 2003 • 12061 Posts

You guys are still debating this crap? How about you let it rest and just enjoy the games you have for whichever system you own?

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#17  Edited By AzatiS
Member since 2004 • 14969 Posts

@acp_45: Too much blabla.

The facts are clear .

PS4 outperfoms currently X1 on major multiplatforms . This time is not about weird architectures or broken hardware .. Its about power.

When same titles on same graphic quality play on 1080p for PS4 there is nothing to discuss here , thats all that matters in the end. If we add another fact , that PS4 costs 100$ less while is capable to provide better visuals .. then things are getting complicated for X1. Thats all there is to it.

Imho what is happening to M$ with X1 is because of Wii success. They tried to introduce something between a casual system with motion controls and good graphics let alone multimedia capabilities ( everything Wii lacked , even Wii U ) ... they tried to establish a system for everyone.. but they failed.

Thats what i call bad decision personally. Trying to be something that you never meant to be ( all in one , therefore Xone name imho ) .. A gimmick control console but at the same time a powerful console , aiming for super casual audience as well as hardcores etc. ... Backfired in da face. Thats how i see it. Theres nothing else to discuss , why DDR3 , why weaker GPU , why Kinect , why 500$ .... Too late about that !

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#18  Edited By AzatiS
Member since 2004 • 14969 Posts

@DarthaPerkinjan said:

Hermits use DDR3 in their 1440p ultra settings 120fps PCs, it must be great

In fact im using 16GB of DDR3 at 1600mhz PLUS 3GB of GDDR5 dedicated graphic memory from my GPU. X1 is using DDR3 as graphic memory !! Thats ridiculous for a 2013 500$ video gaming product ... Even GPUs of 100$ nowdays providing solid GDDR5

Avatar image for ShepardCommandr
ShepardCommandr

4939

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#19 ShepardCommandr
Member since 2013 • 4939 Posts

@Heil68 said:

DDR5 is better.

That's all you need to know

Avatar image for deactivated-58abb194ab6fb
deactivated-58abb194ab6fb

3984

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 deactivated-58abb194ab6fb
Member since 2010 • 3984 Posts

@Telekill said:

You guys are still debating this crap? How about you let it rest and just enjoy the games you have for whichever system you own?

Amen, this is irrelevant to you unless you are actually making games for either one of these consoles. If you're just playing game, doesn't matter what the ram is, just shut up and play the game and enjoy it. End of story.

Avatar image for Pray_to_me
Pray_to_me

4041

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21  Edited By Pray_to_me
Member since 2011 • 4041 Posts

Also ddr2 has lower latency than ddr3 are you going to upgrade your pc ram to ddr2?

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#22  Edited By lostrib
Member since 2009 • 49999 Posts

@ShepardCommandr said:

@Heil68 said:

DDR5 is better.

That's all you need to know

How can nonexistent RAM be better?

Avatar image for deactivated-5f19d4c9d7318
deactivated-5f19d4c9d7318

4166

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 deactivated-5f19d4c9d7318
Member since 2008 • 4166 Posts

@Pray_to_me said:

Also ddr2 has lower latency than ddr3 are you going to upgrade your pc ram to ddr2?

Lol, it's ironic listening to lems being so critical of latency when they were all hyping cloud based graphics only a couple of months back.

Avatar image for lglz1337
lglz1337

4959

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#24  Edited By lglz1337
Member since 2013 • 4959 Posts

GDDR5 >>>>> DDR3 TC no mather what you try too discuss

Avatar image for navyguy21
navyguy21

17401

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#25  Edited By navyguy21
Member since 2003 • 17401 Posts

This is the simple truth to anyone with a background in technology or at least an advanced understanding:

GDDR5 is faster, simple as that

DDR3 is super slow for graphics, but perfect for multitasking

GDDR5 was built for graphical tasks

DDR3 with move engines and ESRAM is a perfect solution to the bandwidtch difference, but complicates game development

GDDR5 isnt magical and doesnt make a system more powerful

ESRAM and move engines cannot and will not make up for the power differences in the consoles

The difference in power is SMALLER this gen than last gen

PS4 has the simpler, more efficient hardware for gaming

XB1 is better at multitasking and always will be because of GDDR5

XB1 will be fine this gen when comparing to PS4 games.

PS4 exclusives will look better.........again

Multiplats will be the same after drivers are updated/code is optimized for ESRAM, but will have more stable framerates on PS4

The fact that drivers were outdated proves XB1's design was rushed and adjusted on the fly too late.

Increasing GPU and CPU speed so late basically locked XB1s launch titles at 720p, along with the outdated drivers

Sony has the better collection of devs that push hardware, so PS4 will have MORE great looking games.

In the end, the consoles will be equal, its just a matter of which you prefer.

Avatar image for BattlefieldFan3
BattlefieldFan3

361

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26  Edited By BattlefieldFan3
Member since 2012 • 361 Posts

@navyguy21 said:

This is the simple truth to anyone with a background in technology or at least an advanced understanding:

GDDR5 is faster, simple as that

DDR3 is super slow for graphics, but perfect for multitasking

GDDR5 was built for graphical tasks

DDR3 with move engines and ESRAM is a perfect solution to the bandwidtch difference, but complicates game development

GDDR5 isnt magical and doesnt make a system more powerful

ESRAM and move engines cannot and will not make up for the power differences in the consoles

The difference in power is SMALLER this gen than last gen

PS4 has the simpler, more efficient hardware for gaming

XB1 is better at multitasking and always will be because of GDDR5

XB1 will be fine this gen when comparing to PS4 games.

PS4 exclusives will look better.........again

Multiplats will be the same after drivers are updated/code is optimized for ESRAM, but will have more stable framerates on PS4

The fact that drivers were outdated proves XB1's design was rushed and adjusted on the fly too late.

Increasing GPU and CPU speed so late basically locked XB1s launch titles at 720p, along with the outdated drivers

Sony has the better collection of devs that push hardware, so PS4 will have MORE great looking games.

In the end, the consoles will be equal, its just a matter of which you prefer.

Funny how you say Xbone is better at multitasking when it can't install more than 1 game without lagging and stuttering and the entire system.

The PS4 can install 11 games at once without a single hiccup.

Avatar image for MonsieurX
MonsieurX

39858

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 MonsieurX
Member since 2008 • 39858 Posts

@BattlefieldFan3 said:

@navyguy21 said:

This is the simple truth to anyone with a background in technology or at least an advanced understanding:

GDDR5 is faster, simple as that

DDR3 is super slow for graphics, but perfect for multitasking

GDDR5 was built for graphical tasks

DDR3 with move engines and ESRAM is a perfect solution to the bandwidtch difference, but complicates game development

GDDR5 isnt magical and doesnt make a system more powerful

ESRAM and move engines cannot and will not make up for the power differences in the consoles

The difference in power is SMALLER this gen than last gen

PS4 has the simpler, more efficient hardware for gaming

XB1 is better at multitasking and always will be because of GDDR5

XB1 will be fine this gen when comparing to PS4 games.

PS4 exclusives will look better.........again

Multiplats will be the same after drivers are updated/code is optimized for ESRAM, but will have more stable framerates on PS4

The fact that drivers were outdated proves XB1's design was rushed and adjusted on the fly too late.

Increasing GPU and CPU speed so late basically locked XB1s launch titles at 720p, along with the outdated drivers

Sony has the better collection of devs that push hardware, so PS4 will have MORE great looking games.

In the end, the consoles will be equal, its just a matter of which you prefer.

Funny how you say Xbone is better at multitasking when it can't install more than 1 game without lagging and stuttering and the entire system.

The PS4 can install 11 games at once without a single hiccup.

Calling bullshit on that.

Avatar image for 04dcarraher
04dcarraher

23824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#28  Edited By 04dcarraher
Member since 2004 • 23824 Posts

DDR3 vs GDDR5 has multiple factors that determine how well its does in multiple tasks.

DDR3 and GDDR5 are related, GDDR5 is basically higher clocked modified DDR3 that has a controller/ that doubles the I/O of DDR3 GDDR5 handles input and output on the same cycle while normal memory takes two cycles.

DDR3 benefits from low latency at the expense of bandwidth, GDDR5's case is the opposite. Timings for GDDR5 is unbelieveably slow in relation to DDR3, but the speed of VRAM is blazing fast in comparison with DDR3 this has resulted from the relative workloads that a CPU and GPU undertake. Latency isn't much of an issue with GPU's since their parallel nature allows them to move to other calculations when latency cycles cause a stall in the current workload/thread. The performance of a graphics card for instance is greatly affected by altering the internal bandwidth ie bus.

The DDR3 in the X1 is 256bit while moving 68GB/s that is buffered with the ESRAM that is a 1024bit bus based memory that has 170+ GB.s and can read/write 4x faster then GDDR5 cycle vs cycle. Also have to remember that GDDR3 bandwidth that was used for gpu's on the 8800GT or 8800GTS or AMD 4850 series had bandwidth under 68GB/s. With the DDR3 the X1 should be able to do general usage multitasking , and gaming just fine with the 256bit DDR3 because of the hardware limits and ESRAM as a cache/buffer for the bus.

Now with the GDDR5 in the PS4 the only thing that they have to worry about is the higher latency caused by the high clock rates from GDDR5. but pre-fetching to gpu and using the bi directional 10gb/s bus between the cpu and gpu will allow the gpu to process certain things for the cpu quicker vs cpu grabbing data from GDDR5 having to wait longer vs DDR3.

Avatar image for remiks00
remiks00

4249

Forum Posts

0

Wiki Points

0

Followers

Reviews: 37

User Lists: 0

#29  Edited By remiks00
Member since 2006 • 4249 Posts

@navyguy21: I concur sir. I'm not sure about the power gap though, lot's of misinformation about this topic to be sure (outdated drivers, etc), no one really knows the truth.; only time will tell. But at least you're not an idiot poster, and I respect that.

------------------------------------------------------------

*looks at acp_45's post history* ........sigh, another troll bait thread.

This has been done to death. There's so much misinformation floating around everywhere about this crap that only time will tell what design was better. For now, PS4 flat out appears to be the better solution for gamingcompared to the other consoles; plain and simple. And that's okay! No one is saying that the other consoles don't offer exclusives or fun games that are worth playing, which they do. I like Killer Instinct, and I'm a big Dead Rising fan; even though it's not that great of a series lol. (Frank West was cool).

I digress.

All these tiled resources, PRT, ESRAM, latency, blah blah blah blah theories have yet to come into fruition to even be worth a damn within a debate. Xbone was clearly designed as an ALL-in-one machine with a heavy focus on kinect (because they really wanted to penetrate the casual market like the Wii). PS4 was designed as a gaming machine that also has multimedia capability (which will be patched in).

So until then, all this crap is nothing but mere farts in the wind. Enjoy the games, gamers.

Avatar image for Gaming-Planet
Gaming-Planet

21064

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#30  Edited By Gaming-Planet
Member since 2008 • 21064 Posts

@Floppy_Jim said:

@Heil68 said:

DDR5 is better.

GDDR5, chum. The G stands for "Great".

Stands for Graphics...

Avatar image for MdBrOtha04
MdBrOtha04

1828

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#31  Edited By MdBrOtha04
Member since 2003 • 1828 Posts

Lets see what AMD thinks on the matter.

http://www.youtube.com/watch?v=HSyRUL3pNkM

Avatar image for navyguy21
navyguy21

17401

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#32  Edited By navyguy21
Member since 2003 • 17401 Posts

@BattlefieldFan3: I said all of that and thats the only thing you respond to?

Should i use analogies?

GDDR5 is like a Ferrari, DDR3 is Porshe 911

Game development is the Autobahns, CPU processing is Off Road

Without a doubt, the Ferrari would smoke the Porche doing what it was built to do, GAMING.

Put that car in a CPU environment (unpredictable, short notice turns, dips, etc) and it stumbles

ESRAM is like a super charger. Gives the Porshe what it needs to be competitive on the autobahn, but still has disadvantages.

Sony put in memory controllers, which are essentially off road tires on a Ferrari

Does that help?

Avatar image for brut_fruit
brut_fruit

125

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33  Edited By brut_fruit
Member since 2006 • 125 Posts

spin! spin!! spin!!!

all the flagship gpu's from AMD r9XXX & Nvidia gtx7XX use GDDR5 if its ddr3 was better they would've used that

GDDR5>>>>DDR3 deal with it lemmings

Avatar image for MdBrOtha04
MdBrOtha04

1828

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#34  Edited By MdBrOtha04
Member since 2003 • 1828 Posts

DDr3 is probably better for all the OS functions on the xbox one but not for the games.

Avatar image for Dilrod
Dilrod

4257

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#35  Edited By Dilrod
Member since 2003 • 4257 Posts

@MonsieurX I can at least confirm 4 games simultaneously downloading, on my ps4, installing, and switching back and forth between playing a fifth game, surfing you tube videos, and playing the games that are still installing without any lag. I'm mobile now, but I created a topic about this with a picture. On my xbox one, I got video latency, and sound skipping when trying to watch videos or even play COD Ghosts while it was installing.

Avatar image for remiks00
remiks00

4249

Forum Posts

0

Wiki Points

0

Followers

Reviews: 37

User Lists: 0

#36 remiks00
Member since 2006 • 4249 Posts

@Dilrod said:

@MonsieurX I can at least confirm 4 games simultaneously downloading, on my ps4, installing, and switching back and forth between playing a fifth game, surfing you tube videos, and playing the games that are still installing without any lag. I'm mobile now, but I created a topic about this with a picture. On my xbox one, I got video latency, and sound skipping when trying to watch videos or even play COD Ghosts while it was installing.

damn. I can confirm the ps4 portion as I have been able to do that as well. My friend's Xbox One did lag video from from IE while it was snapped on the right. Could be a browser issue though...

Avatar image for navyguy21
navyguy21

17401

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#37 navyguy21
Member since 2003 • 17401 Posts

@brut_fruit said:

spin! spin!! spin!!!

all the flagship gpu's from AMD r9XXX & Nvidia gtx7XX use GDDR5 if its ddr3 was better they would've used that

GDDR5>>>>DDR3 deal with it lemmings

Who said it wasnt better?

Avatar image for Newhopes
Newhopes

4775

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#38 Newhopes
Member since 2009 • 4775 Posts

Show how much the average idiot on those forums know...

Avatar image for zarshack
zarshack

9936

Forum Posts

0

Wiki Points

0

Followers

Reviews: 149

User Lists: 0

#39  Edited By zarshack
Member since 2009 • 9936 Posts

@Floppy_Jim said:

@Heil68 said:

DDR5 is better.

GDDR5, chum. The G stands for "Great".

So that's the "Greatness" Sony was referring to!

Avatar image for jake44
jake44

2085

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#40 jake44
Member since 2003 • 2085 Posts

Did we go back in time 6 months?

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#41  Edited By deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@lostrib: Yes but those GPU are crazy compared to these consoles.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#42 lostrib
Member since 2009 • 49999 Posts

@acp_45 said:

@lostrib: Are you crazy ?

The first GPU with GDDR5 was only recently announced.

If I’m wrong. :/

...

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#43  Edited By AzatiS
Member since 2004 • 14969 Posts

All you need to know

PS4 1080p on same title when X1 manages 720p.

And thats all you need to know !! Everything else doesnt matter , is just blabla.

Lemmings acting like cows 8 years ago ... Cell does this and does that and its 8 core while in action games were slightly better on X360 than PS3. At least the first 2 to 4 years.

Proofs are out there , why we need to discuss something like that when we got both systems on same games in action ?

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#44 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@AzatiS: Sony went for cheaper APU. Because they went for raw performance and used off shelf products and they basically just added it to their console. Customizing that which they added by a minimum.

MS went for a more asymmetrical approach and that’s a lot more customized and expensive. I might be wrong about asymmetrical but I think I’ve seen it somewhere.

@04dcarraher, That explains why Sony evaded the CPU part in the article. It shows you how little journalists know in terms of the hardware technically.

@brut_fruit : I love how I didn’t say anything about DDR 3 is stronger or even when I said anything about the GPU part.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#45 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@remiks00: MISINFORMATION.

Sony first handedly evades the importance of the CPU in an interview.

MS ENGINEERS VS DIGITAL FOUNDRY.

Really Misinformation.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#46 blackace
Member since 2002 • 23576 Posts

Don't care, as long as there are great games to be played on both systems. It's all about the games in the end.

Avatar image for stereointegrity
stereointegrity

12151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 stereointegrity
Member since 2007 • 12151 Posts

@acp_45 said:

@AzatiS: Sony went for cheaper APU. Because they went for raw performance and used off shelf products and they basically just added it to their console. Customizing that which they added by a minimum.

MS went for a more asymmetrical approach and that’s a lot more customized and expensive. I might be wrong about asymmetrical but I think I’ve seen it somewhere.

@04dcarraher, That explains why Sony evaded the CPU part in the article. It shows you how little journalists know in terms of the hardware technically.

@brut_fruit : I love how I didn’t say anything about DDR 3 is stronger or even when I said anything about the GPU part.

microsoft went for the same exact apu sony did just needed to add the esram to fix the lack of gpu bandwidth...

what do u even mean by a-symmetrical?

an apu is an apu is an apu....only difference between these two is the esram....

Avatar image for btk2k2
btk2k2

440

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48  Edited By btk2k2
Member since 2003 • 440 Posts

Not this crap again...

The latency for GDDR5 and DDR3 is practically the same. GDDR5 takes more cycles to do its thing (known as the CAS latency) but it also does more cycles in a given period of time. The net result is about the same.

To use a car analogy it is like a gear box, in a low gear you need to run the engine at a higher RPM to maintain a given velocity but in a high gear you can maintain that velocity with a lower RPM. In this case velocity is the latency in nano seconds, the RPM is the clock speed of the memory and the ratio of the gearbox is the CAS latency.

5500 MHZ GDDR5 with a CAS latency of 15 has the same latency as 2133 MHZ DDR3 with a CAS latency of 6, most DDR3 has a CAS latency of between 7 and 9 making the actual latency in nano seconds higher on your average stick of DDR3 than on your average module of GDDR5.

If you want to continue to spout the misinformation about the GDDR5 and the DDR3 memory latency and come up with analogies trying to show that GDDR5 is good for graphics but bad for CPU workloads and the DDR3 is better for CPU workloads but weaker in graphics then you are being wilfully ignorant and probably will not listen to reason. If you are one of those posters then enjoy revelling in your own ignorance and stupidity, have fun with it while the rest of us actually learn something.

Avatar image for Scipio8
Scipio8

937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 Scipio8
Member since 2013 • 937 Posts

For price/performance/TDP DDR3 is better. The PS4 is an unbalanced system.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#50  Edited By deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@btk2k2: Judging me from what I’ve heard and the information I have does not make me ignorant. I do digging on a lot of topics.

Even on iPhone vs Other flagship phones. To freaking IOS vs Windows and now the next Linux push.

How can you say that I’m ignorant and will not see reason.

Here’s what I know. If you disagree. Tell me.

DDR3 runs at a higher voltage that GDDR5 (typically 1.25-1.65V versus GDDR5 which is at 1V )

DDR3 uses a 64-bit memory controller per channel ( 128-bit bus for dual channel and 256-bit for quad channel), whereas GDDR5 is paired with controllers of 32-bit (16 bit each for input and output), but whereas the CPU's memory contoller is 64bit per channel.

The GPU can utilise any number of 32-bit I/O's (at the cost of die size) depending upon application ( 2 for 64-bit bus, 4 for 128-bit, 6 for 192-bit, 8 for 256-bit....... you can go on.)

The memory is also fundamentally set up specifically for the application it uses:

System memory (DDR3) benefits from low latency (tight timings) at the expense of bandwidth, GDDR5's case is the opposite. Timings for GDDR5 would seem a little slow in relation to DDR3, but the speedof VRAM is blazing fast in comparison with desktop RAM, this has resulted from the relative workloads that a CPU and GPU undertake.

Latency isn't much of an issue with GPU's since their parallel nature allows them to move to other calculation when latency cycles cause a stall in the current workload.

There is however still a little bit of more latency on GDDR5. Notibly at least 9 nanoseconds which isn’t a lot of a difference.

I made this post, only to see what people think of this choice. Because most people in these forums spit out their consoles specs not knowing what’s really going on.

I see a lot of Xbox One Fanboys spitting ESRAM around stating that it’s faster and therefore the Xbox One is better.

They are right about it being faster but the fact is that ESRAM is making it harder for developers and they haven’t found an efficient solution to it.

I see a lot of people spitting CU’s, the PS4 having more meaning better console.

But there is a lot more to it than you think.

Also note that both companies might still have something they haven’t really talked about. Notibly the Hardware based Tiled resources on DX.11.2, that which they showed of at the MS build event late 2013.

Which could be something huge.

Just think about this quickly.

They were able to build a world worth 3GB in size with 16MB while Tiled Resources were on. Now combine that with Cloud Gaming.

This isn’t 100% sure but it’s highly possible.

This won’t only be good news for MS but to the entire industry.

All in All there are so many factors. that could count to both companies.

I just hate the controversies these days. Because they are without sense.