iamrob7's forum posts

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

16

Followers

Reviews: 52

User Lists: 0

#1 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="xboxiphoneps3"] you clearly dont know what you are talking about... the PS4 CPU cores will be clocked at 2.0 ghz each, AND developers can use all of that RAM right away, it wont take "years" for them to finally take advantage of all it... they can take advantage of all the ram right nowtormentos
He actually believes that sony will not use the ram because PC developers are not using it...:lol: I would love to see sony using 5 or 6 GB of ram and trowing incredible textures,effects and things at once,i would love to see how those 660TI and most mid to high range GPU start to choke.. Is something they never think about they think that as long as the GPU is strong nothing can hurt them..:lol: http://www.anandtech.com/show/6359/the-nvidia-geforce-gtx-650-ti-review/6 They don't want to understand what happen when a GPU get Vram limited.. http://www.anandtech.com/show/6359/the-nvidia-geforce-gtx-650-ti-review/15 Look at the huge blow that GPU take when they are ram starved,in both cases the 650ti or the 7850 the result was the same,as long as the game doesn't demand allot of ram is ok,as soon as a game ask to much ram the impact is latent.. 33FPS difference between both models of 7850 just because one had 1GB more of ram,in the same resolution.

 

You don't understand that the resolution of textures you are using is not going to be above the resolution the game is actually in i.e 1080p.  So in a 1080p game the best texture resolution you will get is 1080p.  1080p textures with all the current graphical bells and whistles available in game on top of them use at most 3GB RAM in total.  

 

No game will be released for a long time that uses anything like the full 8GB available on the PS4.  If ever.  By the time the 8GB becomes relevant, the PC equivalent will be a generation ahead or more.  

 

As for the 650ti, of course GPU memory is important, RAM is important.  The point is 8GB worth over 3GB worth for the foreseeable future is going to be meaningless for the PS4.  3GB over 1GB is a completely different thing, it's a big advantage in plenty of games.  

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

16

Followers

Reviews: 52

User Lists: 0

#2 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="Silenthps"]It's not just semantics, GDDR RAM's architecture is vastly different than DDR SDRAM

Silenthps

 

That has absolutely no bearing on anything I've said.  I'm not disputing the differences between DDR and GDDR RAM in any way shape or form.  In fact I've made them clear repeatedly in this thread.

 

It's like you don't understand anything I said in that post at all and I don't have the inclination to repeat myself.  

 

I didn't say you were disputing it and I understood what you wrote. By calling it DDR5 ram all you're doing is creating more confusion.

 

What confusion am I creating?  What problem is it going to cause?  Please explain to me the major issue using "DDR5" to describe GDDR5 being used system wide is going to cause when that is how most people interpret it anyway?  GDDR5 being used system wide is an advancement on the current PC setup, so would DDR5 be if it existed.  The net effect of either is the same for the purposes of any discussion or interpretation.  Especially on this board.  

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

16

Followers

Reviews: 52

User Lists: 0

#3 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]Wrong consistently.  I don't "want to know what the PS4 will do with the extra ram", I know exactly how RAM is used.  Here is my point 1080p@30 games at their most extreme at 1080p use 3GB RAM total, that's for super bastard settings in the most advanced games at higher framerates thant 30FPS.  Higher quality textures = Higher resolution.  PS4 games are going to be 1080p on release are they not?  So this 8GB DDR5 is not going to be used in a game for years to come, not until they come out with graphical possibilities far beyond what we have now.  Planetside 2 has battles featuring 2000 players, 3GB usage. Crysis 3 uses every graphical trick available right now, 3GB top end usage.  Mid range PC GPU's these days have 2-3GB GDDR5 on top of their system memory.  That's 6 months before the PS4 even releases.  

 

The PS4 has a mid range PC gaming GPU.  It can't do anything beyond Crysis 3 at 1080p, there is nothing it can handle that can use anything like 8GB DDR5 right now.  Not even close.  

 

So like I said, as is patentily obvious, by the time 8GB DDR5 is relevant and even 60% utilised, the PC equivalent will be years ahead.  It's a meaningless statistic that "cows" are clinging to desperately, because their GPU is average and their CPU is highly questionable.

tormentos

That mentality is so damn wrong is not even funny so what was the most graphical game on 2005.? You mean to tell me that the xbox 360 and PS3 have not kick the living crap out of the most graphics game 20045 2006.? Nothing on the 7800GTX runs like Halo 4 or Uncharted 3 nothing...Not even on 1024x768... You will see how the PS4 graphics actually surpass those of Crysis 3,power is a waste on PC,rather than getting more visuals in most games the power is use to get more frames. Example the 7850 will run any game out now on max,the only difference between it an a 7970GHZ edition is frames per second,they do output the same quality at the same settings,you will learn quick enough when the PS4 start to been push and you see PS4 games distant for crysis 3,comparing Crysis 3 to Killzone on PS4 is a joke is an unfinish launch game,that is like comparing Resistance 1 vs Halo 4...

 

http://www.youtube.com/watch?v=jHWPGmf_A_0

 

Crysis 2 on a x1950 pro, graphics card from 2006.  Uncharted 3 and Halo 4 use graphical features that didn't exist in 2006.  Can a 2006 PC run them?  I don't see why not if it can run Crysis 2 just aswell if not better than consoles.  

 

Secondly a 7850 most certainly will not "run any gamg out now on max" unless a horribly unplayable framerate second is acceptable to you.  In which case a x1950 from 2006 can run any modern game on MAX settings if framerate is irrelevant for you.  

 

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

16

Followers

Reviews: 52

User Lists: 0

#4 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

Right now PC gamers are playing in 1080p and above with 60FPS.  That's what I play in and no game uses anything like 8GB RAM.  The most RAM heavy game I play is DayZ, the Arma 2 mod, coming in at around 2GB RAM.  Most likely because it is an unoptimised mod for another game.  

 

Crysis 3 in ultra 1080p, uses around 1.5GB RAM.  

 

My GPU, a GTX 580 SC, has 1.5GB DDR5 RAM.  More than enough to cover any game currently at 1080p.  I am on the verge of getting a new GPU and even the medium spec GPU nowadays have 2-3GB DDR5 RAM.  I'm looking at a GTX 680 with 4GB DDR5.  Now that is a pretty decent PC, but that 4GB DDR5 will only be relevant if I have a multi monitor setup and some obscene resolution right now.

 

I see a lot of talk of this 8GB DDR5 RAM on the PS4, firstly the GPU and CPU are far more important features for gaming.  The 8GB is really only for longetivity, it is not going to be used for years, the games coming out on the PS4 will use 1-2GB of it at most for 1080p games (which as has already been stated, will be the stndard for the PS4).

 

So this 8GB DDR5 RAM is actually completely irrelevant and will be for a number of years, until games start actually using all that RAM.  In the meantime it will just sit there doing nothing in the console.  Waiting for the day it becomes relevant.  By that time the PC will be running DDR8 or DDR9, or perhaps something even beyond that whole concept.   

 

TL;DR - In short, by the time the 8GB DDR5 becomes actually useful, the standard gaming PC will be another generation ahead in terms of technology.  It will likely feature more RAM of a more advanced variety.  

 

The PS4's GPU seems to be the equivalent of a medium range gaming PC and the CPU is an open question.  I know playing Planetside 2 right now, my computer is CPU limited in terms of framerate, as opposed to GPU limited.  In large battles my framerate dips to 35-40 FPS, which is borderline unacceptable for me.  Now that is down to the large number of people in a battle, sometimes 500-600 people in a single area battling.  I have a 3770K (4 cores) @ 4.6Ghz.  Will 8 cores @ 1.6Ghz really work?  I don't know.

 

What I'm pleased about with the new consoles is that they are very similar to PC's now, the same architecture, means that dodgy PC ports should be a thing of the past.  I do however think that the latest console specs, whilst providing a big boost initially at least on the previous generation, will leave the consoles further behind the PC than this current generation ultimately.  The last generation of consoles were far closer to a top end PC than this generation will be on release.

 

xboxiphoneps3

you clearly dont know what you are talking about... the PS4 CPU cores will be clocked at 2.0 ghz each, AND developers can use all of that RAM right away, it wont take "years" for them to finally take advantage of all it... they can take advantage of all the ram right now

 

You don't understand a single thing I wrote did you.  I didn't say they COULDN'T use that 8GB RAM immediately, sure they could.  My point is that they WON'T be using it for years.  That's because 1080p games with all the graphical addons available right now use at most 3GB RAM.  By the time they use 8GB the equivalent PC alternative will be a generation ahead. 

 

 

Also LOL as to the 2.0GHz CPU, still over twice as slow clockspeed as a PC I built over a year before the PS4 will be released.  Will the 8 cores being used properly make up for it?  Remains to be seen, big question mark over it.  

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

16

Followers

Reviews: 52

User Lists: 0

#5 iamrob7
Member since 2007 • 2138 Posts

Most of the pc exclusive games have bad graphics because the avg pc is alot weaker than this gen consoles let alone next-gen, so comparing pcs to consoles is like animals to humans. I think a high-end gaming pc can do a lot more than crysis 3. Cryengine 1,2,3 was made for consoles in mind. 

Alienware_fan

 

Wat

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

16

Followers

Reviews: 52

User Lists: 0

#6 iamrob7
Member since 2007 • 2138 Posts

Realistically PC gaming does cost more and cetainly did when the older console generation game out.  The gap this time around is likely to be a lot closer though.  Also when you factor in the price of games etc, PC gaming's cost comes down quite a bit over time.  

 

All in all though, you pay more for the better gaming platform and the better gaming experience.  Just not as much as is sometimes alleged.  Seems fair to me.

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

16

Followers

Reviews: 52

User Lists: 0

#7 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"] 

Your insistence on me using GDDR5 instead of DDR5 in the title reminds me of some academics who couldn't stand to see a scientific term misused from its written definition, even if the misuse actually produced a greater understanding of the underlying concept.  I'm not suggesting that they were wrong or that they didn't have a valid point, but for me it's entirely dependant on circumstance.  Only when the misuse of a concept/word actually affects the meaning behind the point/concept you are getting across is it worth arguing about or getting upset about.  That's not the case here and hence I don't really understand your motivation.  You understood what my point was presumably?  You also understood that using DDR5 as opposed to GDDR5 mislead nobody in any meaningful way?  You understand that GDDR5 has not been used as system memory before?  So it is not unreasonable to use a term to describe GDDR5 being used as system memory, seeing as DDR5 has been widely reported and will be understood by most people.  With popular associations in mind, why would DDR5 not be a reasonable way to express the meaning of GDDR5 used as system wide memory?  It seems to me like a good way to express it.  DDR5 doesn't exist as a different entity, so there can be no confusion.  

Silenthps

It's not just semantics, GDDR RAM's architecture is vastly different than DDR SDRAM

 

That has absolutely no bearing on anything I've said.  I'm not disputing the differences between DDR and GDDR RAM in any way shape or form.  In fact I've made them clear repeatedly in this thread.

 

It's like you don't understand anything I said in that post at all and I don't have the inclination to repeat myself.  

 

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

16

Followers

Reviews: 52

User Lists: 0

#8 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="way2funny"]

No its GDDR5 thats being used system wide. NOT DDR5. The G is NOT there to denote wether its being used system wide or not. The G is there to denote that it is functionally different than DDR3. Sony just decided to use GDDR5 as system wide memory. Its still the same memory, now just the CPU has direct access to it. And FYI DDR3 is a lot better than GDDR5 for CPU workloads. Theres a reason we use GDDR5 in graphics and DDR3 with everything else.

Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware.

way2funny

 

Are you not reading what I just wrote in the last two comments?  That's EXACTLY what I am saying.  2 comments ago I specifically said GDDR5 is DDR5 because DDR5 doesn't exist.  What is it exactly I am saying that is not getting through to you?  Please explain to me what part of my previous comments you didn't manage to take in?  Was it the part where I said DDR5 doesn't exist and the only thing in the PS4 is GDDR5?  How did that not sink in?

 

 I'll say it one more time, I put DDR5 in the title to highlight that I was aware it is being used as system wide memory because that's how it has been reported EVERYWHERE.  Dealing with a single person who is obsessing over meaningless semantics is a lot simpler than having to answer 20 responses from consolites who believe I am ignoring the fact that it is being used system wide and not just on the GPU.  Although you are pushing it now.  Labels are used to express meaning, the reality is far more people interpret it the way I've used, whether it's correct or not.  As that's how it has been reported. 

 

edit - Let me just add this as I re-read this line and it is ridiculous;

 

"Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware."

 

It has literally ZERO bearing on my point or any of the points I've made in this thread.  None whatsoever.  Please do explain to me how that affects the meaning of my point? 

Right im glad you are aware that it is being used as system memory, but it is NOT DDR5 and is not called DDR5. Its called GDDR5 no matter where and how you use it.

 

Meaning is about consensus and relevance.  That's how language is developed and used.  Take the word gay its meaning is happiness essentially, yet that meaning is completely different now, in fact after 50 years or so it now has an additional meaning in a dictionary.  Why?  because it was used differently by the consensus.  People used it to express and label something, a homosexual.  This is how all language has evolved.

 

Years ago I did a physics degree, there are endless concepts and labels used which are applied and used completely differently in normal conversation so as to express an appropriate meaning given a discussion about this or that.  In particular with someone who lacks the technical side of awareness.  That's how language develops and words create different meanings.  It is how language is supposed to be used.  To express your meaning in the best way possible.

Your insistence on me using GDDR5 instead of DDR5 in the title reminds me of some academics who couldn't stand to see a scientific term misused from its written definition, even if the misuse actually produced a greater understanding of the underlying concept.  I'm not suggesting that they were wrong or that they didn't have a valid point, but for me it's entirely dependent on circumstance.  Only when the misuse of a concept/word actually affects the meaning behind the point/concept you are getting across is it worth arguing about or getting upset about.  That's not the case here and hence I don't really understand your motivation.  You understood what my point was presumably?  You also understood that using DDR5 as opposed to GDDR5 mislead nobody in any meaningful way?  You understand that GDDR5 has not been used as system memory before?  So it is not unreasonable to use a term to describe GDDR5 being used as system memory, seeing as DDR5 has been widely reported and will be understood by most people.  With popular associations in mind, why would DDR5 not be a reasonable way to express the meaning of GDDR5 used as system wide memory?  It seems to me like a good way to express it.  DDR5 doesn't exist as a different entity, so there can be no confusion.  

 

I understood why the first guy picked on it, because he found the post upsetting and had no other viable response but to pick on petty semantics.  Your motivation seems to be different though, it's the actual semantics that you are caught up on.  Perhaps because I responded to your initial post about something else highlighting its irrelevance.  Maybe because you had no response to that, you decided to pick on the semantics, I'm not sure.  People such as yourself will always puzzle me, but each to their own ey.  

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

16

Followers

Reviews: 52

User Lists: 0

#9 iamrob7
Member since 2007 • 2138 Posts

[QUOTE="iamrob7"]

[QUOTE="way2funny"]

Right its GDDR5, no such thing as DDR5, they are NOT calling it DDR5 because its GDDR5 so they are calling it GDDR5.

Like its predecessor, GDDR4, GDDR5 is based on DDR3 SDRAM memory

The G doesnt just mean its on the graphics card, it means its functionally different.

PS4Specs.png

way2funny

 

The lack of a G is to denote whether the RAM is being used system wide or not.  Calling it DDR5 makes it clear that I understand it is being used system wide as opposed to labelling it GDDR5 and saves on the responses of people saying it is not just GPU memory.  The reason it has been reported as DDR5 everywhere under the sun is to highlight the fact that it will be used as system wide memory, as opposed to just the GPU.  Effectively it is a new form of system memory as GDDR5 has not been used for this purpose before.  So the simplest solution for me is to label it DDR5 in the title as that's how it has been reported absolutely everywhere.

No its GDDR5 thats being used system wide. NOT DDR5. The G is NOT there to denote wether its being used system wide or not. The G is there to denote that it is functionally different than DDR3. Sony just decided to use GDDR5 as system wide memory. Its still the same memory, now just the CPU has direct access to it. And FYI DDR3 is a lot better than GDDR5 for CPU workloads. Theres a reason we use GDDR5 in graphics and DDR3 with everything else.

Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware.

 

Are you not reading what I just wrote in the last two comments?  That's EXACTLY what I am saying.  2 comments ago I specifically said GDDR5 is DDR5 because DDR5 doesn't exist.  What is it exactly I am saying that is not getting through to you?  Please explain to me what part of my previous comments you didn't manage to take in?  Was it the part where I said DDR5 doesn't exist and the only thing in the PS4 is GDDR5?  How did that not sink in?

 

 I'll say it one more time, I put DDR5 in the title to highlight that I was aware it is being used as system wide memory because that's how it has been reported EVERYWHERE.  Dealing with a single person who is obsessing over meaningless semantics is a lot simpler than having to answer 20 responses from consolites who believe I am ignoring the fact that it is being used system wide and not just on the GPU.  Although you are pushing it now.  Labels are used to express meaning, the reality is far more people interpret it the way I've used, whether it's correct or not.  As that's how it has been reported. 

 

edit - Let me just add this as I re-read this line and it is ridiculous;

 

"Just because consumers like you don't understand that doesnt mean you get to change the name, and therefore the meaning, of hardware."

 

It has literally ZERO bearing on my point or any of the points I've made in this thread.  None whatsoever.  Please do explain to me how that affects the meaning of my point? 

Avatar image for iamrob7
iamrob7

2138

Forum Posts

0

Wiki Points

16

Followers

Reviews: 52

User Lists: 0

#10 iamrob7
Member since 2007 • 2138 Posts

I just wanted to point out that Graphics cards can only tell you how much RAM has been allocated, NOT how much is actually used.

 

It's possible for Crysis 3 to allocate over 2 Gigs for various reasons, but never use that amount.  In fact Crysis 3 max runs fine on my GTX 680's in SLI at 2560x1440, so it obviously doesn't need over 2 gigs.

Kinthalis

 

Good point, although effectively if it has been allocated, then it is still "using it".