Evil Within needs 4 GB of VRAM to look shiny

  • 175 results
  • 1
  • 2
  • 3
  • 4
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#101  Edited By clyde46
Member since 2005 • 49061 Posts

You may want to hold off on those 970's ladies and jellyspoons.

http://wccftech.com/nvidia-geforce-gtx-980-ti-gtx-titan-x-coming/

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#102 wis3boi
Member since 2005 • 32507 Posts

@kingtito said:

. Am I suppose to run out and purchase another $600/700 780ti just to meet that requirement? Lame

no because adding cards in sli/crossfire doesnt double your vram :P

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#103 Wasdie  Moderator
Member since 2003 • 53622 Posts

@04dcarraher said:

i7's may be 4 years old but i7 from 4 years ago are slower then i5 from 2-3 years ago. And to the fact that i7's tend to be $300+

GTX 670 performance wise outclasses the console gpu's.

No, X1 and PS4 only have 5gb or less memory to use. And the system/games store data the same way as pc's do so you are looking at 2-3 gb for game assets and 2-3gb for VRAM typical usage.

These consoles are still the bottleneck for games because of the lack of processing power... but are a still a massive leap over 360/PS3.

But the problem is ID tech 5 engine, however as well seen too many over estimated requirements for games this year.

Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.

Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.

The PS4/Xbox One are not bottlenecks right now.

As you just said an i7 from 4 years ago is slower than the i5 from 2-3 years ago , actually its slower than last years i5. Since they didn't specifiy model we can assume i7 from the Broadwell generation which is the i7 920 as recommended. I don't see how that is a high requirement for a recommended spec at all.

Avatar image for monstersfa
monstersfa

398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#104 monstersfa
Member since 2014 • 398 Posts

@clyde46 said:

@monstersfa said:

@clyde46 said:
@monstersfa said:

Res settings weren't shown in the video.

They were in the description.....

Anything could be in the description.

Are you being difficult on purpose or are you normally this stupid?

How am I supposed to know if he's being honest? If everyone went around being honest then everyone would have tons of enemies.

Avatar image for monstersfa
monstersfa

398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#105 monstersfa
Member since 2014 • 398 Posts

Speaking of Vram here's an excerpt from ign's Mordor review. Sounds a little crazy

On the PC side Mordor also compares to the Batman games, in that it’s of good quality. There are even some enhanced graphics settings, including an ultra-high texture setting that requires a full 6GB of video memory. My only issue with it is some awkward menu controls, but most of those are customizable and those that aren’t aren’t too inconvenient to get used to.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#106  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@Wasdie said:

Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.

Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.

The PS4/Xbox One are not bottlenecks right now.

It makes perfect sense...the game"system"(excluding OS and features) allocating 2-3 gb and then allocates 2-3gb for Vram whats so hard to understand?

Are you serious? jaguar architecture let alone the 1.6ghz clockrate is some what bottleneck now and will be even more so later on and then they only have available 6 cores which actually processes less data then a 5 year old mid ranged quad core...

Yes the PS4 and X1 are bottlenecks because of their cpu limits and gpu limits just like how the 360/PS3 were from 2007 onward. These consoles have stepped out into the world being 2-3 years behind the hardware curve. And it shows with sub 1080p resolutions, frame rates not being stable even with the resources available.

Avatar image for mikhail
mikhail

2697

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#107 mikhail
Member since 2003 • 2697 Posts

@04dcarraher said:

@Wasdie said:

Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.

Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.

The PS4/Xbox One are not bottlenecks right now.

It makes perfect sense...the game"system"(excluding OS and features) allocating 2-3 gb and then allocates 2-3gb for Vram whats so hard to understand?

Are you serious? jaguar architecture let alone the 1.6ghz clockrate is some what bottleneck now and will be even more so later on and then they only have available 6 cores which actually processes less data then a 5 year old mid ranged quad core...

Yes the PS4 and X1 are bottlenecks because of their cpu limits and gpu limits just like how the 360/PS3 were from 2007 onward. These consoles have stepped out into the world being 2-3 years behind the hardware curve. And it shows with sub 1080p resolutions, frame rates not being stable even with the resources available.

This is absolutely right on - you cannot directly compare the low-power laptop grade hardware of the current consoles to full power desktop CPUs and discrete graphics cards. Well, you could, but you would sound dumb and uninformed. All that extra available VRAM doesn't mean a damn thing if the rest of the hardware isn't powerful enough to process the data fast enough. This is why we see things like the "definitive" edition of Sleeping Dogs running at 30fps on the new consoles when that game could be maxed out with the same textures and better effects at 60 fps on mid-range PC GPUs from 2012.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#108 ShadowDeathX
Member since 2006 • 11698 Posts

Anyone know if this game will be running on OpenGL? I'm going to assume it is.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#109 Wasdie  Moderator
Member since 2003 • 53622 Posts

@04dcarraher said:

@Wasdie said:

Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.

Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.

The PS4/Xbox One are not bottlenecks right now.

It makes perfect sense...the game"system"(excluding OS and features) allocating 2-3 gb and then allocates 2-3gb for Vram whats so hard to understand?

Are you serious? jaguar architecture let alone the 1.6ghz clockrate is some what bottleneck now and will be even more so later on and then they only have available 6 cores which actually processes less data then a 5 year old mid ranged quad core...

Yes the PS4 and X1 are bottlenecks because of their cpu limits and gpu limits just like how the 360/PS3 were from 2007 onward. These consoles have stepped out into the world being 2-3 years behind the hardware curve. And it shows with sub 1080p resolutions, frame rates not being stable even with the resources available.

You vastly overestimate how much CPU power is needed for games. It's such a vast overestimate I question how much you really know about game programming and what it takes to make things run. I have a 3770k. If a game uses 15% of my CPU I would be impressed. Most sit at 25% of a single core. We're talking fractions of my CPU are being used.

The lower framerates and resolutions are a result of the GPU as the PS4/Xbox One's GPUs are significantly weaker than what PC have. However graphics don't limit games. You can downscale graphics easily. What limits games more is RAM and CPU bottlenecks neither which the PS4/Xbox One has.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#110  Edited By Wasdie  Moderator
Member since 2003 • 53622 Posts

@mikhail said:

@04dcarraher said:

@Wasdie said:

Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.

Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.

The PS4/Xbox One are not bottlenecks right now.

It makes perfect sense...the game"system"(excluding OS and features) allocating 2-3 gb and then allocates 2-3gb for Vram whats so hard to understand?

Are you serious? jaguar architecture let alone the 1.6ghz clockrate is some what bottleneck now and will be even more so later on and then they only have available 6 cores which actually processes less data then a 5 year old mid ranged quad core...

Yes the PS4 and X1 are bottlenecks because of their cpu limits and gpu limits just like how the 360/PS3 were from 2007 onward. These consoles have stepped out into the world being 2-3 years behind the hardware curve. And it shows with sub 1080p resolutions, frame rates not being stable even with the resources available.

This is absolutely right on - you cannot directly compare the low-power laptop grade hardware of the current consoles to full power desktop CPUs and discrete graphics cards. Well, you could, but you would sound dumb and uninformed. All that extra available VRAM doesn't mean a damn thing if the rest of the hardware isn't powerful enough to process the data fast enough. This is why we see things like the "definitive" edition of Sleeping Dogs running at 30fps on the new consoles when that game could be maxed out with the same textures and better effects at 60 fps on mid-range PC GPUs from 2012.

Wait dumb and unformed? Do you realize that even the most processor intensive games barely crack 25% total CPU usage on anything modern of a CPU? Obviously not. Also 30fps and resoluition has little to do with CPU power and as I said before, you can downscale assets and rendering, those things don't limit game design. Less polygons on a model and less detail in the distances isn't the same as physically not able to implement a mechanic because you ran out of RAM.

I probably have a more powerful PC than 99% of people here and game almost exclusively on my PC yet I don't have my head so far up my ass that I throw out general knowledge just to bash the game consoles and make me feel better for owning that PC.

There are also two other trends you'll see with PC games. Trend 1, games will require a minimum of 4 core processors. Why? PS4/Xbox One's CPUs are powerful enough for games but single-core performance is not all that great. This forces developers to utilize multiple cores. Both consoles went for a low power consumption and low heat solution by going with more weak cores rather than fewer powerful cores (saves on heat and thus allows for smaller form factor and longer lifespan). This is why we see i5s and i7s be the requirements and the recommended specs. Now this won't always hold true, especially for more mature engine that were around last generation (CryEngine, Unreal 3, Frostbite), but expect newer engines to almost always require a minimum of 4 cores.

There is another trend you'll see. Until DX12, DX9 and DX11 have a major CPU bottleneck that really hurts draw calls as well as a few other graphic processing things. Consoles can do draw calls way faster than a comparable PC due to the low-level API and code basically being able to call hardware directly instead of having to be interpreted through a software API running on the CPU. This is why we specs recommended in the i7 range yet only require i5s. Once DX12 hits and games start adapting that, the larger CPU bottlenecks should go away on the PC allowing for even weaker CPUs to be used. It's a win-win.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#111 Wasdie  Moderator
Member since 2003 • 53622 Posts

@ShadowDeathX said:

Anyone know if this game will be running on OpenGL? I'm going to assume it is.

Id Tech 5 on the PC uses OpenGL.

Avatar image for cfisher2833
cfisher2833

2150

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#112 cfisher2833
Member since 2011 • 2150 Posts

And yet the textures in the game still look like complete ass. Final nail in idTech5's coffin? I think so.

Avatar image for monstersfa
monstersfa

398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#113 monstersfa
Member since 2014 • 398 Posts

@Wasdie said:

@mikhail said:

@04dcarraher said:

@Wasdie said:

Wait 2-3 gb for game assets and 2-3gb of vRAM in a console? That makes no sense.

Consoles are not a bottleneck for games because of lack of processing power. What bottleneck? There is no CPU bottleneck (oh dear devs actually have to use more than one fucking core, it's 2014 and the fact most PC game still push the majority of processing to one core is pathetic), no RAM bottleneck, and GPU bottleneck at this point means nothing as you can just apply downscaling algorithms to assets, reduce the LoDs, reduce the quality of the shadows and other rendering assets, knock out the number of dynamic lights (which are rarely important for gameplay), and fit even a game as ridiculous as Star Citizen on a PS4/Xbox One if you're willing to sacrifice the fidelity. It's a non-issue.

The PS4/Xbox One are not bottlenecks right now.

It makes perfect sense...the game"system"(excluding OS and features) allocating 2-3 gb and then allocates 2-3gb for Vram whats so hard to understand?

Are you serious? jaguar architecture let alone the 1.6ghz clockrate is some what bottleneck now and will be even more so later on and then they only have available 6 cores which actually processes less data then a 5 year old mid ranged quad core...

Yes the PS4 and X1 are bottlenecks because of their cpu limits and gpu limits just like how the 360/PS3 were from 2007 onward. These consoles have stepped out into the world being 2-3 years behind the hardware curve. And it shows with sub 1080p resolutions, frame rates not being stable even with the resources available.

This is absolutely right on - you cannot directly compare the low-power laptop grade hardware of the current consoles to full power desktop CPUs and discrete graphics cards. Well, you could, but you would sound dumb and uninformed. All that extra available VRAM doesn't mean a damn thing if the rest of the hardware isn't powerful enough to process the data fast enough. This is why we see things like the "definitive" edition of Sleeping Dogs running at 30fps on the new consoles when that game could be maxed out with the same textures and better effects at 60 fps on mid-range PC GPUs from 2012.

Wait dumb and unformed? Do you realize that even the most processor intensive games barely crack 25% total CPU usage on anything modern of a CPU? Obviously not. Also 30fps and resoluition has little to do with CPU power and as I said before, you can downscale assets and rendering, those things don't limit game design.

I probably have a more powerful PC than 99% of people here and game almost exclusively on my PC yet I don't have my head so far up my ass that I throw out general knowledge just to bash the game consoles and make me feel better for owning that PC. Get some perspective.

Damn! Well said!

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#114 scatteh316
Member since 2004 • 10273 Posts

@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.

PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#115 scatteh316
Member since 2004 • 10273 Posts

@kingtito said:

All of the specs are fine except for 1...4GB of VRAM. I bought the 780ti just a few months ago and THAT doesn't even have 4GB of VRAM. Am I suppose to run out and purchase another $600/700 780ti just to meet that requirement? Lame

No because adding a second card doesn't double your VRAM.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#116 RyviusARC
Member since 2011 • 5708 Posts

Shadow of Mordor says to play with Ultra Textures at 1080p you need 6GB of Vram.....I am calling BS on that.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#117 Wasdie  Moderator
Member since 2003 • 53622 Posts

@scatteh316 said:

@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.

PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.

Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#118  Edited By RyviusARC
Member since 2011 • 5708 Posts

@Wasdie said:

@scatteh316 said:

@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.

PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.

Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.

Keep in mind the consoles have to share that ram between system memory and video memory. So most likely consoles will be stuck with around 2-3GB of vRAM.

And they sometimes may be bottlenecked enough that they can't even handle such high textures.

Avatar image for digitm64
digitm64

470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 5

#119 digitm64
Member since 2013 • 470 Posts

4Gb VRAM is crazy. Vanishing of Ethan Carter looks better and it uses less. Bethesda really have to ditch the IDTech engine.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#120 Wasdie  Moderator
Member since 2003 • 53622 Posts

@RyviusARC said:

@Wasdie said:

@scatteh316 said:

@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.

PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.

Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.

Keep in mind the consoles have to share that ram between system memory and video memory. So most likely consoles will be stuck with around 2.5-3GB of vRAM.

And even they sometimes my be bottlenecked so they can't even handle such high textures.

Just like CPU power, it doesn't take much RAM for the system side of things for games. Most memory for games is with the vRAM. Graphics are by far the largest consumer of system resource. I would say the majority of RAM dedicated to the game is going to be used to store graphic data. As I said, they can get around those bottlenecks now with streaming tech. There is no reason to load assets you're not using into memory. Stream that stuff in on demand.

On the PC side of things, a 4 year old quad-core processor is doing just fine. There are a few exceptions but they are always a problem with programming. Planetside 2 comes to mind. Even with an overclocked 3770k I get framerate dips because of the horrid CPU utilization of that game. That's partly due to the nature of networking (lots of CPU time wasted literally waiting for network operations) but mostly due to how the game is programmed around a single core. They've made it a bit better, but the game does not really utilize multiple cores well at all.

Avatar image for Jankarcop
Jankarcop

11058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#121  Edited By Jankarcop
Member since 2011 • 11058 Posts

@monstersfa said:

Damn! Well said!

You've been getting too obvious lately.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#122  Edited By RyviusARC
Member since 2011 • 5708 Posts

@Wasdie said:

@RyviusARC said:

@Wasdie said:

@scatteh316 said:

@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.

PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.

Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.

Keep in mind the consoles have to share that ram between system memory and video memory. So most likely consoles will be stuck with around 2.5-3GB of vRAM.

And even they sometimes my be bottlenecked so they can't even handle such high textures.

Just like CPU power, it doesn't take much RAM for the system side of things for games. Most memory for games is with the vRAM. Graphics are by far the largest consumer of system resource. I would say the majority of RAM dedicated to the game is going to be used to store graphic data. As I said, they can get around those bottlenecks now with streaming tech. There is no reason to load assets you're not using into memory. Stream that stuff in on demand.

On the PC side of things, a 4 year old quad-core processor is doing just fine. There are a few exceptions but they are always a problem with programming. Planetside 2 comes to mind. Even with an overclocked 3770k I get framerate dips because of the horrid CPU utilization of that game. That's partly due to the nature of networking (lots of CPU time wasted literally waiting for network operations) but mostly due to how the game is programmed around a single core. They've made it a bit better, but the game does not really utilize multiple cores well at all.

Then Why did the PS4 and Xbox One not use the highest texture setting for Watchdogs?

If they don't need to allocate much RAM to system memory why couldn't they handle the 3GB vRAM for textures?

Avatar image for mr_huggles_dog
Mr_Huggles_dog

7805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#123 Mr_Huggles_dog
Member since 2014 • 7805 Posts

So much for the "Best versions of multiplatform games are on PC" thought process.

Avatar image for monstersfa
monstersfa

398

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#124 monstersfa
Member since 2014 • 398 Posts

@Jankarcop said:

@monstersfa said:

Damn! Well said!

You've been getting too obvious lately.

Not sure what you're getting at here.

Avatar image for Jankarcop
Jankarcop

11058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#125  Edited By Jankarcop
Member since 2011 • 11058 Posts

@mr_huggles_dog said:

So much for the "Best versions of multiplatform games are on PC" thought process.

1. Can you show me how you came to the conclusion that this game will not look and run better on high-end PCs? How does it differ from nearly all other multiplats which all look and run better on PC? Link? Citation?

2. Even if one game did finally end up better on consoles, you really think it ends a thought process concerning 99% of other games?

Avatar image for mr_huggles_dog
Mr_Huggles_dog

7805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#126 Mr_Huggles_dog
Member since 2014 • 7805 Posts

@Jankarcop said:

@mr_huggles_dog said:

So much for the "Best versions of multiplatform games are on PC" thought process.

1. Can you show me how you came to the conclusion that this game will not look and run better on high-end PCs? How does it differ from nearly all other multiplats which all look and run better on PC? Link? Citation?

2. Even if one game did finally end up better on consoles, you really think it ends a thought process concerning 99% of other games?

Thats your opinion. I don't see PS4 gamers constantly complaining about their games being crappy ports.

I for one find any game I've played on my PS4 to look great....run great....sound great....all that. While I spent $400 on a PS4 and will get a good looking game that runs great....you're going to need a $1000 PC to run this game.

i7.....GPU with 4GB of VRAM? Yeah....and we both know if thats the case it's gonna need something beefier that a GTX 670 to really get good frames and look like it's suppose to.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#127  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@Wasdie said:

You vastly overestimate how much CPU power is needed for games. It's such a vast overestimate I question how much you really know about game programming and what it takes to make things run. I have a 3770k. If a game uses 15% of my CPU I would be impressed. Most sit at 25% of a single core. We're talking fractions of my CPU are being used.

The lower framerates and resolutions are a result of the GPU as the PS4/Xbox One's GPUs are significantly weaker than what PC have. However graphics don't limit games. You can downscale graphics easily. What limits games more is RAM and CPU bottlenecks neither which the PS4/Xbox One has.

You are underestimating the lack of processing power. Also you are wrong because of the fact that plenty of games do use more then "15"% or 25% of your cpu power hence the ones using more then a single thread at 80-100%. And plenty of prime examples of Cpu bottlenecks to the gpu and or games prove that fact. Take a Phenom 2 x4 to BF3 or 4, limit it to two cores and see the fps get cut in half.

Now "According to developer DICE, BF4 already uses up to 95 percent of available CPU power on next-gen consoles."

"Frostbite technical director Johan Andersson, the game uses 90 to 95 percent of the available CPU power on the PS4 and Xbox One. You can check out an in-depth Q&A at AMD with him and other developers While the next gen consoles are clearly more powerful than the previous generation hardware, Sony and Microsoft have decided to focus more on the GPU than the CPU. Both the PS4 and Xbox One have an 8 core CPU clocked at around 1.6 Ghz, which doesn’t sound like a lot. Furthermore, developers only have access to 6 CPU cores, the last two are reserved for the OS.While it’s no surprise that a game as complex as Battlefield 4 uses almost all of the available CPU power, we’re surprised that developers are already almost hitting the limit."

Also all the fps issues in games Im talking about are from the lack of cpu processing power, not all fps issues are from the lack of gpu power. Sudden changes in directions, lots of multiplayer action are prime examples of the cpu not being able to keep up with the data flow for the gpu. Like you said graphics can be tweaked to fit the hardware but yet we see instances of fps drops and unstable averages, because of the cpu not the gpu.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#128  Edited By 04dcarraher
Member since 2004 • 23832 Posts
@RyviusARC said:

@Wasdie said:

@RyviusARC said:

@Wasdie said:

@scatteh316 said:

@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.

PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.

Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.

Keep in mind the consoles have to share that ram between system memory and video memory. So most likely consoles will be stuck with around 2.5-3GB of vRAM.

And even they sometimes my be bottlenecked so they can't even handle such high textures.

Just like CPU power, it doesn't take much RAM for the system side of things for games. Most memory for games is with the vRAM. Graphics are by far the largest consumer of system resource. I would say the majority of RAM dedicated to the game is going to be used to store graphic data. As I said, they can get around those bottlenecks now with streaming tech. There is no reason to load assets you're not using into memory. Stream that stuff in on demand.

On the PC side of things, a 4 year old quad-core processor is doing just fine. There are a few exceptions but they are always a problem with programming. Planetside 2 comes to mind. Even with an overclocked 3770k I get framerate dips because of the horrid CPU utilization of that game. That's partly due to the nature of networking (lots of CPU time wasted literally waiting for network operations) but mostly due to how the game is programmed around a single core. They've made it a bit better, but the game does not really utilize multiple cores well at all.

Then Why did the PS4 and Xbox One not use the highest texture setting for Watchdogs?

If they don't need to allocate much RAM to system memory why couldn't they handle the 3GB vRAM for textures?

lol, ISS used all 4.5 gb in PS4, Killzone also used 3gb for vram and 1.5gb for the other game assets. Open world type of games use more memory to cache the game data hence the reason for lack of best textures for watchdogs. These new consoles store data just like pc's do... And they have a shared memory pool having to spit the usage using less then 5gb.

Avatar image for Liquid_
Liquid_

3832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#129 Liquid_
Member since 2003 • 3832 Posts

welp i have an i7 but not 4gb of vram...kinda dumb

Avatar image for sukraj
sukraj

27859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 22

User Lists: 0

#130 sukraj
Member since 2008 • 27859 Posts

@Flubbbs said:

that IDTech 5

me love dat tech

Avatar image for KungfuKitten
KungfuKitten

27389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#131 KungfuKitten
Member since 2006 • 27389 Posts

Isn't this an indoor game?
The Witcher 3 at least has the excuses to require high specs.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#133  Edited By RyviusARC
Member since 2011 • 5708 Posts

@04dcarraher said:
@RyviusARC said:

@Wasdie said:

@RyviusARC said:

@Wasdie said:

@scatteh316 said:

@Wasdie: It's actually less then 6Gb of useable RAM for Xbone and PS4.

PS4's OS uses 3Gb and Xbones uses 3.5Gb, that leaves PS4 with 5Gb and Xbone with 4.5Gb.

Ok so I was wrong there but even 4.5 gb of ram is enough. Even the most graphically intensive games on the PC don't use much over 4 gb of total ram. Also since the CPUs of the PS4/Xbox One aren't terrible and that reading data from harddrives and disk drives is faster this generation, streaming technologies are going to be used more often. Stuff like tiled resources are going to be very useful in the coming years even on PCs.

Keep in mind the consoles have to share that ram between system memory and video memory. So most likely consoles will be stuck with around 2.5-3GB of vRAM.

And even they sometimes my be bottlenecked so they can't even handle such high textures.

Just like CPU power, it doesn't take much RAM for the system side of things for games. Most memory for games is with the vRAM. Graphics are by far the largest consumer of system resource. I would say the majority of RAM dedicated to the game is going to be used to store graphic data. As I said, they can get around those bottlenecks now with streaming tech. There is no reason to load assets you're not using into memory. Stream that stuff in on demand.

On the PC side of things, a 4 year old quad-core processor is doing just fine. There are a few exceptions but they are always a problem with programming. Planetside 2 comes to mind. Even with an overclocked 3770k I get framerate dips because of the horrid CPU utilization of that game. That's partly due to the nature of networking (lots of CPU time wasted literally waiting for network operations) but mostly due to how the game is programmed around a single core. They've made it a bit better, but the game does not really utilize multiple cores well at all.

Then Why did the PS4 and Xbox One not use the highest texture setting for Watchdogs?

If they don't need to allocate much RAM to system memory why couldn't they handle the 3GB vRAM for textures?

lol, ISS used all 4.5 gb in PS4, Killzone also used 3gb for vram and 1.5gb for the other game assets. Open world type of games use more memory to cache the game data hence the reason for lack of best textures for watchdogs. These new consoles store data just like pc's do... And they have a shared memory pool having to spit the usage using less then 5gb.

I know that. I was just reminding Wasdie that a good chunk of the available RAM has to go towards system memory.

So most likely the consoles won't use the best textures in Evil Within if the PC version really requires 4GB of vRAM.

Just like how Shadow of Mordor will probably use medium textures on consoles rather than high or ultra.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#135  Edited By tormentos
Member since 2003 • 33784 Posts

@RyviusARC said:

I know that. I was just reminding Wasdie that a good chunk of the available RAM has to go towards system memory.

So most likely the consoles won't use the best textures in Evil Within if the PC version really requires 4GB of vRAM.

Just like how Shadow of Mordor will probably use medium textures on consoles rather than high or ultra.

The PS4 uses 3GB for system 500MB is can go either way and 4.5GB for video that more than double of most 7870 \,660ti,and 1.5GB more than many 7950.

This is one scenario i picture long time ago and hermits like you refuse to admit it,GPU with 2GB of ram will suddenly run into trouble when games start demanding more,suddenly GPU use to run ultra at 1080p will not any more,memory will be a limitation,and before you say anything memory can be just as big hit to performance as lack of power.

And all of the sudden card like the 660ti and R270 which ran most games on Ultra quality textures wise in 1080p suddenly can't..

Medium.. requires 2Gb high 3 or more Ultra require 6GB...lol

I call it ages ago..

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#136  Edited By RyviusARC
Member since 2011 • 5708 Posts

@tormentos said:

@RyviusARC said:

I know that. I was just reminding Wasdie that a good chunk of the available RAM has to go towards system memory.

So most likely the consoles won't use the best textures in Evil Within if the PC version really requires 4GB of vRAM.

Just like how Shadow of Mordor will probably use medium textures on consoles rather than high or ultra.

The PS4 uses 3GB for system 500MB is can go either way and 4.5GB for video that more than double of most 7870 \,660ti,and 1.5GB more than many 7950.

This is one scenario i picture long time ago and hermits like you refuse to admit it,GPU with 2GB of ram will suddenly run into trouble when games start demanding more,suddenly GPU use to run ultra at 1080p will not any more,memory will be a limitation,and before you say anything memory can be just as big hit to performance as lack of power.

And all of the sudden card like the 660ti and R270 which ran most games on Ultra quality textures wise in 1080p suddenly can't..

Medium.. requires 2Gb high 3 or more Ultra require 6GB...lol

I call it ages ago..

Nah the PS4 uses 3.5 for the OS. the rest is split between system memory and video memory.

So the PS4 will most likely use Medium textures if those settings are to be believed.

If the PS4 couldn't do ultra textures for WatchDogs then it probably won't handle high textures for Shadow of Mordor.

If it's anything like Titanfall the textures will look like crap even at the highest and there will me almost no visible difference between high and ultra.

Avatar image for miiiiv
miiiiv

943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#137  Edited By miiiiv
Member since 2013 • 943 Posts
@tormentos said:

@RyviusARC said:

I know that. I was just reminding Wasdie that a good chunk of the available RAM has to go towards system memory.

So most likely the consoles won't use the best textures in Evil Within if the PC version really requires 4GB of vRAM.

Just like how Shadow of Mordor will probably use medium textures on consoles rather than high or ultra.

The PS4 uses 3GB for system 500MB is can go either way and 4.5GB for video that more than double of most 7870 \,660ti,and 1.5GB more than many 7950.

This is one scenario i picture long time ago and hermits like you refuse to admit it,GPU with 2GB of ram will suddenly run into trouble when games start demanding more,suddenly GPU use to run ultra at 1080p will not any more,memory will be a limitation,and before you say anything memory can be just as big hit to performance as lack of power.

And all of the sudden card like the 660ti and R270 which ran most games on Ultra quality textures wise in 1080p suddenly can't..

Medium.. requires 2Gb high 3 or more Ultra require 6GB...lol

I call it ages ago..

Yes, 2gb vram won't be enough max newer games at 1080p but so far it seems that a gtx 680 for example, is bottlenecked by it's power rather than lack of vram and that card is much faster than the ps4 gpu. So despite having more available memory the ps4 lacks the power run upcoming games at the equivalent of max settings on pc, it can't even run some of the existing games at max settings. I'm not saying that there never will be any cases where the ps4 could have an advantage over a pc with 2gb vram.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#138  Edited By deactivated-583e460ca986b
Member since 2004 • 7240 Posts

I see a few different numbers here regarding how much of the 8GB of GDDR5 is used on the PS4 for games. So I am just gonna drop this slide here.
http://www.officialplaystationmagazine.co.uk/2014/04/15/sucker-punch-explains-how-it-used-ps4s-8gb-ram-in-infamous-second-son/#null


Avatar image for Dasein808
Dasein808

839

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#139  Edited By Dasein808
Member since 2008 • 839 Posts

@GoldenElementXL said:
No Caption Provided

2014...

That console "experience."

"Bu-bu-but, in 2005 The Xbox360 was > than Cray."

Avatar image for bobbetybob
bobbetybob

19370

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#140 bobbetybob
Member since 2005 • 19370 Posts

I wonder if it's like Ryse where the VRAM is just to run the game at 4K because 4GB is a hell of a lot for a game that looks like this and is pretty linear.

Avatar image for alcapello
Alcapello

1396

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#141 Alcapello
Member since 2014 • 1396 Posts

No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.

Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#142  Edited By tormentos
Member since 2003 • 33784 Posts

@RyviusARC said:

Nah the PS4 uses 3.5 for the OS. the rest is split between system memory and video memory.

So the PS4 will most likely use Medium textures if those settings are to be believed.

If the PS4 couldn't do ultra textures for WatchDogs then it probably won't handle high textures for Shadow of Mordor.

If it's anything like Titanfall the textures will look like crap even at the highest and there will me almost no visible difference between high and ultra.

No is not 3.5 for OS that also include systems,already confirmed by Infamous developer.

Oh the problem os the ps4 is not ram is power,the power need it wasn't there,on the 7870 there is more power but ram is a limiting factor i call this ages ago and hermits deny it..hahah

Even that i posted that screen there showing how the 7850 2GB out did the same card with 1GB of ram on Skyrim when you installed the HD textures,ram was crippling the 1GB model vs the 2GB one on a card that have the exact same power.

Avatar image for parkurtommo
parkurtommo

28295

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#143 parkurtommo
Member since 2009 • 28295 Posts

@kingtito said:

All of the specs are fine except for 1...4GB of VRAM. I bought the 780ti just a few months ago and THAT doesn't even have 4GB of VRAM. Am I suppose to run out and purchase another $600/700 780ti just to meet that requirement? Lame

Shoulda waited for the 970 bruh! Considering how long ago the 700 series was launched we could easily expect the next generation at the 3rd Q of this year. And it was totally worth the wait. Like I was going to upgrade to a 760 this year but when I heard the rumors of the 800 series I literally waited months, and now I will be getting a card that is like 2 times better for a similar price (the 970, with 4gb too).

Avatar image for LustForSoul
LustForSoul

6404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#144 LustForSoul
Member since 2011 • 6404 Posts

That shit resolution is enough proof to show they don't know what the hell they're doing.

Avatar image for deadline-zero0
DEadliNE-Zero0

6607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#145  Edited By DEadliNE-Zero0
Member since 2014 • 6607 Posts

@alcapello said:

No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.

Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL

What the ****?

Anyway, like i said in the other thread, there's no way in hell SoM needs a 3GB card to run at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best

Avatar image for alcapello
Alcapello

1396

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#146 Alcapello
Member since 2014 • 1396 Posts

@deadline-zero0 said:

@alcapello said:

No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.

Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL

What the ****?

Anyway, like i said in teh other thread, there's no way in hell SoM needs a 3GB card tor un at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best

Face it, PC gaming suck balls this gen compare to the PS4.

It has no game, all the big game are danging in front of the watery mouth of PC gamers.

Sims, Coaster tycoon, WoW, sims, tycoon, WOW.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#147 lostrib
Member since 2009 • 49999 Posts

@alcapello said:

@deadline-zero0 said:

@alcapello said:

No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.

Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL

What the ****?

Anyway, like i said in teh other thread, there's no way in hell SoM needs a 3GB card tor un at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best

Face it, PC gaming suck balls this gen compare to the PS4.

It has no game, all the big game are danging in front of the watery mouth of PC gamers.

Sims, Coaster tycoon, WoW, sims, tycoon, WOW.

You're trying too hard

Avatar image for alcapello
Alcapello

1396

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#148 Alcapello
Member since 2014 • 1396 Posts

@lostrib said:

@alcapello said:

@deadline-zero0 said:

@alcapello said:

No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.

Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL

What the ****?

Anyway, like i said in teh other thread, there's no way in hell SoM needs a 3GB card tor un at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best

Face it, PC gaming suck balls this gen compare to the PS4.

It has no game, all the big game are danging in front of the watery mouth of PC gamers.

Sims, Coaster tycoon, WoW, sims, tycoon, WOW.

You're trying too hard

Still I got a point.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#149 lostrib
Member since 2009 • 49999 Posts

@alcapello said:

@lostrib said:

@alcapello said:

@deadline-zero0 said:

@alcapello said:

No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.

Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL

What the ****?

Anyway, like i said in teh other thread, there's no way in hell SoM needs a 3GB card tor un at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best

Face it, PC gaming suck balls this gen compare to the PS4.

It has no game, all the big game are danging in front of the watery mouth of PC gamers.

Sims, Coaster tycoon, WoW, sims, tycoon, WOW.

You're trying too hard

Still I got a point.

No, you don't. Hence why you are trying too hard

Avatar image for deadline-zero0
DEadliNE-Zero0

6607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#150  Edited By DEadliNE-Zero0
Member since 2014 • 6607 Posts

@alcapello said:

@deadline-zero0 said:

@alcapello said:

No wonder PC having problems getting games like GTA5, Bloodborne, and other major titles. It just too weak and unstable compare to the PS4.

Hey at least the new roller coaster tycoon coming to PC!!! Only.....the graphic are now cartoon LOL (which isn't bad, just saying) LOL

What the ****?

Anyway, like i said in teh other thread, there's no way in hell SoM needs a 3GB card tor un at high settings. Not unless the PS4 version's running at 720p, or medium settngs at best

Face it, PC gaming suck balls this gen compare to the PS4.

It has no game, all the big game are danging in front of the watery mouth of PC gamers.

Sims, Coaster tycoon, WoW, sims, tycoon, WOW.