Evil Within needs 4 GB of VRAM to look shiny

  • 175 results
  • 1
  • 2
  • 3
  • 4
Avatar image for 7mdma
7MDMA

315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#151 7MDMA
Member since 2014 • 315 Posts

@alcapello said:

Still I got a point.

Posting in a thread about a game that will be have higher settings on PC and take advantage of the powerful hardware while bragging about playing it on lower settings on PS4?

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#152 Wasdie  Moderator
Member since 2003 • 53622 Posts

@GoldenElementXL said:

I see a few different numbers here regarding how much of the 8GB of GDDR5 is used on the PS4 for games. So I am just gonna drop this slide here.

http://www.officialplaystationmagazine.co.uk/2014/04/15/sucker-punch-explains-how-it-used-ps4s-8gb-ram-in-infamous-second-son/#null

This right here is exactly what I'm talking about when I say they tweak LoD values. This isn't a RAM thing. The bush is still loaded into memory, they just aren't rendering it at a certain distance.

Avatar image for rekthard
REKThard

479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#153 REKThard
Member since 2014 • 479 Posts

I'm calling this BS. Wolfenstein New Order on my old card(which is very shit) runs 1 fps all the time, then i renamed the game's exe and it runs on max settings just fine(don't know the fps,i think it's 30-40)

Avatar image for 7mdma
7MDMA

315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#154  Edited By 7MDMA
Member since 2014 • 315 Posts

It it BS; same with Shadows of Mordor requiring 6 GB for max textures. With the caveat of the max option being greyed out in your settings I highly doubt enabling it (if you've a powerful card with less video ram) will lead to a substantial performance hit, maybe some barely noticeable micro stutter.

Avatar image for Wasdie
Wasdie

53622

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#155  Edited By Wasdie  Moderator
Member since 2003 • 53622 Posts
@04dcarraher said:

You are underestimating the lack of processing power. Also you are wrong because of the fact that plenty of games do use more then "15"% or 25% of your cpu power hence the ones using more then a single thread at 80-100%. And plenty of prime examples of Cpu bottlenecks to the gpu and or games prove that fact. Take a Phenom 2 x4 to BF3 or 4, limit it to two cores and see the fps get cut in half.

Now "According to developer DICE, BF4 already uses up to 95 percent of available CPU power on next-gen consoles."

"Frostbite technical director Johan Andersson, the game uses 90 to 95 percent of the available CPU power on the PS4 and Xbox One. You can check out an in-depth Q&A at AMD with him and other developers While the next gen consoles are clearly more powerful than the previous generation hardware, Sony and Microsoft have decided to focus more on the GPU than the CPU. Both the PS4 and Xbox One have an 8 core CPU clocked at around 1.6 Ghz, which doesn’t sound like a lot. Furthermore, developers only have access to 6 CPU cores, the last two are reserved for the OS.While it’s no surprise that a game as complex as Battlefield 4 uses almost all of the available CPU power, we’re surprised that developers are already almost hitting the limit."

Also all the fps issues in games Im talking about are from the lack of cpu processing power, not all fps issues are from the lack of gpu power. Sudden changes in directions, lots of multiplayer action are prime examples of the cpu not being able to keep up with the data flow for the gpu. Like you said graphics can be tweaked to fit the hardware but yet we see instances of fps drops and unstable averages, because of the cpu not the gpu.

"Use 95 percent of available CPU power on next-gen consoles" is a context-less quote. How efficient exactly is that CPU useage? How often are threads just hung up waiting? I don't believe for a second a game that was built during the development of the PS4 and rushed out the door for pretty much launch (BF4's QA was some of the worst in recent memory) that they were actually properly utilizing resources available. Also DICE's tech team is mostly a PC developer, so there is quite a bit of bias there.

From what I can tell most other devs are running into problems with the multithreading. Properly utilizing multiple cores isn't easy and up until now almost every game was built around 1 main thread that ran on one main core. It's not hard to believe devs are having a hard time adjusting.You said it yourself, games using one core to 80-100%. That's not going to fly on a next-gen console. While the entire CPU on the PS4/Xbox One is not a bottleneck, the per-core performance alone is not a lot and games absolutely must properly utilize multiple cores if they don't want the CPU to be bottlenecking their game.

I am not underestimating the lack of processing power. I watch laptops from 2-3 years ago running mobile i5s and i7s running processor intensive games like Civilization 5 and Rome 2 without only having to turn down graphic settings due to the graphic card sucking.

You're forgetting that DX is a huge CPU bottleneck on the PC and right now throwing a slower/weaker CPU at a graphically intensive game really hurts graphic performance. This is a bottleneck that barely exists on the game consoles. There is a reason why DX12 is trying to give lower level access to games so they can get around this bottleneck. DX12 still isn't to the level of a console API but it's far closer than what they have now. It's especially helpful with APUs and tablet-CPUs.

Here's a good example of how much an API change can benefit a game. DX11 to DX12 on a Surface Pro 3 with Intel HD4400 graphics, which is the i5 version of the Surface Pro 3, saw a 40% increase in FPS and a 50% decrease in power consumption while doing 50,000 draw calls. That's huge and that bottleneck exists even more with DX9 (which too many games still use). This is a huge factor to this entire argument.

CPU power isn't nearly as important when it comes to the consoles. It's not a bottleneck like you are saying. The only real bottleneck on the game consoles is the GPU.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#156  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@Wasdie: ... Frostbite is a multithreaded engine and MP suffers fps drops when alot of action is going on on large servers especially with physics involved. BF4 used all available cores because if they didnt they would have never able to make the 60 fps target on 1 or 2 cores at 1.6ghz.....Also DX overhead is not an issue with desktop intel quad core cpu's with dedicated gpu's proved with Mantle however helped AMD CPU's alot because they lack the brute force.

Fact is the consoles cpu's lack the power to handle all the normal tasks cpu's do. and they will move certain cpu based jobs to the gpu to save resources, such as physics. Also I dont think you understand how slow these cpu's are. The jaguar architecture per clock is only on average 15% faster then AMD's bobcat series... With tests done with bobcat show even AMD's old Athlon X2's at same clockrate was roughly 20% faster, then with Athlon 2 (K10) was 50% faster. Which means that an Athlon 2 X4 2.4 ghz is still faster then a six core jaguar at 1.6 ghz.

Also about the surface pro 3 cpu's are all dual cores with HT and the base clock are all under 2 ghz lowest is 1.3 ghz, That test was done with the i5 1.9 ghz dual core of course dx12 is going to help alot the cpu lacks the processing power.... To give an idea how slow the i5 4300u is, its only half as strong as laptop i7 2670QM, and even with AMD's A10 6800k at 4.1 ghz is slower then that i7 2670 by around 20-25%. so now imagine yet a slower cpu then the i5 4300....

Avatar image for deadline-zero0
DEadliNE-Zero0

6607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#157 DEadliNE-Zero0
Member since 2014 • 6607 Posts

I can't think of any game that requires 6GB of VRAM, unless it's for a triple monitor setup, if that even has anything to do with it.

Isn't 4GB enough for 4K nowadays?

Avatar image for kingtito
kingtito

11775

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#158  Edited By kingtito
Member since 2003 • 11775 Posts

@parkurtommo said:

@kingtito said:

All of the specs are fine except for 1...4GB of VRAM. I bought the 780ti just a few months ago and THAT doesn't even have 4GB of VRAM. Am I suppose to run out and purchase another $600/700 780ti just to meet that requirement? Lame

Shoulda waited for the 970 bruh! Considering how long ago the 700 series was launched we could easily expect the next generation at the 3rd Q of this year. And it was totally worth the wait. Like I was going to upgrade to a 760 this year but when I heard the rumors of the 800 series I literally waited months, and now I will be getting a card that is like 2 times better for a similar price (the 970, with 4gb too).

I upgraded to play BF4, i know i know dumb reason hahaah but I'm a battlefield fan. I actually wanted to get the 290x but they were sold out everywhere. I was able to get the 780ti soon as it launched from Amazon. I got excited cause the wife gave me the ok so I wasn't even thinking about future cards. I didn't want her to change her mind lol. I'm happy with my card so no regrets. I try to upgrade every year or 2 but it's been in a while and I was tired of my 560ti. Next upgrade will be my MB, CPU and RAM.

Avatar image for parkurtommo
parkurtommo

28295

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#159  Edited By parkurtommo
Member since 2009 • 28295 Posts

@kingtito said:

@parkurtommo said:

@kingtito said:

All of the specs are fine except for 1...4GB of VRAM. I bought the 780ti just a few months ago and THAT doesn't even have 4GB of VRAM. Am I suppose to run out and purchase another $600/700 780ti just to meet that requirement? Lame

Shoulda waited for the 970 bruh! Considering how long ago the 700 series was launched we could easily expect the next generation at the 3rd Q of this year. And it was totally worth the wait. Like I was going to upgrade to a 760 this year but when I heard the rumors of the 800 series I literally waited months, and now I will be getting a card that is like 2 times better for a similar price (the 970, with 4gb too).

I upgraded to play BF4, i know i know dumb reason hahaah but I'm a battlefield fan. I actually wanted to get the 290x but they were sold out everywhere. I was able to get the 780ti soon as it launched from Amazon. I got excited cause the wife gave me the ok so I wasn't even thinking about future cards. I didn't want her to change her mind lol. I'm happy with my card so no regrets. I try to upgrade every year or 2 but it's been in a while and I was tired of my 560ti. Next upgrade will be my MB, CPU and RAM.

I'm upgrading from a 560ti too The 780Ti was a great choice, albeit expensive. Just, the lack of at least 4 gb vram is a bummer. I guess I will just make an entirely new build from scratch in about 2 or so years, this one was my first, and I'm surprised it's still working lol.

Avatar image for kingtito
kingtito

11775

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#160  Edited By kingtito
Member since 2003 • 11775 Posts

@parkurtommo said:

@kingtito said:

@parkurtommo said:

@kingtito said:

All of the specs are fine except for 1...4GB of VRAM. I bought the 780ti just a few months ago and THAT doesn't even have 4GB of VRAM. Am I suppose to run out and purchase another $600/700 780ti just to meet that requirement? Lame

Shoulda waited for the 970 bruh! Considering how long ago the 700 series was launched we could easily expect the next generation at the 3rd Q of this year. And it was totally worth the wait. Like I was going to upgrade to a 760 this year but when I heard the rumors of the 800 series I literally waited months, and now I will be getting a card that is like 2 times better for a similar price (the 970, with 4gb too).

I upgraded to play BF4, i know i know dumb reason hahaah but I'm a battlefield fan. I actually wanted to get the 290x but they were sold out everywhere. I was able to get the 780ti soon as it launched from Amazon. I got excited cause the wife gave me the ok so I wasn't even thinking about future cards. I didn't want her to change her mind lol. I'm happy with my card so no regrets. I try to upgrade every year or 2 but it's been in a while and I was tired of my 560ti. Next upgrade will be my MB, CPU and RAM.

I'm upgrading from a 560ti too The 780Ti was a great choice, albeit expensive. Just, the lack of at least 4 gb vram is a bummer. I guess I will just make an entirely new build from scratch in about 2 or so years, this one was my first, and I'm surprised it's still working lol.

I used to re-install my OS every 6 months to a year, that got tiresome after a while but that kept my system running tip top. I've always built my computers going all the way back to the old 486 50 mhz days. The good old days when 1MB of ram was $100 lol.

Avatar image for brodelin
brodelin

583

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#161 brodelin
Member since 2005 • 583 Posts

Guys if you think that 4gb VRam is high then check out system req. for Lotr: Shadow of Mordor. If you want to max out the textures you'll ned 6 gb VRam. That is insane!

Link to Neogaf

I belive its the future of current gen multiplat games to require more Vram since their console builds have more to work with than last gen consoles.

Avatar image for deactivated-5920bf77daa85
deactivated-5920bf77daa85

3270

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 3

#162 deactivated-5920bf77daa85
Member since 2004 • 3270 Posts

Not really interested - I had to look up wtf that game was. Still, those recommended specs seem a bit suspicious, as they specify intel/nvidia only.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#163 04dcarraher
Member since 2004 • 23832 Posts
@brodelin said:

I belive its the future of current gen multiplat games to require more Vram since their console builds have more to work with than last gen consoles.

PS4 has only 4.5 gb to work with and that 4.5 gb has to split up between game system cache and Vram so these consoles are not going to use 4+gb for vram, only expect 2-3gb typical usage.

Avatar image for mikhail
mikhail

2697

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#164  Edited By mikhail
Member since 2003 • 2697 Posts

People are holding onto this whole "mah GDDR5!" thing on PS4 as if it's some major game changer without even considering how underpowered the rest of the system is. If the PS4's access to all this VRAM is such a big deal, why does it still struggle to achieve even a paltry 60 fps at 1080p in most games? Hell, even Sleeping Dogs is going to be running at 30fps, a game that my four year old GPU did in 2012 at 1080p/60 with no problem whatsoever on 2gb of VRAM, and yes that was with the high res texture pack. Let me guess...optimization will magically cure all of this over time?

What about Wolfenstein: The New Order on consoles, another recent idTech5 game. My 780 Ti with 3gb of VRAM (yes, GDDR5) coupled with a high end i5 ran Wolfenstein at 1080p/60 with zero problems and everything maxed out. Yet we see the console versions having to dynamically reduce vertical resolutions on both platforms just to maintain 60 fps...why is that? If the PS4's 8gb of unified GDDR5 is so magically powerful, why does the game need to drop it's resolution just to maintain a 60 fps lock? My GPU has "only" 3gb of dedicated VRAM, yet I don't have this problem....hmmm.....

Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#165 clyde46
Member since 2005 • 49061 Posts

@deadline-zero0 said:

I can't think of any game that requires 6GB of VRAM, unless it's for a triple monitor setup, if that even has anything to do with it.

Isn't 4GB enough for 4K nowadays?

These are games built without being hamstrung by the PS3/360. I reckon we will see these specs creep up. That being said, since when has "recommended specs" ever been right?

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#166  Edited By 04dcarraher
Member since 2004 • 23832 Posts
@mikhail said:

People are holding onto this whole "mah GDDR5!" thing on PS4 as if it's some major game changer without even considering how underpowered the rest of the system is. If the PS4's access to all this VRAM is such a big deal, why does it still struggle to achieve even a paltry 60 fps at 1080p in most games? Hell, even Sleeping Dogs is going to be running at 30fps, a game that my four year old GPU did in 2012 at 1080p/60 with no problem whatsoever on 2gb of VRAM, and yes that was with the high res texture pack. Let me guess...optimization will magically cure all of this over time?

What about Wolfenstein: The New Order on consoles, another recent idTech5 game. My 780 Ti with 3gb of VRAM (yes, GDDR5) coupled with a high end i5 ran Wolfenstein at 1080p/60 with zero problems and everything maxed out. Yet we see the console versions having to dynamically reduce vertical resolutions on both platforms just to maintain 60 fps...why is that? If the PS4's 8gb of unified GDDR5 is so magically powerful, why does the game need to drop it's resolution just to maintain a 60 fps lock? My GPU has "only" 3gb of dedicated VRAM, yet I don't have this problem....hmmm.....

Its the lack of gpu power and memory to go around, however the cpu is also lacking, And that 8gb GDDR5..... only 4.5gb is usable under normal use and that has to be split for game cache as well as Vram...

Avatar image for parkurtommo
parkurtommo

28295

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#167 parkurtommo
Member since 2009 • 28295 Posts

@clyde46 said:

@deadline-zero0 said:

I can't think of any game that requires 6GB of VRAM, unless it's for a triple monitor setup, if that even has anything to do with it.

Isn't 4GB enough for 4K nowadays?

These are games built without being hamstrung by the PS3/360. I reckon we will see these specs creep up. That being said, since when has "recommended specs" ever been right?

Yeah recommended specs sometimes just means "you need this for medium settings" or "you need this for maximum settings at a constant 60 fps". But it can still be helpful, because "required specs" certainly does **** all.

Avatar image for SexyJazzCat
SexyJazzCat

2796

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#168 SexyJazzCat
Member since 2013 • 2796 Posts

@Wasdie: Shit, parts are getting out of date but remain expensive.

Avatar image for miiiiv
miiiiv

943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#169  Edited By miiiiv
Member since 2013 • 943 Posts

I think it's bs that it would need 4 GB vram at 1080p. Just as Watch Dogs was said to use 3 GB vram yet it runs better on a gtx 690 (2 GB usable vram) than it does on a titan 6 GB or a gtx 780ti 3 GB at 1600p with 4x msaa. That's something that wouldn't happen if the 690 was bottlenecked by it's vram.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#170  Edited By deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@miiiiv: Gotta find a way to sell new cards. Tech has outpaced gaming it seems.

Avatar image for mister_xyz
mister_XYZ

45

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#171 mister_XYZ
Member since 2014 • 45 Posts

WOW, insane requirements. Hope i can run it on my rig :(

Avatar image for with_teeth26
with_teeth26

11511

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 1

#172 with_teeth26
Member since 2007 • 11511 Posts

ID Tech 5 is such a piece of shit. Takes twice the HDD space and much more powerful hardware with visuals that look slightly worse than the latest builds of UE3.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#173  Edited By tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

Its the lack of gpu power and memory to go around, however the cpu is also lacking, And that 8gb GDDR5..... only 4.5gb is usable under normal use and that has to be split for game cache as well as Vram...

It has 4.5 for Vram which is more than double than most mid range cards,even 3GB on the PS4 for Vram would have been more than enough most 7870,66ti and alike GPU have 2GB of ram,7950 has mostly 3 and so do the similar ones,4GB on PC is basically rare most still by far is 1Gb fallow by 2GB closely.

The PS4 has more ram than it actually needs for a GPU of its class,but is funny i argue this with you as well,and told you that when games on PC started to demand more and more ram those 1GB and 2GB cards will suffer against the PS4,lack of ram can impact performance just as bad as lack of power can i already proved this with Anandtech benchmark of the 7850 1Gb vs the 2GB 7850 On skyrim with HD textures the 78501GB chuck the 2GB one almost double it in performance 2 GPU that are the same.

But you would not listen my R270 can't run a certain game coming now because is asking 6GB of Vram for ultra settings even on 1080p,for me not even high but medium all of the sudden my card which ran almost all games on ultra can't any more because of video memory.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#174  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@tormentos said:

@04dcarraher said:

Its the lack of gpu power and memory to go around, however the cpu is also lacking, And that 8gb GDDR5..... only 4.5gb is usable under normal use and that has to be split for game cache as well as Vram...

It has 4.5 for Vram which is more than double than most mid range cards,even 3GB on the PS4 for Vram would have been more than enough most 7870,66ti and alike GPU have 2GB of ram,7950 has mostly 3 and so do the similar ones,4GB on PC is basically rare most still by far is 1Gb fallow by 2GB closely.

The PS4 has more ram than it actually needs for a GPU of its class,but is funny i argue this with you as well,and told you that when games on PC started to demand more and more ram those 1GB and 2GB cards will suffer against the PS4,lack of ram can impact performance just as bad as lack of power can i already proved this with Anandtech benchmark of the 7850 1Gb vs the 2GB 7850 On skyrim with HD textures the 78501GB chuck the 2GB one almost double it in performance 2 GPU that are the same.

But you would not listen my R270 can't run a certain game coming now because is asking 6GB of Vram for ultra settings even on 1080p,for me not even high but medium all of the sudden my card which ran almost all games on ultra can't any more because of video memory.

lol again stop with dumb fanboyism..... the more open the game the more is used for game cache which means less for Vram, You think the PS4 has 4.5gb for vram? god your dumb.... the PS4 has 4.5 gb TOTAL for both Vram and game cache. The GPU will only use 2-3gb for Vram not 4 gb+ lol.

and then again its funny that you think these upcoming games will need that much memory..... fact is that the requirements are way over exaggerated or really coded badly. Also you want to try a more up to date skyrim benchmark ?. O wow look at this 1080p, ultra settings with 4x AA with high res pack but yet 1gb card perform on par with 2gb model.... see what happens after a patch or two and updated drivers....

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#175  Edited By tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

lol again stop with dumb fanboyism..... the more open the game the more is used for game cache which means less for Vram, and then again its funny that you think these upcoming games will need that much memory..... fact is that the requirements are way over exaggerated. Also you want to try a more up to date skyrim benchmark ?. O wow look at this 1080p, ultra settings with 4x AA with high res pack but yet 1gb card perform on par with 2gb model.... see what happens after a patch or two and updated drivers....

By the way the test i posted uses MSAAx4 no just 4xaa and is a little higher than 1080p is 1200p,and basically the 2GB version run faster on 1200p with 4xMSAA than the 1GB does in 1080p without 4xMSAA.

http://www.anandtech.com/show/6359/the-nvidia-geforce-gtx-650-ti-review/15

Now tell me ram doesn't impact performance again..

When the 2GB version runs faster in higher resolution with bigger load than the 1GB does in lower resolution with less load.

The same with the 650TI...

Regardless of whether we’re looking at AMD or NVIDIA cards, there’s only one benchmark where 2GB cards have a clear lead: Skyrim at 1920 with the high resolution texture pack. For our other 9 games the performance difference is miniscule at best.

http://www.anandtech.com/show/6359/the-nvidia-geforce-gtx-650-ti-review/6

Looking at you and see you try to make a double standard argument is a joke,on one side you try to diminish the PS4 4.5 GB of vram yet on the other you claim there is no difference from a 1 GB card to a 2GB card on memory foot print when the contrary has been prove already,is like having a 7970 with 1GB and believe it will not hurt it.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#176  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@tormentos said:

@04dcarraher said:

lol again stop with dumb fanboyism..... the more open the game the more is used for game cache which means less for Vram, and then again its funny that you think these upcoming games will need that much memory..... fact is that the requirements are way over exaggerated. Also you want to try a more up to date skyrim benchmark ?. O wow look at this 1080p, ultra settings with 4x AA with high res pack but yet 1gb card perform on par with 2gb model.... see what happens after a patch or two and updated drivers....


I post irrelevant things look at me!

lol your so dense 4x aa is the same as MSAA. again using out of date review poor attempt..... again thinking PS4 uses all available 4.5gbof memory for Vram is hilarious.

Another review this time even newer May of 2013 gtx 650ti 1gb vs 2gb 1080p 4xaa max.....

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#177  Edited By tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

I am a sony hater who can't admit been wrong.

Owned like the butthut hermit that you are..lol

Link already posted on the post you ignore sad hermit..

Whats that 21FPS difference from the same fu**ing GPU.?

lol...

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#178 04dcarraher
Member since 2004 • 23832 Posts

@tormentos: lol its really sad you keep on posting the same shit from the same outdated and irrelevant benchs, you really must be desperate

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#179 tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

@tormentos: lol its really sad you keep on posting the same shit from the same outdated and irrelevant benchs, you really must be desperate

Yeah keep hiding on outdated excuse there is a reason why the 7970 doesn't have 2GB of ram.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#180  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@tormentos said:

@04dcarraher said:

@tormentos: lol its really sad you keep on posting the same shit from the same outdated and irrelevant benchs, you really must be desperate

Yeah keep hiding on outdated excuse there is a reason why the 7970 doesn't have 2GB of ram.

No the reason is the memory bus width that determines whats best ratio for memory..... you goof, AMD always uses the higher ratio to separate themselves apart from Nvidia. the 384bit bus uses and fits 1.5gb/3gb or 6gb better then putting in 2 or 4gb....The need for more then 2gb for 1080p gaming is dumb because there is no need too if you know how to code.... and proof is in the pudding with all these newer titles asking for i7's, 3gb+ video cards but yet lower class cpu's and video cards with less memory are able to perform just fine and just as well. It called over inflation, and artificial restrictions to promote hardware sales and to cover their own asses because of piss poor coding. Even with insane resolutions and AA the difference between 2 and 4gb is virtually no difference. By the time you actually need more then 2gb those type of tier based gpu's their performance is not strong enough to handle those resolutions and settings...

Avatar image for sukraj
sukraj

27859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 22

User Lists: 0

#181 sukraj
Member since 2008 • 27859 Posts

2 more weeks before I get this bad boy.