PC gaming, 4K OR 60FPS. Not both.

  • 162 results
  • 1
  • 2
  • 3
  • 4
Avatar image for Cranler
#101 Posted by Cranler (8809 posts) -

4k tv's have really dropped in price. A 55" 120 hz 1080p lcd in 2010 was about the same price as a 55" 4k is now.

In another 3 or 4 years 1080p tv's will be pretty much phased out.

Avatar image for Zen_Light
#102 Posted by Zen_Light (2143 posts) -

Secret sauce?

Avatar image for mgools
#103 Posted by mgools (1170 posts) -

@Cranler said:

4k tv's have really dropped in price. A 55" 120 hz 1080p lcd in 2010 was about the same price as a 55" 4k is now.

In another 3 or 4 years 1080p tv's will be pretty much phased out.

No need for 4K on the TV side. It is actually not a good thing right now. Broadcast TV is 720P and 1080i (and no plans to change this that I have heard of). Blu-ray is 1080P. A few videos from Youtube are 4K. What you end up with is video running at non native resolution which causes a soft image. (Same will go for Xbox One and PS4 games as they are upscaled)

Also what we run into at this point is a bandwidth and compression issue. Actually getting these larger bitrates to the home in good video quality is a task. 4K right now is more of a marketing thing then a usable thing.

On the PC gaming side though, if games can support it, and send it out natively and your pc can push it then things will look good. At some point there is a law of diminishing returns with resolution.

Avatar image for faizan_faizan
#104 Edited by faizan_faizan (7869 posts) -

@Couth_ said:

@miiiiv said:

In most games yes, if we are talking without aa, the 780ti manges stable 60 fps in most games at 1080p max settings but when speaking of 1080p, I assumed at least 4x aa. And there are several examples where the gtx 780ti drops below 60 fps with 4x aa at 1080p max settings like Crysis 3, Metro LL and Arma 3

As for 1440p without aa (I agree, at that res. aa isn't necessary) there are still some cases where the gtx 780ti doesn't manage a stable 60 fps, Crysis 3 and Arma 3 come to mind again.

Don't get me wrong, the gtx 780ti is an awesome graphics card but it certainly is not "destroying" 1080p, if it were it would easily run all games at 1080 with some amount of aa and never ever drop below 60 fps.

Why do I get the feeling you don't own any high end PC equipment. There isn't a single game out right now that the 780Ti won't get 60fps in at 1080p

Don't even believe what I am reading right now. You are really trying to suggest the highest end video game performance graphics card offered by Nvidia struggles at 1080p.. LOL

He's not wrong.

Crysis 3 doesn't get 60FPS in either AA solutions. I'm sure you can get a solid 60FPS with AA turned off.

Avatar image for Cranler
#105 Posted by Cranler (8809 posts) -

@mgools said:

@Cranler said:

4k tv's have really dropped in price. A 55" 120 hz 1080p lcd in 2010 was about the same price as a 55" 4k is now.

In another 3 or 4 years 1080p tv's will be pretty much phased out.

No need for 4K on the TV side. It is actually not a good thing right now. Broadcast TV is 720P and 1080i (and no plans to change this that I have heard of). Blu-ray is 1080P. A few videos from Youtube are 4K. What you end up with is video running at non native resolution which causes a soft image. (Same will go for Xbox One and PS4 games as they are upscaled)

Also what we run into at this point is a bandwidth and compression issue. Actually getting these larger bitrates to the home in good video quality is a task. 4K right now is more of a marketing thing then a usable thing.

On the PC gaming side though, if games can support it, and send it out natively and your pc can push it then things will look good. At some point there is a law of diminishing returns with resolution.

It's never about what consumers need. At this point most tv buyers are getting 1080p tv's to watch 720p tv content.

TV manufacturers and stores will always make the most room for the expensive tv's. You think Best Buy and Samsung will be content selling 65" 1080p tv's for $300 3 years from now? You don't think they're going to continue pushing all the latest tech like they always have?

Avatar image for Cranler
#106 Posted by Cranler (8809 posts) -

@faizan_faizan said:

@Couth_ said:

@miiiiv said:

In most games yes, if we are talking without aa, the 780ti manges stable 60 fps in most games at 1080p max settings but when speaking of 1080p, I assumed at least 4x aa. And there are several examples where the gtx 780ti drops below 60 fps with 4x aa at 1080p max settings like Crysis 3, Metro LL and Arma 3

As for 1440p without aa (I agree, at that res. aa isn't necessary) there are still some cases where the gtx 780ti doesn't manage a stable 60 fps, Crysis 3 and Arma 3 come to mind again.

Don't get me wrong, the gtx 780ti is an awesome graphics card but it certainly is not "destroying" 1080p, if it were it would easily run all games at 1080 with some amount of aa and never ever drop below 60 fps.

Why do I get the feeling you don't own any high end PC equipment. There isn't a single game out right now that the 780Ti won't get 60fps in at 1080p

Don't even believe what I am reading right now. You are really trying to suggest the highest end video game performance graphics card offered by Nvidia struggles at 1080p.. LOL

He's not wrong.

Crysis 3 doesn't get 60FPS in either AA solutions. I'm sure you can get a solid 60FPS with AA turned off.

One thing though is that some of the most demanding settings in Crysis have very little visual impact. You can drop down a couple settings, save 30 fps and see very little difference.

Avatar image for mdign
#107 Edited by MDIGN (30 posts) -

As someone that currently plays in 4k, I can honestly say that most of these benchmarks will sway you wrong. Due to the fact that you don't need AA at 4k. I have some proof here: http://steamcommunity.com/id/MDIGN/screenshots/?appid=242700

I'd upload the pics to GameSpot, but the last time I tried that there was a large reduction in quality. Those are a mix of pre-rendered and real-time cut scenes. Before you ask, the ones with the jaggies are the pre-renders.

I also get above 60fps in most games(with no AA it's easier on the GPU than for example 1600p with 8x AA or 16x AA) even Battlefield 4, thanks to Mantle. My only problem is that this Seiki 4k UHD TV is limited to 30hz @4k. However I've overclocked the panel to do 32hz which is slightly better, but still not great. It's not bad by any means though, to be honest most people won't see any difference in 30 and 60hz (inb4 well my eyes can see only 120hz!, Yeah so can mine bud, but not everyone has eyes as good as us.) 4k gaming is here, if you have the budget. The most expensive thing will be the GPUs.

What I like about this TV though is that it's currently the best deal on the market. With a firmware update and some tinkering, I can run it at 1080p @120hz, 1440p @75hz, 1620p @ 60hz, 1800p @45hz, and basically anything in-between. It's bigger, better, and cheaper than monitors that cost double the amount for only 1440p @60hz.

The reason it doesn't do 4k @ 60hz is actually due to component limits. It's only using HDMI 1.4, however newer models are supposed to be coming soon with HDMI 2.0 and 4k @60hz+

Avatar image for KHAndAnime
#108 Edited by KHAndAnime (16819 posts) -

@Cranler said:

@KHAndAnime said:

@Cranler said:

@KHAndAnime said:

I think I'll stick with 1080p/1200p, thanks. 1080P is crisp and jaggy free with any AA - so there's zero reason to upgrade (ever). I can't see a point 4k will ever be the standard - there's absolutely no reason for it. It's upping the resolution just for the sake - which might seem cool to people who don't understand resolution is low on the list of what makes good IQ, but it's not so useful for everyone else.

4K will become standard eventually. Manufacturers like to phase out the old cheap stuff which is what 1080p will be soon.

No, not for a very, very long time. There's no 4k-native content, none on the horizon, and there's no reason for it to even exist. People aren't going to randomly upgrade to 4k monitors even if they were more common and cheap - it takes serious hardware to run games at 4k resolution, nobody will want to pay tons of extra cash for a barely noticeable boost in sharpness. It's a niche for enthusiasts.

Netflix has 4k content.

1280x1024 was the most commonly used res by pc gamers at the beginning of the last console gen. 1080p didn't start becoming common until the 8800 gtx came out although 1680x1050 was just as if not more common for a long time.

Hardware keeps improving at a faster rate than software and with the new consoles being so outdated the hardware will get further ahead than ever. Soon we'll see $400 gpu's that can run the latest games at over 120fps at 1080p, so many will be inclined to upgrade.

4k is a big deal since you not only get sharper graphics but it's much better at eliminating jaggies than aa. Some games have awful aa implementation. In Tombraider I played without aa because both aa types blur the overall image.

You're forgetting that game graphics improve with time as well. People can barely run 4k games now - they won't be able to run the 4k games coming out within the next year because they'll be significantly more taxing. We just hit next-gen so the stagnation in graphics development will be soon gone and you'll be struggling to just run games at 1080P again. 4k is extremely taxing compared to 1080P, yet 4k isn't significantly better at eliminating jaggies than AA. With 4x MSAA @ 1200P, I see zero jaggies - why would I ever need 4k? The 4k content on Netflix is worthless - it has already been established you have to have a fairly large TV and have to be close to it to even see the difference between 720p and 1080p. Don't fall for the number gimmick - more isn't better, you need to learn that resolution has a "sweet spot". What you should be striving for is monitors with better color accuracy, contrast, and black levels - those are far more important in achieving a good picture and realistic graphics. Ask around at avsforum.com.

Avatar image for santoron
#109 Posted by santoron (8581 posts) -

60 FPS for me. I have zero interest in 4k anywhere in my home, least of all my desktop. So much power required for a minute tangible benefit.

Avatar image for scatteh316
#110 Posted by scatteh316 (5699 posts) -

@jhonMalcovich said:

@clyde46:

Actually, not true. I made a research. GTX Titan Z (3000$), AMD R9 295 X2 (1500 USD), GTX 780 Sli (1100 USD) can do both 4k and 60fps in most games.

Later this year, I will buy a second GTX780 adn will be able to make a sli and play games at 4k, 50-60fps :D

So yeh, by making a SLI 4k is pretty afordable considering that is quite recent feature. In 3-4 years, a sinlge high-end gpu will be able to do 4k and 60fps without any hitch.

You don't have the memory to max out games at 4k on GTX 780's.

Avatar image for mdign
#111 Posted by MDIGN (30 posts) -

@scatteh316 said:

@jhonMalcovich said:

@clyde46:

Actually, not true. I made a research. GTX Titan Z (3000$), AMD R9 295 X2 (1500 USD), GTX 780 Sli (1100 USD) can do both 4k and 60fps in most games.

Later this year, I will buy a second GTX780 adn will be able to make a sli and play games at 4k, 50-60fps :D

So yeh, by making a SLI 4k is pretty afordable considering that is quite recent feature. In 3-4 years, a sinlge high-end gpu will be able to do 4k and 60fps without any hitch.

You don't have the memory to max out games at 4k on GTX 780's.

...I'm sorry, but do you know what you're talking about? That makes almost no logical sense.

Avatar image for KHAndAnime
#112 Posted by KHAndAnime (16819 posts) -

@mdign said:

@scatteh316 said:

@jhonMalcovich said:

@clyde46:

Actually, not true. I made a research. GTX Titan Z (3000$), AMD R9 295 X2 (1500 USD), GTX 780 Sli (1100 USD) can do both 4k and 60fps in most games.

Later this year, I will buy a second GTX780 adn will be able to make a sli and play games at 4k, 50-60fps :D

So yeh, by making a SLI 4k is pretty afordable considering that is quite recent feature. In 3-4 years, a sinlge high-end gpu will be able to do 4k and 60fps without any hitch.

You don't have the memory to max out games at 4k on GTX 780's.

...I'm sorry, but do you know what you're talking about? That makes almost no logical sense.

You need more video memory to max out games on a GTX 780, what makes no logical sense about that?

Avatar image for topgunmv
#113 Posted by topgunmv (10691 posts) -

@mdign said:

@scatteh316 said:

@jhonMalcovich said:

@clyde46:

Actually, not true. I made a research. GTX Titan Z (3000$), AMD R9 295 X2 (1500 USD), GTX 780 Sli (1100 USD) can do both 4k and 60fps in most games.

Later this year, I will buy a second GTX780 adn will be able to make a sli and play games at 4k, 50-60fps :D

So yeh, by making a SLI 4k is pretty afordable considering that is quite recent feature. In 3-4 years, a sinlge high-end gpu will be able to do 4k and 60fps without any hitch.

You don't have the memory to max out games at 4k on GTX 780's.

...I'm sorry, but do you know what you're talking about? That makes almost no logical sense.

Probably referring to video memory. Nvidia likes to cheap out on ram, so I'm going to guess that the 780 only has 3 gigs.

Avatar image for geraldwillkill
#114 Posted by geraldwillkill (1134 posts) -

4k or 60fps? neither. Im currently 1080p on a 144Hz monitor and its amazing when I reach 60+ fps. I will probably never get a 4k monitor for gaming purposes.

Avatar image for Couth_
#115 Posted by Couth_ (10369 posts) -

@faizan_faizan said:

He's not wrong.

I'm specifially talking about no AA, also your second pic is 2560x1600 which is an even higher resolution than 1440p

Avatar image for Cranler
#116 Posted by Cranler (8809 posts) -

@KHAndAnime said:

@Cranler said:

@KHAndAnime said:

@Cranler said:

@KHAndAnime said:

I think I'll stick with 1080p/1200p, thanks. 1080P is crisp and jaggy free with any AA - so there's zero reason to upgrade (ever). I can't see a point 4k will ever be the standard - there's absolutely no reason for it. It's upping the resolution just for the sake - which might seem cool to people who don't understand resolution is low on the list of what makes good IQ, but it's not so useful for everyone else.

4K will become standard eventually. Manufacturers like to phase out the old cheap stuff which is what 1080p will be soon.

No, not for a very, very long time. There's no 4k-native content, none on the horizon, and there's no reason for it to even exist. People aren't going to randomly upgrade to 4k monitors even if they were more common and cheap - it takes serious hardware to run games at 4k resolution, nobody will want to pay tons of extra cash for a barely noticeable boost in sharpness. It's a niche for enthusiasts.

Netflix has 4k content.

1280x1024 was the most commonly used res by pc gamers at the beginning of the last console gen. 1080p didn't start becoming common until the 8800 gtx came out although 1680x1050 was just as if not more common for a long time.

Hardware keeps improving at a faster rate than software and with the new consoles being so outdated the hardware will get further ahead than ever. Soon we'll see $400 gpu's that can run the latest games at over 120fps at 1080p, so many will be inclined to upgrade.

4k is a big deal since you not only get sharper graphics but it's much better at eliminating jaggies than aa. Some games have awful aa implementation. In Tombraider I played without aa because both aa types blur the overall image.

You're forgetting that game graphics improve with time as well. People can barely run 4k games now - they won't be able to run the 4k games coming out within the next year because they'll be significantly more taxing. We just hit next-gen so the stagnation in graphics development will be soon gone and you'll be struggling to just run games at 1080P again. 4k is extremely taxing compared to 1080P, yet 4k isn't significantly better at eliminating jaggies than AA. With 4x MSAA @ 1200P, I see zero jaggies - why would I ever need 4k? The 4k content on Netflix is worthless - it has already been established you have to have a fairly large TV and have to be close to it to even see the difference between 720p and 1080p. Don't fall for the number gimmick - more isn't better, you need to learn that resolution has a "sweet spot". What you should be striving for is monitors with better color accuracy, contrast, and black levels - those are far more important in achieving a good picture and realistic graphics. Ask around at avsforum.com.

Graphics have always been improving. There were times when graphics improved at a much faster rate than they do today yet the average pc gaming resolution has always been increasing.

People could barely run Oblivion at 720p in 2006. Graphics have improved a lot since then yet we're at 1080p now as the standard desktop gaming res.

Upcoming games will be more demanding because of the new console gen but there's more powerful gpu's on the way to counter it. Graphic progression will slow down again in 2 years while gpu progression will most likely remain steady. High end gpu's in 2016 will smoke any game at 1080p so people will switch to 1440p then that will become too easy for the high end gpu's and 4k will be next.

Like I said before many games aa implementation blurs the entire image. A resolution high enough to eliminate jaggies would be in ideal.

How did you come to the conclusion that Netflix 4k content is useless? I saw a review where they said it's easy to notice the difference over Netflix at 1080p.

How did they run the 720p vs 1080p tests? The only way for a proper test would be the exact same content on same size tv by same manufacturer. But what content comes in both 720p and 1080p?

I can see the difference from 10 ft away on a 46". I can easily see the difference between Netflix 720p, 1080p and then blu ray 1080p.

Avatar image for EducatingU_PCMR
#117 Edited by EducatingU_PCMR (1463 posts) -

You'd need 780s with 6GB VRAM for 4K, 3GB isn't going to cut it. Just another reason to laugh at Ngreedia, "high end" cards with 2GB RAM (770/680) and "enthusiast models" (780/780Ti) with 3GB LOL

You'd expend less with two 290s, which BTW have dropped in price for some models, $780 for two, instead of paying $1160 for two 6GB 780s.

4GBs of VRAM is like the sweet spot for 4K.

Avatar image for Truth_Hurts_U
#118 Edited by Truth_Hurts_U (9697 posts) -

144 Hz with max settings at 1080 res > 4k

http://isthisretina.com/

24 inch monitor = 3 feet 1 inch not to see the pixels. Which is almost perfect for someone sitting at a desk 2.5 feet away.

2k Monitor would be as high as I would go. 4k is over kill unless your doing triple/quad displays.

Avatar image for miiiiv
#119 Edited by miiiiv (897 posts) -
@Couth_ said:

@faizan_faizan said:

He's not wrong.

I'm specifially talking about no AA, also your second pic is 2560x1600 which is an even higher resolution than 1440p

Here are two benchmarks, the 780ti manages 55 fps average in Crysis 3 at 1080p without AA, at 1440p without AA, the 780ti averages 31 fps. The oc'd models fare a bit better but keep in mind that the minimum frame rate is almost always considerably lower than the average fps.
And in Metro LL at 1440p without AA, the average frame rate is 62fps and minimum 48fps.

And my graphics solution is more powerful than the 780ti so I should know and when I run Crysis 3 max settings 1080p with 4x AA I get drops into the high 40s during the most demanding parts. I usually don't reduce the AA to get better performance but rather dial back a couple of the settings, even though I'm mostly okay with +45 fps.

Avatar image for thereal25
#120 Posted by thereal25 (1152 posts) -

Yep, I don't think it'd we worth it to attempt 4k anytime soon (unless you're really wealthy).

The thing is, even if you did somehow manage to get the latest games running at or above 60fps average on high/ultra settings - what about tomorrows games?

In other words buying a FUTURE PROOF 4k gaming setup would be near impossible at the moment.

Avatar image for jhonMalcovich
#121 Posted by jhonMalcovich (6905 posts) -

@scatteh316 said:

@jhonMalcovich said:

@clyde46:

Actually, not true. I made a research. GTX Titan Z (3000$), AMD R9 295 X2 (1500 USD), GTX 780 Sli (1100 USD) can do both 4k and 60fps in most games.

Later this year, I will buy a second GTX780 adn will be able to make a sli and play games at 4k, 50-60fps :D

So yeh, by making a SLI 4k is pretty afordable considering that is quite recent feature. In 3-4 years, a sinlge high-end gpu will be able to do 4k and 60fps without any hitch.

You don't have the memory to max out games at 4k on GTX 780's.

Google what SLI means

Avatar image for deactivated-583e460ca986b
#122 Posted by deactivated-583e460ca986b (7240 posts) -

@jhonMalcovich said:

@scatteh316 said:

You don't have the memory to max out games at 4k on GTX 780's.

Google what SLI means

Why? Vram doesn't stack. If you don't have the 6 GB 780's then 4K isn't happening.

Avatar image for jhonMalcovich
#123 Edited by jhonMalcovich (6905 posts) -

@GoldenElementXL said:

@jhonMalcovich said:

@scatteh316 said:

You don't have the memory to max out games at 4k on GTX 780's.

Google what SLI means

Why? Vram doesn't stack. If you don't have the 6 GB 780's then 4K isn't happening.

I know that it's splitting, but still,in overall, it uses 6GB, not 3GB. And as I already showed several benchmarks, GTX780 SLI is doing 60fps in most games except a few very heavy ones, like Crysis 3 and Metro Last Light (30fps at 4k)

Avatar image for deactivated-583e460ca986b
#124 Posted by deactivated-583e460ca986b (7240 posts) -

@jhonMalcovich said:

@GoldenElementXL said:

@jhonMalcovich said:

Google what SLI means

Why? Vram doesn't stack. If you don't have the 6 GB 780's then 4K isn't happening.

I know that it's splitting, but still,in overall, it uses 6GB, not 3GB. And as I already showed several benchmarks, GTX780 SLI is doing 60fps in most games except a few very heavy ones, like Crysis 3 and Metro Last Light (30fps at 4k)

Um, no. It uses 3GB.

I think you need to read these.

http://www.tomshardware.com/answers/id-1868829/sli-increase-vram-usage.html
http://www.overclock.net/t/1365868/please-does-vram-stack-in-sli-crossfire-or-not
http://forums.aria.co.uk/showthread.php/125719-Vram-stacking

And what settings are being used in those benchmarks you posted? Because every other benchmark that has been posted in this thread does not support your claim.

Avatar image for jhonMalcovich
#125 Edited by jhonMalcovich (6905 posts) -

@GoldenElementXL said:

@jhonMalcovich said:

@GoldenElementXL said:

@jhonMalcovich said:

Google what SLI means

Why? Vram doesn't stack. If you don't have the 6 GB 780's then 4K isn't happening.

I know that it's splitting, but still,in overall, it uses 6GB, not 3GB. And as I already showed several benchmarks, GTX780 SLI is doing 60fps in most games except a few very heavy ones, like Crysis 3 and Metro Last Light (30fps at 4k)

Um, no. It uses 3GB.

I think you need to read these.

http://www.tomshardware.com/answers/id-1868829/sli-increase-vram-usage.html

http://www.overclock.net/t/1365868/please-does-vram-stack-in-sli-crossfire-or-not

http://forums.aria.co.uk/showthread.php/125719-Vram-stacking

And what settings are being used in those benchmarks you posted? Because every other benchmark that has been posted in this thread does not support your claim.

It doesn't contradict what I am saying. Every video card in SLI will use its own VRAM, so in overall, 6GB are used, not 3GB. Video Cards in SLI don't share VRAM among themselves, but still capable of accessing its own VRAM.

2-way SLI:

780 (1) uses 3Gb

780 (2) uses 3Gb

It's not as efficient as having 6GB of unified memory, but still eases the overall load.

Avatar image for deactivated-583e460ca986b
#126 Posted by deactivated-583e460ca986b (7240 posts) -

@jhonMalcovich said:

@GoldenElementXL said:

@jhonMalcovich said:

I know that it's splitting, but still,in overall, it uses 6GB, not 3GB. And as I already showed several benchmarks, GTX780 SLI is doing 60fps in most games except a few very heavy ones, like Crysis 3 and Metro Last Light (30fps at 4k)

Um, no. It uses 3GB.

I think you need to read these.

http://www.tomshardware.com/answers/id-1868829/sli-increase-vram-usage.html

http://www.overclock.net/t/1365868/please-does-vram-stack-in-sli-crossfire-or-not

http://forums.aria.co.uk/showthread.php/125719-Vram-stacking

And what settings are being used in those benchmarks you posted? Because every other benchmark that has been posted in this thread does not support your claim.

It doesn't contradict what I am saying. Every video card in SLI will use its own VRAM, so in overall, 6GB are used, not 3GB. Video Cards in SLI don't share VRAM among themselves, but still capable of accessing its own VRAM.

2-way SLI:

780 (1) uses 3Gb

780 (2) uses 3Gb

So you didn't read any of those threads. Nice.

In the case of GPU Vram 3+3=3. The same resources are stored on both GPU's memory so they can alternate drawing the frames. 3 GB of vram will be the most 1 780, or 4 780's will ever be able to use.

Avatar image for jhonMalcovich
#127 Edited by jhonMalcovich (6905 posts) -

@GoldenElementXL said:

@jhonMalcovich said:

@GoldenElementXL said:

@jhonMalcovich said:

I know that it's splitting, but still,in overall, it uses 6GB, not 3GB. And as I already showed several benchmarks, GTX780 SLI is doing 60fps in most games except a few very heavy ones, like Crysis 3 and Metro Last Light (30fps at 4k)

Um, no. It uses 3GB.

I think you need to read these.

http://www.tomshardware.com/answers/id-1868829/sli-increase-vram-usage.html

http://www.overclock.net/t/1365868/please-does-vram-stack-in-sli-crossfire-or-not

http://forums.aria.co.uk/showthread.php/125719-Vram-stacking

And what settings are being used in those benchmarks you posted? Because every other benchmark that has been posted in this thread does not support your claim.

It doesn't contradict what I am saying. Every video card in SLI will use its own VRAM, so in overall, 6GB are used, not 3GB. Video Cards in SLI don't share VRAM among themselves, but still capable of accessing its own VRAM.

2-way SLI:

780 (1) uses 3Gb

780 (2) uses 3Gb

So you didn't read any of those threads. Nice.

In the case of GPU Vram 3+3=3. The same resources are stored on both GPU's memory so they can alternate drawing the frames. 3 GB of vram will be the most 1 780, or 4 780's will ever be able to use.

I just don't know why are we discussing this. Video cards in sli alternate between each others to render frames. SO this is why sli setups usually almost double FPS in games. So technically, 6GB are used, 3GB per GPU, one frame per GPU.

GTX780 SLI is capable of 4k and 45-60fps in MOST GAMES. Period. Post me benchmarks when it doesn't, if it's not Crysis 3 or Metro. I had agreed with you that SLI memory usage doesn't act as a unified memory since the start. So what's the point discussing this ? Telling that 780 SLI is not capable of 4k is factually wrong, and benchmars shows that.

Avatar image for deactivated-583e460ca986b
#128 Edited by deactivated-583e460ca986b (7240 posts) -

@jhonMalcovich said:

@GoldenElementXL said:

So you didn't read any of those threads. Nice.

In the case of GPU Vram 3+3=3. The same resources are stored on both GPU's memory so they can alternate drawing the frames. 3 GB of vram will be the most 1 780, or 4 780's will ever be able to use.

I just don't know why are we discussing this. Video cards in sli alternate between each others to render frames. SO this is why sli setups usually almost double FPS in games. So technically, 6GB are used, 3GB per GPU, one frame per GPU.

GTX780 SLI is capable of 4k and 45-60fps in MOST GAMES. Period. Post me benchmarks when it doesn't, if it's not Crysis 3 or Metro.

We are still discussing this because you aren't getting something. I don't know if I can explain this to you a better way. But you are wrong about how VRAM is used in sli. The same information is being stored in each GPU's VRAM. Then they alternate generating each frame. The computational power is what is increasing when you add another GPU and is why you see a performance increase. But for this to happen each GPU has to have the same information stored. So if you see GPU 1 using 2.2 GB of VRAM you will see GPU 2 using the identical amount. That is because it has to mirror the information that is stored on GPU 1 in order to alternate frames in a efficient manner. Their VRAM is in no way used in a independent manner. EVER. So if you are using 3 GB GTX 780's in sli, the most VRAM that can ever be used is 3GB.

As for your benchmark request. Here. And these benchmarks are for 780 Ti's in sli so 780's would perform worse.

Avatar image for jhonMalcovich
#129 Edited by jhonMalcovich (6905 posts) -

@GoldenElementXL said:

@jhonMalcovich said:

@GoldenElementXL said:

So you didn't read any of those threads. Nice.

In the case of GPU Vram 3+3=3. The same resources are stored on both GPU's memory so they can alternate drawing the frames. 3 GB of vram will be the most 1 780, or 4 780's will ever be able to use.

I just don't know why are we discussing this. Video cards in sli alternate between each others to render frames. SO this is why sli setups usually almost double FPS in games. So technically, 6GB are used, 3GB per GPU, one frame per GPU.

GTX780 SLI is capable of 4k and 45-60fps in MOST GAMES. Period. Post me benchmarks when it doesn't, if it's not Crysis 3 or Metro.

We are still discussing this because you aren't getting something. I don't know if I can explain this to you a better way. But you are wrong about how VRAM is used in sli. The same information is being stored in each GPU's VRAM. Then they alternate generating each frame. The computational power is what is increasing when you add another GPU and is why you see a performance increase. But for this to happen each GPU has to have the same information stored. So if you see GPU 1 using 2.2 GB of VRAM you will see GPU 2 using the identical amount. That is because it has to mirror the information that is stored on GPU 1 in order to alternate frames in a efficient manner. Their VRAM is in no way used in a independent manner. EVER. So if you are using 3 GB GTX 780's in sli, the most VRAM that can ever be used is 3GB.

As for your benchmark request. Here. And these benchmarks are for 780 Ti's in sli so 780's would perform worse.

That benchmark is made at max settings. Turning off AA, which you don't need at 4k anyway, will provide a boost of 10fps or more. Jugling with a few setting also helps. So I still say GTX780 SLI is pretty viable for 4k ALREADY, for the people who already own that GPU like me. I mean what I lose in making a sli, nothing. And difference between 780 and 780it at 4k is marginal, 1-4fps.

And by the way, I am not really sure if memory size matters that much as a sli if GTX780Ti SLI (3gb) is pretty equal to Titan Black Sli (6GB) in benchmarks.

Avatar image for EducatingU_PCMR
#130 Posted by EducatingU_PCMR (1463 posts) -

The buffer is mirrored, it's the same info on GPU1 VRAM and GPU2 VRAM, effectively 3GB of VRAM for game assets.

3GB won't be future proof for 4K, even without AA.

Avatar image for deactivated-583e460ca986b
#131 Edited by deactivated-583e460ca986b (7240 posts) -

@jhonMalcovich said:

@GoldenElementXL said:

We are still discussing this because you aren't getting something. I don't know if I can explain this to you a better way. But you are wrong about how VRAM is used in sli. The same information is being stored in each GPU's VRAM. Then they alternate generating each frame. The computational power is what is increasing when you add another GPU and is why you see a performance increase. But for this to happen each GPU has to have the same information stored. So if you see GPU 1 using 2.2 GB of VRAM you will see GPU 2 using the identical amount. That is because it has to mirror the information that is stored on GPU 1 in order to alternate frames in a efficient manner. Their VRAM is in no way used in a independent manner. EVER. So if you are using 3 GB GTX 780's in sli, the most VRAM that can ever be used is 3GB.

As for your benchmark request. Here. And these benchmarks are for 780 Ti's in sli so 780's would perform worse.

That benchmark is made at max settings. Turning off AA, which you don't need at 4k anyway, will provide a boost of 10fps or more. Jugling with a few setting also helps. So I still say GTX780 SLI is pretty viable for 4k ALREADY, for the people who already own that GPU like me. I mean what I lose in making a sli, nothing. And difference between 780 and 780it at 4k is marginal, 1-4fps.

And by the way, I am not really sure if memory size matters that much as a sli if GTX780Ti SLI (3gb) is pretty equal to Titan Black Sli (6GB) in benchmarks.

Using your benchmark source (with custom settings at 4K) the difference between the 780 and the 780ti is more like 7-8 frames but I get your point. The difference isn't huge.

The way games are made right now allows the 780 ti to be close (and sometimes better) than the Titan Black in benchmarks. But you will start to see PC games that use more VRAM which will start to hurt the 780 ti's 3GB of VRAM. I know, I went through this same thing with my old build with a GTX 690. I was told time and time again the card was overkill for any game, even at 1440p. Games like Titanfall, Call of Duty and Watch Dogs started asking for 3 GB of VRAM at 1080p. So I had to start turning things down that I really didn't want to. Optimized or not this is the future. So to avoid this, I went with Titan Blacks instead of 780 Ti's. Overkill? Nope. I still can't max some games at 1440p and 60fps is very important imo. And you can't just turn off AA at 1440p from my experiences. I can't say either way for 4K since I have never played at that resolution. But it makes sense that it would need very little if any.

If you are ready to jump into 4K then go ahead and let us know what you think. But for what I want out of PC gaming, I don't think we are there yet. Maybe I'll rethink this when my upgrade is done. We'll see I guess.

Avatar image for mgools
#132 Edited by mgools (1170 posts) -

@Cranler said:

@mgools said:

@Cranler said:

4k tv's have really dropped in price. A 55" 120 hz 1080p lcd in 2010 was about the same price as a 55" 4k is now.

In another 3 or 4 years 1080p tv's will be pretty much phased out.

No need for 4K on the TV side. It is actually not a good thing right now. Broadcast TV is 720P and 1080i (and no plans to change this that I have heard of). Blu-ray is 1080P. A few videos from Youtube are 4K. What you end up with is video running at non native resolution which causes a soft image. (Same will go for Xbox One and PS4 games as they are upscaled)

Also what we run into at this point is a bandwidth and compression issue. Actually getting these larger bitrates to the home in good video quality is a task. 4K right now is more of a marketing thing then a usable thing.

On the PC gaming side though, if games can support it, and send it out natively and your pc can push it then things will look good. At some point there is a law of diminishing returns with resolution.

It's never about what consumers need. At this point most tv buyers are getting 1080p tv's to watch 720p tv content.

TV manufacturers and stores will always make the most room for the expensive tv's. You think Best Buy and Samsung will be content selling 65" 1080p tv's for $300 3 years from now? You don't think they're going to continue pushing all the latest tech like they always have?

Give me 1080P OLED instead of staying with a lesser tech in LCD (with LED backlighting) or plasma. Even if it is 4K. Was at a store the other day where they had an OLED 1080P right next to the 4K TV display, and the OLED was so much better quality.

Avatar image for Cranler
#133 Edited by Cranler (8809 posts) -

@mgools said:

@Cranler said:

@mgools said:

@Cranler said:

4k tv's have really dropped in price. A 55" 120 hz 1080p lcd in 2010 was about the same price as a 55" 4k is now.

In another 3 or 4 years 1080p tv's will be pretty much phased out.

No need for 4K on the TV side. It is actually not a good thing right now. Broadcast TV is 720P and 1080i (and no plans to change this that I have heard of). Blu-ray is 1080P. A few videos from Youtube are 4K. What you end up with is video running at non native resolution which causes a soft image. (Same will go for Xbox One and PS4 games as they are upscaled)

Also what we run into at this point is a bandwidth and compression issue. Actually getting these larger bitrates to the home in good video quality is a task. 4K right now is more of a marketing thing then a usable thing.

On the PC gaming side though, if games can support it, and send it out natively and your pc can push it then things will look good. At some point there is a law of diminishing returns with resolution.

It's never about what consumers need. At this point most tv buyers are getting 1080p tv's to watch 720p tv content.

TV manufacturers and stores will always make the most room for the expensive tv's. You think Best Buy and Samsung will be content selling 65" 1080p tv's for $300 3 years from now? You don't think they're going to continue pushing all the latest tech like they always have?

Give me 1080P OLED instead of staying with a lesser tech in LCD (with LED backlighting) or plasma. Even if it is 4K. Was at a store the other day where they had an OLED 1080P right next to the 4K TV display, and the OLED was so much better quality.

Oled 4k would be the ideal. oled's seem to be having a lot of production issues resulting in a very small selection and also no oled gaming monitors are available.

Avatar image for GTSaiyanjin2
#134 Edited by GTSaiyanjin2 (6017 posts) -

Seen few 4k monitors reviews, and the input lag was horrendous... From the top of my head the lag was 90ms which is almost 6 frames of lag per second. That would make almost any game unplayable. Will wait to see more 4k monitors design towards gamers before I even think about getting one.

Avatar image for mdign
#135 Edited by MDIGN (30 posts) -

@KHAndAnime said:

@mdign said:

@scatteh316 said:

@jhonMalcovich said:

@clyde46:

Actually, not true. I made a research. GTX Titan Z (3000$), AMD R9 295 X2 (1500 USD), GTX 780 Sli (1100 USD) can do both 4k and 60fps in most games.

Later this year, I will buy a second GTX780 adn will be able to make a sli and play games at 4k, 50-60fps :D

So yeh, by making a SLI 4k is pretty afordable considering that is quite recent feature. In 3-4 years, a sinlge high-end gpu will be able to do 4k and 60fps without any hitch.

You don't have the memory to max out games at 4k on GTX 780's.

...I'm sorry, but do you know what you're talking about? That makes almost no logical sense.

You need more video memory to max out games on a GTX 780, what makes no logical sense about that?

@topgunmv said:

@mdign said:

@scatteh316 said:

@jhonMalcovich said:

@clyde46:

Actually, not true. I made a research. GTX Titan Z (3000$), AMD R9 295 X2 (1500 USD), GTX 780 Sli (1100 USD) can do both 4k and 60fps in most games.

Later this year, I will buy a second GTX780 adn will be able to make a sli and play games at 4k, 50-60fps :D

So yeh, by making a SLI 4k is pretty afordable considering that is quite recent feature. In 3-4 years, a sinlge high-end gpu will be able to do 4k and 60fps without any hitch.

You don't have the memory to max out games at 4k on GTX 780's.

...I'm sorry, but do you know what you're talking about? That makes almost no logical sense.

Probably referring to video memory. Nvidia likes to cheap out on ram, so I'm going to guess that the 780 only has 3 gigs.

I'm running a GPU with only 2GBs of RAM and have no problem maxing games @4k. However, this is an AMD GPU so that may have something to do with it. You can still max games at 4k with 3GBs though, the main thing is the GPU itself. For some reason people have gotten a misconception that having insane amounts of RAM on the graphics card will magically improve performance at 4k. Yes it helps, but it's far from the deciding factor.

Avatar image for deactivated-583e460ca986b
#136 Posted by deactivated-583e460ca986b (7240 posts) -

@mdign: OK what do you mean by "maxing"? What frame rate are you playing at? And what games are these? Solitaire and Mine Sweeper?

Avatar image for illage2
#137 Edited by illage2 (642 posts) -

Who says we can't have 4K and 60fps???? SLI Two GTX Titan Blacks, with an SSD and the fastest multicore processor on the market with 32GB's of RAM. 4k & 60fps Easily

Avatar image for bezza2011
#138 Posted by bezza2011 (2729 posts) -

Yes you can game at 60fps at 4K or at least just under 60fps, you do have to buy a rig which would cost you $3000 but it can be done :S it's well worth it, if you wanna see the future now, but a $300 rig maybe made beginning of next year once the new hardware comes out, then as long as your smart in your build your'll have a PC which will last a good few years,

Avatar image for Zelda187
#139 Edited by Zelda187 (1047 posts) -

@KHAndAnime said:

@Cranler said:

@KHAndAnime said:

I think I'll stick with 1080p/1200p, thanks. 1080P is crisp and jaggy free with any AA - so there's zero reason to upgrade (ever). I can't see a point 4k will ever be the standard - there's absolutely no reason for it. It's upping the resolution just for the sake - which might seem cool to people who don't understand resolution is low on the list of what makes good IQ, but it's not so useful for everyone else.

4K will become standard eventually. Manufacturers like to phase out the old cheap stuff which is what 1080p will be soon.

No, not for a very, very long time. There's no 4k-native content, none on the horizon, and there's no reason for it to even exist. People aren't going to randomly upgrade to 4k monitors even if they were more common and cheap - it takes serious hardware to run games at 4k resolution, nobody will want to pay tons of extra cash for a barely noticeable boost in sharpness. It's a niche for enthusiasts.

This

4K at this stage is an unnecessary luxury for people who have money to blow, absolutely no life outside of playing videogames or...both.

It would cost you at least $4-5 grand right now to buy a 4K monitor and put together a rig capable of sustaining adequate frame rates at that resolution. And like you said, for what? A minor, barely noticeable increase in sharpness? No human being with the slightest bit of common sense or without a job paying in excess of $100,000 a year would make the jump to 4K at this stage. It's utterly pointless.

Avatar image for faizan_faizan
#140 Posted by faizan_faizan (7869 posts) -

@Cranler said:

One thing though is that some of the most demanding settings in Crysis have very little visual impact. You can drop down a couple settings, save 30 fps and see very little difference.

I'm aware of that, but that's not the point.

@Couth_ said:

@faizan_faizan said:

He's not wrong.

I'm specifially talking about no AA, also your second pic is 2560x1600 which is an even higher resolution than 1440p

Did you even read mine, or his post? Did you even bother to look at the graphs?

Avatar image for -Unreal-
#141 Posted by -Unreal- (24650 posts) -

I'd rather have 1440p at 60-120 FPS.

Avatar image for AzatiS
#142 Edited by AzatiS (11219 posts) -

@RyviusARC said:

I wouldn't mind a frame rate lower than 60fps in most games.

Also keep in mind that even the highest resolution console games are only 1920x1080.

For comparison:

1920x1080 = 2073600 pixels

4k - 3840x2160 = 8294400 pixels

It's a 4x increase in pixels which requires about 4x the power from a GPU.

Even having the current hardware to run higher end games at high settings at 4k shows the power gap between PC and consoles.

Just imagine that gap in another 3-4 years when 4k is a lot easier to run.

Actually is way higher than 4x power from current GPUs

Avatar image for mdign
#143 Posted by MDIGN (30 posts) -

@GoldenElementXL:

I'm going to assume that you're trolling, but I'll pretend you aren't and feed it anyway.

If you read my past post you'd see that I have played Injustice and some other games at 4k, and that I even linked screenshots. I get well above 60 FPS in Injustice, but the game is locked to 60fps. Inb4, "Dats not evn that hard too run dat game, play sumthing real like minecraft or minesweeper noob rofl" I also get 160-210fps in CS:GO and 100+fps in Skyrim max settings with 4k mods, Inb4 "Well Source games are like easy to run" I can do this over and over on different games, but I won't waste my time to prove facts to someone. Oh, and since you seem so inclined to mock me as if I'm a liar that doesn't know what he's talking about here's some pics (They're raw .bmps so they're too large to upload to GameSpot, or at least it acts up for me when I try) :
CS:GO 1

CS:GO 2

CS:GO 3

Injustice 1

Injustice 2

Injustice 3


These are all consumer 4K, which is slightly less than True 4K. All max settings with no AA, but full Anisotropic Filtering. As I said before, the only problem is my TV/monitor being stuck with 30hz for 4K content due to HDMI 1.4 resrictions.

Next question/statement is "Alright dude calm down, but how much did your rig cost? I bet it was like $2000+ right?"

Go ahead, ask.

Avatar image for deactivated-583e460ca986b
#144 Posted by deactivated-583e460ca986b (7240 posts) -

@mdign said:

@GoldenElementXL:

I'm going to assume that you're trolling, but I'll pretend you aren't and feed it anyway.

If you read my past post you'd see that I have played Injustice and some other games at 4k, and that I even linked screenshots. I get well above 60 FPS in Injustice, but the game is locked to 60fps. Inb4, "Dats not evn that hard too run dat game, play sumthing real like minecraft or minesweeper noob rofl" I also get 160-210fps in CS:GO and 100+fps in Skyrim max settings with 4k mods, Inb4 "Well Source games are like easy to run" I can do this over and over on different games, but I won't waste my time to prove facts to someone. Oh, and since you seem so inclined to mock me as if I'm a liar that doesn't know what he's talking about here's some pics (They're raw .bmps so they're too large to upload to GameSpot, or at least it acts up for me when I try) :

CS:GO 1

CS:GO 2

CS:GO 3

Injustice 1

Injustice 2

Injustice 3

These are all consumer 4K, which is slightly less than True 4K. All max settings with no AA, but full Anisotropic Filtering. As I said before, the only problem is my TV/monitor being stuck with 30hz for 4K content due to HDMI 1.4 resrictions.

Next question/statement is "Alright dude calm down, but how much did your rig cost? I bet it was like $2000+ right?"

Go ahead, ask.

I won't ask how much your rig costs because I have spent a pretty good amount on mine. (probably more than you did but who's keeping track..)

Now I wont say those games are easy to run. But lets see some screen shots with fraps running with games like BF4, Sleeping Dogs, Borderlands 2 or Bioshock Infinite. I won't even include Crysis 3 or any Metro games.


I am both not trolling and a pretty serious gamer. Both PC and console. If CS GO and Injustice are all you want to play at 4K then fine. But you will struggle with any multiplat title that releases this generation at 4K. Especially with 2 GB of VRAM.



Avatar image for mdign
#145 Edited by MDIGN (30 posts) -

@GoldenElementXL:

@GoldenElementXL said:

@mdign said:

I won't ask how much your rig costs because I have spent a pretty good amount on mine. (probably more than you did but who's keeping track..)

Now I wont say those games are easy to run. But lets see some screen shots with fraps running with games like BF4, Sleeping Dogs, Borderlands 2 or Bioshock Infinite. I won't even include Crysis 3 or any Metro games.

I am both not trolling and a pretty serious gamer. Both PC and console. If CS GO and Injustice are all you want to play at 4K then fine. But you will struggle with any multiplat title that releases this generation at 4K. Especially with 2 GB of VRAM.

*sigh I've run the Battlefield: Hardline Beta on 4k @45fps DirectX and it boosts all the way up to 64fps with Mantle. I'll tell you the same thing I've told wanna be gamers that stick 16GBs of RAM in their system thinking it'll have some massive affect on game performance over 8GBs of RAM. Most games won't use over 2GBs of VRAM, the ones that do are games like Battlefield 4 where the game is optimized to use any and all RAM available. Let's pretend I did run into some wall with a game, I haven't even overclocked my cards yet. I can ramp the GPU clocks a few hz and call it a day. "Well what if that isn't enough?" Then I can move the settings from 'ULTRA' to 'VERY HIGH' and enjoy a game with almost no difference in the graphics at 60fps. Seriously, most settings in Crysis 3 are resource hogs with minimal graphical benefits.

Video Card Performance: 2GB vs 4GB Memory

Also, I already told you twice I play way more than CS:GO and Injustice. I listed Skyrim as well. Not to mention the 1000+ PS2, Wii, GameCube and other games that I play at 4k. (It's really an awesome experience playing your favorite classics in upscaled or native 4k. If you can you should try it.) I also play Divinity: Original Sin, FFXIV, Diablo III(has a lot of problems like constant crashing, blue screens, and overheating), Dark Souls II, Arma III, ect. New games aren't even that demanding. It's only once or twice a year that we get a game worth using as a benchmark.

Do you really want me to go through the trouble of downloading and/or buying those games just to prove you wrong... again? I mean let's be rational here, I'm only here to provide 'proof of concept' that 4K gaming is already here and playable. I'm not here to brag about my uber powerful rig and make you bow before me or whatever crazy elitists do. I'm not saying my rig is a permanent solution either. "Well your system won't be able to max out next-years games at 4K." I hope so, by that time we should have better cards than mine along with better TVs and monitors. However, if you really want me to do this fine.

Also, it doesn't matter how much you've spent on your rig. Let's not turn this into an e-peen slinging competition, because if that's how you want to play it I've already won considering your rig can't touch 4K apparently. Performance > Price Always.

Anyway, enough of my 'opinions' here's a video: You can skip to about 6:00 if you want. (I don't watch his vids, but he's just here to prove a point.)

Loading Video...

Avatar image for RyviusARC
#146 Posted by RyviusARC (5470 posts) -

@AzatiS said:

@RyviusARC said:

I wouldn't mind a frame rate lower than 60fps in most games.

Also keep in mind that even the highest resolution console games are only 1920x1080.

For comparison:

1920x1080 = 2073600 pixels

4k - 3840x2160 = 8294400 pixels

It's a 4x increase in pixels which requires about 4x the power from a GPU.

Even having the current hardware to run higher end games at high settings at 4k shows the power gap between PC and consoles.

Just imagine that gap in another 3-4 years when 4k is a lot easier to run.

Actually is way higher than 4x power from current GPUs

How so?

Avatar image for deactivated-583e460ca986b
#147 Edited by deactivated-583e460ca986b (7240 posts) -

@mdign:

OK lets go.

First off. My rig.

Second. Vram usage benchmarks.

And I said Battlefield 4 not 3.



































And now your "maxed" claim has turned into "very high."

Nice!!

Avatar image for mdign
#148 Edited by MDIGN (30 posts) -

@GoldenElementXL said:

@mdign:

OK lets go.

First off. My rig.

Second. Vram usage benchmarks.

And I said Battlefield 4 not 3.

And now your "maxed" claim has turned into "very high."

Nice!!

Lmao, wow. I'm done with you troll/elitist whatever you want to call yourself. I've already told you what this was about and you've turned it around into some PC rig e-peen measuring contest.

Second. If you actually read or even remotely knew anything about programming and hardware you'd know this:"Most games won't use over 2GBs of VRAM, and the ones that do are games like Battlefield 4 where the game is optimized to use any and all VRAM available as it sees fit." Wow, wait that's an exact quote from me that you didn't read.

Third. Those aren't my marks, I just grabbed them to show you that RAM =/= performance. AKA the whole reason you started this debate.

You didn't bother reading anything or using logic, therefore I won't entertain this garbage.

Also, tell me again how I turned maxed to very high? I want the exact set of quotes too.

Avatar image for Evo_nine
#149 Edited by Evo_nine (2161 posts) -

You cant game at 4k / 60fps on TV's anyways until they start bringing out GPU's with hdmi 2.0.

Until then its 4k/30fps.

1080p/60fps is enough for me for the time being.

Avatar image for deactivated-583e460ca986b
#150 Edited by deactivated-583e460ca986b (7240 posts) -

@mdign said:

@GoldenElementXL said:

@mdign:

OK lets go.

First off. My rig.

Second. Vram usage benchmarks.

And I said Battlefield 4 not 3.

And now your "maxed" claim has turned into "very high."

Nice!!

Lmao, wow. I'm done with you troll/elitist whatever you want to call yourself. I've already told you what this was about and you've turned it around into some PC rig e-peen measuring contest.

Second. If you actually read or even remotely knew anything about programming and hardware you'd know this:"Most games won't use over 2GBs of VRAM, and the ones that do are games like Battlefield 4 where the game is optimized to use any and all VRAM available as it sees fit." Wow, wait that's an exact quote from me that you didn't read.

Third. Those aren't my marks, I just grabbed them to show you that RAM =/= performance. AKA the whole reason you started this debate.

You didn't bother reading anything or using logic, therefore I won't entertain this garbage.

Also, tell me again how I turned maxed to very high? I want the exact set of quotes too.

You posted this in your last response to me. - "Then I can move the settings from 'ULTRA' to 'VERY HIGH' and enjoy a game with almost no difference in the graphics at 60fps."

And as for your "You've turned it around into some PC rig e-peen measuring contest."

You said this in your last post. - "because if that's how you want to play it I've already won considering your rig can't touch 4K"

Just scroll up if you want to see what you've said.

You call my own personal experience with PC gaming/benchmarks "garbage." To prove me wrong go to the PC benchmark thread in the PC gaming discussion board. Run the bench and post your score. If you score higher than me then you can claim you're right . Otherwise you are full of it. I have posted proof that VRAM is important when gaming in 4K. All you have done is post bullsh*t with nothing to back it up.

As for programming and hardware knowledge. Lets just say you don't want to go there.