NVIDIA RTX 3000 series great price. But NVIDIA skimping on Video Memory?

  • 64 results
  • 1
  • 2
Avatar image for Xtasy26
Xtasy26

5412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 0

Poll NVIDIA RTX 3000 series great price. But NVIDIA skimping on Video Memory? (35 votes)

Yes. 51%
No. 46%

I was absolutely surprised at the price/performance of the new RTX 3000 series. Only to be disappointed to find out that RTX 3070 series only offers 8GB and the RTX 3080 10 GB. 8GB seems so 2016. I am looking to jump to 4K and that's definitely not enough to future proof it. I know, as much as I like my GTX 1060 6GB I can't max out DOOM Eternal at 1080P because to run it at Ultra Nightmare it requires 6.7 GB at 1080P, as stated by tom’s hardware(ultra nightmare needs 6766 MiB at 1080p), while GTX 1060 can run it at descent frame rates but it lacks memory to do so. RX 580 8GB doesn't have this problem, makes me think that I should have gone AMD with 8 GB video memory. Decided to go with 1060 6GB at time because at the time it was widely available and nVidia had the best price/performance/power at the time. I want to jump to 4K but games are already pushing 11 GB maxed out. Heck some games are pushing 13 GB maxed out at 4K.

No Caption Provided

I really want to get RTX 3000 series (above 3070) but it’s memory is wanting me to wait and see what comes from AMD. Don’t want to get stuck in the same situation where I run out of video memory at 4K couple of years down the road.

So, what do you all say, nVidia skimping on video memory?

 • 
Avatar image for hardwenzen
hardwenzen

3582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 hardwenzen  Online
Member since 2005 • 3582 Posts

Perfect for 1440p so i am fine with the 8gb

Avatar image for rmpumper
rmpumper

1142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 rmpumper
Member since 2016 • 1142 Posts

Not so much skimping as waiting for AMD to reveal what they have been cooking up in order to pull out their 16GB 3070Ti at $599 and maybe a 20GB 3080Ti at $899.

Avatar image for Random_Matt
Random_Matt

5588

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#3 Random_Matt
Member since 2013 • 5588 Posts

It is not a great price, it is that way on purpose. FE will sell out in minutes, AIB prices rise, they are already creeping up. I'm grabbing one as soon as, but Nvidia is conning each and every one you.

Avatar image for jasonofa36
JasonOfA36

2325

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#4 JasonOfA36  Online
Member since 2016 • 2325 Posts

It'd be fine. GDDR6 is fast enough that you won't see any huge problems with VRAM limitations. Also, the graphics from the VII is a bit misleading. That's cached VRAM vs utilized VRAM. Utilized VRAM for 4K on recent AAA titles hit about 8-9GB, and if you have a bigger VRAM, uses the rest as cache.

Avatar image for DragonfireXZ95
DragonfireXZ95

26134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 DragonfireXZ95
Member since 2005 • 26134 Posts

@Random_Matt said:

It is not a great price, it is that way on purpose. FE will sell out in minutes, AIB prices rise, they are already creeping up. I'm grabbing one as soon as, but Nvidia is conning each and every one you.

You mean kind of like how Sony said they'd be down 4 million Playstation 5 consoles at launch? Because they are conning everyone and waiting for the potential scalping?

Avatar image for eoten
Eoten

1035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#6 Eoten
Member since 2020 • 1035 Posts

@DragonfireXZ95 said:
@Random_Matt said:

It is not a great price, it is that way on purpose. FE will sell out in minutes, AIB prices rise, they are already creeping up. I'm grabbing one as soon as, but Nvidia is conning each and every one you.

You mean kind of like how Sony said they'd be down 4 million Playstation 5 consoles at launch? Because they are conning everyone and waiting for the potential scalping?

Is Sony getting on board with scalping? Nintendo seems to be enabling them on purpose as well.

Avatar image for DragonfireXZ95
DragonfireXZ95

26134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 DragonfireXZ95
Member since 2005 • 26134 Posts

@eoten said:
@DragonfireXZ95 said:
@Random_Matt said:

It is not a great price, it is that way on purpose. FE will sell out in minutes, AIB prices rise, they are already creeping up. I'm grabbing one as soon as, but Nvidia is conning each and every one you.

You mean kind of like how Sony said they'd be down 4 million Playstation 5 consoles at launch? Because they are conning everyone and waiting for the potential scalping?

Is Sony getting on board with scalping? Nintendo seems to be enabling them on purpose as well.

I don't know. I'm just saying that because it's a stupid conspiracy theory and it sounds stupid to say.

Avatar image for ButDuuude
ButDuuude

1701

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#8 ButDuuude
Member since 2013 • 1701 Posts

@DragonfireXZ95: Sony said that’s not true.

https://www.digitaltrends.com/gaming/sony-denies-ps5-production-problems/?amp

Avatar image for MirkoS77
MirkoS77

15503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#9 MirkoS77
Member since 2011 • 15503 Posts

Why I’m dropping a hefty dime on the 3090.

Avatar image for WESTBLADE85
WESTBLADE85

237

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 WESTBLADE85
Member since 2013 • 237 Posts

LOL @ AMD's Radeon division an their downright misleading marketing. Really sad that people still fall for that.

To cut it short - RTX 2080 (vanilla) outperforms it at 4K most of the time despite having "only" 8GB GDDR6 VRAM

-------------------------

RTX 3070 - Basically the new 1440p "sweetspot", so... no?
RTX 3080 - Will tell you if I run into any issues... :P

Avatar image for R4gn4r0k
R4gn4r0k

35180

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11 R4gn4r0k
Member since 2004 • 35180 Posts

Imagine spending 700+ on one piece in your PC and not being able to run all games at 4K because Nvidia cheaped out on memory.

Hope this is not 970 bullshit all over again, Nvidia

Avatar image for BassMan
BassMan

12569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 143

User Lists: 0

#12  Edited By BassMan
Member since 2002 • 12569 Posts

I never had an issue with VRAM at 4K on the 2080 Ti. The next-gen consoles will also not be exceeding 10GB VRAM. So, I am not worried about it for the 3080. If it becomes an issue, I will lower settings as needed. I will most likely be upgrading again with the 4000 series anyway.

Avatar image for DragonfireXZ95
DragonfireXZ95

26134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 DragonfireXZ95
Member since 2005 • 26134 Posts

@ButDuuude said:

@DragonfireXZ95: Sony said that’s not true.

https://www.digitaltrends.com/gaming/sony-denies-ps5-production-problems/?amp

Read my other comment below that.

Avatar image for Vaasman
Vaasman

14514

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#14  Edited By Vaasman
Member since 2008 • 14514 Posts

This is like arguing that all that matters about a CPU is the core clock speed.

Avatar image for pc_rocks
PC_Rocks

4412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#15 PC_Rocks
Member since 2018 • 4412 Posts

How much memory graphics memory next-gen consoles have?

Avatar image for ivangrozny
IvanGrozny

1262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 IvanGrozny
Member since 2015 • 1262 Posts

It's more than enough for 4k gaming. The next memory size jump will be when 8k becomes a standard.

Avatar image for davillain-
DaVillain-

43694

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#18 DaVillain-  Moderator
Member since 2014 • 43694 Posts

@BassMan said:

I never had an issue with VRAM at 4K on the 2080 Ti. The next-gen consoles will also not be exceeding 10GB VRAM. So, I am not worried about it for the 3080. If it becomes an issue, I will lower settings as needed. I will most likely be upgrading again with the 4000 series anyway.

For 4K yes, but the way I see it, I still feel like the 3080 is the 1440p card to go for as that’s its primary target. It’s going to demolish games at 1440p ultra this gen if you ask and I come to a conclusion just get myself 3080 cause I'll be doing myself and my PC a favor. 3070 isn't what it all seems after looking at the benchmark leaks.

Avatar image for goldenelementxl
GoldenElementXL

4767

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 GoldenElementXL
Member since 2016 • 4767 Posts

VRAM usage has been greatly exaggerated. Once we learned those MSI programs were showing “allocated” VRAM and not actual used VRAM, people started digging further. 10GB of VRAM is plenty with the speed of VRAM these days. ESPECIALLY if you’re using DLSS.

Avatar image for Messiahbolical-
Messiahbolical-

5660

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 Messiahbolical-
Member since 2009 • 5660 Posts

The price is trash. It's sad to see how well Nvidia's tactic worked of blowing the price up to outrageous levels then dropping them back down to still very high levels to make people think they're "cheap". Even though you're paying $700 for just a graphics card, not to mention the 2nd best(soon to be 3rd best when the Ti version comes out) Nvidia card when like 5 years ago the top tier Nvidia card was like $500.

Meanwhile in just a few months we got new consoles coming out for likely $400-500 that include a whole mid range GPU, 8 core Zen2 CPU, 1tb SSD(ps5's is state of the art), motherboard, 4k Blu Ray Player, fast af RAM, cooling, PSU, case, wifi card, controller... look at the value difference. Shit, I don't even expect the GPU in the consoles to be that much weaker than a RTX 2070 either, which is already $500 alone.

As much as I love PC gaming, it's crazy how PC gamers are such pushovers with their wallets in general. Companies could charge whatever they want and I guarantee people will still buy it. I see people spending $130 for a 2-pack of Corsair case fans. Not even big fans, just 120mm. Shit you can buy a whole nice ass ceiling fan from a reputable company for that much money. Not to mention the fact that they charge full price for graphics cards pretty much until the new ones launch is fucked. No reason people should have been still paying $1200 for a nearly 2 year old 2080Ti until now. Basically just rips off people who aren't tech nerds and don't know about the new cards coming out. IMO with the rapid release cycle of these kinds of products, prices should drop steadily little by little after 6-8 months of a GPU/CPU being out.

Avatar image for pelvist
pelvist

7922

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#21  Edited By pelvist
Member since 2010 • 7922 Posts

Im ordering a 3080 as soon as the website lets me, I want to get one as soon as I can before there is any reason for the prices to get inflated. I still need a new power supply too but the prices of those are through the root at the min, I wish id have bought one sooner as now I dont know weather i should wait for the prices to go down or just buy one now. I dont want to risk making that mistake with the 3080 as its a lot more expensive than a PSU.

Avatar image for eoten
Eoten

1035

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#22 Eoten
Member since 2020 • 1035 Posts

@Messiahbolical- said:

The price is trash. It's sad to see how well Nvidia's tactic worked of blowing the price up to outrageous levels then dropping them back down to still very high levels to make people think they're "cheap". Even though you're paying $700 for just a graphics card, not to mention the 2nd best(soon to be 3rd best when the Ti version comes out) Nvidia card when like 5 years ago the top tier Nvidia card was like $500.

Meanwhile in just a few months we got new consoles coming out for likely $400-500 that include a whole mid range GPU, 8 core Zen2 CPU, 1tb SSD(ps5's is state of the art), motherboard, 4k Blu Ray Player, fast af RAM, cooling, PSU, case, wifi card, controller... look at the value difference. Shit, I don't even expect the GPU in the consoles to be that much weaker than a RTX 2070 either, which is already $500 alone.

As much as I love PC gaming, it's crazy how PC gamers are such pushovers with their wallets in general. Companies could charge whatever they want and I guarantee people will still buy it. I see people spending $130 for a 2-pack of Corsair case fans. Not even big fans, just 120mm. Shit you can buy a whole nice ass ceiling fan from a reputable company for that much money. Not to mention the fact that they charge full price for graphics cards pretty much until the new ones launch is fucked. No reason people should have been still paying $1200 for a nearly 2 year old 2080Ti until now. Basically just rips off people who aren't tech nerds and don't know about the new cards coming out. IMO with the rapid release cycle of these kinds of products, prices should drop steadily little by little after 6-8 months of a GPU/CPU being out.

But here I am in a position where I can either buy the next Xbox console for $499, or wait till the next AMD GPU comes out later this year. The rest of my PC still outperforms anything any of the consoles offer, as will the GPU. I'll have greater than the "pro" console version that'll no doubt come out in a few years, and I'll have it now. And in order to catch up where this PC will be at with that GPU upgrade, you'll have to buy ANOTHER Xbox (that Pro version) later at an additional $499.

So let's say I buy a GPU for $499, Motherboard for $75, CPU for $200, ram for $100, 1TB SSD for $100, Aftermarket cooler for $25. And I can keep reusing my case, it's fans, and it's power supply for years to come. So even if Xbox takes a loss on hardware, you're still going to pay as much over the life of the console, and have a lesser system in the end than what I will start out with. But my current case was $50, PSU was $75, and I added a 5-pack of 140mm fans to it for an additional $25 or so.

Avatar image for lhughey
lhughey

4695

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 lhughey
Member since 2006 • 4695 Posts

@rmpumper said:

Not so much skimping as waiting for AMD to reveal what they have been cooking up in order to pull out their 16GB 3070Ti at $599 and maybe a 20GB 3080Ti at $899.

I think you're right about them introducing Ti edition cards, but there is a big diff between 10gb and 16gb, which shows that 10gb is a lil low. I think 12gb & 16gb would have been a better release for the 3070 and 3080, especially for the price.

Avatar image for davillain-
DaVillain-

43694

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#24 DaVillain-  Moderator
Member since 2014 • 43694 Posts

@Messiahbolical- said:

The price is trash. It's sad to see how well Nvidia's tactic worked of blowing the price up to outrageous levels then dropping them back down to still very high levels to make people think they're "cheap". Even though you're paying $700 for just a graphics card, not to mention the 2nd best(soon to be 3rd best when the Ti version comes out) Nvidia card when like 5 years ago the top tier Nvidia card was like $500.

Meanwhile in just a few months we got new consoles coming out for likely $400-500 that include a whole mid range GPU, 8 core Zen2 CPU, 1tb SSD(ps5's is state of the art), motherboard, 4k Blu Ray Player, fast af RAM, cooling, PSU, case, wifi card, controller... look at the value difference. Shit, I don't even expect the GPU in the consoles to be that much weaker than a RTX 2070 either, which is already $500 alone.

As much as I love PC gaming, it's crazy how PC gamers are such pushovers with their wallets in general. Companies could charge whatever they want and I guarantee people will still buy it. I see people spending $130 for a 2-pack of Corsair case fans. Not even big fans, just 120mm. Shit you can buy a whole nice ass ceiling fan from a reputable company for that much money. Not to mention the fact that they charge full price for graphics cards pretty much until the new ones launch is fucked. No reason people should have been still paying $1200 for a nearly 2 year old 2080Ti until now. Basically just rips off people who aren't tech nerds and don't know about the new cards coming out. IMO with the rapid release cycle of these kinds of products, prices should drop steadily little by little after 6-8 months of a GPU/CPU being out.

PC gamers need to remember you only upgrade when you really need to or want to and money will always be a factor and most importantly, you get what you paid for. There's a reason AMD/Nvidia exist to give gamers option if you want the best of the best or something affordable for less but sacrificing raw power GPU. Nvidia on the other hand learned their lesson this time around with the prices, and RTX 3080 is price fair for what your getting at but there's always 3070 for a reasonable price as well.

Still, if you have a nice PC, I don't see any reasons to go buy Xbox Series X at all when MS now supports PC and the only thing I see myself buying is PS5 for exclusives if I want to but I'll always buy Nintendo console for the exclusives. Buying the top of the line GPU should only be for 4K/60fps+ and that's why Nvidia charges more. Do you need 4K? That's the question and I don't game in 4K, not interested and 1440p is still the goat.

Avatar image for enzyme36
enzyme36

4570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 enzyme36
Member since 2007 • 4570 Posts

Yes AMD card looks amazing... everyone skip on the 3080 and wait for the AMD conference. Stay off your PCs tomorrow morning

Avatar image for zaryia
Zaryia

13575

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 Zaryia  Online
Member since 2016 • 13575 Posts

@enzyme36 said:

Yes AMD card looks amazing... everyone skip on the 3080 and wait for the AMD conference. Stay off your PCs tomorrow morning

Stay out of lines too. Covid-19 warning.

Avatar image for Pedro
Pedro

42984

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#27 Pedro
Member since 2002 • 42984 Posts

I rarely take advantage of my 16GB of VRAM. People just like seeing higher numbers regardless of real world utilization.

Avatar image for enzyme36
enzyme36

4570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28  Edited By enzyme36
Member since 2007 • 4570 Posts

@zaryia said:
@enzyme36 said:

Yes AMD card looks amazing... everyone skip on the 3080 and wait for the AMD conference. Stay off your PCs tomorrow morning

Stay out of lines too. Covid-19 warning.

Good advice... safety 1st

Avatar image for zaryia
Zaryia

13575

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29  Edited By Zaryia  Online
Member since 2016 • 13575 Posts

Avatar image for Zero_epyon
Zero_epyon

15105

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#30 Zero_epyon
Member since 2004 • 15105 Posts

Those 10 GB of ram will probably be way faster than the 16GB in the AMD cards. And with Direct Storage, games won't need to fill a GPU like that as they can be streamed in from super fast SSDs.

Avatar image for dxmcat
dxmcat

3373

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31 dxmcat
Member since 2007 • 3373 Posts

Doom? LOL oh plz.

Try FFXV with 4k assets, that shit eats up 10GB of my 1080ti.

Avatar image for 04dcarraher
04dcarraher

23533

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#32 04dcarraher
Member since 2004 • 23533 Posts

@dxmcat said:

Doom? LOL oh plz.

Try FFXV with 4k assets, that shit eats up 10GB of my 1080ti.

it dont actually use it. The game sees no real performance gain after 6gb of vram usage. Its just cramming as much at it can into the vram as possible. with 4k with the texture pack it only needs 8gb.

Avatar image for dxmcat
dxmcat

3373

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 dxmcat
Member since 2007 • 3373 Posts

I know the difference between comitted and used. gtfo.

Avatar image for Pedro
Pedro

42984

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#34 Pedro
Member since 2002 • 42984 Posts

@Zero_epyon said:

Those 10 GB of ram will probably be way faster than the 16GB in the AMD cards. And with Direct Storage, games won't need to fill a GPU like that as they can be streamed in from super fast SSDs.

Really? How much faster?

Avatar image for DragonfireXZ95
DragonfireXZ95

26134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 DragonfireXZ95
Member since 2005 • 26134 Posts

@enzyme36 said:
@zaryia said:
@enzyme36 said:

Yes AMD card looks amazing... everyone skip on the 3080 and wait for the AMD conference. Stay off your PCs tomorrow morning

Stay out of lines too. Covid-19 warning.

Good advice... safety 1st

Lol, you're just trying to make sure you get a 3080. 😂

Avatar image for Zero_epyon
Zero_epyon

15105

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#36 Zero_epyon
Member since 2004 • 15105 Posts

@Pedro said:
@Zero_epyon said:

Those 10 GB of ram will probably be way faster than the 16GB in the AMD cards. And with Direct Storage, games won't need to fill a GPU like that as they can be streamed in from super fast SSDs.

Really? How much faster?

I don't know since we don't know all that much about the new AMD cards. That's why I said probably.

Avatar image for Xtasy26
Xtasy26

5412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 0

#37 Xtasy26
Member since 2008 • 5412 Posts

@Zero_epyon said:

Those 10 GB of ram will probably be way faster than the 16GB in the AMD cards. And with Direct Storage, games won't need to fill a GPU like that as they can be streamed in from super fast SSDs.

That's a lot of assumption. That's like saying Fury X with it's faster HBM memory was equivalent to 980 Ti's 6GB RAM. For a time it may have been when AMD was optimizing their driver to make use of HBM memory. But what about down the road like in 2 years time when more and more games come out, especially with the newer consoles now with more VRAM. I wouldn't be surprised if newer games easily pushes past 10 GB in 4K when we already have games pushing 10GB in 4K.

My 1060 6GB is getting pushed in certain games even at 1080P like the above mentioned Doom Eternal where you can't even choose Ultra Nightmare settings because of 6GB frame buffer, even though pretty sure the 1060 has enough horse power to run it.

Avatar image for Xtasy26
Xtasy26

5412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 0

#38 Xtasy26
Member since 2008 • 5412 Posts
@Messiahbolical- said:

The price is trash. It's sad to see how well Nvidia's tactic worked of blowing the price up to outrageous levels then dropping them back down to still very high levels to make people think they're "cheap". Even though you're paying $700 for just a graphics card, not to mention the 2nd best(soon to be 3rd best when the Ti version comes out) Nvidia card when like 5 years ago the top tier Nvidia card was like $500.

Meanwhile in just a few months we got new consoles coming out for likely $400-500 that include a whole mid range GPU, 8 core Zen2 CPU, 1tb SSD(ps5's is state of the art), motherboard, 4k Blu Ray Player, fast af RAM, cooling, PSU, case, wifi card, controller... look at the value difference. Shit, I don't even expect the GPU in the consoles to be that much weaker than a RTX 2070 either, which is already $500 alone.

As much as I love PC gaming, it's crazy how PC gamers are such pushovers with their wallets in general. Companies could charge whatever they want and I guarantee people will still buy it. I see people spending $130 for a 2-pack of Corsair case fans. Not even big fans, just 120mm. Shit you can buy a whole nice ass ceiling fan from a reputable company for that much money. Not to mention the fact that they charge full price for graphics cards pretty much until the new ones launch is fucked. No reason people should have been still paying $1200 for a nearly 2 year old 2080Ti until now. Basically just rips off people who aren't tech nerds and don't know about the new cards coming out. IMO with the rapid release cycle of these kinds of products, prices should drop steadily little by little after 6-8 months of a GPU/CPU being out.

That's a fair argument. But I think we should give credit where credit is due. 3070 being faster than a 2080Ti that cost over double the price is nothing to scoof at. It does seem like nVidia is taking PC Gamers for suckers when the 970 launched at $330 and the 1070 launched at $379.99. A lot of the blame can be due to AMD not providing competition on the high end. For consoles, you have to consider that they are selling at a loss.

@jasonofa36 said:

It'd be fine. GDDR6 is fast enough that you won't see any huge problems with VRAM limitations. Also, the graphics from the VII is a bit misleading. That's cached VRAM vs utilized VRAM. Utilized VRAM for 4K on recent AAA titles hit about 8-9GB, and if you have a bigger VRAM, uses the rest as cache.

That's not entirely accurate. In certain scenes you can see utilized VRAM in certain games, hence the "stutter" in certain games in particular scenes, as the frame buffer gets filled, seen that with games before, hence higher instance of lower 1% frame rate. May not seem that bad but as more and more games utilize more VRAM, especially with newer games several years down the road.

@BassMan said:

I never had an issue with VRAM at 4K on the 2080 Ti. The next-gen consoles will also not be exceeding 10GB VRAM. So, I am not worried about it for the 3080. If it becomes an issue, I will lower settings as needed. I will most likely be upgrading again with the 4000 series anyway.

Why would I "lower" settings if I am going to throw that much cash at a GPU. I would expect it to stay relevant for at least 2-3 years.

Avatar image for BassMan
BassMan

12569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 143

User Lists: 0

#39  Edited By BassMan
Member since 2002 • 12569 Posts

@Xtasy26: There is nothing indicating that 10GB VRAM will not be enough. So, I also expect it to stay relevant for 2-3 years. All I am saying is that I am not worried about it and I could lower settings if VRAM usage becomes a problem.

I don't like the idea of overpaying for something that is not necessary. VRAM is not cheap. So, if Nvidia decided to go with 10GB of VRAM, then they obviously feel that is sufficient for 4K gaming. Developers will optimize their games around that specification. DirectStorage may also be leveraged in games to ease the burden on VRAM.

Avatar image for NoodleFighter
NoodleFighter

11037

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40 NoodleFighter
Member since 2011 • 11037 Posts

@BassMan: Watch Dogs Legions system requirements recently came out and it requires 11GB VRAM for 4k max settings with ray tracing. Although we haven't seen how the game actually performs yet.

Avatar image for BassMan
BassMan

12569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 143

User Lists: 0

#41 BassMan
Member since 2002 • 12569 Posts

@NoodleFighter said:

@BassMan: Watch Dogs Legions system requirements recently came out and it requires 11GB VRAM for 4k max settings with ray tracing. Although we haven't seen how the game actually performs yet.

That will be interesting to see for sure.

Avatar image for Xtasy26
Xtasy26

5412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 0

#42 Xtasy26
Member since 2008 • 5412 Posts

@BassMan said:

@Xtasy26: There is nothing indicating that 10GB VRAM will not be enough. So, I also expect it to stay relevant for 2-3 years. All I am saying is that I am not worried about it and I could lower settings if VRAM usage becomes a problem.

I don't like the idea of overpaying for something that is not necessary. VRAM is not cheap. So, if Nvidia decided to go with 10GB of VRAM, then they obviously feel that is sufficient for 4K gaming. Developers will optimize their games around that specification. DirectStorage may also be leveraged in games to ease the burden on VRAM.

Some games are already pushing 10 GB already. BF5 with Ray Tracing in some cases is can get over 10GB in 4K.

That's a 2018 game. I can easily see more and more games pushing 10 GB in 2-3 years at 4K especially with the newer consoles now having more VRAM than ever.

Avatar image for jasonofa36
JasonOfA36

2325

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#43  Edited By JasonOfA36  Online
Member since 2016 • 2325 Posts

@Xtasy26: Most likely it's cached VRAM.

Loading Video...

Steve talks about this in the review. Basically, bandwidth is enough to alleviate the "skimped" VRAM.

Avatar image for BassMan
BassMan

12569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 143

User Lists: 0

#44  Edited By BassMan
Member since 2002 • 12569 Posts

@Xtasy26 said:
@BassMan said:

@Xtasy26: There is nothing indicating that 10GB VRAM will not be enough. So, I also expect it to stay relevant for 2-3 years. All I am saying is that I am not worried about it and I could lower settings if VRAM usage becomes a problem.

I don't like the idea of overpaying for something that is not necessary. VRAM is not cheap. So, if Nvidia decided to go with 10GB of VRAM, then they obviously feel that is sufficient for 4K gaming. Developers will optimize their games around that specification. DirectStorage may also be leveraged in games to ease the burden on VRAM.

Some games are already pushing 10 GB already. BF5 with Ray Tracing in some cases is can get over 10GB in 4K.

That's a 2018 game. I can easily see more and more games pushing 10 GB in 2-3 years at 4K especially with the newer consoles now having more VRAM than ever.

How much of that VRAM is actively being used by the game though and not just being unnecessarily buffered? BFV also has a GPU Memory Restriction option which prevents the game from exceeding the VRAM of your card.

Avatar image for Xtasy26
Xtasy26

5412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 0

#45 Xtasy26
Member since 2008 • 5412 Posts

@BassMan said:
@Xtasy26 said:
@BassMan said:

@Xtasy26: There is nothing indicating that 10GB VRAM will not be enough. So, I also expect it to stay relevant for 2-3 years. All I am saying is that I am not worried about it and I could lower settings if VRAM usage becomes a problem.

I don't like the idea of overpaying for something that is not necessary. VRAM is not cheap. So, if Nvidia decided to go with 10GB of VRAM, then they obviously feel that is sufficient for 4K gaming. Developers will optimize their games around that specification. DirectStorage may also be leveraged in games to ease the burden on VRAM.

Some games are already pushing 10 GB already. BF5 with Ray Tracing in some cases is can get over 10GB in 4K.

That's a 2018 game. I can easily see more and more games pushing 10 GB in 2-3 years at 4K especially with the newer consoles now having more VRAM than ever.

How much of that VRAM is actively being used by the game though and not just being unnecessarily buffered? BFV also has a GPU Memory Restriction option which prevents the game from exceeding the VRAM of your card.

It seems certain scenes it's actually using 10 GB. Haven't heard of BFV engine on those engine not unnecessarily being buffered unlike other engines. I am showing usage at the time the scene is rendered at that point in time. That fact that it's pushing 10GB in a 2018 game is troubling. What happens when games do start to use over 10GB and push easily past it in 2-3 years (especially with newer consoles). We are back to Fury X argument where it was okay for 4GB back in 2015 but 2 - 3 years down 4GB wasn't enough.

@jasonofa36 said:

@Xtasy26: Most likely it's cached VRAM.

Loading Video...

Steve talks about this in the review. Basically, bandwidth is enough to alleviate the "skimped" VRAM.

He only talks about one game, Call of Duty which we know does allocate more than it needs. But he doesn't talk about other engines like the Ego engine used in Red Dead 2. We know that RX 580 performs better than 1060 6GB DDR5X despite that engine being historically being better favoring nVidia in GTA V (which is great for my 1060 6GB) as I get great frame rates in GTA V.

1060 6GB5X even with it's faster memory version (similar to RTX 3080 using faster GDDR6X) has much lower 1% min frame rates compared to RX 580 8GB and more stutter because Red Dead 2 exceeds 6 GB in some scenes.

As techspot stated. "we see rather poor 1% low performance from the GTX 1060 due to its 6GB buffer.".

https://www.techspot.com/review/1990-geforce-1060-vs-radeon-580/

Avatar image for jasonofa36
JasonOfA36

2325

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#46 JasonOfA36  Online
Member since 2016 • 2325 Posts

@Xtasy26: the 1060 doesn't use GDDR5X. And RDR 2 doesn't use the Ego engine. Stop, please, you're talking out of your ass.

Avatar image for Pedro
Pedro

42984

Forum Posts

0

Wiki Points

0

Followers

Reviews: 66

User Lists: 0

#47 Pedro
Member since 2002 • 42984 Posts

Something to consider. Most of the memory used in games are not actively utilized. With the implementation of the newer tech from DirectX 12U, the need for such large memory footprints would be reduced (theoretically) because the data would be more dynamic than static as data is quickly swapped in and out of RAM and VRAM with the SSD DMA optimization.

Avatar image for xantufrog
xantufrog

13901

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#48 xantufrog  Moderator  Online
Member since 2013 • 13901 Posts

I think the opposite. It's only ok on price and the memory should be fine given its substantial throughput

Avatar image for Xtasy26
Xtasy26

5412

Forum Posts

0

Wiki Points

0

Followers

Reviews: 43

User Lists: 0

#49 Xtasy26
Member since 2008 • 5412 Posts

@jasonofa36 said:

@Xtasy26: the 1060 doesn't use GDDR5X. And RDR 2 doesn't use the Ego engine. Stop, please, you're talking out of your ass.

They used the 9Gbps version of 1060 I mean (the one with faster memory). And actually there are 1060 revision that does use GDDR5X.

https://www.tomshardware.com/news/nvidia-geforce-gtx-1060-gddr5x-specs,37961.html

And I misspoke I mean the RAGE engine.

https://en.wikipedia.org/wiki/Rockstar_Advanced_Game_Engine

Avatar image for netracing
netracing

574

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#50 netracing
Member since 2018 • 574 Posts

As has been stated endlessly over the past few weeks, games will display their memory allocation which will often just max out at the vram of the card but it does not mean the card is utilising that much in-game.