RTX3080: 10GB VRAM is enough for 4K gaming

  • 107 results
  • 1
  • 2
  • 3
Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#1  Edited By R4gn4r0k
Member since 2004 • 46292 Posts

For those worried: yes, 10GB of VRAM is enough for 4K gaming

Call of Duty Warzone (with Ray Tracing enabled) was one of the games that pushed our VRAM usage to 9.3GB. On the other hand, Watch Dogs Legion required 9GB at 4K/Ultra (without Ray Tracing). Additionally, Marvel’s Avengers used 8.8GB during the scene in which our RTX2080Ti was using more than 10GB of VRAM (we don’t know whether Nixxes has improved things via a post-launch update, we’ll find out when we’ll re-benchmark the RTX2080Ti in the next couple of days). Furthermore, Crysis Remastered used 8.7GB of VRAM, and Quantum Break used 6GB of VRAM. All the other games were using between 4-8.5GB of VRAM.

So, right now there isn’t any game in which the RTX3080 is bottlenecked by its VRAM. Even Cyberpunk 2077, a game that pushes next-gen graphics, had no VRAM issues at 4K/Ultra settings. However, we obviously can’t predict the future.

Source

Also, running a test on 17 demanding games, DSO has also found out that Nvidia likes blowing hot air and the RTX3080 isn't really that much better than the RTX2080ti, like it was claiming in its super limited benchmark scenarios.

In conclusion, the NVIDIA GeForce RTX3080 appears to be an amazing 1440p graphics card. For 4K/Ultra gaming, though, you’ll either need DLSS or highly-optimized PC games. Owners of OC’ed RTX2080Ti GPUs should also not worry about the RTX3080, as the performance difference between them is not that big.

Avatar image for jasonofa36
JasonOfA36

3725

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#2 JasonOfA36
Member since 2016 • 3725 Posts

Of course it is. Only AMD fanboys say that 16GB is the minimum for 4K gaming now.

Avatar image for eoten
Eoten

8671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#3  Edited By Eoten
Member since 2020 • 8671 Posts

Neither GPU will run games at a full 60fps at 4K. Most people who game on PC couldn't care less about 4K performance because we tend to have access to better gaming resolutions typically not available on a living room television. 4K performance in GPUs is marketing, it's an attempt to hype the completely irrelevant to sell more units.

Also... for console gamers paying attention, if a 3080 and 6800XT aren't able to maintain 60fps on any of these titles then your PS5s and XBox Series Xs won't either.

"4K gaming" is marketing hype, and those frame rates will only get lower, and lower every year with true next gen titles as games become even more demanding.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#4 R4gn4r0k
Member since 2004 • 46292 Posts

@jasonofa36 said:

Of course it is. Only AMD fanboys say that 16GB is the minimum for 4K gaming now.

AMD overhyped their cards and Nvidia overhyped theirs, these companies really are equal for eachother.

Handpicked benchmarks pre release, effective strategy.

Avatar image for jasonofa36
JasonOfA36

3725

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 JasonOfA36
Member since 2016 • 3725 Posts

@R4gn4r0k: Eh, I don't consider the RTX 3080 overhyped. Both are definitely overselling what the cards can actually do.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6 R4gn4r0k
Member since 2004 • 46292 Posts

@jasonofa36 said:

@R4gn4r0k: Eh, I don't consider the RTX 3080 overhyped. Both are definitely overselling what the cards can actually do.

Slightly overhyped perhaps. Cherrypicked benchmarks prerelease for sure.

I mean I don't just blame Nvidia for this. AMD did this too. I'm sure Sony and MS were overhyping their console capabilities as well (4K 60 with raytracing)

But I just though I'd make a thread about the RTX3080 as its such a hot product.

Avatar image for rdnav2
RDNAv2

230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#7 RDNAv2
Member since 2019 • 230 Posts

Amd GPU will age better

Like always

Avatar image for firedrakes
firedrakes

4365

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#8 firedrakes
Member since 2004 • 4365 Posts

display at 4k.. not in made in 4k. so your going to need way my vram.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9 R4gn4r0k
Member since 2004 • 46292 Posts

@firedrakes said:

display at 4k.. not in made in 4k. so your going to need way my vram.

These will be the exceptions. The current 17 most demanding games don't require more than 10GB vram at 4K.

Avatar image for firedrakes
firedrakes

4365

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#10 firedrakes
Member since 2004 • 4365 Posts

@R4gn4r0k: true but again game not made in 4k. so it simple what ever high end assets their using,.

Avatar image for eoten
Eoten

8671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#11 Eoten
Member since 2020 • 8671 Posts

@R4gn4r0k said:
@jasonofa36 said:

@R4gn4r0k: Eh, I don't consider the RTX 3080 overhyped. Both are definitely overselling what the cards can actually do.

Slightly overhyped perhaps. Cherrypicked benchmarks prerelease for sure.

I mean I don't just blame Nvidia for this. AMD did this too. I'm sure Sony and MS were overhyping their console capabilities as well (4K 60 with raytracing)

But I just though I'd make a thread about the RTX3080 as its such a hot product.

They're all hot products. The 3080, the 6800XT, the PS5, and the Xbox Series X. All are currently sold out. And they were all overhyped to sell more consoles. The sad part is so many people still do not realize it, and will continue to defend "4K60 gaming" (lmao), as if it was ever going to happen. Sony does it because they want to sell more than just consoles, but also televisions. Nvidia does it to make people think they've come out with something ground breaking and must-have. AMD does it so they don't lose customers to Nvidia.

They all get away with it because let's face it, gamers as a whole aren't very bright. We shouldn't be demanding these companies provide us something unobtainable, but that they focus on improving what they can give us. Solid 1440P gaming.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12  Edited By R4gn4r0k
Member since 2004 • 46292 Posts

@eoten said:

They're all hot products. The 3080, the 6800XT, the PS5, and the Xbox Series X. All are currently sold out. And they were all overhyped to sell more consoles. The sad part is so many people still do not realize it, and will continue to defend "4K60 gaming" (lmao), as if it was ever going to happen. Sony does it because they want to sell more than just consoles, but also televisions. Nvidia does it to make people think they've come out with something ground breaking and must-have. AMD does it so they don't lose customers to Nvidia.

They all get away with it because let's face it, gamers as a whole aren't very bright. We shouldn't be demanding these companies provide us something unobtainable, but that they focus on improving what they can give us. Solid 1440P gaming.

Gamers do have unrealistic expectations, which plays into this. I mean last gen hardware like PS4 pro, Xbox One X or nvidia GTX series are already plenty strong. And more than capable of running demanding games with huge sprawling worlds. With zero load times.

But people just demand stronger stuff.

Avatar image for eoten
Eoten

8671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#13 Eoten
Member since 2020 • 8671 Posts

@R4gn4r0k said:
@eoten said:

They're all hot products. The 3080, the 6800XT, the PS5, and the Xbox Series X. All are currently sold out. And they were all overhyped to sell more consoles. The sad part is so many people still do not realize it, and will continue to defend "4K60 gaming" (lmao), as if it was ever going to happen. Sony does it because they want to sell more than just consoles, but also televisions. Nvidia does it to make people think they've come out with something ground breaking and must-have. AMD does it so they don't lose customers to Nvidia.

They all get away with it because let's face it, gamers as a whole aren't very bright. We shouldn't be demanding these companies provide us something unobtainable, but that they focus on improving what they can give us. Solid 1440P gaming.

Gamers do have unrealistic expectations, which plays into this. I mean last gen hardware like PS4 pro, Xbox One X or nvidia GTX series are already plenty strong. And more than capable of running demanding games with huge sprawling worlds. With zero load times.

But people just demand stronger stuff.

Well, we are certainly past a point of diminishing returns with hardware. If you look at how great games 10 years old still look, games like Skyrim are already 9 years old, Mass Effect is like 12 and they still look great. But go back just 5 years before that and you see a huge difference. Going forward that gap just gets narrower and narrower as we are already past the threshold for human vision with resolutions and frame rates modern games are capable of. There's not a whole heck of a lot more they can add into a game to make them look better.

In fact, My current GPU is almost 2 years old. It still plays most games on high settings, some of the more demanding ones on medium. It has me wondering if there's even a point in my buying a 3080 or 6800XT, since there's not a whole heck of a lot of tangible improvements to be had with it yet.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#14 R4gn4r0k
Member since 2004 • 46292 Posts

@eoten said:

Well, we are certainly past a point of diminishing returns with hardware. If you look at how great games 10 years old still look, games like Skyrim are already 9 years old, Mass Effect is like 12 and they still look great. But go back just 5 years before that and you see a huge difference. Going forward that gap just gets narrower and narrower as we are already past the threshold for human vision with resolutions and frame rates modern games are capable of. There's not a whole heck of a lot more they can add into a game to make them look better.

In fact, My current GPU is almost 2 years old. It still plays most games on high settings, some of the more demanding ones on medium. It has me wondering if there's even a point in my buying a 3080 or 6800XT, since there's not a whole heck of a lot of tangible improvements to be had with it yet.

Upgrading a GPU used to be about getting new features, or pushing graphics or straight up out of necessity.

But GPU power these days all seem to be about pushing resolutions.

If I just lower my resolution I can play all of the newest games easily on Ultra settings.

I have the same view on graphics too, and of course older games can look rough with the photorealistic graphics we have now, but it's nice to go back to older games and replay them at modern resolutions (4K) or high framerates.

Avatar image for eoten
Eoten

8671

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#15 Eoten
Member since 2020 • 8671 Posts

@R4gn4r0k said:
@eoten said:

Well, we are certainly past a point of diminishing returns with hardware. If you look at how great games 10 years old still look, games like Skyrim are already 9 years old, Mass Effect is like 12 and they still look great. But go back just 5 years before that and you see a huge difference. Going forward that gap just gets narrower and narrower as we are already past the threshold for human vision with resolutions and frame rates modern games are capable of. There's not a whole heck of a lot more they can add into a game to make them look better.

In fact, My current GPU is almost 2 years old. It still plays most games on high settings, some of the more demanding ones on medium. It has me wondering if there's even a point in my buying a 3080 or 6800XT, since there's not a whole heck of a lot of tangible improvements to be had with it yet.

Upgrading a GPU used to be about getting new features, or pushing graphics or straight up out of necessity.

But GPU power these days all seem to be about pushing resolutions.

If I just lower my resolution I can play all of the newest games easily on Ultra settings.

I have the same view on graphics too, and of course older games can look rough with the photorealistic graphics we have now, but it's nice to go back to older games and replay them at modern resolutions (4K) or high framerates.

I prefer frame rates over anything else. Under 60 I consider to be unplayable for anything with a moving camera. For competitive shooters then I'll even reduce the graphics settings, eliminate shadows, reflections, and go for higher frame rates like 120. Playing a competitive shooter, I don't spend much time looking at reflections and shadows anyway, in fact, in many games reducing graphics gives you an advantage.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#16 R4gn4r0k
Member since 2004 • 46292 Posts

@eoten said:

I prefer frame rates over anything else. Under 60 I consider to be unplayable for anything with a moving camera. For competitive shooters then I'll even reduce the graphics settings, eliminate shadows, reflections, and go for higher frame rates like 120. Playing a competitive shooter, I don't spend much time looking at reflections and shadows anyway, in fact, in many games reducing graphics gives you an advantage.

Yup, in fact the only reason why I would buy an RTX3000 card this year is because I want to get high framerates in the upcoming Battlefield, which will be more demanding (possibly 100v100 battles).

There are no other games I need one for.

Avatar image for davillain
DaVillain

56107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#17 DaVillain  Moderator
Member since 2014 • 56107 Posts

Even 8GB VRAM is enough for 1440p IMO.

Avatar image for remiks00
remiks00

4249

Forum Posts

0

Wiki Points

0

Followers

Reviews: 37

User Lists: 0

#18  Edited By remiks00
Member since 2006 • 4249 Posts

@eoten said:
@R4gn4r0k said:
@jasonofa36 said:

@R4gn4r0k: Eh, I don't consider the RTX 3080 overhyped. Both are definitely overselling what the cards can actually do.

Slightly overhyped perhaps. Cherrypicked benchmarks prerelease for sure.

I mean I don't just blame Nvidia for this. AMD did this too. I'm sure Sony and MS were overhyping their console capabilities as well (4K 60 with raytracing)

But I just though I'd make a thread about the RTX3080 as its such a hot product.

They're all hot products. The 3080, the 6800XT, the PS5, and the Xbox Series X. All are currently sold out. And they were all overhyped to sell more consoles. The sad part is so many people still do not realize it, and will continue to defend "4K60 gaming" (lmao), as if it was ever going to happen. Sony does it because they want to sell more than just consoles, but also televisions. Nvidia does it to make people think they've come out with something ground breaking and must-have. AMD does it so they don't lose customers to Nvidia.

They all get away with it because let's face it, gamers as a whole aren't very bright. We shouldn't be demanding these companies provide us something unobtainable, but that they focus on improving what they can give us. Solid 1440P gaming.

Well said.

@davillain- said:

Even 8GB VRAM is enough for 1440p IMO.

Indeed. I don't think Ill upgrade to a 4k monitor anytime soon, if ever. I really don't see the point. 27' 1440p/144+hz is more than enough for me. In fact, I've been a bit intrigued by possibly getting an 1080p 240 or 360hz monitor. I need to do more research on those to see if it'll be worth it :P

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#19 R4gn4r0k
Member since 2004 • 46292 Posts

@remiks00: Going back to 1080p is hard after enjoying 1440p or 4K.

I believe you are fine with 1440p at 144hz. The only reason why I would ever upgrade my current monitor (I'm talking in 5+ years at the earliest) is HDR.

Avatar image for ConanTheStoner
ConanTheStoner

23712

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 ConanTheStoner
Member since 2011 • 23712 Posts

Yeah, think I bought my first 4k display in 2015? Have a 4k tv, two 4k monitors, yet still do all my gaming on a 1440p monitor lol. Would even kick a game over to the 1080p monitor if I really need insane performance.

Might dip into 4k more when I get a new GPU, but still expect it to be a while before it's my go-to gaming resolution.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21 R4gn4r0k
Member since 2004 • 46292 Posts

@ConanTheStoner said:

Yeah, think I bought my first 4k display in 2015? Have a 4k tv, two 4k monitors, yet still do all my gaming on a 1440p monitor lol. Would even kick a game over to the 1080p monitor if I really need insane performance.

Might dip into 4k more when I get a new GPU, but still expect it to be a while before it's my go-to gaming resolution.

4K is totally possible on these cards thanks to DLSS. The way I'm reading is that it's virtually undistinguishable and seeing as how it's original rendered on a lower resolution, save massively on performance.

Avatar image for ConanTheStoner
ConanTheStoner

23712

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 ConanTheStoner
Member since 2011 • 23712 Posts
@remiks00 said:

I've been a bit intrigued by possibly getting an 1080p 240 or 360hz monitor. I need to do more research on those to see if it'll be worth it :P

Been curious to try those insane refresh rates... just not enough to grab one for myself lol. Around 90+ frames, already becomes difficult for me to tell the difference. Really can't tell between 120 and 144. Don't know if I'd notice 240 or 360 unless I'm playing some classic arena shooter.

I know people always meme about "I'm not a competitive gamer, don't need high frames, blah", but it seems at that point the meme would be a reality.

Avatar image for ConanTheStoner
ConanTheStoner

23712

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 ConanTheStoner
Member since 2011 • 23712 Posts
@R4gn4r0k said:

4K is totally possible on these cards thanks to DLSS. The way I'm reading is that it's virtually undistinguishable and seeing as how it's original rendered on a lower resolution, save massively on performance.

Yeah, definitely gonna give it a go when I snag a 3090. If I can get the settings/performance I want at 4k, then hell yeah.

Avatar image for Zero_epyon
Zero_epyon

20105

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#24 Zero_epyon
Member since 2004 • 20105 Posts

It won't be enough until I can actually buy one...

Avatar image for Pedro
Pedro

69479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#25 Pedro
Member since 2002 • 69479 Posts

But I was told that 10GB is simply not enough from one of our resident console gamer technology experts. There must be some kind of error. 😎

Avatar image for pyro1245
pyro1245

9397

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#26  Edited By pyro1245
Member since 2003 • 9397 Posts

Mmmm. I'd love a 3080 for playing at really high frame rates on a WQHD monitor with all the RTX and graphical goodness.

With 4K textures, of course for that up-close detail.

Avatar image for mrbojangles25
mrbojangles25

58308

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#27 mrbojangles25
Member since 2005 • 58308 Posts

@eoten: yup, 4K is just marketing. Sad to see so many rabid fanboys think it's actually something of substance, important, or an actual metric to measure how good your console is vs another console.

Avatar image for Pedro
Pedro

69479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#28 Pedro
Member since 2002 • 69479 Posts

@mrbojangles25 said:

@eoten: yup, 4K is just marketing. Sad to see so many rabid fanboys think it's actually something of substance, important, or an actual metric to measure how good your console is vs another console.

Watch your step Mr. Man. Don't be messing with the very essence of next gen gaming with your rhetoric.

Avatar image for appariti0n
appariti0n

5013

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 appariti0n
Member since 2009 • 5013 Posts

3080 is a 4k 60 card, if you're cool with DLSS and "technically" not true 4k.

From the screenshots and videos I've seen, it looks pretty indistinguishable from true 4k, but I would need to see it in person to really judge.

Avatar image for mrbojangles25
mrbojangles25

58308

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#30 mrbojangles25
Member since 2005 • 58308 Posts

@Pedro said:
@mrbojangles25 said:

@eoten: yup, 4K is just marketing. Sad to see so many rabid fanboys think it's actually something of substance, important, or an actual metric to measure how good your console is vs another console.

Watch your step Mr. Man. Don't be messing with the very essence of next gen gaming with your rhetoric.

Doesn't matter, it's all about 8K now. 4K is soooooooo last-gen.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#31 R4gn4r0k
Member since 2004 • 46292 Posts

@Zero_epyon said:

It won't be enough until I can actually buy one...

yup, 0GB of ram isn't enough for 4K, I'm afraid.

Avatar image for Pedro
Pedro

69479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#32 Pedro
Member since 2002 • 69479 Posts

@mrbojangles25 said:

Doesn't matter, it's all about 8K now. 4K is soooooooo last-gen.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#33 R4gn4r0k
Member since 2004 • 46292 Posts

@Pedro said:
@mrbojangles25 said:

Doesn't matter, it's all about 8K now. 4K is soooooooo last-gen.

This is what 4K looks like now, fact:

Avatar image for Pedro
Pedro

69479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#34 Pedro
Member since 2002 • 69479 Posts

@R4gn4r0k: ****. I need to upgrade to 8K. I can't play games looking like that.

Avatar image for Vaasman
Vaasman

15569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#35  Edited By Vaasman
Member since 2008 • 15569 Posts

@appariti0n said:

3080 is a 4k 60 card, if you're cool with DLSS and "technically" not true 4k.

From the screenshots and videos I've seen, it looks pretty indistinguishable from true 4k, but I would need to see it in person to really judge.

From what I have read, DLSS actually technically is true 4k. The nature of DLSS is that it's recreating a legitimate 4k image based on the info it gets, which is why (at quality setting and version 2+) it's a far more impressive and effectively unnoticeable upscaling technique compared to others. In some cases it's literally better than native.

Having personally played several current DLSS titles and seeing the difference it has made in performance for certain games, I maintain that AI image processing is the next big step for graphics, far more so than ray tracing.

Avatar image for appariti0n
appariti0n

5013

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#36 appariti0n
Member since 2009 • 5013 Posts

@Vaasman said:
@appariti0n said:

3080 is a 4k 60 card, if you're cool with DLSS and "technically" not true 4k.

From the screenshots and videos I've seen, it looks pretty indistinguishable from true 4k, but I would need to see it in person to really judge.

From what I have read, DLSS actually technically is true 4k. The nature of DLSS is that it's recreating a legitimate 4k image based on the info it gets, which is why (at quality setting and version 2+) it's a far more impressive and effectively unnoticeable upscaling technique compared to others. In some cases it's literally better than native.

Having personally played several current DLSS titles and seeing the difference it has made in performance for certain games, I maintain that AI image processing is the next big step for graphics, far more so than ray tracing.

I will say that some folks are lumping this in with resolution scaling, and that's not accurate at all.

Hence the "technically". Because apples to apples would be rendering at 4k natively with no help from DLSS.

You'd better believe i'll be running DLSS the entire time I have the card tho. I'm at 1080p right now anyhow LOL.

Avatar image for BassMan
BassMan

17808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#37  Edited By BassMan
Member since 2002 • 17808 Posts

@Vaasman: DLSS is not native resolution. It renders at a lower resolution and the tensor cores use deep learning to fill in the extra detail based on the analysis of the image and then output it at the target resolution. It's essentially remastering the image on the fly.

Avatar image for Pedro
Pedro

69479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#38  Edited By Pedro
Member since 2002 • 69479 Posts

@BassMan said:

@Vaasman: DLSS is not native resolution. It renders at a lower resolution and the tensor cores use deep learning to fill in the extra detail based on the analysis of the image and then output it at the target resolution.

The output of all upscaling techniques targeting 4K are natively 4K. That is why he used the word technically.

Avatar image for osan0
osan0

17817

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#39 osan0
Member since 2004 • 17817 Posts

Aye tis. i wouldn't let that be the deciding factor on what GPU to buy for the vast majority of people.

i mean if you get a 3080 and plan to get a 4080 in a couple of years it wont be an issue (except for the one odd case outlier...there always has to be one odd case outlier :P).

It may (and i do emphasise may...cant predict the future) become an issue if you are the type of person who likes to buy high and hold onto it for a long time AND insist on 4K native BUT are not that fussy on the framerate. so if you want to buy a new GPU and hold onto it for the next 5+years basically.

Avatar image for Vaasman
Vaasman

15569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#40  Edited By Vaasman
Member since 2008 • 15569 Posts

@BassMan: I am aware the software is not rendering internally at 4k, but the point is that after the AI pass it outputs a genuine 4k image, instead of what previous methods had, which was more like a 1080p output and a smear over the top.

In all senses they're all technically 4k anyway, but DLSS 2.0 is far and away superior to previous methods because of legitimate image recreation as opposed to simple sharpening.

Avatar image for BassMan
BassMan

17808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#41  Edited By BassMan
Member since 2002 • 17808 Posts

@Pedro said:
@BassMan said:

@Vaasman: DLSS is not native resolution. It renders at a lower resolution and the tensor cores use deep learning to fill in the extra detail based on the analysis of the image and then output it at the target resolution.

The output of all upscaling techniques targeting 4K are natively 4K. That is why he used the word technically.

Which still does not make sense. The output resolution is 4K, but the internal rendering resolution is not. It is being up-scaled to match the native resolution. So, it is not a true native resolution.

Avatar image for Pedro
Pedro

69479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#42 Pedro
Member since 2002 • 69479 Posts

@BassMan said:

Which still does not make sense. The output resolution is 4K, but the internal rendering resolution is not. It is being up-scaled to match the native resolution. So, it is not a true native resolution.

Not really. The output from the framebuffer is a 4K image. That is the internal to external resolution. If this all happening within the rasterization process of the render pipeline, it is a native 4K image.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#43  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

@Vaasman: yes I consider DLSS more an alternative rendering method for that reason. Especially because although it is used to upscale, it doesn't have to be - it could be used to supersample above your native target or simply fill in details at a fixed resolution. E.g. improving texture complexity. It's being used primarily (ok, I think exclusively) one way right now but isn't confined to that

Avatar image for BassMan
BassMan

17808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#44 BassMan
Member since 2002 • 17808 Posts

@Pedro said:
@BassMan said:

Which still does not make sense. The output resolution is 4K, but the internal rendering resolution is not. It is being up-scaled to match the native resolution. So, it is not a true native resolution.

Not really. The output from the framebuffer is a 4K image. That is the internal to external resolution. If this all happening within the rasterization process of the render pipeline, it is a native 4K image.

DLSS is just a better way of filling in the blanks so to speak. It is still an up-scaling/reconstuction technique which by default makes it not the native resolution. The internal rendering resolution has to match the native resolution of the display for it to be considered running in native resolution.

Avatar image for R4gn4r0k
R4gn4r0k

46292

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#45 R4gn4r0k
Member since 2004 • 46292 Posts

@BassMan said:

DLSS is just a better way of filling in the blanks so to speak. It is still an up-scaling/reconstuction technique which by default makes it not the native resolution. The internal rendering resolution has to match the native resolution of the display for it to be considered running in native resolution.

The only thing about reconstructing is that it can cause input lag.

Quantum Break and Days Gone were two example on consoles that had reconstructing and I felt were totally unresponsive. QB was very blurry on top of it.

There is no input lag to speak of in DLSS reconstructing, is that correct?

Avatar image for BassMan
BassMan

17808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#46  Edited By BassMan
Member since 2002 • 17808 Posts
@R4gn4r0k said:
@BassMan said:

DLSS is just a better way of filling in the blanks so to speak. It is still an up-scaling/reconstuction technique which by default makes it not the native resolution. The internal rendering resolution has to match the native resolution of the display for it to be considered running in native resolution.

The only thing about reconstructing is that it can cause input lag.

Quantum Break and Days Gone were two example on consoles that had reconstructing and I felt were totally unresponsive. QB was very blurry on top of it.

There is no input lag to speak of in DLSS reconstructing, is that correct?

I haven't played many games with DLSS on due to it being shit until 2.0. The ones that I have played, I haven't noticed any major input lag, but I didn't run comparisons. They felt good enough. I just finished playing Days Gone on PS5 and it felt pretty good with the controller at 60fps. Quantum Blur was just shit unless you were running at native resolution, but then it was a bitch to run as it was not optimized.

Avatar image for Pedro
Pedro

69479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#47 Pedro
Member since 2002 • 69479 Posts

@BassMan said:

DLSS is just a better way of filling in the blanks so to speak. It is still an up-scaling/reconstuction technique which by default makes it not the native resolution. The internal rendering resolution has to match the native resolution of the display for it to be considered running in native resolution.

When rendering anything to be displayed on the screen, there are a series of processes that occur to create a final output. The final output resolution is the "native" resolution of the process. It really doesn't matter what techniques and algorithms are used to achieve the output resolution will still remain the "native" resolution of the process. This nitpicking on "native" resolution is really fanboy nonsense at heart.

Native resolution as term relates more to displays than to actual images. Digital images are dynamic and ones that are generated through a render pipeline is...natively (😉) dynamic. In that, the output can be anything within the constraints of the hardware/software.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#48  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

@BassMan: what is your definition of "internal"?

Like machine-learning itself, you wouldn't take a neural network, add a layer, and say "nuh uh, the only learning in the algorithm that "counts" is the previous original layers in the network". Or perhaps a better (although more obscure) analogy: you wouldn't take a 2-stroke engine and say "no its TRUE power output and fuel efficiency is what it would be BEFORE the expansion chamber - the fact the expansion chamber recycles and enhances the output doesn't count". What's the horsepower at the crank or wheel? That's what people want to know. The ML algorithm has been added as an optional layer to the (I would argue "internal") rendering pipeline.

Avatar image for BassMan
BassMan

17808

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#49  Edited By BassMan
Member since 2002 • 17808 Posts
@xantufrog said:

@BassMan: what is your definition of "internal"?

Like machine-learning itself, you wouldn't take a neural network, add a layer, and say "nuh uh, the only learning in the algorithm that "counts" is the previous layers in the network". Or perhaps a better (although more obscure) analogy: you wouldn't take a 2-stroke engine and so "no its TRUE power output and fuel efficiency is what it would be BEFORE the expansion chamber - the fact the expansion chamber recycles and enhances the output doesn't count". What's the horsepower at the crank or wheel? That's what people want to know. The ML algorithm has been added as an optional layer to the (I would argue "internal") rendering pipeline.

The internal resolution is like a naturally aspirated engine. There are no turbos or superchargers compensating for lack of power. They may achieve similar output horsepower (output resolution), but they are not the same.

Internal = Naturally Aspirated

DLSS = Turbo/Supercharged

Avatar image for Pedro
Pedro

69479

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#50 Pedro
Member since 2002 • 69479 Posts

@BassMan said:

The internal resolution is like a naturally aspirated engine. There are no turbos or superchargers generating the power. So, just like you can achieve the same output horsepower (output resolution) both ways, they are not the same.

Internal = Naturally Aspirated

DLSS = Turbo/Supercharged

Natural does not exist in the process of rendering. 😎