Nvidia: By 2023 the first AAA game will require ray tracing GPUs

Avatar image for NoodleFighter
NoodleFighter

11088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1  Edited By NoodleFighter  Online
Member since 2011 • 11088 Posts

One of Nvidia's research team members Morgan Mcquire claims that by 2023 you will need a ray tracing GPU to play AAA games. Makes sense since by then there will be nothing but ray tracing cards on the market and even if high end games don't require ray tracing GPU and non ray tracing GPUs would be pretty old and too weak to run them anyway. I wonder when and if integrated graphics cards will every get ray tracing.

I think ray tracing gets most of its hate from the cost of the Nvidia RTX cards especially the 2080 ti but they will get cheaper once AMD and even Intel come out with their ray tracing GPUs. That means we should also be seeing even more ray traced games since AMD and Intel will also partner with developers to implement ray tracing into their games too.

Developers like Quantic Dream also say its going to be huge and are already looking into for their games. Ray tracing is easier to implement than the methods we have now so other than the performance hit which will eventually be fixed as we get newer generations of cards I see no reason for devs not adopt it.

Avatar image for davillain-
DaVillain-

44596

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#2 DaVillain-  Moderator  Online
Member since 2014 • 44596 Posts

Ray-tracing has been around for decades. Real-time RT is pretty new though, so the hardware is excessively expensive, and the results aren't that great. As it develops though, as others have said on Twitter, it'll probably become a standard setting for all future AAA games, (and we all know that Cyberpunk 2077 AAA will be using RT) just like tessellation and other shaders and stuff. The same goes for VR. Both VR and RT are in the infant stages of technology until the tech becomes mature for developers to fully utilize.

That said, I use RT whenever the games has it. I think it's a great feature. I'm not talking about it's radical, I think it looks great in both Battlefield V & Metro Exodus I use so far. Nvidia is a graphics hardware & Research A.I but they got carried away with their flagship 2080Ti price it over $1,000 is why it's getting hated. I'm not Nvidia fanboy but I do favor their GPU top-notch and nothing against AMD, their CPU is one thing I support and love but they can only do so much in terms of GPU's.

Avatar image for lundy86_4
lundy86_4

57391

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#3 lundy86_4
Member since 2003 • 57391 Posts

I could see that, but I also see developers not wanting to drop potentially decent sized parts of their audience utilizing older cards... Though that has to happen eventually, I suppose. Tough to say.

Avatar image for Phreek300
Phreek300

672

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#4  Edited By Phreek300
Member since 2007 • 672 Posts

Not if consumers can't afford the cards and devs don't program for it. Nvidia can f right off with a 1200 price on cards at the top end as well. The TI cards need to stay under 800 to sell well.

Avatar image for FireEmblem_Man
FireEmblem_Man

19892

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#5 FireEmblem_Man
Member since 2004 • 19892 Posts

@davillain-: Around 2022-2023, if Nintendo sticks with Nvidia and the Tegra SoC, I can already imagine Nvidia include some RT Cores in a small form factor.

Avatar image for lundy86_4
lundy86_4

57391

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#6  Edited By lundy86_4
Member since 2003 • 57391 Posts

@Phreek300 said:

Not if consumers can't afford the cards and devs don't program for it. Nvidia can f right off with a 1200 price on cards at the top end as well. The TI cards need to stay under 800 to sell well.

I grabbed my 2080 for just shy of $1100CAD IIRC. That's about the limit for GPUs in my case... Anything after becomes serious fluff at a heavy monetary expense.

At the end-of-the-day, the RTX line was heavily overpriced, which is why AMD needs to get their shit in check... As well as Nvidia, naturally lol.

Avatar image for Phreek300
Phreek300

672

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#7  Edited By Phreek300
Member since 2007 • 672 Posts

@lundy86_4: AMD does need to get their house in order. Navi and RDNA is a good 1st step. The only way we send messages to companies is by not buying things we feel are overpriced. The fact that you could buy a 1080ti with slightly less rasterization performance and no Ray Tracing says alot.

We used to get TI performance in the next generations 70 level cards. Now it's the 80. So less of a jump at a 40% price hike is a huge nope from me and other people I suspect.

Avatar image for lundy86_4
lundy86_4

57391

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#8  Edited By lundy86_4
Member since 2003 • 57391 Posts

@Phreek300 said:

@lundy86_4: AMD does need to get their house in order. Navi and RDNA is a good 1st step. The only way we send messages to companies is by not buying things we feel are overpriced. The fact that you could buy a 1080ti with slightly less rasterization performance and no Ray Tracing says alot.

We used to get TI performance in the next generations 70 level cards. Now it's the 80. So less of a jump at a 40% price hike is a huge nope from me and other people I suspect.

Exactly. My bro got a 1080ti just before I bought my 2080, and got it for sub-$1000. I bought my GPU a month later (ish) and Nvidia jacked the prices of older tech... The 2080 was actually much cheaper than the 1080ti (about $1000 less IIRC.) These are the prices now in CAD:

Yeah, the 70 series was always a reasonable alternative to price/performance. I happily rocked a 1070 for a while. They've skewed P/P heavily.

Avatar image for NoodleFighter
NoodleFighter

11088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 NoodleFighter  Online
Member since 2011 • 11088 Posts

@davillain-: The people that also want ray tracing to fail for reasons other than pricing are simply just AMD fanboys. Looking at Wccftech almost everyone bashing ray tracing is a blatant AMD fanboy. They use the infancy stage of RT as proof of it failing. Some are even twisting the context of the statement as if a game requiring it in 2023 means games from now won't support it until then. We already got a couple of games supporting RT and more on the way but since it isn't required to play them does that mean the RT support magically doesn't count? No. They're calling ray tracing a gimmick like Physx even though ray tracing isn't biased towards any hardware brand. This reminds me of when they were calling tessellation a gimmick just because AMD cards sucked at it.

@lundy86_4 said:
@Phreek300 said:

Not if consumers can't afford the cards and devs don't program for it. Nvidia can f right off with a 1200 price on cards at the top end as well. The TI cards need to stay under 800 to sell well.

I grabbed my 2080 for just shy of $1100CAD IIRC. That's about the limit for GPUs in my case... Anything after becomes serious fluff at a heavy monetary expense.

At the end-of-the-day, the RTX line was heavily overpriced, which is why AMD needs to get their shit in check... As well as Nvidia, naturally lol.

Well Intel's upcoming discrete graphics cards will also support ray tracing so that should bring some pressure to Nvidia assuming they aren't as high priced as well. I'm worried about AMD's hybrid solution because that is telling me right off the back that they can't implement as many/good RT cores as Nvidia and are going to try and overcompensate with this software solution. Once we get RT cards in the $150-$200 range it should be mainstream enough for devs to feel confident in making it a requirement.

Avatar image for lundy86_4
lundy86_4

57391

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10 lundy86_4
Member since 2003 • 57391 Posts

@NoodleFighter: I'm tepid on Intel. If they can come out strong, then I am all for backing them. I genuinely hope they can bring more competition aside from the two companies atm.

I agree that the low/mid-end needs to hit that price where even BB pre-builts can include them... Once we hit brick-and-mortar PCs with the cards, it's game over.

Avatar image for MonsieurX
MonsieurX

39780

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 MonsieurX
Member since 2008 • 39780 Posts

In 4 years? They barely know what's coming next year

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12  Edited By ronvalencia
Member since 2008 • 29612 Posts

@NoodleFighter:

AMD and Intel confirmed to support Microsoft DXR which BVH based ray-tracing.

https://www.techspot.com/news/79904-intel-xe-gpus-feature-hardware-level-ray-tracing.html

https://www.pcgamesn.com/amd/ray-tracing-rdna-2-gpu-2020 AMD’s second-gen RDNA GPUs will feature hardware accelerated ray tracing in 2020

Microsoft DXR is not PhysX. Microsoft has Havok phyiscs.

Avatar image for ajstyles
AJStyles

1430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#13 AJStyles
Member since 2018 • 1430 Posts

I don’t know anything about anything.

Why don’t they just make a new ___PU?

You know? To like.... do stuff the GPU and CPU don’t do or whatever.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#14  Edited By ronvalencia
Member since 2008 • 29612 Posts

@NoodleFighter said:

@davillain-: The people that also want ray tracing to fail for reasons other than pricing are simply just AMD fanboys. Looking at Wccftech almost everyone bashing ray tracing is a blatant AMD fanboy. They use the infancy stage of RT as proof of it failing. Some are even twisting the context of the statement as if a game requiring it in 2023 means games from now won't support it until then. We already got a couple of games supporting RT and more on the way but since it isn't required to play them does that mean the RT support magically doesn't count? No. They're calling ray tracing a gimmick like Physx even though ray tracing isn't biased towards any hardware brand. This reminds me of when they were calling tessellation a gimmick just because AMD cards sucked at it.

@lundy86_4 said:
@Phreek300 said:

Not if consumers can't afford the cards and devs don't program for it. Nvidia can f right off with a 1200 price on cards at the top end as well. The TI cards need to stay under 800 to sell well.

I grabbed my 2080 for just shy of $1100CAD IIRC. That's about the limit for GPUs in my case... Anything after becomes serious fluff at a heavy monetary expense.

At the end-of-the-day, the RTX line was heavily overpriced, which is why AMD needs to get their shit in check... As well as Nvidia, naturally lol.

Well Intel's upcoming discrete graphics cards will also support ray tracing so that should bring some pressure to Nvidia assuming they aren't as high priced as well. I'm worried about AMD's hybrid solution because that is telling me right off the back that they can't implement as many/good RT cores as Nvidia and are going to try and overcompensate with this software solution. Once we get RT cards in the $150-$200 range it should be mainstream enough for devs to feel confident in making it a requirement.

AMD's 2017 RT patent difference with NVIDIA RTX is with tree node transverse while both solution has bound box and triangle interect test hardware. Both solutions are attached to texure cache/memory fetch path.

NAVI CU's texture filtering processing are twice the width when compared to GCN CU's.

AMD's tessellation being inferior is due to compute processing bias in server workloads which is the same problem with Bulldozer's server focus.

In terms of geometry, NAVI 10 and TU106 are similar i.e. four prim units vs four GPCs.

NAVI 10 has dual shading engines with four prim units over 256bit bus, hence AMD positioned NAVI 10 similar to 7870's dual shading engines prior to Hawaii GCN's quad shading engines. Sapphire has reseve RX 5800 and RX 5900 model numbers.

There's no point for AMD to release RX-5800/RX-5900 without hardware acclerated RT.

Avatar image for boxrekt
BoxRekt

2425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#15 BoxRekt
Member since 2019 • 2425 Posts

If you buy into this I have a bridge I want to sell you.

Nvidia makes new GPU with RT sales gimmick = Ray Tracing is the future, it will be REQUIREDZ!!!

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#16  Edited By ronvalencia
Member since 2008 • 29612 Posts

@boxrekt said:

If you buy into this I have a bridge I want to sell you.

Nvidia makes new GPU with RT sales gimmick = Ray Tracing is the future, it will be REQUIREDZ!!!

DXR (Bounding Volume Hierarchy tree structure) is mandated by MS and "hardware accelerated raytracing" comes with Xbox Scarlet and AMD's "RDNA 2".

DXR is not PhysX when it's mandated by MS.

Bounding volume hierarchy (BVH) tree search "hardware accelerator" plays a role in MS's future Azure cloud plans.

https://en.wikipedia.org/wiki/Bounding_volume_hierarchy

Bounding volume hierarchy (BVH) search tree "hardware accelerator" can be applied outside of graphics.

AMD, Intel, MS and NVIDIA supports bounding volume hierarchy (BVH) search tree "hardware accelerator". That's the big four companies that contributes to PC industry's IP sources.

Avatar image for goldenelementxl
GoldenElementXL

4920

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 GoldenElementXL
Member since 2016 • 4920 Posts

So next gen is only gonna last 3 years?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18  Edited By ronvalencia
Member since 2008 • 29612 Posts

@NoodleFighter:

VRAY benchmark shows shader based raytracing, NAVI's (via OpenCL) shaders are tuned towards raytracing which already rivals Turing's shader (via CUDA) raytracing.

RX 5700 XT murders RX Vega II's RT efforts. PS4 Pro's RT will be slower than RX 590's results.

NAVI 10 is missing bound volumn box and trinagle interect test hardware accelerator.

AMD agrees with NVIDIA.

AMD applied "end of life" status on RX Vega 56, Vega 64 and Vega II.

Avatar image for Guy_Brohski
Guy_Brohski

2221

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#19 Guy_Brohski
Member since 2013 • 2221 Posts

@goldenelementxl: Why would that be when both upcoming next gen consoles support RT?

Avatar image for HalcyonScarlet
HalcyonScarlet

9122

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#20 HalcyonScarlet
Member since 2011 • 9122 Posts

It's possible, but the problem is the variation on the PC. For example, even after all these years DX12 adoption is slow because of all the non Windows 10 users. People don't rush to get the latest cards.

But why would they push a feature like this as a requirement?

Avatar image for horgen
horgen

124306

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21 horgen  Moderator
Member since 2006 • 124306 Posts

That's not really hard to believe. I bet even XX50 cards from 2023 from nVidia will run RT better than RTX 2080 today.

Avatar image for Yams1980
Yams1980

4103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 27

User Lists: -5

#22 Yams1980
Member since 2006 • 4103 Posts

this is total BS. nvidia lackeys expose themselves as frauds every day it seems. no games will never require ray tracing.

You can run pretty much all games in basically potato mode. You don't even need ambient occulsion or shadows enabled in games and that techs been around forever and its just eye candy lighting effects like ray tracing is.

ray tracing doesn't do anything useful, adds some realism to lighting but nobody cares about it when it kills performance so much.

its not like i wont have a ray tracing card either by then, i got a 1080 and gonna upgrade it within a year or so. But i'll never enable raytracing, i want performance and minimum amounts of aliasing in my games, I don't care about realistic lighting that i can't even notice unless i'm standing still and taking comparison screenshots of raytracing on and off. I'd rather put the extra gpu power into some DSR or AA.

Avatar image for pc_rocks
PC_Rocks

4732

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#23 PC_Rocks
Member since 2018 • 4732 Posts

Sounds about right because consoles will probably be dead by then.

Avatar image for goldenelementxl
GoldenElementXL

4920

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 GoldenElementXL
Member since 2016 • 4920 Posts

@Guy_Brohski said:

@goldenelementxl: Why would that be when both upcoming next gen consoles support RT?

The RTX 2060 supports RT, but it's hardly ideal. "Supporting" and requiring are very different.

Avatar image for Pedro
Pedro

45066

Forum Posts

0

Wiki Points

0

Followers

Reviews: 67

User Lists: 0

#25 Pedro
Member since 2002 • 45066 Posts

If Ray tracing can be implemented in the same manner as other features without crippling the game, I am all for it. Anything that can remove the need for baking lights.

Avatar image for NoodleFighter
NoodleFighter

11088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 NoodleFighter  Online
Member since 2011 • 11088 Posts
@goldenelementxl said:
@Guy_Brohski said:

@goldenelementxl: Why would that be when both upcoming next gen consoles support RT?

The RTX 2060 supports RT, but it's hardly ideal. "Supporting" and requiring are very different.

Still better than a non RT card it may not be 2080ti but a RTX 2060 will still give you much better results than a non RTX card and its not like there is only one setting for ray tracing. Just look at how the RTX 2060 is getting 20 more fps than the GTX 1080 Ti even though the 1080 ti beats it in non ray tracing performance.

Loading Video...

Avatar image for Gatygun
Gatygun

2161

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 Gatygun
Member since 2010 • 2161 Posts

2023 = 4 years from now people.

If raytracing pushes forwards, then yea i can see this happen. Even the absolute worst card in the market will have better performanbce most likely then a 2080 ti on raytracing.

Avatar image for Jag85
Jag85

15224

Forum Posts

0

Wiki Points

0

Followers

Reviews: 217

User Lists: 0

#28  Edited By Jag85
Member since 2005 • 15224 Posts

RTX currently uses TSMC's 12 nm FinFET transistors. Samsung and TSMC plan to manufacture quantum-level 3nm GAAFET nodes around 2021-2022. GPUs may use 3nm nodes by 2023, which would allow high-quality ray tracing on low-end GPUs.

Avatar image for NoodleFighter
NoodleFighter

11088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 NoodleFighter  Online
Member since 2011 • 11088 Posts

@Yams1980: Going by that logic graphics wouldn't have evolved hardly at all. Ray tracing reflections can be used pretty usefully. For example in racing games the mirrors on cars can be ray traced to give people more accurate reflections much closer to real life cars thus giving people a much more immersive racing experience and not having to resort to looking at a reverse camera as much breaking the immersion and screwing up while driving. Battlefield V's utilization of ray traced reflections shows this kind of potential. Hell people in that game practically have an advantage as they can see reflections off even the tiniest of surfaces and use that to pinpoint enemy locations to catch them by surprise.

Metro Exodus' is currently the best example of ray tracing as its ray traced global illumination is also some very noticeable stuff it adds more color to everything, areas are more accurately lit instead of just being dull or overly dark, makes things such as character model's skin look less plastic like and makes things such as wet dirt and sand stand out more in comparison to it's dry counterparts and non ray traced versions. To be honest after seeing that

You seem like one of those people that try to run everything in potato mode to get the most amount of frames possible. If that's the case than yeah RT is not for you, but outside of people trying to be ultra competitive in an esports game people want performance AND nice graphics. You're not going to see many people play games like The Witcher 3 in potato mode on a high end rig just to get 244fps, you don't see companies like Nvidia or AMD use that as a marketing point they use the performance of high end graphics as a selling point not the performance of low end graphics as a selling point. Even developers of low end games try to make sure their games look good at least art style wise and not full on potato mode.

Ray tracing may take a big performance hit but you also have to realize that this is the first time that real time ray tracing on a large scale is actually running in a game. The first generation of ray tracing cards aren't a even a year old yet and Nvidia is the only company with them out on the market that's like calling 3D graphics cards a gimmick because the first generation to come out didn't give you Crysis level graphics with no performance hit. Also ray tracing can be scaled down. It isn't ultra high ray counts or nothing. Even the lower end ray tracing settings will give you a noticeable improvement over their more static/baked counterparts.

This is just like when tessellation came out people were calling it a gimmick due to its performance hit even though it hadn't even been more than a generation than the first cards with dedicated tech and now here we are with tessellation as a norm in games and performance hits not as big anymore.

Ray tracing may not be a mandatory requirement in the next decade but I still expect it to be an option for many games just like how you point out that you can completely turn off shadows and ambient occlusion in games today. As graphics continue to get more high end developers will have no other choice than to use ray tracing since it is easier to implement than static and prebaked methods.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#30  Edited By ronvalencia
Member since 2008 • 29612 Posts

@HalcyonScarlet said:

It's possible, but the problem is the variation on the PC. For example, even after all these years DX12 adoption is slow because of all the non Windows 10 users. People don't rush to get the latest cards.

But why would they push a feature like this as a requirement?

Slow DirectX12 adoption rates was created by MS since DirectX12 API was later enabled for Windows 7 with certain games.

MS attempted to shoehorn "Windows 10" into user's PCs.

Avatar image for burntbyhellfire
burntbyhellfire

789

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#31  Edited By burntbyhellfire
Member since 2019 • 789 Posts

it sounds like nvidia is blowing smoke up peoples asses to try to get them to run out and drop $1200 on a GPU that isn't actually going to become useful for AAA games for quite a few years and by time you need that tech it'll be available at a fraction of the cost

ive seen games using ray tracing side by side with the game not using it.. theres not a huge difference in visuals but a massive difference in frame rates.. the cost (in fps) and what you get for it just isnt worth it, wont be for a while, and who knows, might never be a valuable tradeoff

Avatar image for Yams1980
Yams1980

4103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 27

User Lists: -5

#32 Yams1980
Member since 2006 • 4103 Posts
@NoodleFighter said:

You seem like one of those people that try to run everything in potato mode to get the most amount of frames possible. If that's the case than yeah RT is not for you

I actually don't run games in potato mode if I can avoid it, unless the performance is terrible. But I do turn off effects that are hard to notice unless you are doing comparison screenshots to actually see a difference.

Example to this is ambient occlusion which does have nice extra shadowing in areas but if its gonna cost me fps that will drop below a playable level, its gotta go.

Same with any extra post processing effects or too much antialiasing or super sampling. I'll often toss it completely off and try to get reshade/sweetfx to run and use SMAA at a fraction of the performance hit.

I took a couple screenshots of Sunset Overdrive back when i was playing it. Ended up taking off SSAO because it was doing about a 10fps performance hit, and really didn't notice it enough. Even the comparison shots don't look much different. Reminds of me RT, everytime i see screens of it, I only notice it if i'm seeing with comparison shots, you can fake lighting effects enough to not need it.


Avatar image for HalcyonScarlet
HalcyonScarlet

9122

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#33 HalcyonScarlet
Member since 2011 • 9122 Posts

@ronvalencia said:
@HalcyonScarlet said:

It's possible, but the problem is the variation on the PC. For example, even after all these years DX12 adoption is slow because of all the non Windows 10 users. People don't rush to get the latest cards.

But why would they push a feature like this as a requirement?

Slow DirectX12 adoption rates was created by MS since DirectX12 API was later enabled for Windows 7 with certain games.

MS attempted to shoehorn "Windows 10" into user's PCs.

Too late. There's no point now, MS are dropping Windows update support for Windows 7 by January 2020. They should have done that in the first place.

Avatar image for ten_pints
Ten_Pints

4045

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#34  Edited By Ten_Pints
Member since 2014 • 4045 Posts

Bullshit.

Avatar image for xantufrog
xantufrog

14463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#35 xantufrog  Moderator
Member since 2013 • 14463 Posts

I believe it. Of course, people are all over the place on what they consider "AAA" these days, so I suppose that classification in the prediction is a point to debate. But as someone who doesn't own an RTX card, I'm still not threatened by the concept - seems very likely to me that someone is going to make a game requiring ray tracing hardware in 4 years from now.

Avatar image for pc_rocks
PC_Rocks

4732

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#36 PC_Rocks
Member since 2018 • 4732 Posts

@Jag85:

3nm is not quantum level, it's almost pushing the limits of current semiconductor dies but still no way at quantum level. You also need to understand that Xnm is mostly a marketing term that different companies use differently. It has very little to do with actual gate length.

Avatar image for Jag85
Jag85

15224

Forum Posts

0

Wiki Points

0

Followers

Reviews: 217

User Lists: 0

#37  Edited By Jag85
Member since 2005 • 15224 Posts

@pc_rocks: Anything below 7nm will experience quantum tunneling effects, so it is quantum level in that sense. It's possible that the "3nm" nodes may actually end up being 5nm, but that's still below 7nm and would experience quantum tunneling effects.

Avatar image for pc_rocks
PC_Rocks

4732

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#38  Edited By PC_Rocks
Member since 2018 • 4732 Posts

@Jag85 said:

@pc_rocks: Anything below 7nm will experience quantum tunneling effects, so it is quantum level in that sense. It's possible that the "3nm" nodes may actually end up being 5nm, but that's still below 7nm and would experience quantum tunneling effects.

Well quantum tunneling effects started to appear at 14nm/10nm already. So far they have been able to avoid these effects at large scale via different methods with acceptable errors. It just becomes worse as they keep shrinking the nodes hence the reason 3nm almost sounds like impossible or fantasy.

Having said all that quantum tunneling is not the same as 'quantum realm or quantum computing' For semiconductors involving silicon you have to go below 0.8nm to reach quantum realm 'ideally not practically'. Quantum tunneling is a phenomenon where electrons are able to bypass the barriers and flow across the gates at no/low voltage effectively keeping the switch in always on state because the barrier width s extremely small.

Basically what I'm trying to say is we are trying to avoid quantum effects in nodes rather than embracing them which is what I think when I say at quantum realm.

Avatar image for Juub1990
Juub1990

10430

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 Juub1990  Online
Member since 2013 • 10430 Posts

@burntbyhellfire: That’s the thing. Unless you know how light and shadows are supposed to behave/look like, non-ray traced will look just fine despite being extremely inaccurate. Ray tracing brings more realism but realism doesn’t equate beauty.

Avatar image for musicalmac
musicalmac

25029

Forum Posts

0

Wiki Points

0

Followers

Reviews: 15

User Lists: 1

#40  Edited By musicalmac  Moderator
Member since 2006 • 25029 Posts

Why will it require ray tracing? Will the game be unplayable without ray tracing? Will progress be impossible? I don't fully understand what this is other than a proprietary roadblock.

Happy to be informed if someone can do so.

Avatar image for pc_rocks
PC_Rocks

4732

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#41  Edited By PC_Rocks
Member since 2018 • 4732 Posts

@musicalmac:

RT is not proprietary at any level, be it drivers, hardware or APIs.

Avatar image for NoodleFighter
NoodleFighter

11088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42  Edited By NoodleFighter  Online
Member since 2011 • 11088 Posts
@Yams1980 said:
@NoodleFighter said:

You seem like one of those people that try to run everything in potato mode to get the most amount of frames possible. If that's the case than yeah RT is not for you

I actually don't run games in potato mode if I can avoid it, unless the performance is terrible. But I do turn off effects that are hard to notice unless you are doing comparison screenshots to actually see a difference.

Example to this is ambient occlusion which does have nice extra shadowing in areas but if its gonna cost me fps that will drop below a playable level, its gotta go.

Same with any extra post processing effects or too much antialiasing or super sampling. I'll often toss it completely off and try to get reshade/sweetfx to run and use SMAA at a fraction of the performance hit.

I took a couple screenshots of Sunset Overdrive back when i was playing it. Ended up taking off SSAO because it was doing about a 10fps performance hit, and really didn't notice it enough. Even the comparison shots don't look much different. Reminds of me RT, everytime i see screens of it, I only notice it if i'm seeing with comparison shots, you can fake lighting effects enough to not need it.

Metro Exodus sold me on ray tracing. Control and Atomic Heart also look fantastic with ray tracing. Battlefield V's ray traced reflections are also pretty good. After seeing this it has become really easy for me to spot the difference. I have yet to see any game good enough at faking ray tracing. I think once more games are made with Ray tracing in mind the differences become even easier to spot and performance hits won't be as bad. Metro Exodus outdoors lighting is practically night and day but the underground levels you won't really tell much of a difference. So the design of the game does matter in ray tracing making a difference.

Loading Video...
Loading Video...
@pc_rocks said:

@musicalmac:

RT is not proprietary at any level, be it drivers, hardware or APIs.

A lot of people are mistaking RT as a propriertary thing because of RTX and Nvidia only being on the market with it currently even and most don't know that RTX is actually a subset of DXR.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#43  Edited By ronvalencia
Member since 2008 • 29612 Posts

@NoodleFighter said:

@pc_rocks said:

@musicalmac:

RT is not proprietary at any level, be it drivers, hardware or APIs.

A lot of people are mistaking RT as a propriertary thing because of RTX and Nvidia only being on the market with it currently even and most don't know that RTX is actually a subset of DXR.

RTX hardware accelerates DXR at tier 1 level. https://en.wikipedia.org/wiki/Feature_levels_in_Direct3D#Support_matrix