6800XT benchmarks.

  • 66 results
  • 1
  • 2
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 tormentos
Member since 2003 • 33784 Posts

As always, you have to be careful with such benchmarks, even if the material I received yesterday seems quite plausible. Two sources, very different approaches or settings and yet in the end a certain coverage of the results – one can at least already see a certain trend. The most plausible results for me come from an Ultra-HD run (“4K”) of the three known benchmarks from the 3DMark suite.

What I received is said to be based on a benchmark run by a board partner who manufactures both AMD and NVIDIA graphics cards and is said to have carried out these three subsequent runs with an evaluation model (EVT phase). A normal Intel platform could have served as a basis when comparing the results. Since I was asked to benchmark the other cards myself for comparison and testing purposes and then to refer to the percentage differences, I first summarized the pure performance comparison as it was sent to me in this form in the original:

https://www.igorslab.de/en/3dmark-in-ultra-hd-benchmarks-the-rx-6800xt-without-and-with-raytracing/

The 6800XT beating the 3080FE?

But losing in RT vs the 3080 alto beating the 2080ti on its first RT try.

What say you system wars?

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#2 Zero_epyon
Member since 2004 • 20103 Posts

Impressive if true. Would like to see more though.

Avatar image for howmakewood
Howmakewood

7702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3 Howmakewood
Member since 2015 • 7702 Posts

I'd like to believe but I want to see more.

Avatar image for the6millionlies
The6millionLies

564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

#4  Edited By The6millionLies
Member since 2020 • 564 Posts

Both cards have the same amount of CUs apparently ( RX 6800XT , Nvidia 3080 )

And Nvidia RT solution will always be better there.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 Juub1990
Member since 2013 • 12620 Posts

@tormentos: That RT performance is very meh though. If the 6800XT can match the 3080 in raster but narrowly beat a 2080 Ti in RT, it means the RT performance of RDNA2 is worse than Turing.

Not that I care about RT anyway. Gotta drop shit down to 30fps to make it visually appealing enough.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

5700XT in Firestrike beats a 2080 FE... In real world gaming its drastically slower:

So, these results aren't surprising... I expect it to be slower than a RTX 3080 by 10-15%.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 tormentos
Member since 2003 • 33784 Posts

@Grey_Eyed_Elf said:

5700XT in Firestrike beats a 2080 FE... In real world gaming its drastically slower:

So, these results aren't surprising... I expect it to be slower than a RTX 3080 by 10-15%.

Yes and it beat it in time spy extreme as well, which is odd since the 5700XT OC loss vs the 2060 and Up GPU from Nvidia.

But we have to wait for actual in game benchmarks.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@tormentos said:
@Grey_Eyed_Elf said:

5700XT in Firestrike beats a 2080 FE... In real world gaming its drastically slower:

So, these results aren't surprising... I expect it to be slower than a RTX 3080 by 10-15%.

Yes and it beat it in time spy extreme as well, which is odd since the 5700XT OC loss vs the 2060 and Up GPU from Nvidia.

But we have to wait for actual in game benchmarks.

Pinch of salt my son, AMD since Polaris has had rampant fanboys releasing false performance benchmarks and never have they hit those leaked performance targets from leaks.

AMD already released benchmarks but didn't specify what card it was and it was losing to a 3080.... Add that to the fact that a 52 CU XSX loses to a RTX 2080, its safe to say that a 80 CU chip might be on par with it but has a high chance it will be a little slower in majority of games, as games differ in which hardware architecture they prefer and so on.

Avatar image for Willy105
Willy105

26093

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#9 Willy105
Member since 2005 • 26093 Posts

I used both an Nvidia GTX 970 and AMD’s Radeon 570 (both competing cards of the same power), and the 970 was far nicer to use.

Avatar image for deactivated-63d2876fd4204
deactivated-63d2876fd4204

9129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 deactivated-63d2876fd4204
Member since 2016 • 9129 Posts

Actual game benchmarks are gonna be a disaster. And raytracing on the AMD card is gonna be bad as well. Buckle up AMD fanboys, it’s gonna be a wild week

Avatar image for appariti0n
appariti0n

5013

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11 appariti0n
Member since 2009 • 5013 Posts

Nice if it turns out to be true!

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@goldenelementxl: The only thing these RDNA2 cards will do for PC gaming is make Nvidia Lower prices or get them to bring out RTX Super series GPU's.

DLSS and RTX cores have me completely sold... I wouldn't want to get a AMD GPU when the only real big game coming out is Cyberpunk and is optimized for Nvidia RTX cards.

I can't wait for all the cards and consoles to come out and the DF video's to start rolling out for Cyberpunk and Ass Creed.

Avatar image for osan0
osan0

17810

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13 osan0
Member since 2004 • 17810 Posts

taking this with a grain of salt at the moment of course. but if true the rasterisation performance looks nice.

the RT....er...not so much. the 2080TI was ok at RT but...well....it was nvidias first crack at it.

i was hoping AMD would look to get a better balance of RT to rasterisation performance. if anything this graph would suggest it's worse which is not good.

Thats also the 72CU model according to the graph. there is a 6900XT too apparently with 80CUs. big question there: will it scale? will games be able to actually get work to those CUs? thats an element of AMD GPUs that badly needs fixing (RDNA1 didnt give an answer).

but yeah: by no means a bad chip if its true but the RT looks like a sore point. pricing will be key as always of course.

Avatar image for Vaasman
Vaasman

15564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#14 Vaasman
Member since 2008 • 15564 Posts

@tormentos said:
@Grey_Eyed_Elf said:

5700XT in Firestrike beats a 2080 FE... In real world gaming its drastically slower:

So, these results aren't surprising... I expect it to be slower than a RTX 3080 by 10-15%.

Yes and it beat it in time spy extreme as well, which is odd since the 5700XT OC loss vs the 2060 and Up GPU from Nvidia.

But we have to wait for actual in game benchmarks.

It makes sense if you realize that AMD has always sucked at the software side of things. Their cards have good price to performance from raw specs but their drivers are piss poor and pretty much never get the most out of their products. Not to mention they virtually never push the boundaries of rendering engine technology, meaning they have to develop proprietary junk for things Nvidia develops, such as SLI, or more recently RTX and DLSS.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 Juub1990
Member since 2013 • 12620 Posts

@tormentos: What? The 5700XT is almost on par with the 2070S at 1440p and a bit slower at 4K. It beats the 2060 easily.

Unless you were talking about benchmarks?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 tormentos
Member since 2003 • 33784 Posts

@Juub1990:

Yes a timespy extreme benchmark the 2060 beats the 5700XT OC.

Avatar image for HalcyonScarlet
HalcyonScarlet

13663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#17 HalcyonScarlet
Member since 2011 • 13663 Posts

:-O Consoles aren't even out and they're getting pounded by Nvidia and AMD.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#18 deactivated-642321fb121ca
Member since 2013 • 7142 Posts

PC fanboys are a strange lot, I'm sure AMD still won't beat Nvidia. But it is a hell of a lot more competitive and still you like to take the piss. They will offer great raster and price, big win in my book.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#19 xantufrog  Moderator
Member since 2013 • 17875 Posts

That's pretty impressive if it holds up in the wild.... AMD getting its act together!

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20 Fedor
Member since 2015 • 11612 Posts

Going nowhere if they don't reveal a solid DLSS alternative.

Avatar image for BassMan
BassMan

17801

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#21  Edited By BassMan  Online
Member since 2002 • 17801 Posts

@Random_Matt said:

PC fanboys are a strange lot, I'm sure AMD still won't beat Nvidia. But it is a hell of a lot more competitive and still you like to take the piss. They will offer great raster and price, big win in my book.

I will hold my judgement until I see a lot of games benchmarks, but I agree that great raster and price can be a solid combo. Especially since RT cores are still a bottleneck on RTX 3000 series. Rasterization performance is still the most important.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@Random_Matt said:

PC fanboys are a strange lot, I'm sure AMD still won't beat Nvidia. But it is a hell of a lot more competitive and still you like to take the piss. They will offer great raster and price, big win in my book.

Well that's the thing AMD since Polaris hasn't really done anything special in terms of price to performance, well even before then it was toe to toe in price and performance anyway.

The trouble with AMD is that the fans of AMD seem to think they are going to get a card 20-30% better than Nvidia for the same price or the same performance but 30% cheaper... Its not the case and never has been, at best your going to get a card 4-5% faster for the same price or 4-5% cheaper for the same performance.

GPU's now are all about features, and AMD so far has no answer to DLSS and we have no idea how well it will perform in Ray Tracing and or what level of support we are going to get from them.

I am all about performance when it comes to hardware, I have had a Phenom II CPU and a Athlon X2... and from the looks of it my next CPU will be a 5900X, but like all things PC until its proven its all fanboy d*** waggling.

I will gladly buy a 6900XT if its better than a RTX 3080 BUT if it doesn't support Ray Tracing in games I want to play like Cyberpunk then no thank I will take the Nvidia card.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23  Edited By Juub1990
Member since 2013 • 12620 Posts

@BassMan: The RT cores aren’t the bottleneck. The shader cores are. In RT workloads, the 3080 performs much better than in RT/Raster or Raster-only scenarios.

Avatar image for BassMan
BassMan

17801

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#24  Edited By BassMan  Online
Member since 2002 • 17801 Posts

@Juub1990 said:

@BassMan: The RT cores aren’t the bottleneck. The shader cores are. In RT workloads, the 3080 performs much better than in RT/Raster or Raster-only scenarios.

Regardless, it is still way too much of a performance penalty enabling RT. Those RT cores need to be performing in parallel with the Cuda cores and delivering performance similar to rasterization only. We need to get to a point where "RTX ON or OFF?" is not even a question we ask ourselves.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Juub1990
Member since 2013 • 12620 Posts

@BassMan: Yep. Which is why I don’t care for it. It won’t be viable before Hopper or perhaps even the gen after.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 tormentos
Member since 2003 • 33784 Posts

@fedor said:

Going nowhere if they don't reveal a solid DLSS alternative.

I love how PC gamers cheer now for upscaling methods, i guess upscaling is OK as long as is on PC.

No i don't care about semantics all i care is that for years PC gamers here have slam checkerboard and dynamic resolution on both the xbox and PS, now they claim AMD most have the same as Nvidia.

The only reason DLSS exist it to cover the shitty performance those very expensive GPU from Nvidia has had since they came with Ray tracing.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 tormentos
Member since 2003 • 33784 Posts

Take it with a truck load of salt.

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 Fedor
Member since 2015 • 11612 Posts

@tormentos said:
@fedor said:

Going nowhere if they don't reveal a solid DLSS alternative.

I love how PC gamers cheer now for upscaling methods, i guess upscaling is OK as long as is on PC.

No i don't care about semantics all i care is that for years PC gamers here have slam checkerboard and dynamic resolution on both the xbox and PS, now they claim AMD most have the same as Nvidia.

The only reason DLSS exist it to cover the shitty performance those very expensive GPU from Nvidia has had since they came with Ray tracing.

Do you ever get tired of sounding dumb?

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 Juub1990
Member since 2013 • 12620 Posts

@tormentos: Checkerboarding never looked as good as native so nah.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31  Edited By Gatygun
Member since 2010 • 2709 Posts

@tormentos said:

Take it with a truck load of salt.

That's insane. 3090 will have the same price drop as the 2080ti a few months ago pretty much a month or so after its release rofl.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32  Edited By tormentos
Member since 2003 • 33784 Posts

@Juub1990 said:

@tormentos: Checkerboarding never looked as good as native so nah.

Irrelevant to the point both are upscaling techniques, in order to cope with hardware not been capable of rending at the target resolution on pure muscle, the argument here is not which is better technique or not.

The point is RT drops your $1400 GPU to 1080p and you have to use a scaling method to deal with the lower resolution.

There is no amount of damage control a PC gamer can pull that will justify it, i don't have a problem with DLSS which is SUPERIOR all the way by the way no denial on my part, the problem here is the double standard.

@fedor said:

Do you ever get tired of sounding dumb?

Don't you get tired of deflecting?

A 2080TI which is a $1400 until recently has to drop to 1080p when raytracing is in play, to get sharper image Nvidia came with DLSS.

So you get 1080p that look as sharp as 4k while not having true 4k.

In my books that count as upscaling which is a big no no in PC gamers book and something they have slam consoles for and still do.😊

Avatar image for Pedro
Pedro

69407

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#33 Pedro
Member since 2002 • 69407 Posts

@fedor said:
@tormentos said:

I love how PC gamers cheer now for upscaling methods, i guess upscaling is OK as long as is on PC.

No i don't care about semantics all i care is that for years PC gamers here have slam checkerboard and dynamic resolution on both the xbox and PS, now they claim AMD most have the same as Nvidia.

The only reason DLSS exist it to cover the shitty performance those very expensive GPU from Nvidia has had since they came with Ray tracing.

Do you ever get tired of sounding dumb?

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 Fedor
Member since 2015 • 11612 Posts

@tormentos said:

@fedor said:

Do you ever get tired of sounding dumb?

Don't you get tired of deflecting?

A 2080TI which is a $1400 until recently has to drop to 1080p when raytracing is in play, to get sharper image Nvidia came with DLSS.

So you get 1080p that look as sharp as 4k while not having true 4k.

In my books that count as upscaling which is a big no no in PC gamers book and something they have slam consoles for and still do.😊

Good thing your book is worthless.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 tormentos
Member since 2003 • 33784 Posts

@fedor said:

Good thing your book is worthless.

Yep as much as your sorry ass damage control peasant.😂😂😂

Now please continue your line of hypocrisy by saying how AMD needs DLSS to match Nvidia upscaling.😎

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36  Edited By Juub1990
Member since 2013 • 12620 Posts

@tormentos: It’s relevant to the point, stop being obtuse. Checkerboarding got mocked because Sony hyped up the PS4 Pro as a 4K machine and had to upscale 1440p or 192x2160 for a result that didn’t even look as good as 4K. If the result was identical, we would have applauded them but it was worse and made their 4K claims look silly.

Don’t you remember when DLSS 1.0 got shat on? Because it was a garbage upscaling technique.

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Fedor
Member since 2015 • 11612 Posts

@tormentos said:
@fedor said:

Good thing your book is worthless.

Yep as much as your sorry ass damage control peasant.😂😂😂

Now please continue your line of hypocrisy by saying how AMD needs DLSS to match Nvidia upscaling.😎

I didn't damage control anything... What I said was also 100% true, you continue to make yourself look like the clown you are.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#38  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@tormentos said:
@fedor said:

Good thing your book is worthless.

Yep as much as your sorry ass damage control peasant.😂😂😂

Now please continue your line of hypocrisy by saying how AMD needs DLSS to match Nvidia upscaling.😎

AMD needs something like DLSS to be able to provide proper 60 fps+ with RT with 1440-4k, to keep up with Nvidia's offerings. Having a hybrid upscaling approach that will eat into shader processor resources will not yield the same gains as DLSS. So say at 4k RTX 3080 with Control gets 63 fps on avg with DLSS+RT, AMD having weaker RT performance and possibility worse upscaling feature, gets 50 fps with the 6800xt. Its not a good option for running games with all the bells and whistles then.

Avatar image for lundy86_4
lundy86_4

61477

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#39 lundy86_4
Member since 2003 • 61477 Posts

This thread went well for OP lol.

Avatar image for deactivated-5fd4737f5f083
deactivated-5fd4737f5f083

937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#40  Edited By deactivated-5fd4737f5f083
Member since 2018 • 937 Posts

It'll be massively impressive if they pull off a 3080 bearing card. I'll feel a little salty I bought a 3080 but it's not like I'll be struggling in games regardless and I bought it for cyberpunk with all the bells and whistles turned on, which amd will not support.

I'm all for RT though and feel it falling short there will be disappointing. I hope they can offer some kind of decent dlss tech as well.

Avatar image for blaznwiipspman1
blaznwiipspman1

16539

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41  Edited By blaznwiipspman1
Member since 2007 • 16539 Posts

AMD always drops these sham benchmarks. They're generally fake news, because gaming benchmarks almost always have nvidia coming out on top. The good thing is, not by much.

You really have to be an nvidia fanboy to spend hundreds of dollars more for gimmicky features that AMD also has but conveniently ignores because its not as good. Nobody cares about gimmicky features in the first place, and you'd have to be a moron to let hundreds of dollars of savings go.

You can quote me on this, but the dsl 2.0 or whatever will be relegated to the trash heap in 5-10 years just like physx, and just like GSync. I even called it 10 years ago, gimmicky physx was going to end up in the trash heap, but the nvidia fanboys just couldn't stop drooling over it. Here we are.

I have NEVER bought an nvidia card since the GTX 460 days because of the obscene prices and horrendous price to performance. Until nvidia drops their price, or the nvidia fanboys stop following like sheep and the price naturally drops, I will continue riding AMD.

Also some of these nvidia fan boys always love to quote MSRP prices while ignoring that AMD frequently has promotions, deep price cuts. Nvidia, will never do such a thing.

Avatar image for blaznwiipspman1
blaznwiipspman1

16539

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 blaznwiipspman1
Member since 2007 • 16539 Posts

@04dcarraher said:
@tormentos said:
@fedor said:

Good thing your book is worthless.

Yep as much as your sorry ass damage control peasant.😂😂😂

Now please continue your line of hypocrisy by saying how AMD needs DLSS to match Nvidia upscaling.😎

AMD needs something like DLSS to be able to provide proper 60 fps+ with RT with 1440-4k, to keep up with Nvidia's offerings. Having a hybrid upscaling approach that will eat into shader processor resources will not yield the same gains as DLSS. So say at 4k RTX 3080 with Control gets 63 fps on avg with DLSS+RT, AMD having weaker RT performance and possibility worse upscaling feature, gets 50 fps with the 6800xt. Its not a good option for running games with all the bells and whistles then.

No, amd doesn't need gimmicky features like dsl whatever its called. Just like AMD didn't give a shit about physx 10 years ago, or gsync more recently. All these nvidia gimmicky features invariably end up in the trash bin eventually. AMD goes open source and they always end up being the standard.

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43  Edited By Fedor
Member since 2015 • 11612 Posts

@blaznwiipspman1 said:
@04dcarraher said:
@tormentos said:
@fedor said:

Good thing your book is worthless.

Yep as much as your sorry ass damage control peasant.😂😂😂

Now please continue your line of hypocrisy by saying how AMD needs DLSS to match Nvidia upscaling.😎

AMD needs something like DLSS to be able to provide proper 60 fps+ with RT with 1440-4k, to keep up with Nvidia's offerings. Having a hybrid upscaling approach that will eat into shader processor resources will not yield the same gains as DLSS. So say at 4k RTX 3080 with Control gets 63 fps on avg with DLSS+RT, AMD having weaker RT performance and possibility worse upscaling feature, gets 50 fps with the 6800xt. Its not a good option for running games with all the bells and whistles then.

No, amd doesn't need gimmicky features like dsl whatever its called. Just like AMD didn't give a shit about physx 10 years ago, or gsync more recently. All these nvidia gimmicky features invariably end up in the trash bin eventually. AMD goes open source and they always end up being the standard.

Yes they do. Also AMD has Freesync as their Gsync alternative.

Avatar image for blaznwiipspman1
blaznwiipspman1

16539

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 blaznwiipspman1
Member since 2007 • 16539 Posts

@fedor said:
@blaznwiipspman1 said:
@04dcarraher said:
@tormentos said:
@fedor said:

Good thing your book is worthless.

Yep as much as your sorry ass damage control peasant.😂😂😂

Now please continue your line of hypocrisy by saying how AMD needs DLSS to match Nvidia upscaling.😎

AMD needs something like DLSS to be able to provide proper 60 fps+ with RT with 1440-4k, to keep up with Nvidia's offerings. Having a hybrid upscaling approach that will eat into shader processor resources will not yield the same gains as DLSS. So say at 4k RTX 3080 with Control gets 63 fps on avg with DLSS+RT, AMD having weaker RT performance and possibility worse upscaling feature, gets 50 fps with the 6800xt. Its not a good option for running games with all the bells and whistles then.

No, amd doesn't need gimmicky features like dsl whatever its called. Just like AMD didn't give a shit about physx 10 years ago, or gsync more recently. All these nvidia gimmicky features invariably end up in the trash bin eventually. AMD goes open source and they always end up being the standard.

Yes they do.

and you're the reason why these crappy nvidia cards are selling for $2500 on ebay 😂

Avatar image for Vaasman
Vaasman

15564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#45  Edited By Vaasman
Member since 2008 • 15564 Posts

@tormentos said:
@fedor said:

Going nowhere if they don't reveal a solid DLSS alternative.

I love how PC gamers cheer now for upscaling methods, i guess upscaling is OK as long as is on PC.

No i don't care about semantics all i care is that for years PC gamers here have slam checkerboard and dynamic resolution on both the xbox and PS, now they claim AMD most have the same as Nvidia.

The only reason DLSS exist it to cover the shitty performance those very expensive GPU from Nvidia has had since they came with Ray tracing.

AI upscaling is a major departure from previous upscaling techniques. As of the 2.0 version of DLSS is resulting in little or no loss of quality compared to 4k without DLSS. The difference compared to checkerboarding and other outdated techniques is massive, and because of networking it's technology that actually gets better as time goes on even without regular manual updates.

Jelly?

Avatar image for fedor
Fedor

11612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 Fedor
Member since 2015 • 11612 Posts

@blaznwiipspman1 said:
@fedor said:
@blaznwiipspman1 said:
@04dcarraher said:
@tormentos said:

Yep as much as your sorry ass damage control peasant.😂😂😂

Now please continue your line of hypocrisy by saying how AMD needs DLSS to match Nvidia upscaling.😎

AMD needs something like DLSS to be able to provide proper 60 fps+ with RT with 1440-4k, to keep up with Nvidia's offerings. Having a hybrid upscaling approach that will eat into shader processor resources will not yield the same gains as DLSS. So say at 4k RTX 3080 with Control gets 63 fps on avg with DLSS+RT, AMD having weaker RT performance and possibility worse upscaling feature, gets 50 fps with the 6800xt. Its not a good option for running games with all the bells and whistles then.

No, amd doesn't need gimmicky features like dsl whatever its called. Just like AMD didn't give a shit about physx 10 years ago, or gsync more recently. All these nvidia gimmicky features invariably end up in the trash bin eventually. AMD goes open source and they always end up being the standard.

Yes they do.

and you're the reason why these crappy nvidia cards are selling for $2500 on ebay 😂

That doesn't even make sense.

Avatar image for deactivated-5fd4737f5f083
deactivated-5fd4737f5f083

937

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#47 deactivated-5fd4737f5f083
Member since 2018 • 937 Posts

Calling Ray tracing, dlss and adaptive sync technologies "gimmicks" that won't last shows an impressive level of ignorance.

Ray tracing allows for accurately rendering of light, that is pretty significant in the progress of gaming graphics. It will be commonplace as hardware becomes more capable of handling it.

Dlss and other Ai scaling tech allows for higher quality images for less performance impact and it's at the stage now where it's indistinguishable from native resolutions but allows for higher graphical fidelity. It will be come commonplace because it's genuinely a step forward.

Adaptive sync is now standard on pretty much all med to high range monitors. The difference it makes whole gaming is very noticeable and another great technology without any performance impacts. It's very clearly now commonplace.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#48  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@blaznwiipspman1 said:

No, amd doesn't need gimmicky features like dsl whatever its called. Just like AMD didn't give a shit about physx 10 years ago, or gsync more recently. All these nvidia gimmicky features invariably end up in the trash bin eventually. AMD goes open source and they always end up being the standard.

How is it gimmicky? when it works..... and AMD is implementing their own shader based upscaling that is not as good as DLSS performance wise...... RT kills performance and 4k is still stressful... That gimmicky feature is needed you goofball, until hardware is able to supply the performance needed. O right.....AMD didn't care about Nvidia tech so much that they created alternatives of the same features..... TressFX vs Nvidia Hairworks, Freesync vs Gsync, O let me fill you in on something AMD did promote a gpu accelerated physics called Bullet which was based on OpenCL around 2010 and gave up on it a few years later.

Just because Nvidia creates proprietary software to work with their hardware is no different any other company promoting their products. Hell AMD has done the same things.... When AMD gpu division was ATi (2007) they created "Truform" their proprietary tessellation hardware on the gpu and software suite that went nowhere.... and Nvidia didn't care since they were on an Unified shader architecture with the geforce 8 series. Also lets not forget AMD's proprietary API called Mantle and that went nowhere for them and they ended up giving it to Kronos group to give Vulkan a leg up.

Your living in a dream world if you think AMD going open source is something AMD wants... If given the option and being in a position of power they would promote and make feature proprietary to their products....

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 Juub1990
Member since 2013 • 12620 Posts

Last I checked G-Sync is still alive and kicking.

One of my favorite NVIDIA features I gotta say.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#50 04dcarraher
Member since 2004 • 23829 Posts
@blaznwiipspman1 said:
@fedor said:
@blaznwiipspman1 said:
@04dcarraher said:
@tormentos said:

Yep as much as your sorry ass damage control peasant.😂😂😂

Now please continue your line of hypocrisy by saying how AMD needs DLSS to match Nvidia upscaling.😎

AMD needs something like DLSS to be able to provide proper 60 fps+ with RT with 1440-4k, to keep up with Nvidia's offerings. Having a hybrid upscaling approach that will eat into shader processor resources will not yield the same gains as DLSS. So say at 4k RTX 3080 with Control gets 63 fps on avg with DLSS+RT, AMD having weaker RT performance and possibility worse upscaling feature, gets 50 fps with the 6800xt. Its not a good option for running games with all the bells and whistles then.

No, amd doesn't need gimmicky features like dsl whatever its called. Just like AMD didn't give a shit about physx 10 years ago, or gsync more recently. All these nvidia gimmicky features invariably end up in the trash bin eventually. AMD goes open source and they always end up being the standard.

Yes they do.

and you're the reason why these crappy nvidia cards are selling for $2500 on ebay 😂

And your the reason why R9 290's and Vega series of gpus sold 2-4x higher than msrp when they launched......