AMD potentially blocking DLSS on their sponsored games?

  • 94 results
  • 1
  • 2
Avatar image for pclover1980
PCLover1980

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#1  Edited By PCLover1980
Member since 2022 • 1244 Posts

Whilst Nvidia are slimy greedy assholes of their own, it looks like AMD isn't one to brag about being pro-consumer either.

Most Nvidia sponsored titles have both DLSS and FSR included in their games, but it's the other way around for AMD sponsored titles. All these blew up after a WCCF article and Bethesda announcing an AMD partnership surfaced.

And, yes, Nvidia never blocked AMD from using FSR on their sponsored games, and can even use Raytracing at a much worse performance.

NVIDIA does not and will not block, restrict, discourage, or hinder developers from implementing competitor technologies in any way. We provide the support and tools for all game developers to easily integrate DLSS if they choose and even created NVIDIA Streamline to make it easier for game developers to add competitive technologies to their games.

Keita Iida, vice president of developer relations, NVIDIA

AMD FidelityFX Super Resolution is an open-source technology that supports a variety of GPU architectures, including consoles and competitive solutions, and we believe an open approach that is broadly supported on multiple hardware platforms is the best approach that benefits developers and gamers. AMD is committed to doing what is best for game developers and gamers, and we give developers the flexibility to implement FSR into whichever games they choose.

AMD Spokesperson to Wccftech

Even their PR answers are really bad.

Avatar image for Postosuchus
Postosuchus

907

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 Postosuchus
Member since 2005 • 907 Posts

I assume DLSS isn't as easy or cheap to implement as flicking a switch Besides, Nvidia locked it down not only to their own hardware, but only certain generations of their own hardware. If a game dev has limited time and budget to implement one of these framerate crutches, why not the one that works on all hardware?

Avatar image for pclover1980
PCLover1980

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#3 PCLover1980
Member since 2022 • 1244 Posts

@Postosuchus: If you watched the vid, at least on Unreal engine, it's not as easy as FSR, but it's not really hard to implement either.

This is what a Nixxes programmer had to say about this:

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#4  Edited By 04dcarraher
Member since 2004 • 23832 Posts

Also I would like to suggest that more recent 8gb vram limitation increase in games has one common thread.... these newer AMD sponsorship titles.....(STAR WARS Jedi: Survivor,Resident Evil 4, THE LAST OF US etc) Is AMD purposely influencing developers to inflate vram requirements to make their gpu's look better? AMD holding back RT effects in sponsored games to cater toward their shortcomings? AMD Goaded Nvidia over vram ahead of RTX 4070 launching ..... All these things seem a bit fishy in being hypocritical in supporting "open source" yet cutting/limiting features.

Avatar image for pclover1980
PCLover1980

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#5  Edited By PCLover1980
Member since 2022 • 1244 Posts

@04dcarraher said:

Also I would like to suggest that more recent 8gb vram limitation increase in games has one common thread.... these newer AMD sponsorship titles.....(STAR WARS Jedi: Survivor,Resident Evil 4, THE LAST OF US etc) Is AMD purposely influencing developers to inflate vram requirements to make their gpu's look better? AMD holding back RT effects in sponsored games to cater toward their shortcomings? AMD Goaded Nvidia over vram ahead of RTX 4070 launching ..... All these things seem a bit fishy in being hypocritical in supporting "open source" yet cutting/limiting features.

I feel like this is a bit too tinfoil hatty for me, but it may be the case too. Or devs are really, really just lazy. Looking at TLOU1 on PC, there's a noticeable difference in terms of VRAM usage from launch and the latest patch. Also with the case in Forspoken.

Avatar image for davillain
DaVillain

56143

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#6 DaVillain  Moderator
Member since 2014 • 56143 Posts

I'm surprised it took someone this long to even make a thread on this case with AMD sponsorship games blocking DLSS.

Looking at the Starfield AMD sponsorship exclusive when it was reveal days ago, that right their is a red flag and as someone who's been looking forward to it, I'm now gonna wait for PC reviews before I purchase it now. As for AMD's PR team. If the answer was "no" they would have easily confirmed it when they were asked by both Hardware Unboxed & Gamer Nexus putting the negative rumor's to rest, but nope, they chosen to be anti-pro not answering the question and chose to make Starfield look bad in the process which MS doesn't need right now. No company wants a bad name and no negative rumors either. But its clear AMD has bad PR team and that's a given.

It should be noted that DLSS is just superior to FSR but I do want to see FSR get supported by future games but not trying to block it from developers not including DLSS as DLSS could have made other games playable to a degree like Callisto Protocol & Star Wars: Jedi Survivor.

Avatar image for lamprey263
lamprey263

44575

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#7 lamprey263
Member since 2006 • 44575 Posts

Guess AMD has to play dirty, was just a few short years ago Nvidia was working with devs to create custom APIs to tank performance on AMD hardware. People were okay with that though, as long as their games performed better on Nvidia's hardware they were happy to buy Nvidia GPUs. Nvidia used the money to improve on their graphics card performance. Now that they've managed to put themselves in advantageous market position they want to play nice all the sudden. If people are going to be consistent as they were the last decade or more, they really shouldn't care what AMD resorts to in order to gain an edge, as they seemed fine when Nvidia played nasty.

Avatar image for Pedro
Pedro

69566

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#8 Pedro
Member since 2002 • 69566 Posts

So they are doing what Nvidia has done. Full circle.🤔

Avatar image for blaznwiipspman1
blaznwiipspman1

16542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9  Edited By blaznwiipspman1
Member since 2007 • 16542 Posts

@pclover1980: why would you care about dsl?? It'd just a scam technology to skimp on rasterization...something Nvidia has been doing for years now. Skimping on rasterization and memory.

FSR is open source and available to all video cards, it's not restricted to one manufacturer. It's the superior option.

Avatar image for mrbojangles25
mrbojangles25

58345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#10 mrbojangles25
Member since 2005 • 58345 Posts

@Postosuchus said:

I assume DLSS isn't as easy or cheap to implement as flicking a switch Besides, Nvidia locked it down not only to their own hardware, but only certain generations of their own hardware. If a game dev has limited time and budget to implement one of these framerate crutches, why not the one that works on all hardware?

Locking down a tech to a specific hardware is not the same as forbidding it simply because you don't own that hardware.

Basically it's the difference between "Oh, you don't have our hardware? Well that's fine you just can't take advantage of _" vs "Oh, you don't have our hardware? Well, we sponsored this game, so we won't let you use ANYTHING, nyah nyah nyah!"

This is anti-consumer and people, no matter what hardware you own, should be against it.

Avatar image for pclover1980
PCLover1980

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#11 PCLover1980
Member since 2022 • 1244 Posts

@blaznwiipspman1 said:

@pclover1980: why would you care about dsl?? It'd just a scam technology to skimp on rasterization...something Nvidia has been doing for years now. Skimping on rasterization and memory.

FSR is open source and available to all video cards, it's not restricted to one manufacturer. It's the superior option.

DLSS is as much of a scam as XeSS is, if you meant scam as looking better than FSR.

Avatar image for blaznwiipspman1
blaznwiipspman1

16542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12  Edited By blaznwiipspman1
Member since 2007 • 16542 Posts

@pclover1980: I love it when Nvidia fangirls get butt hurt 😆. Over priced gpus, conned with lower rasterization and memory, and gimmicks like dsl and ray Tracing.

Hopefully AMD sponsors more of these games.

Avatar image for adrian1480
adrian1480

15033

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#13 adrian1480
Member since 2003 • 15033 Posts

AMD has been pretty sus lately. idk.

Avatar image for Postosuchus
Postosuchus

907

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 Postosuchus
Member since 2005 • 907 Posts

@mrbojangles25 said:
@Postosuchus said:

I assume DLSS isn't as easy or cheap to implement as flicking a switch Besides, Nvidia locked it down not only to their own hardware, but only certain generations of their own hardware. If a game dev has limited time and budget to implement one of these framerate crutches, why not the one that works on all hardware?

Locking down a tech to a specific hardware is not the same as forbidding it simply because you don't own that hardware.

Basically it's the difference between "Oh, you don't have our hardware? Well that's fine you just can't take advantage of _" vs "Oh, you don't have our hardware? Well, we sponsored this game, so we won't let you use ANYTHING, nyah nyah nyah!"

This is anti-consumer and people, no matter what hardware you own, should be against it.

What are you on about? FSR works on AMD, Nvidia, AND Intel. I've got a GTX 1070, and thus can use AMD's resolution crutch system but not Nvidia's thanks to their proprietary BS. And anyone who dropped money on their overpriced RTX 2000 or 3000 series is now being artificially denied version 3.

Avatar image for pclover1980
PCLover1980

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#15 PCLover1980
Member since 2022 • 1244 Posts

@blaznwiipspman1: I love it when you resort to ad hominems 'cause you don't know what the hell you're talking about. lmao

Just get off this thread if you're gonna troll.

Avatar image for pclover1980
PCLover1980

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#16 PCLover1980
Member since 2022 • 1244 Posts
@Postosuchus said:
@mrbojangles25 said:
@Postosuchus said:

I assume DLSS isn't as easy or cheap to implement as flicking a switch Besides, Nvidia locked it down not only to their own hardware, but only certain generations of their own hardware. If a game dev has limited time and budget to implement one of these framerate crutches, why not the one that works on all hardware?

Locking down a tech to a specific hardware is not the same as forbidding it simply because you don't own that hardware.

Basically it's the difference between "Oh, you don't have our hardware? Well that's fine you just can't take advantage of _" vs "Oh, you don't have our hardware? Well, we sponsored this game, so we won't let you use ANYTHING, nyah nyah nyah!"

This is anti-consumer and people, no matter what hardware you own, should be against it.

What are you on about? FSR works on AMD, Nvidia, AND Intel. I've got a GTX 1070, and thus can use AMD's resolution crutch system but not Nvidia's thanks to their proprietary BS. And anyone who dropped money on their overpriced RTX 2000 or 3000 series is now being artificially denied version 3.

Options are always better, no? Why deny people who can use DLSS even if FSR is a thing?

Avatar image for Bond007uk
Bond007uk

1644

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 Bond007uk
Member since 2002 • 1644 Posts

@Pedro said:

So they are doing what Nvidia has done. Full circle.🤔

Well no, because most games that support DLSS also support FSR.

Avatar image for Litchie
Litchie

34626

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#18 Litchie
Member since 2003 • 34626 Posts

That sucks.

AMD has always been the "cheaper and worse, but still good" option. Now that they aren't cheaper anymore, there's no reason to get an AMD. You're basically getting a worse GPU with less support for the same amount of money as an Nvidia GPU. No idea what AMD's plan is with that.

Avatar image for adrian1480
adrian1480

15033

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#19  Edited By adrian1480
Member since 2003 • 15033 Posts
@Litchie said:

That sucks.

AMD has always been the "cheaper and worse, but still good" option. Now that they aren't cheaper anymore, there's no reason to get an AMD. You're basically getting a worse GPU with less support for the same amount of money as an Nvidia GPU. No idea what AMD's plan is with that.

Considering Nvidia GPUs are melting at an unusual clip thanks to its fancy trash connector that nobody asked for, AMD is the safer GPU selection, for less, for not a significant drop in performance if your application isn't some sort of 3D design productivity.

Of course their CPUs have been superior in serious multithreaded performance and efficiency for the last few years.

Avatar image for Litchie
Litchie

34626

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#20 Litchie
Member since 2003 • 34626 Posts

@adrian1480 said:
@Litchie said:

That sucks.

AMD has always been the "cheaper and worse, but still good" option. Now that they aren't cheaper anymore, there's no reason to get an AMD. You're basically getting a worse GPU with less support for the same amount of money as an Nvidia GPU. No idea what AMD's plan is with that.

Considering Nvidia GPUs are melting at an unusual clip thanks to its fancy trash connector that nobody asked for, AMD is the safer GPU selection, for less, for not a significant drop in performance if your application isn't some sort of 3D design productivity.

Of course their CPUs have been superior in serious multithreaded performance and efficiency for the last few years.

I've never heard about that, and highly doubt that will be any form of problem for me. Getting a worse card with less support for the same amount of money, because of a supposed "trash connector" on Nvidia cards? Hell no.

Avatar image for R4gn4r0k
R4gn4r0k

46348

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21 R4gn4r0k
Member since 2004 • 46348 Posts

They 100% definitely block DLSS on their sponsored games:

  • Halo Infinite
  • Callisto Protocol
  • Dead island 2
  • Jedi Survivor
  • Starfield

All of these games get some crappy implementation of FSR (or in Infinite's case: Ray tracing) and won't receive much needed DLSS support.

@Postosuchus said:

I assume DLSS isn't as easy or cheap to implement as flicking a switch

It's been proven to be as easy and cheap to implement as flicking a switch. It's built into UE 4 so games like Calliso, Dead Island 2 and Jedi have gone through MORE effort removing it than implementing it.

Avatar image for osan0
osan0

17822

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#22 osan0
Member since 2004 • 17822 Posts

Ah this whole mess has made be laugh at all involved to be honest. For the record: I do hope Starfield gets DLSS support at launch.

But yeah: Chalk it up to RTGs marketing department (Ryzen marketing is at least decent) to take having a sponsorship deal with a very big AAA game....and completely screwing it up and turning it against themselves. Seriously...how is that department still operating? Even IF there was a stipulation that DLSS couldn't be implemented....at this point it would just be prudent to quietly take it out and let Bethesda announce that it will have DLSS support. Then this whole mess goes away: AMD can include Starfield with various AMD hardware purchases and everyone is happy. Either that or announce FSR 3 with frame gen will launch with Starfield and it will work with AMD, Nvidia and Intel GPUs....assuming that that's the plan (where is that anyway?). Even then.....just announce DLSS on Starfield for gods sake...it's getting silly.

On the flip side though...i also have to laugh at gamers. Yes it sucks that a tech may be barred due a contractual agreement. Yes DLSS is the best upscaling tech there is. But Jaesus H....it's just upscaling. It's not absolutely critical to the experience. It's not a hard requirement for a game to function. It's a nice to have...a tick box feature. Get some perspective. I do have to doff my cap at Nvidias marketing department...they have done an amazing job at convincing people that everything they do and every feature they implement is ABSOLUTELY CRITICAL.

Also everyone is freaking out but....and someone may need to confirm this.....but the game is not released yet at the time of writing this post. So far there is nothing to actually freak out about. Nothing has actually happened....yet. Good to let the industry know you want DLSS...no harm in that. But The complete overreaction is comical. It's real McDonalds sauce tantrum stuff.

As I say: I hope it does get DLSS support and this sorry mess is put to rest. It sucks when tech won't be implemented due to business policies. Sadly the industry is littered with examples from all players :(.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#23  Edited By 04dcarraher
Member since 2004 • 23832 Posts
@lamprey263 said:

Guess AMD has to play dirty, was just a few short years ago Nvidia was working with devs to create custom APIs to tank performance on AMD hardware. People were okay with that though, as long as their games performed better on Nvidia's hardware they were happy to buy Nvidia GPUs. Nvidia used the money to improve on their graphics card performance. Now that they've managed to put themselves in advantageous market position they want to play nice all the sudden. If people are going to be consistent as they were the last decade or more, they really shouldn't care what AMD resorts to in order to gain an edge, as they seemed fine when Nvidia played nasty.

Actually it was AMD who was creating custom graphical API that catered to their products first.... It was called Mantle, the only way Nvidia would be able to use it was by disclosing their gpu information... which they were not going to do. Mantle was slick but shady move for AMD because it shifted optimizations and hardware coding to the developer shifting the blame and not having worry about spending resources patching drivers for each new title that needed tweaked. Because at that time frame AMD Direct x11 optimizations were all over the place and performance always lagged behind Nvidia.

If your referring to Nvidia software API. AMD uses their software API the same way.... look at TressFX, which performed poorly on Nvidia gpus.

The thing is that those proprietary features or "gimmicks" as some would call them, push innovation into future standards. You may complain about Nvidia over saturating games with tessellation killing AMD performance, but it forced AMD to design gpus to handle high levels of tessellation, which are now common place. Or look at Physx, putting gpu real time physics in the fore front, which lead to more open sourced physics engines that could use any gpu, which forced Nvidia to forgo that proprietary feature.

I think to problem is AMD touting it being "opensource", but when the other guy known for its "gimmickery". Is promoting more openness for dev's is not a great look.

Avatar image for blaznwiipspman1
blaznwiipspman1

16542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 blaznwiipspman1
Member since 2007 • 16542 Posts

@pclover1980: let the butt hurt flow 🤣

@pedro exactly, Ngreedia has been doing far worse for decades, yet I love how fanboys claim ngreedia are saints.

Avatar image for Bond007uk
Bond007uk

1644

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Bond007uk
Member since 2002 • 1644 Posts

@osan0 said:

On the flip side though...i also have to laugh at gamers. Yes it sucks that a tech may be barred due a contractual agreement. Yes DLSS is the best upscaling tech there is. But Jaesus H....it's just upscaling. It's not absolutely critical to the experience. It's not a hard requirement for a game to function. It's a nice to have...a tick box feature. Get some perspective. I do have to doff my cap at Nvidias marketing department...they have done an amazing job at convincing people that everything they do and every feature they implement is ABSOLUTELY CRITICAL.

Yeah, the game doesn't have RT anyway. As long as you're not aiming for 4K, I don't think DLSS is required, not for 1440P. As long as your GPU is at least decent modern card. I mean with RT and DLSS turned off, I can run Cyberpunk 2077 at 1440P at over 100fps. Pretty sure I'm not going to have much issues running Starfield. I do have an RTX 3080 though.

Could be all about the lack frame generation with DLSS 3 that is causing the anger? But surely, how many folks actually own a 4xxxx Nvidia card?

Avatar image for glez13
glez13

10310

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 glez13
Member since 2006 • 10310 Posts

According to Moore's Law is Dead this is a new low in video game journalism and that there is no evidence that points at this being a real thing. He even contacted some of his developer sources and he didn't found anything.

Loading Video...

Avatar image for davillain
DaVillain

56143

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#27  Edited By DaVillain  Moderator
Member since 2014 • 56143 Posts
@Bond007uk said:
@osan0 said:

On the flip side though...i also have to laugh at gamers. Yes it sucks that a tech may be barred due a contractual agreement. Yes DLSS is the best upscaling tech there is. But Jaesus H....it's just upscaling. It's not absolutely critical to the experience. It's not a hard requirement for a game to function. It's a nice to have...a tick box feature. Get some perspective. I do have to doff my cap at Nvidias marketing department...they have done an amazing job at convincing people that everything they do and every feature they implement is ABSOLUTELY CRITICAL.

Yeah, the game doesn't have RT anyway. As long as you're not aiming for 4K, I don't think DLSS is required, not for 1440P. As long as your GPU is at least decent modern card. I mean with RT and DLSS turned off, I can run Cyberpunk 2077 at 1440P at over 100fps. Pretty sure I'm not going to have much issues running Starfield. I do have an RTX 3080 though.

Could be all about the lack frame generation with DLSS 3 that is causing the anger? But surely, how many folks actually own a 4xxxx Nvidia card?

The problem from what I'm hearing regarding Starfield is that it's CPU bound. The Minimum GPU they are asking here is a GTX 1070 Ti (I keep forgetting that Ti version exist for that 1070) & the recommended GPU is an RTX 2080. I don't think Starfield would need DLSS but the CPU bound is a question should it need it though.

As for RT, it says there's Global Illumination hence the RT label. I was excited for this game but now, I'll just wait for the PC reviews before I decide to buy it. (And no, I'm not interest using Game Pass)

Avatar image for hardwenzen
hardwenzen

38943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#28 hardwenzen
Member since 2005 • 38943 Posts

I feel ashamed for being an rx6800 owner, but when you look at the other side, they're as shitty if not shittier. You hermits have the worst companies in existence.

Avatar image for NoodleFighter
NoodleFighter

11798

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 NoodleFighter
Member since 2011 • 11798 Posts

@04dcarraher said:
@lamprey263 said:

Guess AMD has to play dirty, was just a few short years ago Nvidia was working with devs to create custom APIs to tank performance on AMD hardware. People were okay with that though, as long as their games performed better on Nvidia's hardware they were happy to buy Nvidia GPUs. Nvidia used the money to improve on their graphics card performance. Now that they've managed to put themselves in advantageous market position they want to play nice all the sudden. If people are going to be consistent as they were the last decade or more, they really shouldn't care what AMD resorts to in order to gain an edge, as they seemed fine when Nvidia played nasty.

Actually it was AMD who was creating custom graphical API that catered to their products first.... It was called Mantle, the only way Nvidia would be able to use it was by disclosing their gpu information... which they were not going to do. Mantle was slick but shady move for AMD because it shifted optimizations and hardware coding to the developer shifting the blame and not having worry about spending resources patching drivers for each new title that needed tweaked. Because at that time frame AMD Direct x11 optimizations were all over the place and performance always lagged behind Nvidia.

If your referring to Nvidia software API. AMD uses their software API the same way.... look at TressFX, which performed poorly on Nvidia gpus.

The thing is that those proprietary features or "gimmicks" as some would call them, push innovation into future standards. You may complain about Nvidia over saturating games with tessellation killing AMD performance, but it forced AMD to design gpus to handle high levels of tessellation, which are now common place. Or look at Physx, putting gpu real time physics in the fore front, which lead to more open sourced physics engines that could use any gpu, which forced Nvidia to forgo that proprietary feature.

I think to problem is AMD touting it being "opensource", but when the other guy known for its "gimmickery". Is promoting more openness for dev's is not a great look.

Open source is worthless if the tech is trash which is what a lot of AMD proprietary stuff was. People keep praising AMD for being open source when it's Nvidia that paths the way and brings innovation to industry and AMD simply makes a knock off a year later after Nvidia reveals their tech. Although we should give AMD props for Mantle because in the long run it ended up being the building blocks for the Vulkan API which is just as good if not better than DirectX12 depending on the game and hardware.

@Bond007uk said:
@osan0 said:

On the flip side though...i also have to laugh at gamers. Yes it sucks that a tech may be barred due a contractual agreement. Yes DLSS is the best upscaling tech there is. But Jaesus H....it's just upscaling. It's not absolutely critical to the experience. It's not a hard requirement for a game to function. It's a nice to have...a tick box feature. Get some perspective. I do have to doff my cap at Nvidias marketing department...they have done an amazing job at convincing people that everything they do and every feature they implement is ABSOLUTELY CRITICAL.

Yeah, the game doesn't have RT anyway. As long as you're not aiming for 4K, I don't think DLSS is required, not for 1440P. As long as your GPU is at least decent modern card. I mean with RT and DLSS turned off, I can run Cyberpunk 2077 at 1440P at over 100fps. Pretty sure I'm not going to have much issues running Starfield. I do have an RTX 3080 though.

Could be all about the lack frame generation with DLSS 3 that is causing the anger? But surely, how many folks actually own a 4xxxx Nvidia card?

I'm an RTX 3080 user and DLSS has been really handy for games that are very graphically demanding. I started A Plague Tale: Requiem the other day and it's gorgeous but very hard to run at 4K even with RT off. I'm able to get a solid 60fps most of the time on just DLSS Balanced. I switched between native 4K, DLSS Quality, Balanced and Performance mode and couldn't notice any big difference in visuals but it ran a lot smoother on Balanced and Performance mode.

Also people with weaker RTX cards may want to play at higher graphical settings but not sacrifice the resolution quality for it. People already use FSR on the Steam Deck to get better performance without lowering the graphics settings into potato mode. Before I had my RTX 3080 build I was using my Alienware RTX 2070 Super laptop and playing games like Metro Exodus Enhanced edition on ultra settings with 1440p DLSS quality mode with a consistent 60fps. DLSS and FSR are definitely going to be carrying cards such as the RTX 2060 well into the lifespan of this gaming generation.

If I was a RTX 4000 series owner I would be pissed at the lack of DLSS 3 support because the jump in performance compared to older generations with rasterization and RT is pathetic on anything lower than a 4080 16GB. Nvidia is clearly using Frame Generation as a crutch since they use it to market the big performance boost you get over older cards with it. For example the RTX 4060 is only 20% faster at best compared to the RTX 3060 without Frame Generation and it comes with less VRAM. You'd be better off just getting the RTX 3060 12GB for cheaper and maybe using DLSS balanced and performance mode on games. Unless I see a RTX 4080 on sale for less than $700. I'm not buying it. I'd rather ride it out with my RTX 3080 and play games in DLSS performance mode.

Avatar image for mrbojangles25
mrbojangles25

58345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#30 mrbojangles25
Member since 2005 • 58345 Posts

@Litchie said:
@adrian1480 said:
@Litchie said:

That sucks.

AMD has always been the "cheaper and worse, but still good" option. Now that they aren't cheaper anymore, there's no reason to get an AMD. You're basically getting a worse GPU with less support for the same amount of money as an Nvidia GPU. No idea what AMD's plan is with that.

Considering Nvidia GPUs are melting at an unusual clip thanks to its fancy trash connector that nobody asked for, AMD is the safer GPU selection, for less, for not a significant drop in performance if your application isn't some sort of 3D design productivity.

Of course their CPUs have been superior in serious multithreaded performance and efficiency for the last few years.

I've never heard about that, and highly doubt that will be any form of problem for me. Getting a worse card with less support for the same amount of money, because of a supposed "trash connector" on Nvidia cards? Hell no.

You havent heard about it because it's happening to only a very specific card and not even that often.

But it is a very real defect so it get's blown out of proportion.

Avatar image for pclover1980
PCLover1980

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#31 PCLover1980
Member since 2022 • 1244 Posts

@Bond007uk said:
@osan0 said:

On the flip side though...i also have to laugh at gamers. Yes it sucks that a tech may be barred due a contractual agreement. Yes DLSS is the best upscaling tech there is. But Jaesus H....it's just upscaling. It's not absolutely critical to the experience. It's not a hard requirement for a game to function. It's a nice to have...a tick box feature. Get some perspective. I do have to doff my cap at Nvidias marketing department...they have done an amazing job at convincing people that everything they do and every feature they implement is ABSOLUTELY CRITICAL.

Yeah, the game doesn't have RT anyway. As long as you're not aiming for 4K, I don't think DLSS is required, not for 1440P. As long as your GPU is at least decent modern card. I mean with RT and DLSS turned off, I can run Cyberpunk 2077 at 1440P at over 100fps. Pretty sure I'm not going to have much issues running Starfield. I do have an RTX 3080 though.

Could be all about the lack frame generation with DLSS 3 that is causing the anger? But surely, how many folks actually own a 4xxxx Nvidia card?

DLSS does clean up image nicely at the quality or balanced settings on some games. Not all games, but it depends on the implementation. Devs have really been lazy on their AA implementation, especially with TAA, making the image quality really blurry. FSR doesn't help with that much 'cause it doesn't have AI reconstruction. Add to that, if there's DLSS, there's most likely going to be DLAA, which is much, much superior to TAA. And the additional performance it gives is really nice too.

Loading Video...
Loading Video...

Avatar image for pclover1980
PCLover1980

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#32 PCLover1980
Member since 2022 • 1244 Posts

@NoodleFighter: The RTX 4000 series is a whole new thread in and of itself. It's a trash generation by Nvidia, and the only worthwhile card in the series is the 4090. That said, yeah, I agree with how Nvidia is shooting themselves in the foot by using Frame Gen as a crutch to their much worse cards.

If anyone buys a 4070 below, especially if you're coming from a 3xxx series, you're just burning money.

Avatar image for Litchie
Litchie

34626

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#33 Litchie
Member since 2003 • 34626 Posts

@mrbojangles25 said:
@Litchie said:
@adrian1480 said:
@Litchie said:

That sucks.

AMD has always been the "cheaper and worse, but still good" option. Now that they aren't cheaper anymore, there's no reason to get an AMD. You're basically getting a worse GPU with less support for the same amount of money as an Nvidia GPU. No idea what AMD's plan is with that.

Considering Nvidia GPUs are melting at an unusual clip thanks to its fancy trash connector that nobody asked for, AMD is the safer GPU selection, for less, for not a significant drop in performance if your application isn't some sort of 3D design productivity.

Of course their CPUs have been superior in serious multithreaded performance and efficiency for the last few years.

I've never heard about that, and highly doubt that will be any form of problem for me. Getting a worse card with less support for the same amount of money, because of a supposed "trash connector" on Nvidia cards? Hell no.

You havent heard about it because it's happening to only a very specific card and not even that often.

But it is a very real defect so it get's blown out of proportion.

That's what I thought. Acting like it's a huge problem you need to get an AMD instead for is some weird behavior. Thanks for clarifying.

Avatar image for BassMan
BassMan

17812

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#34  Edited By BassMan
Member since 2002 • 17812 Posts

FSR is not as good as DLSS at upscaling. So, you are getting inferior quality and then there is not even the option for frame generation. These exclusive partner deals are shit for gamers. PC gamers deserve all the options and allow them to choose what they want to use.

Also, FSR being open or hardware agnostic is mostly irrelevant if it can not match the quality of DLSS. Don't force inferior tech on people. That is like some console bullshit.

Avatar image for Litchie
Litchie

34626

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#35 Litchie
Member since 2003 • 34626 Posts

@BassMan: So not only are they trying to sell you worse products with worse features for roughly the same amount of money as the better products, they also try to force said worse features on you.

Didn't need any more reasons to not support AMD GPUs, but there we go.

Avatar image for BassMan
BassMan

17812

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#36  Edited By BassMan
Member since 2002 • 17812 Posts
@Litchie said:

@BassMan: So not only are they trying to sell you worse products with worse features for roughly the same amount of money as the better products, they also try to force said worse features on you.

Didn't need any more reasons to not support AMD GPUs, but there we go.

I think AMD is very good at basic rasterization, but they are bad at RT and tech that makes use of AI cores because they don't have the AI cores. So, they have to realize they are in a position of inferiority and price their products accordingly if they want people to buy their products. Overcharging for products and forcing inferior tech on people with exclusive deals is not the way to win over gamers.

I am glad that Intel has now entered the discrete GPU space and they seem to be good at RT and have AI cores as well. However, all their current GPUs are low tier and have been buggy. They need to bring out some higher tier products and work out the kinks.

Avatar image for Litchie
Litchie

34626

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#37 Litchie
Member since 2003 • 34626 Posts

@BassMan: Yup, I agree with all of that.

Avatar image for R4gn4r0k
R4gn4r0k

46348

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#38 R4gn4r0k
Member since 2004 • 46348 Posts

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

Avatar image for osan0
osan0

17822

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#39 osan0
Member since 2004 • 17822 Posts

@R4gn4r0k said:

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

Yeah I saw that in the Hardware Unboxed Vid.....that's a really bad look for AMD indeed.

Avatar image for R4gn4r0k
R4gn4r0k

46348

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#40 R4gn4r0k
Member since 2004 • 46348 Posts

@osan0 said:
@R4gn4r0k said:

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

Yeah I saw that in the Hardware Unboxed Vid.....that's a really bad look for AMD indeed.

It kinda makes sense as AMD doesn't want DLSS to outshine their proprietary tech in their sponsored games.

But I still wish the practice would cease to a halt, starting with Starfield.

And I wish their AMD sponsorship would go in a different direction.

Avatar image for osan0
osan0

17822

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#41  Edited By osan0
Member since 2004 • 17822 Posts

@R4gn4r0k said:
@osan0 said:
@R4gn4r0k said:

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

Yeah I saw that in the Hardware Unboxed Vid.....that's a really bad look for AMD indeed.

It kinda makes sense as AMD doesn't want DLSS to outshine their proprietary tech in their sponsored games.

But I still wish the practice would cease to a halt, starting with Starfield.

And I wish their AMD sponsorship would go in a different direction.

I kinda makes sense on the surface but, with a bit more thought, it really doesn't. It's no secret that DLSS is superior to FSR. This is known. It's well covered and established. There has been nothing to indicate that FSR2 has caught up. This is not to say FSR is bad. I use it myself in a couple of games and it's fine...though I am probably using it under the best conditions (FSR quality @4K). But DLSS is better (though by no means perfect).

So what does cutting DLSS out of starfield actually get AMD except bad press? There is no win here. Developers wont be any more or less encouraged to use FSR in their games anyway. It still makes sense to implement FSR since it runs on everything including consoles, so the tech still has value. It's also designed so that any game with DLSS support can easily implement FSR2 and the reverse is also true.

RTGs marketing department are in a tough spot in fairness. The reality is that Radeon is still behind on the software and hardware front. They have made good strides on the hardware front since vega in fairness. But there are still weaknesses. Then there is a lot of work to do on the software side too, especially in the areas that are not gaming.

But there has to be better ways of using the deal than this surely. RTGs marketing department really does need an overhaul.

Avatar image for mrbojangles25
mrbojangles25

58345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#42  Edited By mrbojangles25
Member since 2005 • 58345 Posts
@R4gn4r0k said:

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

That almost seems like legit corruption of some kind, you know? But I guess it's just "good business" to corpo scum.

@osan0 said:

I kinda makes sense on the surface but, with a bit more thought, it really doesn't. It's no secret that DLSS is superior to FSR. This is known. It's well covered and established. There has been nothing to indicate that FSR2 has caught up. This is not to say FSR is bad. I use it myself in a couple of games and it's fine...though I am probably using it under the best conditions (FSR quality @4K). But DLSS is better (though by no means perfect).

So what does cutting DLSS out of starfield actually get AMD except bad press? There is no win here. Developers wont be any more or less encouraged to use FSR in their games anyway. It still makes sense to implement FSR since it runs on everything including consoles, so the tech still has value. It's also designed so that any game with DLSS support can easily implement FSR2 and the reverse is also true.

RTGs marketing department are in a tough spot in fairness. The reality is that Radeon is still behind on the software and hardware front. They have made good strides on the hardware front since vega in fairness. But there are still weaknesses. Then there is a lot of work to do on the software side too, especially in the areas that are not gaming.

But there has to be better ways of using the deal than this surely. RTGs marketing department really does need an overhaul.

It makes sense in the same way that killing a rival makes sense. I mean, if you killed him, there'd be less competition...but, you know, there's the whole "ethical thing" :P

Anyway, with the way you phrased it, it's sort of like AMD is essentially saying they want to remove a better option from their supported/sponsored games so it doesn't take away from their inferior option, but all it does is remind us that we are forced to use and stuck with an inferior option.

It's anti-consumer and anti-competitive. Two great sins in capitalism, in my opinion, but as here in the US we use a bastardized, twisted, mutated strain of capitalism I suppose it kind of makes sense *shrug*

***This has been a Comrade Bojangles-approved post, since I criticized capitalism I figure I have to adopt the socialist persona***

Avatar image for R4gn4r0k
R4gn4r0k

46348

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#43 R4gn4r0k
Member since 2004 • 46348 Posts

@mrbojangles25 said:
@R4gn4r0k said:

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

That almost seems like legit corruption of some kind, you know? But I guess it's just "good business" to corpo scum.

Worse still, Sony is straight up money hatting developers to not put their games on Xbox. Or delay them on PC (square enix titles).

In turn MS is buying developers like Obsidian and Activision who can no longer create games for Playstation.

There are definitely worse examples in this industry, but yeah I don't like this one bit.

Avatar image for mrbojangles25
mrbojangles25

58345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#44 mrbojangles25
Member since 2005 • 58345 Posts

@R4gn4r0k said:
@mrbojangles25 said:
@R4gn4r0k said:

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

That almost seems like legit corruption of some kind, you know? But I guess it's just "good business" to corpo scum.

Worse still, Sony is straight up money hatting developers to not put their games on Xbox. Or delay them on PC (square enix titles).

In turn MS is buying developers like Obsidian and Activision who can no longer create games for Playstation.

There are definitely worse examples in this industry, but yeah I don't like this one bit.

Broken record on my part, but as I say, thank god for independent and small-scale game development. Probably the only reason I still game.

Avatar image for R4gn4r0k
R4gn4r0k

46348

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#45 R4gn4r0k
Member since 2004 • 46348 Posts

@mrbojangles25 said:

Broken record on my part, but as I say, thank god for independent and small-scale game development. Probably the only reason I still game.

Little known game called Gylt was finally freed from the shackles of Stadia and released on PC and consoles today.

If you wanna know if it's good? It was the best Stadia exclusive ever made!

Well that still doesn't answer the question, but yes, yes it's good :)

Avatar image for lamprey263
lamprey263

44575

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#46  Edited By lamprey263
Member since 2006 • 44575 Posts

@mrbojangles25: "Broken record on my part, but as I say, thank god for independent and small-scale game development. Probably the only reason I still game."

If anything smaller developers can also be much easier to buy out to exclude titles from competing platforms. They're not as high profile for larger parties to try to make exclusivity deals with, but when their projects get noticed, I'm sure it doesn't take much incentive to be locked down into a deal.

Avatar image for mrbojangles25
mrbojangles25

58345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#47  Edited By mrbojangles25
Member since 2005 • 58345 Posts
@lamprey263 said:

@mrbojangles25: "Broken record on my part, but as I say, thank god for independent and small-scale game development. Probably the only reason I still game."

If anything smaller developers can also be much easier to buy out to exclude titles from competing platforms. They're not as high profile for larger parties to try to make exclusivity deals with, but when their projects get noticed, I'm sure it doesn't take much incentive to be locked down into a deal.

They might be more vulnerable to complete buyouts/payoffs, but (and I have no proof of this, so humor me) I'd argue they might be more ethical?

No doubt it varies for everyone, and a person's "price" varies as well, but I imagine someone that founded a studio to get away from the corporate world might turn down a large sum of money if they were a.) already making more than enough money, and b.) had strong enough moral convictions.

Call me naive, but if I went from making 50k a year at EA as one of a thousand programmers to running a studio with 10 of my friends and making 200k a year, you'd have to give me A LOT of money to take that step backwards into the corporate world.

Of course, it probably isn't that simple. Many small businesses struggle these days.

Avatar image for R4gn4r0k
R4gn4r0k

46348

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#48 R4gn4r0k
Member since 2004 • 46348 Posts

@mrbojangles25 said:
@lamprey263 said:

@mrbojangles25: "Broken record on my part, but as I say, thank god for independent and small-scale game development. Probably the only reason I still game."

If anything smaller developers can also be much easier to buy out to exclude titles from competing platforms. They're not as high profile for larger parties to try to make exclusivity deals with, but when their projects get noticed, I'm sure it doesn't take much incentive to be locked down into a deal.

They might be more vulnerable to complete buyouts/payoffs, but (and I have no proof of this, so humor me) I'd argue they might be more ethical?

No doubt it varies for everyone, and a person's "price" varies as well, but I imagine someone that founded a studio to get away from the corporate world might turn down a large sum of money if they were a.) already making more than enough money, and b.) had strong enough moral convictions.

Call me naive, but if I went from making 50k a year at EA as one of a thousand programmers to running a studio with 10 of my friends and making 200k a year, you'd have to give me A LOT of money to take that step backwards into the corporate world.

Of course, it probably isn't that simple. Many small businesses struggle these days.

They are more ethical: money corrupts.

Also EA split their EA sports and EA development branches recently as a way to say "hey there MS/Sony, you can totally buy me now"

Avatar image for pclover1980
PCLover1980

1244

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#49 PCLover1980
Member since 2022 • 1244 Posts

@R4gn4r0k said:

There was some space game on Steam. I can't remember the name as there are so many space games but I remember it being a first person shooter made in UE4.

And the moment they announced they were partnering with AMD, they made a statement on Steam saying they would no longer be able to bring DLSS to their game.

I think that right there is pretty much confirmation that AMD asks developers it partners with to not support DLSS.

It's Into The Radius VR. But the dev's response to it was because DLSS couldn't work properly with VR, or something along those lines. Which is kinda BS because it works the same as FSR, just without the tensor cores.

Avatar image for osan0
osan0

17822

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#50 osan0
Member since 2004 • 17822 Posts

@mrbojangles25 said:

@osan0 said:

I kinda makes sense on the surface but, with a bit more thought, it really doesn't. It's no secret that DLSS is superior to FSR. This is known. It's well covered and established. There has been nothing to indicate that FSR2 has caught up. This is not to say FSR is bad. I use it myself in a couple of games and it's fine...though I am probably using it under the best conditions (FSR quality @4K). But DLSS is better (though by no means perfect).

So what does cutting DLSS out of starfield actually get AMD except bad press? There is no win here. Developers wont be any more or less encouraged to use FSR in their games anyway. It still makes sense to implement FSR since it runs on everything including consoles, so the tech still has value. It's also designed so that any game with DLSS support can easily implement FSR2 and the reverse is also true.

RTGs marketing department are in a tough spot in fairness. The reality is that Radeon is still behind on the software and hardware front. They have made good strides on the hardware front since vega in fairness. But there are still weaknesses. Then there is a lot of work to do on the software side too, especially in the areas that are not gaming.

But there has to be better ways of using the deal than this surely. RTGs marketing department really does need an overhaul.

It makes sense in the same way that killing a rival makes sense. I mean, if you killed him, there'd be less competition...but, you know, there's the whole "ethical thing" :P

Anyway, with the way you phrased it, it's sort of like AMD is essentially saying they want to remove a better option from their supported/sponsored games so it doesn't take away from their inferior option, but all it does is remind us that we are forced to use and stuck with an inferior option.

It's anti-consumer and anti-competitive. Two great sins in capitalism, in my opinion, but as here in the US we use a bastardized, twisted, mutated strain of capitalism I suppose it kind of makes sense *shrug*

***This has been a Comrade Bojangles-approved post, since I criticized capitalism I figure I have to adopt the socialist persona***

AMD are not really killing anything though with the effort. DLSS won't just wither away over this...not even close. It's a bit more like saying "oh yeah DLSS is irrelevant and offer no benfit" while 50 vids and 50 more games demonstrate the contrary right behind them. They are fooling no one.

One of the reasons this is such a marketing blunder now is, when people hear "AMD Sponsored", they think "Downgrade". It's the first thing popping into their head. That's the last thing Radeon needs.

It's been a common theme with RTGs marketing division (and to a lesser extent AMD as a whole...though the CPU side seems to have copped on a bit since "game cache"). AMD do actually do some cool stuff. But their marketing keeps making a mess of it. Most of the time Radeon seems to do better when they don't say anything at all.

E.G. exposing REBAR and developing FSR to run on anything is great. The chiplet approach to GPUs is interesting and could be really interesting IF it pans out (big IF at the moment...a huge technical challenge). But the way they are promoted has generally been poor.