Godfall needs 12GB of VRAM for Ultra textures. NVIDIA fans on suicide watch

  • 63 results
  • 1
  • 2
Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 Juub1990
Member since 2013 • 12620 Posts

You knew this was coming.

Interestingly, Keith Lee revealed that in order to support 4X x 4X UltraHD textures a 12GB VRAM is required. This means that Radeon RX 6000 series, which all feature 16GB GDDR6 memory along with 128MB Infinity Cache should have no issues delivering such high-resolution textures. It may also mean that the NVIDIA GeForce RTX 3080 graphics card, which only has 10GB of VRAM, will not be enough.

Source

Shots have been fired and the GPU wars have resumed. AMD has partnered with the dev and in what appears to be a power move, pulled the rug from underneath NVIDIA's feet.

Let the games begin!!!

Avatar image for SolidGame_basic
SolidGame_basic

45101

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 SolidGame_basic
Member since 2003 • 45101 Posts

Well, better wait for the GTX 5090TI

Avatar image for Telekill
Telekill

12061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#3 Telekill
Member since 2003 • 12061 Posts

See... even PC players are always at each other's throats.

Master race my ass.

Avatar image for R4gn4r0k
R4gn4r0k

46260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#4 R4gn4r0k
Member since 2004 • 46260 Posts

Well stop being poor and just buy a 1500+ dollar RTX3090.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 Juub1990
Member since 2013 • 12620 Posts

@Telekill said:

See... even PC players are always at each other's throats.

Master race my ass.

The **** you think this is? It's war, my friend.

The world will be plunged into a firestorm of death and desolation, and only the strong and worthy will emerge from its ashes to rebuild the glorious PC Gaming World in all of its glory. Free of console peasants, weak hardware and bad GPU vendors.

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#6 Pedro
Member since 2002 • 69448 Posts

The dev should manage their memory better. Most of that RAM is just going to be cold data. Time to move away from that model.

Avatar image for pc_rocks
PC_Rocks

8470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#7  Edited By PC_Rocks
Member since 2018 • 8470 Posts

Isn't it AMD's partnered/marketed title? Then pretty standard for them to do that. Simply a marketing deal to push that. Pretty sure in the real world GDDR6 vs GDDR6x will be a bigger differentiator. Lastly we still don't know if it's actual memory consumption or just allocated memory.

Avatar image for davillain
DaVillain

56088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#8 DaVillain  Moderator
Member since 2014 • 56088 Posts

"But 8GB is enough."

Well GodFall is one of those so called "next-gen" games for PS5 even though it doesn't seem like it lol. Which is why it has demanding requirements, and I believe that this is how it will be for many games for now on. Probably doesn't use all 12GB VRAM at least. But wouldn't surprise me if it uses 10GB+ with everything ultra/ray tracing on etc. Heck even FFXV can go up to 10GB VRAM if you let it (highest tram setting) granted that game isn't the best optimized on PC mind you. One could turn a few settings down and probably be fine tho, per usual.

If anyone can't tell, I'm hype for GodFall myself and despite I'm gonna run it with my RTX 2070, AMD ain't got shit on me😉

Avatar image for BassMan
BassMan

17806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#9 BassMan  Online
Member since 2002 • 17806 Posts

I call bullshit. This game is not very impressive. No need to be using that much VRAM. Looks like some shady tactics from AMD.

Avatar image for HalcyonScarlet
HalcyonScarlet

13664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10 HalcyonScarlet
Member since 2011 • 13664 Posts
@Pedro said:

The dev should manage their memory better. Most of that RAM is just going to be cold data. Time to move away from that model.

Surely Nvidia knew what they were doing, could they really have made such a critical mistake? I'm not going to pretend to understand this, but don't they also compress data and decompress data? I'm going to disclaimer my comment that I don't understand what I'm talking about. Happy to leave it to the professionals.

Avatar image for mrbojangles25
mrbojangles25

58300

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#11  Edited By mrbojangles25
Member since 2005 • 58300 Posts

lol 12 GB, yeah OK whatever.

@Telekill said:

See... even PC players are always at each other's throats.

Master race my ass.

Hey, even the gods up on Olympus quarreled from time to time.

"Zeus was horny, so he screwed some girl, then Hera got pissed"

@BassMan said:

I call bullshit. This game is not very impressive. No need to be using that much VRAM. Looks like some shady tactics from AMD.

Yeah the game looks very current-gen from what I have seen. This reeks of poor optimization, intentional (for marketing) or not.

Avatar image for hardwenzen
hardwenzen

38854

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#12 hardwenzen
Member since 2005 • 38854 Posts

You know who is on the suicide watch? The ones that plan on playing this F2P phone like game that requires 12gb of ram. Those are the ones i'd be worried about.

Avatar image for Vaasman
Vaasman

15569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#14 Vaasman
Member since 2008 • 15569 Posts

It's called shit optimization. If a game was properly utilizing 12gb of overhead it would also run at 5 fps because cards today aren't equipped to run anything that legitimately required that kind of VRAM use, especially with 6x. This is just a lazy tactic to entice people to wait for AMD cards, instead of coming up with any proper kind of software appeal for the cards.

Avatar image for zaryia
Zaryia

21607

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15  Edited By Zaryia
Member since 2016 • 21607 Posts
@R4gn4r0k said:

Well stop being poor and just buy a 1500+ dollar RTX3090.

Avatar image for davillain
DaVillain

56088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#16  Edited By DaVillain  Moderator
Member since 2014 • 56088 Posts

@R4gn4r0k said:

Well stop being poor and just buy a 1500+ dollar RTX3090.

8GB=1440p & 10GB=4K

Look, I get that 8GB VRAM looks poor, but let's be honest, does anyone expects 8GB to be a great card for 4K? How many of us really game in 4K? I love Nvidia but I'm not crazy to spend over a grand for a GPU I personally don't need. (But I really want 3090 but not at that tax increase lol)

Since I'll be staying in 1440p for a few more years, 3070 is my pick for next year's upgrade.

Avatar image for uninspiredcup
uninspiredcup

58930

Forum Posts

0

Wiki Points

0

Followers

Reviews: 86

User Lists: 2

#17  Edited By uninspiredcup  Online
Member since 2013 • 58930 Posts

I feel sorry for any goober upgrading for this ugly ass looking Unreal student project.

Avatar image for R4gn4r0k
R4gn4r0k

46260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18 R4gn4r0k
Member since 2004 • 46260 Posts

@davillain- said:

8GB=1440p & 10GB=4K

Look, I get that 8GB VRAM looks poor, but let's be honest, does anyone expects 8GB to be a great card for 4K? How many of us really game in 4K? I love Nvidia but I'm not crazy to spend over a grand for a GPU I personally don't need. (But I really want 3090 but not at that tax increase lol)

Since I'll be staying in 1440p for a few more years, 3070 is my pick for next year's upgrade.

Nah, I jest: 3070&3080 are all good cards.

3070 is the pick for 1440p and 3080 for 4K. Lowering textures from ultra to high to fit the 10GB budget won't make much of a difference :)

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19  Edited By tormentos
Member since 2003 • 33784 Posts

Impossible several system wars warriors from lemmings and hermits told me 10GB of VRAM was enough for everything for years to come.

Avatar image for navyguy21
navyguy21

17425

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#20  Edited By navyguy21
Member since 2003 • 17425 Posts

@R4gn4r0k:

Done! Lol

Avatar image for howmakewood
Howmakewood

7702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21 Howmakewood
Member since 2015 • 7702 Posts

3090 buy vindicated, imagine having below 20gb gddr6x woosh

Avatar image for firedrakes
firedrakes

4365

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#22 firedrakes
Member since 2004 • 4365 Posts

flight sim will eat as mych vram as it can...

Avatar image for deactivated-6092a2d005fba
deactivated-6092a2d005fba

22663

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#23 deactivated-6092a2d005fba
Member since 2015 • 22663 Posts

Meh, let the laster race fight among themselves while the real gaming gods (console players) will just laugh at this :)

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 Xplode_games
Member since 2011 • 2540 Posts

@Pedro said:

The dev should manage their memory better. Most of that RAM is just going to be cold data. Time to move away from that model.

What about this dev?

"My recommendation for 8GB GPUs is to use high textures in general and reduce them down to medium if you start seeing stutter at your chosen resolution should it be higher than 1440p."

Loading Video...

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Xplode_games
Member since 2011 • 2540 Posts

@R4gn4r0k said:
@davillain- said:

8GB=1440p & 10GB=4K

Look, I get that 8GB VRAM looks poor, but let's be honest, does anyone expects 8GB to be a great card for 4K? How many of us really game in 4K? I love Nvidia but I'm not crazy to spend over a grand for a GPU I personally don't need. (But I really want 3090 but not at that tax increase lol)

Since I'll be staying in 1440p for a few more years, 3070 is my pick for next year's upgrade.

Nah, I jest: 3070&3080 are all good cards.

3070 is the pick for 1440p and 3080 for 4K. Lowering textures from ultra to high to fit the 10GB budget won't make much of a difference :)

"My recommendation for 8GB GPUs is to use high textures in general and reduce them down to medium if you start seeing stutter at your chosen resolution should it be higher than 1440p."

Loading Video...

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 Xplode_games
Member since 2011 • 2540 Posts

@Vaasman said:

It's called shit optimization. If a game was properly utilizing 12gb of overhead it would also run at 5 fps because cards today aren't equipped to run anything that legitimately required that kind of VRAM use, especially with 6x. This is just a lazy tactic to entice people to wait for AMD cards, instead of coming up with any proper kind of software appeal for the cards.

"My recommendation for 8GB GPUs is to use high textures in general and reduce them down to medium if you start seeing stutter at your chosen resolution should it be higher than 1440p."

Loading Video...

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 Xplode_games
Member since 2011 • 2540 Posts

@BassMan said:

I call bullshit. This game is not very impressive. No need to be using that much VRAM. Looks like some shady tactics from AMD.

What about this game? Keep in mind, these are cross gen games with ultra settings and RT. Imagine what true next gen games in a couple of years will require. Anyone who bought an next gen 8GB or 10GB GPU better have fun with massive compromises.

"My recommendation for 8GB GPUs is to use high textures in general and reduce them down to medium if you start seeing stutter at your chosen resolution should it be higher than 1440p."

Loading Video...

Avatar image for Vaasman
Vaasman

15569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#28  Edited By Vaasman
Member since 2008 • 15569 Posts

@Xplode_games said:
@Vaasman said:

It's called shit optimization. If a game was properly utilizing 12gb of overhead it would also run at 5 fps because cards today aren't equipped to run anything that legitimately required that kind of VRAM use, especially with 6x. This is just a lazy tactic to entice people to wait for AMD cards, instead of coming up with any proper kind of software appeal for the cards.

"My recommendation for 8GB GPUs is to use high textures in general and reduce them down to medium if you start seeing stutter at your chosen resolution should it be higher than 1440p."

A Ubisoft side-game is the rebuttal for arguing that this is a case of shit optimization? Okay I guess...

By the way you can quote multiple accounts in one post, there's no need to spam the board with the same video 4 times.

Avatar image for speedytimsi
speedytimsi

1415

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29  Edited By speedytimsi
Member since 2003 • 1415 Posts

I know nobody that actually wants to play this game. I'll just watch someone play this.

Besides AMD fans think they can get this card on launch? Watch this sells out in like 0.1 seconds.

Avatar image for BassMan
BassMan

17806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#30  Edited By BassMan  Online
Member since 2002 • 17806 Posts

@Xplode_games said:
@BassMan said:

I call bullshit. This game is not very impressive. No need to be using that much VRAM. Looks like some shady tactics from AMD.

What about this game? Keep in mind, these are cross gen games with ultra settings and RT. Imagine what true next gen games in a couple of years will require. Anyone who bought an next gen 8GB or 10GB GPU better have fun with massive compromises.

"My recommendation for 8GB GPUs is to use high textures in general and reduce them down to medium if you start seeing stutter at your chosen resolution should it be higher than 1440p."

It is not the 8GB which is in question, but 10GB. People who are buying a 3070 are not targeting 4K. There is nothing going on in Godfall to justify the 12GB VRAM. It is an AMD sponsored game. Just some shady tactics to try to generate interest for their 16GB cards. Like I said all along, my 3080 will do just fine until the 4080 comes out. If I have to reduce a setting or two to accommodate 10GB on the odd game, it is not a big deal. It will still be better than consoles anyway.

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#31 Pedro
Member since 2002 • 69448 Posts

@BassMan said:

It is not the 8GB which is in question, but 10GB. People who are buying a 3070 are not targeting 4K. There is nothing going on in Godfall to justify the 12GB VRAM. It is an AMD sponsored game. Just some shady tactics to try to generate interest for their 16GB cards. Like I said all along, my 3080 will do just fine until the 4080 comes out. If I have to reduce a setting or two to accommodate 10GB on the odd game, it is not a big deal. It will still be better than consoles anyway.

Sounds like you are trying to justify not owning the best. 😎

Avatar image for BassMan
BassMan

17806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#32 BassMan  Online
Member since 2002 • 17806 Posts

@Pedro said:
@BassMan said:

It is not the 8GB which is in question, but 10GB. People who are buying a 3070 are not targeting 4K. There is nothing going on in Godfall to justify the 12GB VRAM. It is an AMD sponsored game. Just some shady tactics to try to generate interest for their 16GB cards. Like I said all along, my 3080 will do just fine until the 4080 comes out. If I have to reduce a setting or two to accommodate 10GB on the odd game, it is not a big deal. It will still be better than consoles anyway.

Sounds like you are trying to justify not owning the best. 😎

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#33 Pedro
Member since 2002 • 69448 Posts

@BassMan: You know its true. You have become a console gamer. 😂

Avatar image for BassMan
BassMan

17806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#34 BassMan  Online
Member since 2002 • 17806 Posts

@Pedro said:

@BassMan: You know its true. You have become a console gamer. 😂

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#35 Pedro
Member since 2002 • 69448 Posts

@BassMan:

Avatar image for mazuiface
mazuiface

1604

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#36 mazuiface
Member since 2016 • 1604 Posts

I can't wait for the 6800XT. It will pair nicely with a 5900X

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37  Edited By Gatygun
Member since 2010 • 2709 Posts

8gb and 10gb cards will be outdated fast when next gen arrives. Get used to lower settings just because of v-ram limitations.

16gb should be the minimum for any next gen card.

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#38  Edited By Pedro
Member since 2002 • 69448 Posts

@Gatygun said:

8gb and 10gb cards will be outdated fast when next gen arrives. Get used to lower settings just because of v-ram limitations.

16gb should be the minimum for any next gen card.

Or, developers can manage their memory better instead of just dumping everything they can in memory and only using actively using portion of it.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#39  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Gatygun said:

8gb and 10gb cards will be outdated fast when next gen arrives. Get used to lower settings just because of v-ram limitations.

16gb should be the minimum for any next gen card.

But yet consoles will not have 16gb for vram and will most likely have 5-6gb minimum used out of the 16gb for OS+features along with game logic "cache"..... So only expect 10gb vram usage in XSX and same ballpark for PS5..... And with direct storage tech larger vram buffer size will mean less in the future ..... You do not need more than 128mb of memory to render a 4k image hence RDNA 2 having infinity cache to bypass the 256bit bus of memory. VRAM and ram usage is all preloading and prediction of whats to come and "what if" With the ability of Direct storage tech the streaming of graphical assets on the fly or loading less of the "preloading and prediction"will remove quite a bit of the bulk of needing "16gb" for "next gen".

Avatar image for R4gn4r0k
R4gn4r0k

46260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#40  Edited By R4gn4r0k
Member since 2004 • 46260 Posts

@Xplode_games: Awesome explanation by the DF guys!

However Watch Dogs L is optimized like crap, so it's not a good example. Look at their comparison of CPUs, and look at their comparison of DX11 v DX12.

AMD has insane stutters, consoles run sub 30 fps, game runs way worse in DX12, ... the list goes on. Watch Dogs L just runs really bad.

Avatar image for R4gn4r0k
R4gn4r0k

46260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#42 R4gn4r0k
Member since 2004 • 46260 Posts

@Yams1980 said:

i think the refreshes and the newer lines of the 3000 gpus are going to have 16gb of ram. Im glad these were sold out after all and can buy the better versions of them later. Wasn't there going to be a 16 gb version of the 3070?

Apparently the 3080 with 20gb and 3070 with 16gb are cancelled in favor of their current versions with half the ram.

The 3080ti with 20GB is still upcoming, but will be pricier for sure.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@Telekill said:

See... even PC players are always at each other's throats.

Master race my ass.

No PC its survival of the fittest... When the companies fight we all win because things get cheaper and better products are made.

You console gamers aren't a survival of the fittest bunch, you're a inclusive society held back by your 30FPS in-bred children.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44 Gatygun
Member since 2010 • 2709 Posts

@Pedro said:
@Gatygun said:

8gb and 10gb cards will be outdated fast when next gen arrives. Get used to lower settings just because of v-ram limitations.

16gb should be the minimum for any next gen card.

Or, developers can manage their memory better instead of just dumping everything they can in memory and only using actively using portion of it.

Nothing to do with managing it better, everything to do with consoles being more powerful and therefore the base bar will go upwards as result for games and PC will join it. Which means more v-ram needed and all hardware with it.

@04dcarraher said:
@Gatygun said:

8gb and 10gb cards will be outdated fast when next gen arrives. Get used to lower settings just because of v-ram limitations.

16gb should be the minimum for any next gen card.

But yet consoles will not have 16gb for vram and will most likely have 5-6gb minimum used out of the 16gb for OS+features along with game logic "cache"..... So only expect 10gb vram usage in XSX and same ballpark for PS5..... And with direct storage tech larger vram buffer size will mean less in the future ..... You do not need more than 128mb of memory to render a 4k image hence RDNA 2 having infinity cache to bypass the 256bit bus of memory. VRAM and ram usage is all preloading and prediction of whats to come and "what if" With the ability of Direct storage tech the streaming of graphical assets on the fly or loading less of the "preloading and prediction"will remove quite a bit of the bulk of needing "16gb" for "next gen".

PS4 = 2-3gb of v-ram allocation results in 4gb games recommend with 2-3gb low and 6gb ultra settings.

Now replace that 2-3gb with 10gb from the xbox which will be the limiting factor most likely and calculate the other numbers. yea 10gb suddenly doesn't look so good anymore does it? indeed.

There is a reason why people wanted 24gb cards for the 3080ti which was renamed to 3090 and 16gb for cheaper models which nvidia didn't do because they play the v-ram game again which they did last generation also with the 700 series of cards.

But with AMD moving to 16gb modules and nvidia probably releasing higher v-ram cards sooner rather than later and next gen games probably hitting end next year the whole market will be riddled with those cards which means it will become the standard and devs will focus on it also.

The 128mb of cache cant store a screen of texture data mate. That's why u got v-ram. It won't have any impact on that size it just reduces the stress on the bus which is needed if u want to push bigger textures forwards which require a lot more data to be flushed though. so no clue why u go into this department.

The reduction that direct storage will bring on v-ram demand isn't interesting because devs will simple increase the complexity of the scene at the same time like they always do, they will saturate that 10gb no matter what because its available. as result 10gb will always be loaded into that xbox series x and pc will provide higher settings then what consoles deliver always also If PC can't keep up with direct storage for example the ram requirements will just explode even more. Because data needs to be fed into the ram and v-ram and swapped around a lot more often.

Anyway 10gb of v-ram simple won't age well.

Avatar image for NoodleFighter
NoodleFighter

11793

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 NoodleFighter
Member since 2011 • 11793 Posts

Looks like the GPU wars are back baby!

Avatar image for davillain
DaVillain

56088

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#46  Edited By DaVillain  Moderator
Member since 2014 • 56088 Posts

@R4gn4r0k said:
@Yams1980 said:

i think the refreshes and the newer lines of the 3000 gpus are going to have 16gb of ram. Im glad these were sold out after all and can buy the better versions of them later. Wasn't there going to be a 16 gb version of the 3070?

Apparently the 3080 with 20gb and 3070 with 16gb are cancelled in favor of their current versions with half the ram.

The 3080ti with 20GB is still upcoming, but will be pricier for sure.

So far, I saw a rumor going for RTX 3080Ti at $999 and if that's true, (if it is true of course) the RX 6900XT will be out of business in no time lol.

All in all, Nvidia doesn't like being second rate whenever AMD get's an edge on the competition.

Avatar image for R4gn4r0k
R4gn4r0k

46260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#47 R4gn4r0k
Member since 2004 • 46260 Posts

@davillain- said:

So far, I saw a rumor going for RTX 3080Ti at $999 and if that's true, (if it is true of course) the RX 6900XT will be out of business in no time lol.

All in all, Nvidia doesn't like being second rate whenever AMD get's an edge on the competition.

I only used to spend around ~400euro on new graphics cards, my 1080ti was more than that at 800euro.

A 999euro card is even more than that, and the price of a full PC desktop at that point, but I'd actually be willing to spend that much if performance is in line with what I'm expecting and if Nvidia can magically make it around the same heat and power draw level as the RTX3080.

Avatar image for Pedro
Pedro

69448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#48 Pedro
Member since 2002 • 69448 Posts

@Gatygun: It has a everything to do with memory management. You factually don't process all or even 25%(most likely less) of data actively in VRAM at the moment. Most of the data sits there until needed and depending on what the player does it may never be used. We have the technology to reduce the memory footprint by using more intelligent memory management algorithms and software. The current model is simply inefficient. VRAM needs to be more actively used than static.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

If the charts they used for the new Navi GPU's is in any where inline with the 5 series CPU's... Then prepare for dissapointment.

They promised the BEST gaming performance and yet Intel is still beating them, the 5600X and 5800X get out performed by a 10600K in gaming.

I told people not to listen to AMD when it comes to performance hype.

Also the benchmark used in Doom Eternal for the RTX 3080 had it scoring 20FPS lower than all reviews at 4K.

AMD is doing a AMD still and forcing developers to use more VRAM just for the f*** of it, I hope Nvidia lock out AMD from RTX developed games.

All they do is hype and lie, no wonder they are in the consoles.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 Gatygun
Member since 2010 • 2709 Posts

AMD cpu's always perform like rollercoasters. Some games they work on par other games they drop down a lot. Same for there GPU's.

However 16gb of memory will make the 6800xt a far better investment then any of those 3080's currently on the market.