So many hermits, so much...
And hardly any hermits will buy them lol.
Most people are still gaming at 1080p. 8k is so useless for pc gaming. I'm glad 4k is now extremely viable.
@rmpumper: Yeah, I've been reading the 3070 is getting 2080 Ti performance. TBH, I don't trust Nvidia's numbers themselves so I may still have to wait for 3rd party benchmarks.
@rmpumper: Yeah, I've been reading the 3070 is getting 2080 Ti performance. TBH, I don't trust Nvidia's numbers themselves so I may still have to wait for 3rd party benchmarks.
Those should appear soon, but no word on when the embargo will end.
@rmpumper: Yeah, I've been reading the 3070 is getting 2080 Ti performance. TBH, I don't trust Nvidia's numbers themselves so I may still have to wait for 3rd party benchmarks.
Those should appear soon, but no word on when the embargo will end.
Anytime soon, yeah. September 17 can't come soon enough. Then again, I live in a 3rd world country, so it may take longer lol
So many hermits, so much...
And hardly any hermits will buy them lol.
The demand for GPUs keeps increasing each year.
Yeah the more they make the cheaper the 1yr "old" versions cost lol.
I'd say only 1% of hermits here buy the latest, the rest dream about it.
So many hermits, so much...
And hardly any hermits will buy them lol.
The demand for GPUs keeps increasing each year.
Yeah the more they make the cheaper the 1yr "old" versions cost lol.
I'd say only 1% of hermits here buy the latest, the rest dream about it.
Have to agree with you on this. I own a 1050 Ti 4 gb GPU personally, can deam about owning 3070 even atm.
So many hermits, so much...
And hardly any hermits will buy them lol.
The demand for GPUs keeps increasing each year.
Yeah the more they make the cheaper the 1yr "old" versions cost lol.
I'd say only 1% of hermits here buy the latest, the rest dream about it.
Have to agree with you on this. I own a 1050 Ti 4 gb GPU personally, can deam about owning 3070 even atm.
It won't stop the "hardcore" hermits from making dumb threads about consoles being obsolete, checks the front page of SW lol.
So many hermits, so much...
And hardly any hermits will buy them lol.
The demand for GPUs keeps increasing each year.
Yeah the more they make the cheaper the 1yr "old" versions cost lol.
I'd say only 1% of hermits here buy the latest, the rest dream about it.
Have to agree with you on this. I own a 1050 Ti 4 gb GPU personally, can deam about owning 3070 even atm.
It won't stop the "hardcore" hermits from making dumb threads about consoles being obsolete, checks the front page of SW lol.
Lying dosen't help one in the end except winning a ego war.
@R4gn4r0k: all dlss used to
Nope, check the DF video, it's in the OP.
He clearly tests 4k/60 performance, DLSS performance and RTX performance seperate.
Huge gains in all of them.
i will still wait till other party check it.
It won't stop the "hardcore" hermits from making dumb threads about consoles being obsolete, checks the front page of SW lol.
There is room for console and pc gaming. I'll be buying both a 3000 series card and a next gen console.
It's mobile trash that needs to go.
i will still wait till other party check it.
That'd be the smart thing to do.
I have no idea when the embargo will lift, but it should be in the next couple of weeks ;)
It won't stop the "hardcore" hermits from making dumb threads about consoles being obsolete, checks the front page of SW lol.
There is room for console and pc gaming. I'll be buying both a 3000 series card and a next gen console.
It's mobile trash that needs to go.
i will still wait till other party check it.
That'd be the smart thing to do.
I have no idea when the embargo will lift, but it should be in the next couple of weeks ;)
also not mention in video was board partner power draws.
atm i dont need the 3000 series.
i also have to factor in my ups set ups. seeing after a certain watt. it quad in cost.
@firedrakes: yeah I want to know how noisy the 3080 gets, seeing as it draws a lot more power than the RTX2000 series.
What do you mean with board partner power draws?
the power draw nvidia mention was their cards. we seen how board manf do higher power draws on some cards or lower on other cards. steve even mention this to in the most recent video.
It won't stop the "hardcore" hermits from making dumb threads about consoles being obsolete, checks the front page of SW lol.
There is room for console and pc gaming. I'll be buying both a 3000 series card and a next gen console.
It's mobile trash that needs to go.
And streaming.
@NathanDrakeSwag: Well, it's a good thing I sold my RTX 2080 Ti before the announcement, got $850 for it. Well worth it
PC VR will benefit the most out of this we can finally get rid of the screendoor effect with a high resolution and have a wide FOV ect.. but Oculus can go die from now on.
PC VR will benefit the most out of this we can finally get rid of the screendoor effect with a high resolution and have a wide FOV ect.. but Oculus can go die from now on.
HP Reverb G2 has landed at a very good time. Perfect headset for sit down VR which as recently gained a massive boost thanks to the new flight sim and will hopefully be compatible soon.
Can't think of a better period to be in the market for a new PC in a long time tbh.
You'll be able to build a complete PC which is quicker than a current 2080ti machine with the benefits of 2nd gen RT & DLSS for $1100. Not bad.
@netracing: I'd love to play VR. But not wasting over a grand on a few games.
I'll to traditional gaming, thanks!
And that's fine, but why wish for others to no longer have the option to enjoy it?
And that's fine, but why wish for others to no longer have the option to enjoy it?
Well if VR didn't exist than the newest Medal of Honor or Half Life could've been a traditional game.
That's all theoretical though, as these games might as well have not been made if VR didn't exist.
I'm fine with people enjoying PC, console or VR gaming
It won't stop the "hardcore" hermits from making dumb threads about consoles being obsolete, checks the front page of SW lol.
You mean just like the Xbox One X being used as a "meta" console when it comes to performance, yet MS sold like what, 1 million units?
LMAO
8gb is bad for a 500 buck card,
10gb is bad for a 800 buck card
2090 seems like a solid card for 1500 its laughable.
I think a lot of people will go team red next generation, or nvidia will introduce new cards that drop the prices. 16gb or lower v-ram = laughable at this point for next gen cards.
You do realise that both consoles come with 16GB unified ram which use 4-6GB just for the OS... Those consoles have 9-10GB actual VRAM at their disposal.
Memory bandwidth and actual GPU performance matter FAR more now than ever before, we had that spike in VRAM usage because developers were transitioning to higher resolution textures for 4K, its going to plateau now and pretty much has to think about it.
The same thing happened when we transistioned to HD textures we went from 128-512MB VRAM to 1-3GB VRAM... Then for 4K we went from 1-3GB VRAM to 4-8GB VRAM.
Since they are working on DLSS and the RTX 3070 is targeting 1440-4K 8GB is more than enough for the next 3 years.
The Bandwidth and raw power was what needed the boost for those resolutions along with RT core's.
Nvidia's has hit it out the park, the 3080 at 10GB with up to 2x the power of a 2080... Yes please.
Navi (RDNA) as a architecture cannot compete with Turing let alone ampere so I wouldn't hold your breath on that statement... They would need a 100 CU GPU to touch a 3080, as a XSX is 52 CU's and hits close to 2080 performance.
100 CU's?... perpair for a AIO 2 GPU type of monster that will overheat if a warm fart is let out in the same room. AMD is DOA.
Memory bandwidth means nothing when u run out of it. Also why do you think its going to stop at 4k textures? we will be sitting at 8k an then 16k exactly what nvidia is talking about. 4k isn't a magical number and the 3070 and 3080 will fall hopelessly behind. But even at 4k and the raytracing nvidia does which consumes large amounts of v-ram by its own the cards are nothing but supercharged current geneation game cards.
They are very much like the 580, absolute a beast under ps3 games, couldn't play shit in the ps4 area.
V-ram consumption will go up, PC settings will go up on top of it. And the 10gb of 3080 and 8gb 3070 will be hopelessly behind.
These cards are budget cards and simple not remotely ready for the future. V-ram limit = the worst bottleneck a GPU can have.
@R4gn4r0k: It's rather difficult to choose the right one that'll last a few years. I do think it is best to wait a year just to see what will happen since PS5 and XBOX series will make great advancement in graphics considering the new Unreal Engine 5 coming out soon. So yeh i would wait till that comes out, there may be a big step up because of that engine being so powerful in terms of advancement graphically. Just my opinion and observations, what do you think?
@R4gn4r0k: It's rather difficult to choose the right one that'll last a few years. I do think it is best to wait a year just to see what will happen since PS5 and XBOX series will make great advancement in graphics considering the new Unreal Engine 5 coming out soon. So yeh i would wait till that comes out, there may be a big step up because of that engine being so powerful in terms of advancement graphically. Just my opinion and observations, what do you think?
Well let's say that UE5 engine games started development this year.
That means we'll see the first real UE5 engine game in 2022-2023 at the earliest.
A lot of next gen games will be using UE4 engine. And they look good. Look at Observer system redux for example :)
It won't stop the "hardcore" hermits from making dumb threads about consoles being obsolete, checks the front page of SW lol.
You're really hurt by this, aren't you?
@R4gn4r0k: I agree, it will take 2/3 years for game developers to create a game using this engine. Is it worth going all out on the RTX right now? idk, suppose it depends on each individuals situation (financial and whatnot), but maybe it is best going for best value for money that'll be competent enough to provide you with good game experience for 2/3 years for now. And then splashing out on a top of the line one. As I said depends on your financial situation, if you have the money, no one can stop you from getting the best one right now.
You do realise that both consoles come with 16GB unified ram which use 4-6GB just for the OS... Those consoles have 9-10GB actual VRAM at their disposal.
Memory bandwidth and actual GPU performance matter FAR more now than ever before, we had that spike in VRAM usage because developers were transitioning to higher resolution textures for 4K, its going to plateau now and pretty much has to think about it.
The same thing happened when we transistioned to HD textures we went from 128-512MB VRAM to 1-3GB VRAM... Then for 4K we went from 1-3GB VRAM to 4-8GB VRAM.
Since they are working on DLSS and the RTX 3070 is targeting 1440-4K 8GB is more than enough for the next 3 years.
The Bandwidth and raw power was what needed the boost for those resolutions along with RT core's.
Nvidia's has hit it out the park, the 3080 at 10GB with up to 2x the power of a 2080... Yes please.
Navi (RDNA) as a architecture cannot compete with Turing let alone ampere so I wouldn't hold your breath on that statement... They would need a 100 CU GPU to touch a 3080, as a XSX is 52 CU's and hits close to 2080 performance.
100 CU's?... perpair for a AIO 2 GPU type of monster that will overheat if a warm fart is let out in the same room. AMD is DOA.
Memory bandwidth means nothing when u run out of it. Also why do you think its going to stop at 4k textures? we will be sitting at 8k an then 16k exactly what nvidia is talking about. 4k isn't a magical number and the 3070 and 3080 will fall hopelessly behind. But even at 4k and the raytracing nvidia does which consumes large amounts of v-ram by its own the cards are nothing but supercharged current geneation game cards.
They are very much like the 580, absolute a beast under ps3 games, couldn't play shit in the ps4 area.
V-ram consumption will go up, PC settings will go up on top of it. And the 10gb of 3080 and 8gb 3070 will be hopelessly behind.
These cards are budget cards and simple not remotely ready for the future. V-ram limit = the worst bottleneck a GPU can have.
Hey @Grey_Eyed_Elf, don't you just love when some who clearly never had a high-end rig, talks to someone who actually does have one?
e.g. "I don't care that you actually have a high-end GPU under daily use for months, all I have is a low-end junk you had 6 years ago, but will still tell you how your GPU is supposed to perform anyway, because a random youtuber said so."
So many hermits, so much...
And hardly any hermits will buy them lol.
lol, "hey guys, I can run Minecraft in 16K at 250 giga flops" I'm cool lol.
@elitegaming313: Agreed, I mean does PC have that killer app right now?
Aside from Cyberpunk there is Watch Dogs of Valhalla you could play at 4k...
But for me personally a killer app would be Battlefield... But that's coming next year.
Any killer app for you?
Game changing. Makes me want to get back into PC gaming. Gonna be either the RTX 3080 or the RTX 3090. $1500 isn't that much for Titan-destroying performance.
If it can run PS5 games in 8K/60fps... holy shit.
Game changing. Makes me want to get back into PC gaming. Gonna be either the RTX 3080 or the RTX 3090. $1500 isn't that much for Titan-destroying performance.
If it can run PS5 games in 8K/60fps... holy shit.
Lol, that's not happening. The 3080 is maybe 20% slower than the 3090 and it's already struggling to maintain 60 at 4K max settings on current games.
@R4gn4r0k: check this out, mate , evern Cory Barlog (God of War director) agrees that Nvidia is the best
twice the performance for less? dammit, take my money. https://t.co/LyDQRiax1n
— Cory Barlog 🖖 (@corybarlog) September 1, 2020
@R4gn4r0k: check this out, mate , evern Cory Barlog (God of War director) agrees that Nvidia is the best
There's also evidence that the 3070Ti is on the way: Multiple leaks suggest Nvidia RTX 3070 Ti with 16GB GDDR6 memory incoming but still, I'll just buy 3070 as its a huge upgrade to my 2070 and faster than 2080Ti is impressive for only $500. The Ti can take a hike cause I don't have the time to wait a year for that Ti.
Think it’s finally time to upgrade from the 980. That 3070 is looking pretty sweet. I’m gonna wait to see how AMD responds first though.
Yes yes, Jenson said its now safe to upgrade and upgrading 980 to 3070 is a huge leap in terms of performance for the right price its packing. I'll also be getting 3070 myself and its more than I'll need to game in 1440p/144Hz.
@R4gn4r0k: check this out, mate , evern Cory Barlog (God of War director) agrees that Nvidia is the best
There's also evidence that the 3070Ti is on the way: Multiple leaks suggest Nvidia RTX 3070 Ti with 16GB GDDR6 memory incoming but still, I'll just buy 3070 as its a huge upgrade to my 2070 and faster than 2080Ti is impressive for only $500. The Ti can take a hike cause I don't have the time to wait a year for that Ti.
I ain't gonna buy it , first I need to get a new monitor , like the new 360Hz ... oh yeah
https://www.engadget.com/nvidia-predator-x25-360-hz-gaming-monitor-180029258.html
@R4gn4r0k: check this out, mate , evern Cory Barlog (God of War director) agrees that Nvidia is the best
twice the performance for less? dammit, take my money. https://t.co/LyDQRiax1n
— Cory Barlog 🖖 (@corybarlog) September 1, 2020
It's good to have a state of the industry where devs are free to speak out instead of being silenced by industry NDAs or something.
There's also evidence that the 3070Ti is on the way: Multiple leaks suggest Nvidia RTX 3070 Ti with 16GB GDDR6 memory incoming but still, I'll just buy 3070 as its a huge upgrade to my 2070 and faster than 2080Ti is impressive for only $500. The Ti can take a hike cause I don't have the time to wait a year for that Ti.
If the benchmarks prove that the 3080 is really 180% that of a 280 in most games (not just DLSS/RTX games) than I'm interested in buying one.
If not then I'll keep waiting for a 3080 ti or 3080 super. But so far it looks good.
I'm also wondering how silent the card can be with that much power draw.
@rmpumper: The RTX 2080 can run almost anything in 4K and get around 50 fps on average, sometimes 4K/60, some times only 40 fps so average ≈ 50 range. An RTX 3080 has 2x the performance that's 4K/100fps on average
RTX 3090 significantly more powerful than that... Which means 8K/60 DLSS should work fine.
Please Log In to post.
Log in to comment