Benchmarks start 3:37
The benchmark for RDR2 should be good.
In case you didn't know. Someone already leak RDR2 is coming to PC at a later date after the console version launches, we'll gladly see those benchmarks indeed. I look forward to our discussion when RDR2 releases on PC Swag :P
The benchmark for RDR2 should be good.
In case you didn't know. Someone already leak RDR2 is coming to PC at a later date after the console version launches, we'll gladly see those benchmarks indeed. I look forward to our discussion when RDR2 releases on PC Swag :P
Hermits keep waiting for games meanwhile I'll be enjoying 2 GOTY contenders in back to back months on the PS4 Pro.
The benchmark for RDR2 should be good.
In case you didn't know. Someone already leak RDR2 is coming to PC at a later date after the console version launches, we'll gladly see those benchmarks indeed. I look forward to our discussion when RDR2 releases on PC Swag :P
Hermits keep waiting for games meanwhile I'll be enjoying 2 GOTY contenders in back to back months on the PS4 Pro.
We do need beta testers after all, so you'll do nicely :)
(Here's my own opinion, so take it with a grind of salt)
The RTX lineup will only be getting like 50% better performance in Raytrace enabled games...which is whatever in my opinion anyways, because my GTX 1080Ti should, in theory, be able to run RTX in 1280x720 at acceptable framerates going off the graphs I saw that compared performance in 1080p.
These so called RTX....just put simply are, a literal rip-off! Nvidia has been taking a mile every time the consumer gives them an inch, and after taking several miles over the last few years, they decided to go ahead and take the entire metric system this time, and people are going to let them. If they get away with this BS this time, man, we will TRULY be screwed from here on out. Have some pride as a consumer that's all I've got to say! If nobody buys these cards, they'll rethink how they price their products which is something that will not only benefit us, but benefit AMD, which will give us a double return on the benefits.
If everybody goes out and buys these cards, dont be surprised when AMD decides to pull the same BS, and then we forever get locked into this $1,000+ flagship card every year from here on out, from BOTH companies. In normal DX12 mode, they're going to get max, like 20% better (and that's in their own controlled environment) unless they gimp Pascal temporarily for the launch window they did this with Kepler when Maxwell launched, so it wouldn't surprise me, especially with how controversial these cards be on the lookout for fishy shits like that.
(Here's my own opinion, so take it with a grind of salt)
The RTX lineup will only be getting like 50% better performance in Raytrace enabled games...which is whatever in my opinion anyways, because my GTX 1080Ti should, in theory, be able to run RTX in 1280x720 at acceptable framerates going off the graphs I saw that compared performance in 1080p.
These so called RTX....just put simply are, a literal rip-off! Nvidia has been taking a mile every time the consumer gives them an inch, and after taking several miles over the last few years, they decided to go ahead and take the entire metric system this time, and people are going to let them. If they get away with this BS this time, man, we will TRULY be screwed from here on out. Have some pride as a consumer that's all I've got to say! If nobody buys these cards, they'll rethink how they price their products which is something that will not only benefit us, but benefit AMD, which will give us a double return on the benefits.
If everybody goes out and buys these cards, dont be surprised when AMD decides to pull the same BS, and then we forever get locked into this $1,000+ flagship card every year from here on out, from BOTH companies. In normal DX12 mode, they're going to get max, like 20% better (and that's in their own controlled environment) unless they gimp Pascal temporarily for the launch window they did this with Kepler when Maxwell launched, so it wouldn't surprise me, especially with how controversial these cards be on the lookout for fishy shits like that.
I'm sure Nvidia do, but unsure I have experienced any gimping. My aorus CF1 965m sli still runs just about everything max settings 1080p/60fps. Will need to be replaced when ps5 lands no doubt.
still not a flawless 4k 60fps card. Almost every game had some really bad fps lows. Gonna have to wait for the 2180ti i guess for that.
The card couldn't even run witcher 3 at 4k 60fps, that is pretty laughable for a $1200 usd card.
Thing is, this is a 1200 dollar card, and its failing to hit 60fps in games 3-4 years old. Any newer game its gonna fail even worse at 60fps 4k. Nvidia has really failed here. At least sell the thing for 600 dollars cause that's what its worth at most. Raytracing is a completely useless tech they've thrown in here, you will never be able to play a game with it enabled unless you like 30 fps gameplay.
And yet the rabid Nvidia fanboys will still buy it, not the most intellectual types.
The benchmark for RDR2 should be good.
In case you didn't know. Someone already leak RDR2 is coming to PC at a later date after the console version launches, we'll gladly see those benchmarks indeed. I look forward to our discussion when RDR2 releases on PC Swag :P
Hermits keep waiting for games meanwhile I'll be enjoying 2 GOTY contenders in back to back months on the PS4 Pro.
I see you're still on the PS4 train this week... LOL!
still not a flawless 4k 60fps card. Almost every game had some really bad fps lows. Gonna have to wait for the 2180ti i guess for that.
The card couldn't even run witcher 3 at 4k 60fps, that is pretty laughable for a $1200 usd card.
Thing is, this is a 1200 dollar card, and its failing to hit 60fps in games 3-4 years old. Any newer game its gonna fail even worse at 60fps 4k. Nvidia has really failed here. At least sell the thing for 600 dollars cause that's what its worth at most. Raytracing is a completely useless tech they've thrown in here, you will never be able to play a game with it enabled unless you like 30 fps gameplay.
30 FPS 1080p lol
still not a flawless 4k 60fps card. Almost every game had some really bad fps lows. Gonna have to wait for the 2180ti i guess for that.
The card couldn't even run witcher 3 at 4k 60fps, that is pretty laughable for a $1200 usd card.
Thing is, this is a 1200 dollar card, and its failing to hit 60fps in games 3-4 years old. Any newer game its gonna fail even worse at 60fps 4k. Nvidia has really failed here. At least sell the thing for 600 dollars cause that's what its worth at most. Raytracing is a completely useless tech they've thrown in here, you will never be able to play a game with it enabled unless you like 30 fps gameplay.
30 FPS 1080p lol
Indeed. Ray tracing reminds me of consoles. Keep pushing for graphics that are beyond the capabilities of the hardware. LOL
still not a flawless 4k 60fps card. Almost every game had some really bad fps lows. Gonna have to wait for the 2180ti i guess for that.
The card couldn't even run witcher 3 at 4k 60fps, that is pretty laughable for a $1200 usd card.
Thing is, this is a 1200 dollar card, and its failing to hit 60fps in games 3-4 years old. Any newer game its gonna fail even worse at 60fps 4k. Nvidia has really failed here. At least sell the thing for 600 dollars cause that's what its worth at most. Raytracing is a completely useless tech they've thrown in here, you will never be able to play a game with it enabled unless you like 30 fps gameplay.
30 FPS 1080p lol
Indeed. Ray tracing reminds me of consoles. Keep pushing for graphics that are beyond the capabilities of the hardware. LOL
Not gonna lie, this is true and it legitimately made me laugh. I never thought Nvidia would be tying to advocate customers playing in 1080/30FPS in 2018.
(Here's my own opinion, so take it with a grind of salt)
The RTX lineup will only be getting like 50% better performance in Raytrace enabled games...which is whatever in my opinion anyways, because my GTX 1080Ti should, in theory, be able to run RTX in 1280x720 at acceptable framerates going off the graphs I saw that compared performance in 1080p.
These so called RTX....just put simply are, a literal rip-off! Nvidia has been taking a mile every time the consumer gives them an inch, and after taking several miles over the last few years, they decided to go ahead and take the entire metric system this time, and people are going to let them. If they get away with this BS this time, man, we will TRULY be screwed from here on out. Have some pride as a consumer that's all I've got to say! If nobody buys these cards, they'll rethink how they price their products which is something that will not only benefit us, but benefit AMD, which will give us a double return on the benefits.
If everybody goes out and buys these cards, dont be surprised when AMD decides to pull the same BS, and then we forever get locked into this $1,000+ flagship card every year from here on out, from BOTH companies. In normal DX12 mode, they're going to get max, like 20% better (and that's in their own controlled environment) unless they gimp Pascal temporarily for the launch window they did this with Kepler when Maxwell launched, so it wouldn't surprise me, especially with how controversial these cards be on the lookout for fishy shits like that.
I don't actually disagree with you but the flagship cards, Titans, have been over $1000 for several years now. The RTX 2080Ti is just a different way for Nvidia to brand and sell their cards. Before the 20xx cards Titans came out with (roughly speaking) the xx80 GTX the Ti version would be a refresh of the xx80 GTX card the following(ish) year, they were never released at the same time.
We've also been getting 20% - 25% increase from new cards each time they come out, that's pretty standard. It's also the reason why annual upgrades/single card jumps are pointless.
The benchmark for RDR2 should be good.
In case you didn't know. Someone already leak RDR2 is coming to PC at a later date after the console version launches, we'll gladly see those benchmarks indeed. I look forward to our discussion when RDR2 releases on PC Swag :P
Hermits keep waiting for games
PC is up over 100 games against PS4 for this gen. Only high scoring titles.
Those are pretty impressive numbers. I won't be buying a Ti edition, but I'll definitely be purchasing a 2080.
Taking these all with a grain of salt, however, as they are leaked numbers.
The benchmark for RDR2 should be good.
In case you didn't know. Someone already leak RDR2 is coming to PC at a later date after the console version launches, we'll gladly see those benchmarks indeed. I look forward to our discussion when RDR2 releases on PC Swag :P
Hermits keep waiting for games
PC is up over 100 games against PS4 for this gen. Only high scoring titles.
I'm not even sure why the initial post was offensive in the first place? Because we might not get RDR2?
So?
@GarGx1: Even when viewing the RTX 2080 Ti as a rebranded Titan, it is still strange to have a Titan/Ti card at launch. I think it is an indicator that this 20 series will be short lived and we will be seeing new 7nm cards not long from now. Also, regardless if it is a Titan or Ti, it is still a rip off.
Need to take these leaked* benchmarks with a grain of salt until we can get 3rd party and user benchmarks dont trust anything.
I'm waiting for legitimate benchmarks. At the pricepoint these cards are at they seem like a giant waste of money. I think these new RT cores they added are too soon and it's not sufficient for the cost. I'm still heavily leaning to the 2080, should it be decent enough over the 1080ti... There's about a $100-150 difference in Canada.
still not a flawless 4k 60fps card. Almost every game had some really bad fps lows. Gonna have to wait for the 2180ti i guess for that.
The card couldn't even run witcher 3 at 4k 60fps, that is pretty laughable for a $1200 usd card.
Thing is, this is a 1200 dollar card, and its failing to hit 60fps in games 3-4 years old. Any newer game its gonna fail even worse at 60fps 4k. Nvidia has really failed here. At least sell the thing for 600 dollars cause that's what its worth at most. Raytracing is a completely useless tech they've thrown in here, you will never be able to play a game with it enabled unless you like 30 fps gameplay.
NVIDIA's TU102 has separated Tensor and RT cores from CUDA cores and classic GPU hardware i.e. it's not 754 mm2 worth of GTX 1080 Ti (471 mm2, 16 nm) GPU.
The benchmark for RDR2 should be good.
In case you didn't know. Someone already leak RDR2 is coming to PC at a later date after the console version launches, we'll gladly see those benchmarks indeed. I look forward to our discussion when RDR2 releases on PC Swag :P
GTA V on PC was phenomenal. Would be worth the wait. Meanwhile my backlog awaits!
@04dcarraher: They come out in 12 days. People can’t wait?
Nope
People have been itching for a card that could do 4k 60 fps consistently. The 1080 ti can do many, but definitely doesn't come close to all. I think that's why people are anticipating and snatching up pre-orders so quick.
What does AMD have in the oven hermits?
AMD have had there Ray Tracing solution long before Nvidia had RTX...
Only AMD's using GPU Compute and will be more widely adopted then Nvidia's locked implementation.
Minimum frame rates still well below 60fps to provide a 'locked 60fps at 4k.
So any Hermit claiming 4k@60fps is either lieing or using console quality settings...lmao!
Minimum frame rates still well below 60fps to provide a 'locked 60fps at 4k.
So any Hermit claiming 4k@60fps is either lieing or using console quality settings...lmao!
I noticed that, they would need 1080TI SLI to not be lying.
So much for 4k 60fps at max settings like virtually every hermit claims here on this board. Gotta bookmark this for future lying hermits. Even a 1080TI runs the Witcher 3 in the 40s at 4k before reducing settings, so imagine running mods LMAO. No happening.
A lot of Hermits been outed "bu but I play da Witcher 3 ultra settings 4k 60fps with Mods n that looks better than any console exclusive!"
Apparently we know that's a lie going by specs people have posted lately. Any hermits here running a 1080TI SLI set up?
Ya'll either not playing in 4k, playing at reduced less graphical settings, or playing a slide show just to take the highest res pics possible for SW arguments. Pathetic
Minimum frame rates still well below 60fps to provide a 'locked 60fps at 4k.
So any Hermit claiming 4k@60fps is either lieing or using console quality settings...lmao!
I noticed that, they would need 1080TI SLI to not be lying.
So much for 4k 60fps at max settings like virtually every hermit claims here on this board. Gotta bookmark this for future lying hermits. Even a 1080TI runs the Witcher 3 in the 40s at 4k before reducing settings, so imagine running mods LMAO. No happening.
A lot of Hermits been outed "bu but I play da Witcher 3 ultra settings 4k 60fps with Mods n that looks better than any console exclusive!"
Apparently we know that's a lie going by specs people have posted lately.Any hermits here running a 1080TI SLI set up?
Ya'll either not playing in 4k, playing at reduced less graphical settings, or playing a slide show just to take the highest res pics possible for SW arguments. Pathetic
I only know of one and that's Juub.
99% of hermits are full of shit....... all of them use performance scores from hardware they don't personally own to try and own consoles instead as they know the hardware they do actually own is shit.
Benchmarks start 3:37
You know all the meme's poking fun at PC gamers for being fat, ugly nerds?
The guy in this video is just proving those meme's are accurate.
Minimum frame rates still well below 60fps to provide a 'locked 60fps at 4k.
So any Hermit claiming 4k@60fps is either lieing or using console quality settings...lmao!
I noticed that, they would need 1080TI SLI to not be lying.
So much for 4k 60fps at max settings like virtually every hermit claims here on this board. Gotta bookmark this for future lying hermits. Even a 1080TI runs the Witcher 3 in the 40s at 4k before reducing settings, so imagine running mods LMAO. No happening.
A lot of Hermits been outed "bu but I play da Witcher 3 ultra settings 4k 60fps with Mods n that looks better than any console exclusive!"
Apparently we know that's a lie going by specs people have posted lately.Any hermits here running a 1080TI SLI set up?
Ya'll either not playing in 4k, playing at reduced less graphical settings, or playing a slide show just to take the highest res pics possible for SW arguments. Pathetic
I only know of one and that's Juub.
99% of hermits are full of shit....... all of them use performance scores from hardware they don't personally own to try and own consoles instead as they know the hardware they do actually own is shit.
Wow, you guys truly are pathetic. People have way better stuff than you and all you jealous fucks can do is hate while playing on shit consoles. Sad.
Minimum frame rates still well below 60fps to provide a 'locked 60fps at 4k.
So any Hermit claiming 4k@60fps is either lieing or using console quality settings...lmao!
I noticed that, they would need 1080TI SLI to not be lying.
So much for 4k 60fps at max settings like virtually every hermit claims here on this board. Gotta bookmark this for future lying hermits. Even a 1080TI runs the Witcher 3 in the 40s at 4k before reducing settings, so imagine running mods LMAO. No happening.
A lot of Hermits been outed "bu but I play da Witcher 3 ultra settings 4k 60fps with Mods n that looks better than any console exclusive!"
Apparently we know that's a lie going by specs people have posted lately.Any hermits here running a 1080TI SLI set up?
Ya'll either not playing in 4k, playing at reduced less graphical settings, or playing a slide show just to take the highest res pics possible for SW arguments. Pathetic
I only know of one and that's Juub.
99% of hermits are full of shit....... all of them use performance scores from hardware they don't personally own to try and own consoles instead as they know the hardware they do actually own is shit.
Wow, you guys truly are pathetic. People have way better stuff than you and all you jealous fucks can do is hate while playing on shit consoles. Sad.
Another arrogant Hermit thinking he has better then every one else and that no console owner can afford what he or others have.
Truly a pathetic mentality.
I noticed that, they would need 1080TI SLI to not be lying.
So much for 4k 60fps at max settings like virtually every hermit claims here on this board. Gotta bookmark this for future lying hermits. Even a 1080TI runs the Witcher 3 in the 40s at 4k before reducing settings, so imagine running mods LMAO. No happening.
A lot of Hermits been outed "bu but I play da Witcher 3 ultra settings 4k 60fps with Mods n that looks better than any console exclusive!"
Apparently we know that's a lie going by specs people have posted lately.Any hermits here running a 1080TI SLI set up?
Ya'll either not playing in 4k, playing at reduced less graphical settings, or playing a slide show just to take the highest res pics possible for SW arguments. Pathetic
I only know of one and that's Juub.
99% of hermits are full of shit....... all of them use performance scores from hardware they don't personally own to try and own consoles instead as they know the hardware they do actually own is shit.
Wow, you guys truly are pathetic. People have way better stuff than you and all you jealous fucks can do is hate while playing on shit consoles. Sad.
Another arrogant Hermit thinking he has better then every one else and that no console owner can afford what he or others have.
Truly a pathetic mentality.
I never claimed to have better than everybody else, but I do know what I have is good and delivers quality gaming. The same can't be said for consoles.
When someone has something better than you, you just look like an idiot when you try to put them down. You are like a guy in a Toyota Tercel making fun of a Ferrari 458 owner because they don't have a La Ferrari. It's ridiculous.
I only know of one and that's Juub.
99% of hermits are full of shit....... all of them use performance scores from hardware they don't personally own to try and own consoles instead as they know the hardware they do actually own is shit.
Wow, you guys truly are pathetic. People have way better stuff than you and all you jealous fucks can do is hate while playing on shit consoles. Sad.
Another arrogant Hermit thinking he has better then every one else and that no console owner can afford what he or others have.
Truly a pathetic mentality.
I never claimed to have better than everybody else, but I do know what I have is good and delivers quality gaming. The same can't be said for consoles.
When someone has something better than you, you just look like an idiot when you try to put them down. You are like a guy in a Toyota Tercel making fun of a Ferrari 458 owner because they don't have a La Ferrari. It's ridiculous.
And who have a put down you clown?
99% of hermits on this bored are just flat out full of shit when it comes to the performance they can to be getting.....One such hermit spends all time raging about 4k and 60fps on PC and yet he only has a GTX970 .... smh
Putting them down? Too right, if a noob comes out with bull shit number too right I'll put him down.
And console's don't deliver quality gaming?
So why have you whined like a little girl throwing your dummy out the pram because you can't get certain console exclusives on PC?
Why would you want something that's not quality on PC?
And your PC will delivery decent experience, but it won't deliver a locked 60fps at max settings in all games at your native resolution.
Not everyone can afford a gaming PC but what certain people need to get in their thick skulls is that not everyone wants one even if they could afford it.
Did he say $1200???
Dude hermits are boasting about how MHW is Best on PC bc 4K and graphics and to get a stable 60fps now you need a $1200 GPU alone????
Fucks sake. How important are graphics to you????
Did he say $1200???
Dude hermits are boasting about how MHW is Best on PC bc 4K and graphics and to get a stable 60fps now you need a $1200 GPU alone????
Fucks sake. How important are graphics to you????
Sorry to disappoint you but. I'm not interest gaming in 4K anytime soon. While 4K is totally nice and everything, I hate losing performance just for raw graphics, I game in 1440p and 1440p is still the gold standard among PC gaming.
Please Log In to post.
Log in to comment