@scatteh316: It's still a decent GPU today by every single measure. At 1080p it holds up fine in every game that comes out. We're at a period of stagnation since the PS4/Xbox One's GPUs are so weak. Developers aren't really pushing the baseline visual quality higher due to the fact that the consoles simply can't run it.
Wasdie's forum posts
@Wasdie: no, he actually meant that this new technology doesn't improve graphics very much yet is very expensive and some PC gamers are hyping it despite of this.
Well the new technology does improve graphics quite a bit, if you're dumb enough to build a game specifically for that hardware with Nvidia's priority system.
Beyond that it's not on hardware companies to make better graphics, but rather to just build more powerful hardware. Hardware companies don't write the software that actually produces the image.
Hardware companies are in a bind. Yes there is specific hardware that *can* be used to aid the graphic programmers, but often that's going to be very specific for an individual task and not be backwards compatible with older cards so the vast majority of graphic programmers will not utilize the specific hardware as they don't want to make a renderer that only works with the absolute latest GPUs. Despite this, Nvidia still pushes forward with some of those things maybe if only to try to persuade the future direction of graphics rendering. Realtime ray tracing is better than rasterisation from a pure image quality perspective. That's not really debateable. It's just that the performance of such rendering is abysmal compared to rasterisation, which can get you 90% as close to ray tracing in terms of image quality.
The new cards are expensive mainly because they can be thanks to crypto miners, but they are going to be significantly more powerful than the current run of Nvidia high end cards. So I still don't get what his point is. Everybody considers these really expensive, nobody is disagreeing with that. How does this "expose" PC gamers?
I'm trying to get your logic here. The PC gamers are exposed because graphic cards are expensive? Were you expecting a hardware company to change gaming completely? There's already nothing the console's do hardware wise that PC's haven't been doing for years before the release of the consoles. Are you harboring some notion that the next generation of console is going to be some massive innovation? Are you really expecting $400-$500 hardware to somehow have leaps and bounds more powerful than the top-of-the-line PC stuff which is impractical to put in consoles?
What's your line of thinking here?
No game is ever going to use Nividia's proprietary Ray Tracing unless it's implemented as an alternative rendering API to DX or Vulkan. Even in that case I would bet it would be sponsored by Nvidia.
Put it this way. Until there is a graphics rendering API that's not specifically tied to a brand of graphic cards, raytracing specific hardware that can be accessed by said API is Nvidia, AMD, and Intel GPUs (cpus now, but Intel is making a GPU), and the consoles have these GPUs, we will NOT see a major game with raytracing options. We may see very small indie titles sponsored by Nvidia or done by little teams purely as a proof-of-concept, but raytracing isn't realistic for at least 20 years.
Anybody arguing anything about raytracing needs to come back to reality. PCs aren't going to be using this tech for a decade at the minimum and we won't see any developer wasting time on their multiplatform game with a renderer 2/3rds of the playerbase can't even use (and probably even less within the remaining PC gamer pool).
It's super cool tech, but that's about it. It's not practical in the slightest.
They haven't added any bullshit to Windows 10 that prevents gaming and have been keeping up DirectX to keep it modern. I say they do enough.