Why do consoles and game producers on consoles worry about improving resolution, rather than say graphical fidelity?

  • 134 results
  • 1
  • 2
  • 3
Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#101 EG101
Member since 2007 • 2091 Posts

@jv303 said:

Because they've seen the online gaming community obsess over it for the last 4 years.

Pretty much this^^ except it was the last 12 years. It pretty much started with MS saying all games would be 720P on 360 and continued with Sony's PS3 launch hype. All of course which was BS.

Avatar image for Epak_
Epak_

11911

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#102 Epak_
Member since 2004 • 11911 Posts

@gamingpcgod: 4k just happens to be the thing right now, never thought I'd get to play anything in 4k this gen, it seemed like a very unrealistic jump for consoles, but here we are. One should blame the tv manufactures if someone should be blamed. Personally I'd like to have 1440p 60fps option in every XboxOne X game.

Avatar image for wizard
Wizard

940

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 Wizard
Member since 2015 • 940 Posts

@gamingpcgod: Or maybe you're just incorrect. DSR is widely available for anybody with an Nvidia card. So we can all witness it ourself. Your certainly the first person I've ever seen try and equate 4k with SS. If you need "good eyes" to see the difference than it certainly isn't as apparent as 4k where literally everybody should be able to tell.

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 GamingPCGod
Member since 2015 • 132 Posts

@wizard: Let me put it this way.

Gaming on a 15.6 inch display at 1080p is the equivalent of gaming on a 31.6 inch at 4k. However, even the smallest 4k TVs are still north of 40 inches. If you have a 50 inch tv at 4k, it would take at most a 25 inch display to match it's quality.

Avatar image for Nonstop-Madness
Nonstop-Madness

12304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#105 Nonstop-Madness
Member since 2008 • 12304 Posts

Marketing/Hardware side -> The market is moving towards 4K, so it makes sense for consoles (and games) to also support 4K.

Software side -> They need to make software that is for "4K" hardware.

Avatar image for AdobeArtist
AdobeArtist

25184

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#106 AdobeArtist  Moderator
Member since 2006 • 25184 Posts

It's easy to market tech products based on a number indicating a simple metric to evaluate performance; just like when 1080 was the big label to put on TV's, 4K is the popular buzz term today. You could draw similar comparisons to how PCs are sold, the GHz is a simple metric to inform consumers of a computers performance, along with core count and memory. Higher is better, though GHz by itself can be misleading, as many consumers are unaware of the IPC factor or other features like L2 and L3 cache.

Although frame rate should also be an easy marketing number, resolution is more closely related to graphical fidelity which is more immediately obvious even in quick observation, so by perception the more desirable trait to advertise than performance. And as developers try to push as much of the graphics as possible on limited hardware in consoles, 60 fps is more often unattainable, thus less reason to market that.

All the other graphical settings like Texture, level detail, Shadows, Lighting, Ambient Occlusion, Anti-Aliasing, Particles, Vegetation, Water Effects, Hair Effects, Screen space reflections.... these use basic descriptive terms of Low, Med, High, Ultra (or sometimes as simple as On or Off) which while providing a scale of quality, is also vague to the uninitiated as to how it it all rates. It's far more generalized and less concise as to how a game looks, and some terms many don't understand their function, such as occlusion and particles. A descriptive scale of Med, High, Ultra doesn't intuitively translate to a numeric scale where there's no definitive criteria to what the differences are between the tiers, which is only compounded that there is no universal measure of these. It's a sliding standard that each game has it's own scale, and shifts by generations.

Basically it's all "loose and fuzzy" as a means to define visual performance, so it always goes back to the easy to gauge numbers that are easy to market.

Avatar image for cyberpunk_2077
cyberpunk_2077

627

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#108 cyberpunk_2077
Member since 2015 • 627 Posts

@gamingpcgod said:

: ..Let me put it this way.

Gaming on a 15.6 inch display at 1080p is the equivalent of gaming on a 31.6 inch at 4k. However, even the smallest 4k TVs are still north of 40 inches. If you have a 50 inch tv at 4k, it would take at most a 25 inch display to match it's quality.

@gamingpcgod: Could you please elaborate on how you worked this out?

Avatar image for cainetao11
cainetao11

38036

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#109 cainetao11
Member since 2006 • 38036 Posts

Because they all react to the gaming community. It was the gaming community spurred on by gaming media that made resolution differences of 900p-1080p the be all end all. And lets not forget Sony themselves claiming Native 1080p makes us better gamers.

They cant react instantly. Sony and MS couldn't get a more powerful console out a week after click baiting started regarding resolution in 2014. It takes time to R & D and make a console. Same with games being made. Let's not forget it was gamers and the ongoing "mine is better than yours because (insert latest click bait reason)" that takes a large part of the blame.

Avatar image for wizard
Wizard

940

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#110 Wizard
Member since 2015 • 940 Posts

@gamingpcgod: You are so sadly misinformed. Pixel density =/= Detail. You are talking exactly like a console gamer.

Resolution by all definitions is a measurement of detail or distinguishment. At 4k you are using 4X the amount of pixels to represent an image versus 1080p. 1080p does NOT display more detail by shrinking or enlarging the screen, that's asinine.In fact, it's recommended to get larger 4k displays because on smaller ones the detail is lost due to the "resolution" of the human eye, in that it cannot distinguish the smaller detail advantages provided by 4k. Your shitty 15.6 inch laptop display does not provide the same visual quality of a 40 inch 4k display or even a 75 inch one, that's delusional. Due to the smaller pixel size it can be sharper, but really not much else.

@cyberpunk_2077 It's just flat out misinformation. I didn't do the math myself but I assume he's taking the area of the screen divided by the vertical and horizontal resolutions to achieve a superior pixel density and pass them off as superior resolution.

Avatar image for cyberpunk_2077
cyberpunk_2077

627

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#111 cyberpunk_2077
Member since 2015 • 627 Posts

@wizard: I think I am being polite in trying to actually apply what he is saying in reference to resolution and super-sampling etc.

I will hopefully see what he means.

Avatar image for APiranhaAteMyVa
APiranhaAteMyVa

4160

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#112 APiranhaAteMyVa
Member since 2011 • 4160 Posts

That is why Sonys world class engineers came together and created the vastly superior checkerboard 4K, so that you get all the benefits of 4K, without the high resource demand.

Allowing them to sell a console $100 less than the competition while being 110% as good.

Avatar image for Sam3231
Sam3231

2954

Forum Posts

0

Wiki Points

0

Followers

Reviews: 296

User Lists: 0

#113  Edited By Sam3231
Member since 2008 • 2954 Posts

@gamecubepad said:

@gamingpcgod:

Even with GTX 1060 6GB and RX 480 8GB OC'd they will hover around 45-55fps with drops into the low 30s on FM Apex, 4k/60fps, Ultra, 4xAA, wet track, full grid.

I realize you're likely just a fakeboy/troll alt, but I shouldn't have to do all the leg work here. RX 480/GTX 1060 both OC'd and failing miserably at Forza 4k/Ultra, wet track, full grid. X1X was running 60+ while these GPUs drop to the 30s.

FM Apex 4K/60fps, Ultra, 4xAA, wet track, full grid:

RX 480 8GB
RX 480 8GB
GTX 1060 6GB
GTX 1060 6GB

I tested my computer against the game 4K max settings too. It was probably about average 25 FPS. I have a GTX 970 4GB stock which is supposedly kind of close to these cards? But obviously not as much VRAM- but iirc I wasn't using it all anyway. I don't have a deep understanding of components but it seemed like the GPU was the bottleneck. It was running at full load constantly according to EVGA precision. Where as I still using about 7.2/8GB ram and my processor was running about 80% load which is a stock core i5-2400. On Windows 10.

Avatar image for gamingpcgod
GamingPCGod

132

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#114 GamingPCGod
Member since 2015 • 132 Posts

@cyberpunk_2077 said:
@gamingpcgod said:

: ..Let me put it this way.

Gaming on a 15.6 inch display at 1080p is the equivalent of gaming on a 31.6 inch at 4k. However, even the smallest 4k TVs are still north of 40 inches. If you have a 50 inch tv at 4k, it would take at most a 25 inch display to match it's quality.

@gamingpcgod: Could you please elaborate on how you worked this out?

Basically, the amount of pixels in a screen only matter relative to how large the screen is. This is why 480p can look like HD on a tiny 5 inch phone; conversely, this is why 720p can look grainy on a large 70 inch tv.

If you make a tv bigger, you have to scale up the amount of pixels to retain the picture quality.

Since 4k is twice as many pixels vertically and horizontally (with 4x as many pixels in totality) you will have to scale the display's diagonal length x2 to get 4x the surface area. Want to retain 1080p quality on a 15.6 inch display on a 31.2 inch? Have to make that display 4k. Want to retain 4K-60 inch quality on a 30' monitor? Have to make it 1080p.

It's like density. Weather something sinks through the Earth is not at all dependent on it's weight, but how dense the object is. In the same way, pixel density is most important when it comes to display. Again, this is why 480p can look like it's HD on a phone.

Avatar image for gamecubepad
gamecubepad

7214

Forum Posts

0

Wiki Points

0

Followers

Reviews: -12

User Lists: 0

#115  Edited By gamecubepad
Member since 2003 • 7214 Posts

@Sam3231:

Thanks for providing the additional data point. A GTX 970 OC'd is a decent match at lower res for stock GTX 1060 and RX 480, especially the 3GB and 4GB models. The GTX 970 has 3.5GB full speed VRAM with 512MB slower VRAM.

In Forza Apex and RoTR, the 3-4GB cards become VRAM-limited. RoTR can consume up to 7GB VRAM, while Apex uses 5-6GB VRAM. X1X is doubling(at least) the performance of the GTX 970 4GB and GTX 1060 3GB in these games. X1X was doing 60+ in Forza at the same settings I used and I got an average of ~45fps with min into the lower 30s. This is with RX 480 8GB OC, so it's 6.3TFLOPS and ~268GB/s mem bandwidth.

X1X GPU is like a stock 980ti/Fury.

Avatar image for gamecubepad
gamecubepad

7214

Forum Posts

0

Wiki Points

0

Followers

Reviews: -12

User Lists: 0

#116  Edited By gamecubepad
Member since 2003 • 7214 Posts

@gamingpcgod:

Dude, here I am again doing all the legwork for you. Your "Supersampling...giving the environment a much, MUCH more crisper and detailed look) looks FAR better than simply rendering the game's image at 4k" claim is bullshit:

1. Supersampling and DSR/VSR are basically the same thing, with the latter being Nvidia/AMD implementations.

2. Your real claim is that devs would be better off running 1080p 2x2 supersample on a 4K screen than native 4K. Which is also bullshit.

...

This image should display native since I cropped it to fit GS's forum resizing, so it doesn't have a beneficial supersample effect on the 1080p 2x2 supersample screen. If not, right-click then 'view image'. Full image should be 1141x889 at 100% scaling.

1080p, 2x2 supersample, highest settings(very high textures, all high remaining, x16 AF) upscaled to 4K.

vs

Native 4K normal settings except 'very high' textures and x8 AF. That's 6 settings lowered compared to the 1080p shot...*right-click, view image for fullsize*...

Avatar image for deactivated-5c746fddbe486
deactivated-5c746fddbe486

193

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#117 deactivated-5c746fddbe486
Member since 2017 • 193 Posts

PS4 Pro is just upscaled 4K the X1X is actually 4K upgrades with better visual effects and high res 4K textures, the X1X does all those things you listed and then some

Avatar image for PinchySkree
PinchySkree

1342

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#118 PinchySkree
Member since 2012 • 1342 Posts

@pinkanimal said:

it's a marketing ploy to sell to gullible people and fanboys who like counting pixels more than playing games.

Avatar image for CanYouDiglt
CanYouDiglt

8474

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#119 CanYouDiglt
Member since 2009 • 8474 Posts

There is already a few games for Xbox One X using PC high setting such as Gears of War, Fallout 4, Forza 7, Lara Croft and there are others, for example the next Shadows of Mordor. Two of the ones I even mentioned are also I know using 60 fps, not sure of the other two. Do not confuse Xbox One X and PS4 Pro as the same thing. PS4 Pro is basically just a Xbox One S, in fact the X1S has more features.

Avatar image for drlostrib
DrLostRib

5931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#120 DrLostRib
Member since 2017 • 5931 Posts

@tibua said:

PS4 Pro is just upscaled 4K the X1X is actually 4K upgrades with better visual effects and high res 4K textures, the X1X does all those things you listed and then some

depends on the game

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#121  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Sam3231 said:
@gamecubepad said:

@gamingpcgod:

Even with GTX 1060 6GB and RX 480 8GB OC'd they will hover around 45-55fps with drops into the low 30s on FM Apex, 4k/60fps, Ultra, 4xAA, wet track, full grid.

I realize you're likely just a fakeboy/troll alt, but I shouldn't have to do all the leg work here. RX 480/GTX 1060 both OC'd and failing miserably at Forza 4k/Ultra, wet track, full grid. X1X was running 60+ while these GPUs drop to the 30s.

FM Apex 4K/60fps, Ultra, 4xAA, wet track, full grid:

RX 480 8GB
RX 480 8GB
GTX 1060 6GB
GTX 1060 6GB

I tested my computer against the game 4K max settings too. It was probably about average 25 FPS. I have a GTX 970 4GB stock which is supposedly kind of close to these cards? But obviously not as much VRAM- but iirc I wasn't using it all anyway. I don't have a deep understanding of components but it seemed like the GPU was the bottleneck. It was running at full load constantly according to EVGA precision. Where as I still using about 7.2/8GB ram and my processor was running about 80% load which is a stock core i5-2400. On Windows 10.

On RBE front, 970 only has 1.7 MB L2 cache which is shared with TMU read/write units while X1X's RBE has 2MB render cache and TMUs has 2MB L2 cache.

Alpha effects mostly runs on RBE's fix function alpha math units and this RBE path was optimized for X1X. Forza's wet track hammers alpha effects.

When X1X's L2 cache (for TMU) and render cache (for RBE/ROPS) are combined, the total cache size is 4MB which is similar VEGA 56's shared 4MB L2 cache for RBE/ROPS and TMUs. X1X is like R9-390X OC with VEGA size L2 cache.

Selecting VEGA NCU wouldn't have optimized for ForzaTech and Unreal Engine 4 engines.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#122 Xplode_games
Member since 2011 • 2540 Posts

@pinkanimal said:

It makes little sense but then again it's a marketing ploy to sell to gullible people and fanboys who like counting pixels more than playing games.

You can blame cows for claiming pwnage every time they had a pixel count advantage on any game over the X1.

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#123 Xplode_games
Member since 2011 • 2540 Posts

@tibua said:

PS4 Pro is just upscaled 4K the X1X is actually 4K upgrades with better visual effects and high res 4K textures, the X1X does all those things you listed and then some

This

Avatar image for Xplode_games
Xplode_games

2540

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#124  Edited By Xplode_games
Member since 2011 • 2540 Posts

@CanYouDiglt said:

There is already a few games for Xbox One X using PC high setting such as Gears of War, Fallout 4, Forza 7, Lara Croft and there are others, for example the next Shadows of Mordor. Two of the ones I even mentioned are also I know using 60 fps, not sure of the other two. Do not confuse Xbox One X and PS4 Pro as the same thing. PS4 Pro is basically just a Xbox One S, in fact the X1S has more features.

Well put.

Avatar image for pinkanimal
PinkAnimal

2380

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#125 PinkAnimal
Member since 2017 • 2380 Posts

@Xplode_games: "This"

Is BS. Lemmings pretending the 1X is capable of native 4K across the board are not fooling anyone anymore. The majority of games on the X1X is going to be also upscaled our dynamic if they ever reach 4k that is so stop talking crap.

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#126  Edited By EG101
Member since 2007 • 2091 Posts

@pinkanimal said:

@Xplode_games: "This"

Is BS. Lemmings pretending the 1X is capable of native 4K across the board are not fooling anyone anymore. The majority of games on the X1X is going to be also upscaled our dynamic if they ever reach 4k that is so stop talking crap.

Pulled straight out your ass.

Avatar image for pinkanimal
PinkAnimal

2380

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#127 PinkAnimal
Member since 2017 • 2380 Posts

@EG101 said:
@pinkanimal said:

@Xplode_games: "This"

Is BS. Lemmings pretending the 1X is capable of native 4K across the board are not fooling anyone anymore. The majority of games on the X1X is going to be also upscaled our dynamic if they ever reach 4k that is so stop talking crap.

Pulled straight out your ass.

Is much more probable than the majority being native 4k like you lemmings have been pretending all this time

Avatar image for schu
schu

10191

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#128 schu
Member since 2003 • 10191 Posts

I approve of the push for 4K myself. I'm hype for 4k 120fps.

Avatar image for cyberpunk_2077
cyberpunk_2077

627

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#129  Edited By cyberpunk_2077
Member since 2015 • 627 Posts
@gamingpcgod said:
@cyberpunk_2077 said:
@gamingpcgod said:

: ..Let me put it this way.

Gaming on a 15.6 inch display at 1080p is the equivalent of gaming on a 31.6 inch at 4k. However, even the smallest 4k TVs are still north of 40 inches. If you have a 50 inch tv at 4k, it would take at most a 25 inch display to match it's quality.

@gamingpcgod: Could you please elaborate on how you worked this out?

Basically, the amount of pixels in a screen only matter relative to how large the screen is. This is why 480p can look like HD on a tiny 5 inch phone; conversely, this is why 720p can look grainy on a large 70 inch tv.

If you make a tv bigger, you have to scale up the amount of pixels to retain the picture quality.

Since 4k is twice as many pixels vertically and horizontally (with 4x as many pixels in totality) you will have to scale the display's diagonal length x2 to get 4x the surface area. Want to retain 1080p quality on a 15.6 inch display on a 31.2 inch? Have to make that display 4k. Want to retain 4K-60 inch quality on a 30' monitor? Have to make it 1080p.

It's like density. Weather something sinks through the Earth is not at all dependent on it's weight, but how dense the object is. In the same way, pixel density is most important when it comes to display. Again, this is why 480p can look like it's HD on a phone.

The number of pixels would remain the same irrespective of the size of the TV, Isn't it better to say that it is down to how closely or far you are viewing the TV/Monitor from that affects the apparent quality.

So that 5 inch phone is all good up until you put your nose up on to the screen. Lets say we have a 1080 50" and a 4k 50".

Now the 4k 50" is the one denser with pixels making that the better option with the ability to view more detail per inch?

Now in terms of scaling the pixels to up for larger screens then I could agree to a level that a there would be a sweet spot for a 1080 screen at a particular size panel which ever that may be. but again depends of where you view the panel from.

So "Gaming on a 15.6 inch display at 1080p is the equivalent of gaming on a 31.6 inch at 4k." if it were even possible. You need to factor in that the 31.6 is still going to offer more pixels viewing more detail? , more so if you view both from the same distance.

Obviously we need a HD source for you and a 4K source for the 31.6".

To your original question however. The developers somewhat need to keep up with the market and hardware. However marginal and minimal 4k resolution may offer. It is a shame.

But the consumers want 4k gaming with their brand new shiny 4k tv's.

Avatar image for djura
djura

542

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#130 djura
Member since 2016 • 542 Posts

I agree that a focus purely on resolution is a bit of a tough sell and it doesn't make a whole lot of sense to me. Stuff like HDR seems a little bit more impactful, but I don't know, maybe this view will change when enough people have 4K TVs.

Personally, I actually favour Nintendo's approach at this point. They have a major focus on achieving 60fps gameplay and will happily adopt variable resolutions in order to achieve this in some games. For me, that's more of an immediate benefit as a gamer than simply increasing the resolution.

Avatar image for j2zon2591
j2zon2591

3571

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#131 j2zon2591
Member since 2005 • 3571 Posts

@djura:

@djura said:

I agree that a focus purely on resolution is a bit of a tough sell and it doesn't make a whole lot of sense to me. Stuff like HDR seems a little bit more impactful, but I don't know, maybe this view will change when enough people have 4K TVs.

Personally, I actually favour Nintendo's approach at this point. They have a major focus on achieving 60fps gameplay and will happily adopt variable resolutions in order to achieve this in some games. For me, that's more of an immediate benefit as a gamer than simply increasing the resolution.

I agree. I wish there were even 1080p True HDR sets but hey at least we're with a TCL P607 4K Smart TV that's only $ 599 today with true HDR and DV. Maybe 2-3 more years and I can have a much nicer set for the same price and most games have proper HDR mastering (forgot the proper term).

I hope next gen sticks with 4K for a while and use hardware resources towards other things than resolution.

Avatar image for achilles614
achilles614

5310

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#132  Edited By achilles614
Member since 2005 • 5310 Posts

@gamingpcgod: are you familiar with sampling/information theory?

Doesn't matter what fancy filter (in this case supersampling) you apply to a low resolution image, it will never match a higher resolution image. Can't fake unique pixel information by blending/downsampling.

It's fundamental signal/information theory. Can't fool Mother Nature.

Avatar image for Sam3231
Sam3231

2954

Forum Posts

0

Wiki Points

0

Followers

Reviews: 296

User Lists: 0

#133 Sam3231
Member since 2008 • 2954 Posts

@ronvalencia said:
@Sam3231 said:
@gamecubepad said:

@gamingpcgod:

Even with GTX 1060 6GB and RX 480 8GB OC'd they will hover around 45-55fps with drops into the low 30s on FM Apex, 4k/60fps, Ultra, 4xAA, wet track, full grid.

I realize you're likely just a fakeboy/troll alt, but I shouldn't have to do all the leg work here. RX 480/GTX 1060 both OC'd and failing miserably at Forza 4k/Ultra, wet track, full grid. X1X was running 60+ while these GPUs drop to the 30s.

FM Apex 4K/60fps, Ultra, 4xAA, wet track, full grid:

RX 480 8GB
RX 480 8GB
GTX 1060 6GB
GTX 1060 6GB

I tested my computer against the game 4K max settings too. It was probably about average 25 FPS. I have a GTX 970 4GB stock which is supposedly kind of close to these cards? But obviously not as much VRAM- but iirc I wasn't using it all anyway. I don't have a deep understanding of components but it seemed like the GPU was the bottleneck. It was running at full load constantly according to EVGA precision. Where as I still using about 7.2/8GB ram and my processor was running about 80% load which is a stock core i5-2400. On Windows 10.

On RBE front, 970 only has 1.7 MB L2 cache which is shared with TMU read/write units while X1X's RBE has 2MB render cache and TMUs has 2MB L2 cache.

Alpha effects mostly runs on RBE's fix function alpha math units and this RBE path was optimized for X1X. Forza's wet track hammers alpha effects.

When X1X's L2 cache (for TMU) and render cache (for RBE/ROPS) are combined, the total cache size is 4MB which is similar VEGA 56's shared 4MB L2 cache for RBE/ROPS and TMUs. X1X is like R9-390X OC with VEGA size L2 cache.

Selecting VEGA NCU wouldn't have optimized for ForzaTech and Unreal Engine 4 engines.

Ah, that clears things up. ;)

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#134  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Sam3231 said:
@ronvalencia said:
@Sam3231 said:
@gamecubepad said:

@gamingpcgod:

Even with GTX 1060 6GB and RX 480 8GB OC'd they will hover around 45-55fps with drops into the low 30s on FM Apex, 4k/60fps, Ultra, 4xAA, wet track, full grid.

I realize you're likely just a fakeboy/troll alt, but I shouldn't have to do all the leg work here. RX 480/GTX 1060 both OC'd and failing miserably at Forza 4k/Ultra, wet track, full grid. X1X was running 60+ while these GPUs drop to the 30s.

FM Apex 4K/60fps, Ultra, 4xAA, wet track, full grid:

RX 480 8GB
RX 480 8GB
GTX 1060 6GB
GTX 1060 6GB

I tested my computer against the game 4K max settings too. It was probably about average 25 FPS. I have a GTX 970 4GB stock which is supposedly kind of close to these cards? But obviously not as much VRAM- but iirc I wasn't using it all anyway. I don't have a deep understanding of components but it seemed like the GPU was the bottleneck. It was running at full load constantly according to EVGA precision. Where as I still using about 7.2/8GB ram and my processor was running about 80% load which is a stock core i5-2400. On Windows 10.

On RBE front, 970 only has 1.7 MB L2 cache which is shared with TMU read/write units while X1X's RBE has 2MB render cache and TMUs has 2MB L2 cache.

Alpha effects mostly runs on RBE's fix function alpha math units and this RBE path was optimized for X1X. Forza's wet track hammers alpha effects.

When X1X's L2 cache (for TMU) and render cache (for RBE/ROPS) are combined, the total cache size is 4MB which is similar VEGA 56's shared 4MB L2 cache for RBE/ROPS and TMUs. X1X is like R9-390X OC with VEGA size L2 cache.

Selecting VEGA NCU wouldn't have optimized for ForzaTech and Unreal Engine 4 engines.

Ah, that clears things up. ;)

The problem with pure TFLOPS from CU approach is such arguments doesn't complete the full GPU i.e. GPUs are NOT DSPs!

From my observation between R9-390X and GTX 980 Ti, on pure GpGPU workloads, AMD TFLOPS and NVIDIA TFLOPS are similar, but AMD's TFLOPS advantage has difficulty translating into graphics pipeline performance i.e. there seems to be bottlenecks somewhere in AMD GPU's graphics pipeline.

I'm not the only person who identified AMD GPU's graphics pipeline bottlenecks.

Loading Video...

Published on Jul 31, 2016

This old video shows NVIDIA Maxwell GPU's render advantage with hyper fast small cache linked to RBE/ROPS.

http://www.anandtech.com/show/10536/nvidia-maxwell-tile-rasterization-analysis

We also know that NVIDIA significantly increased the L2 cache size and did a number of low-level (transistor level) optimizations to the design. But NVIDIA has also held back information – the technical advantages that are their secret sauce – so I’ve never had a complete picture of how Maxwell compares to Kepler.

For a while now, a number of people have suspected that one of the ingredients of that secret sauce was that NVIDIA had applied some mobile power efficiency technologies to Maxwell. It was, after all, their original mobile-first GPU architecture, and now we have some data to back that up. Friend of AnandTech and all around tech guru David Kanter of Real World Tech has gone digging through Maxwell/Pascal, and in an article & video published this morning, he outlines how he has uncovered very convincing evidence that NVIDIA implemented a tile based rendering system with Maxwell.

In short, by playing around with some DirectX code specifically designed to look at triangle rasterization, he has come up with some solid evidence that NVIDIA’s handling of tringles has significantly changed since Kepler, and that their current method of triangle handling is consistent with a tile based renderer.

NVIDIA Maxwell's ROPS are linked to larger L2 cache while POLARIS/Hawaii/Fury GPU's ROPS has tiny render cache that runs into memory controller bottlenecks.

Somebody at Microsoft are monitoring NVIDIA's GPU developments and has converted the observation into action e.g. X1X GPU's ROPS has 2MB render cache.

Avatar image for emgesp
emgesp

7848

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#135 emgesp
Member since 2004 • 7848 Posts

Because its much easier to just brute force a higher resolution than creating new assets.