1080 Ti still isn't enough for 4K/60fps/max settings

  • 174 results
  • 1
  • 2
  • 3
  • 4

This topic is locked from further discussion.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#101 ronvalencia
Member since 2008 • 29612 Posts

@jhcho2 said:
@ronvalencia said:
@jhcho2 said:

Even, if so, there's so much more to graphics than just resolution and Anisotropic filtering. There is texture quality, draw distance, physics, particle effects, lighting, shadows etc. etc.

It is all of the above which overall affects the graphical fidelity of a game. The whole problem about 4k at 60fps is that it's not necessarily the best area for processing power to go into. Sure, resolution and framerates are important, but any PC gamer can tell you that he'll rather up the texture quality or draw distance by a notch if it means a drop in framerate from 60fps to say....52fps. Anything above 50fps is still silky smooth. The final 10 fps between 50 to 60 fps is far less noticeable than the other things which you can literally perceive with your eyes, like lighting and draw distances.

I'm just saying that while i can appreciate resolution and framerate, I don't believe in spending money on hardware which doesn't give me the option to decide where i want the processing power to be dedicated to.

Anisotropic filtering relates to texture rendering quality.

5 GB to 8 GB increase relates to texture quality.

The above screenshot has XBO graphics settings with PC's 4K textures. Scorpio has another 51 percent extra head room for extras.

Scorpio supports FreeSync and HDMI 2.1 VRR for less than 60 hz update rates.

Example for wasteful Gameworks effects.

It does, but not to be confused with texture quality. Anisotropic filtering just performs correction to textures which would otherwise blur out from oblique angles.

Texture quality on the other hand, relates to how much effort the devs put in to designing and rendering the surface of say....a rock. Anisotropic filtering does not help at all in this area

Texture rendering quality is different from baseline texture quality i.e. artwork quality.

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#102 EG101
Member since 2007 • 2091 Posts

@tormentos said:

@Juub1990:

You are missing the point scorpio has some alien tech from beyond our solar system,its CPU once it detects a frame drop,boost mode kick in on which turn on the hidden extra 16 cores and rise the speed to 4ghz.

Is something out of this world ask Ronvalencia.

Funny thing is No One has said that about Scorpio but Every single Rabid Cow on this Forum stated something along those Lines when mentioning the Cell in the PS3. Every one of you stated the Cell would Magically give the PS3 More Powah. Funny how the Reality is opposite of what You think it is.

The Delusion is so Bad with the Cows that they Twist their own Statements then Pretend the Lemmings said it.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103  Edited By scatteh316
Member since 2004 • 10273 Posts

@EG101 said:
@tormentos said:

@Juub1990:

You are missing the point scorpio has some alien tech from beyond our solar system,its CPU once it detects a frame drop,boost mode kick in on which turn on the hidden extra 16 cores and rise the speed to 4ghz.

Is something out of this world ask Ronvalencia.

Funny thing is No One has said that about Scorpio but Every single Rabid Cow on this Forum stated something along those Lines when mentioning the Cell in the PS3. Every one of you stated the Cell would Magically give the PS3 More Powah. Funny how the Reality is opposite of what You think it is.

The Delusion is so Bad with the Cows that they Twist their own Statements then Pretend the Lemmings said it.

Maybe if you read a few developer white papers and educated yourself on what developers used Cell for you'll see that it did.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104  Edited By Juub1990
Member since 2013 • 12620 Posts

@AzatiS: I know there are 120/240 and even plasmas had 480Hz but it's not the frame rate which is what I was referring to. That's why I said no 144Hhz TV's. The 240Hz tv's advertised is not what will allow higher frame rates which is what we are discussing. Those TV's won't display 240fps lol. My KS8000 for instance cannot go higher than 60fps. The refresh rates TV's advertised is very different from the ones PC monitors advertised and often misleads customers.

Highest I've seen were real 120Hz TV's that do give 120fps but no 144fps ones like PC monitors. Hence the "never heard of 144Hz TV's", Hell I even bought a 240Hz TV for my brother a few years back and it doesn't go higher than 60fps either lol. It's just the motion interpolation which we don't care about here.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#105 scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:
@scatteh316 said:
@ronvalencia said:
@scatteh316 said:
@oflow said:

I dont think anyone expects Scorpio to run games 4k/60 maxed out. Whos saying this?

What it will do though is run games that they've optimized for the system at native 4K, and will have enough extra power to run more games at 60fps even if they arent 4k if the devs choose to make them that way, thats easier for devs since they can make one version of the game.

It really doesn't work that way at all..... 60fps is CPU dependant more then anything and Scorpio's CPU is not that much better then Pro's, in fact as a whole machine Scorpio is a lot more CPU limited then Pro is.

And it's been confirmed that games that run at 1080p and 900p on Xbone will be native 4k on Scorpio but games that are below 900p and run at 720p will not be running at native 4k on Scorpio as it won't have the power.

R9-290X (5.6 TFLOPS) runs XBO's Killer Instinct Season 1 with 720p resolution at 4K with 55 fps.

For this case, Killer Instinct was optimized for AMD GCN i.e. it closed the gap with GTX 980 Ti. The programmers could be careful with GCN version 1.1's smaller L2 cache sizes e.g. R9-290X has 1 MB L2 cache while GTX 980 Ti has 2 MB L2 cache.

AMD GPUs can close the gap with NVIDIA Maxwell GPUs when software optimizations involves Compute Engine to L2 cache path instead of Pixel Engine to memory controller path.

On AMD Vega and NVIDIA Maxwell/Pascal GPUs, both Pixel and Compute Engines are connected to L2 cache.

I really do not give a shit...... People who have had time to play Scorpio and talk to the engineers have already said what I've written.

You have neither played or seen Scorpio and yet you still continue to post irrelevant charts of PC hardware as if they apply to consoles.

Your post is shit.

XBO has Killer Instinct Season 1 has 1280x720p resolution. Season 2 has 1600 x 900p.

Scorpio's Killer Instinct Season 1 to 3 has 4K resolution.

R9-290X can play Killer Instinct Season 1 at 4K resolution 55 fps. It's 4K on my R9-390X you stupid clown.

Fact, Scorpio has proven to be faster than my old R9-390X. I played Forza M6 and H3 on my R9-390X and it doesn't match Scorpio's results.

R9-390X doesn't have Polaris DCC while Scorpio has this feature.

R9-390X doesn't have Polaris 10's 2MB L2 cache while Scorpio has this feature.

Polaris increased IPC was gimped by crappy memory bandwidth.

R9-290 GCN 1.1 is the same generation as XBO.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

I don't need to own Titan XP 2.0 to know this GPU is faster than R9-390X.

I know some of R9-390X's bottlenecks and what's needed to be fixed.

And here we go using baseless hardware charts for PC to use on console....... The chart troll strikes again.

People who've played Scorpio and spoken to the engineers >>>> Your charts

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#106  Edited By AzatiS
Member since 2004 • 14969 Posts

@Juub1990 said:

@AzatiS: I know there are 120/240 and even plasmas had 480Hz but it's not the frame rate which is what I was referring to. That's why I said no 144Hhz TV's. The 240Hz tv's advertised is not what will allow higher frame rates which is what we are discussing. Those TV's won't display 240fps lol. My KS8000 for instance cannot go higher than 60fps. The refresh rates TV's advertised is very different from the ones PC monitors advertised and often misleads customers.

Highest I've seen were real 120Hz TV's that do give 120fps but no 144fps ones like PC monitors. Hence the "never heard of 144Hz TV's", Hell I even bought a 240Hz TV for my brother a few years back and it doesn't go higher than 60fps either lol. It's just the motion interpolation which we don't care about here.

So you saying you havent seen any difference in motion between a 60hz TV and a 240hz TV or more ? I told you , being FAKE or not doesnt matter , youll notice a difference and this difference ( even if it might be pseudo ) will give you a more pleasant experience in my opinion.

Because you know you are talking about pixel lag in short here. Post processing or not the outcome of this is a more smooth experience even if monitor Hz is what we can call " real time" which is the real deal and not fake one.

The difference though you can see from 200HZ TV to a 60Hz TV ( when in reality even 200HZ TV might be a 60HZ ) , as i said before , even if those numbers are FAKE and there is post processing involvement , is a totally different experience.

But anyways , i get your point and you are right if you meant it that way but personally , i wouldnt sacrifice playing on my 144hz monitor 1080p/max settings over 60hz/4K - whatever settings.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#107  Edited By Juub1990
Member since 2013 • 12620 Posts

@AzatiS: Oh I definitely see a difference in everything except games that are capped at 60fps with no motion interplolation.

TV series for instance look a lot smoother which was weird at first lol but now I got used to it.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#108  Edited By ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:
@ronvalencia said:

Your post is shit.

XBO has Killer Instinct Season 1 has 1280x720p resolution. Season 2 has 1600 x 900p.

Scorpio's Killer Instinct Season 1 to 3 has 4K resolution.

R9-290X can play Killer Instinct Season 1 at 4K resolution 55 fps. It's 4K on my R9-390X you stupid clown.

Fact, Scorpio has proven to be faster than my old R9-390X. I played Forza M6 and H3 on my R9-390X and it doesn't match Scorpio's results.

R9-390X doesn't have Polaris DCC while Scorpio has this feature.

R9-390X doesn't have Polaris 10's 2MB L2 cache while Scorpio has this feature.

Polaris increased IPC was gimped by crappy memory bandwidth.

R9-290 GCN 1.1 is the same generation as XBO.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

I don't need to own Titan XP 2.0 to know this GPU is faster than R9-390X.

I know some of R9-390X's bottlenecks and what's needed to be fixed.

And here we go using baseless hardware charts for PC to use on console....... The chart troll strikes again.

People who've played Scorpio and spoken to the engineers >>>> Your charts

Your post is a troll

@scatteh316 said:

.People who've played Scorpio and spoken to the engineers >>>> Your charts

Show that link.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Deal with it.

XBO has Killer Instinct Season 1 has 1280x720p resolution. Season 2 has 1600 x 900p.

Scorpio's Killer Instinct Season 1 to 3 has 4K resolution.

R9-290X can play Killer Instinct Season 1 at 4K resolution 55 fps. It's 4K on my R9-390X you stupid clown.

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#109 AzatiS
Member since 2004 • 14969 Posts

@Juub1990 said:

@AzatiS: Oh I definitely see a difference in everything except games that are capped at 60fps with no motion interplolation.

TV series for instance look a lot smoother which was weird at first lol but now I got used to it.

hehe ok , youre right

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#110 scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:
@scatteh316 said:
@ronvalencia said:

Your post is shit.

XBO has Killer Instinct Season 1 has 1280x720p resolution. Season 2 has 1600 x 900p.

Scorpio's Killer Instinct Season 1 to 3 has 4K resolution.

R9-290X can play Killer Instinct Season 1 at 4K resolution 55 fps. It's 4K on my R9-390X you stupid clown.

Fact, Scorpio has proven to be faster than my old R9-390X. I played Forza M6 and H3 on my R9-390X and it doesn't match Scorpio's results.

R9-390X doesn't have Polaris DCC while Scorpio has this feature.

R9-390X doesn't have Polaris 10's 2MB L2 cache while Scorpio has this feature.

Polaris increased IPC was gimped by crappy memory bandwidth.

R9-290 GCN 1.1 is the same generation as XBO.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

I don't need to own Titan XP 2.0 to know this GPU is faster than R9-390X.

I know some of R9-390X's bottlenecks and what's needed to be fixed.

And here we go using baseless hardware charts for PC to use on console....... The chart troll strikes again.

People who've played Scorpio and spoken to the engineers >>>> Your charts

Your post is a troll

@scatteh316 said:

.People who've played Scorpio and spoken to the engineers >>>> Your charts

Show that link.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Deal with it.

XBO has Killer Instinct Season 1 has 1280x720p resolution. Season 2 has 1600 x 900p.

Scorpio's Killer Instinct Season 1 to 3 has 4K resolution.

R9-290X can play Killer Instinct Season 1 at 4K resolution 55 fps. It's 4K on my R9-390X you stupid clown.

And here we go with the yet again stupid PC hardware comparisons...... CONSOLES are not PC's you dumb ass..... their software and API does not function like a PC does so simply saying PC GPU = SAME IN A CONSOLE is fucking retarded.

But but.... PC otimaztttioonnnnsssssss make my charts accurate......... About as accurate and useful as a fucking chocolate tea pot.

"As we landed on 4K, Andrew [Goossen] and team did a pretty deep analysis," Gammill continues. "We have this developer tool called PIX [Performance Investigator for Xbox]. It lets us do some GPU trace capture. He and his team did a really deep analysis across a breadth of titles with the goal that any 900p or better title would be able to easily run at frame-rate at 4K on Scorpio. That was our big stake in the ground, and so with that we began our work speccing out what the Scorpio Engine is. It's not a process of calling up AMD and saying I'll take this part, this part and this part. A lot of really specific custom work went into this"

But the proof of the pudding is in the eating. Specs are one thing, but Microsoft is promising that both 900p and 1080p Xbox One games should be able to run at native 4K on Project Scorpio.

Forza engine runs at native 4k60fps on Scorpio with 60-70% GPU utilization, Forza is a NATIVE 1080P on Xbone and only has 30-40% left in the tank....... Now imagine if Forza was 720p on Xbone instead........ That spare 30-40% that's left is no where near enough to cover umping from 720p to native 4k

It has been known now that Microsft have in NO WAY stated that games below 900p will be native 4k, read ANY interview about Scorpio and they all say games that are 900p+ should run at native 4k.

It has been said in the DF Scorpio unveil video that 720p to native 4k is too much of a jump for Scorpio... and Microsoft themselves are only claiming games above 900p will be native 4k.... Conincidence?

Understand?? Or do you want me to draw you a picture a in Crayon?

Or are you going to tell me that you know better then Microsofts own engineers you complete tool.....

Now you continue to post your bull shit PC charts that prove you're nothing more then a noob who reads Wiki for 10 minutes and then he thinks he knows it all.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#111  Edited By ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:
@ronvalencia said:
@scatteh316 said:
@ronvalencia said:

Your post is shit.

XBO has Killer Instinct Season 1 has 1280x720p resolution. Season 2 has 1600 x 900p.

Scorpio's Killer Instinct Season 1 to 3 has 4K resolution.

R9-290X can play Killer Instinct Season 1 at 4K resolution 55 fps. It's 4K on my R9-390X you stupid clown.

Fact, Scorpio has proven to be faster than my old R9-390X. I played Forza M6 and H3 on my R9-390X and it doesn't match Scorpio's results.

R9-390X doesn't have Polaris DCC while Scorpio has this feature.

R9-390X doesn't have Polaris 10's 2MB L2 cache while Scorpio has this feature.

Polaris increased IPC was gimped by crappy memory bandwidth.

R9-290 GCN 1.1 is the same generation as XBO.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

I don't need to own Titan XP 2.0 to know this GPU is faster than R9-390X.

I know some of R9-390X's bottlenecks and what's needed to be fixed.

And here we go using baseless hardware charts for PC to use on console....... The chart troll strikes again.

People who've played Scorpio and spoken to the engineers >>>> Your charts

Your post is a troll

@scatteh316 said:

.People who've played Scorpio and spoken to the engineers >>>> Your charts

Show that link.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Deal with it.

XBO has Killer Instinct Season 1 has 1280x720p resolution. Season 2 has 1600 x 900p.

Scorpio's Killer Instinct Season 1 to 3 has 4K resolution.

R9-290X can play Killer Instinct Season 1 at 4K resolution 55 fps. It's 4K on my R9-390X you stupid clown.

And here we go with the yet again stupid PC hardware comparisons...... CONSOLES are not PC's you dumb ass..... their software and API does not function like a PC does so simply saying PC GPU = SAME IN A CONSOLE is fucking retarded.

But but.... PC otimaztttioonnnnsssssss make my charts accurate......... About as accurate and useful as a fucking chocolate tea pot.

"As we landed on 4K, Andrew [Goossen] and team did a pretty deep analysis," Gammill continues. "We have this developer tool called PIX [Performance Investigator for Xbox]. It lets us do some GPU trace capture. He and his team did a really deep analysis across a breadth of titles with the goal that any 900p or better title would be able to easily run at frame-rate at 4K on Scorpio. That was our big stake in the ground, and so with that we began our work speccing out what the Scorpio Engine is. It's not a process of calling up AMD and saying I'll take this part, this part and this part. A lot of really specific custom work went into this"

But the proof of the pudding is in the eating. Specs are one thing, but Microsoft is promising that both 900p and 1080p Xbox One games should be able to run at native 4K on Project Scorpio.

Forza engine runs at native 4k60fps on Scorpio with 60-70% GPU utilization, Forza is a NATIVE 1080P on Xbone and only has 30-40% left in the tank....... Now imagine if Forza was 720p on Xbone instead........ That spare 30-40% that's left is no where near enough to cover umping from 720p to native 4k

It has been known now that Microsft have in NO WAY stated that games below 900p will be native 4k, read ANY interview about Scorpio and they all say games that are 900p+ should run at native 4k.

It has been said in the DF Scorpio unveil video that 720p to native 4k is too much of a jump for Scorpio... and Microsoft themselves are only claiming games above 900p will be native 4k.... Conincidence?

Understand?? Or do you want me to draw you a picture a in Crayon?

Or are you going to tell me that you know better then Microsofts own engineers you complete tool.....

Now you continue to post your bull shit PC charts that prove you're nothing more then a noob who reads Wiki for 10 minutes and then he thinks he knows it all.

API overheads occurs on the CPU side dumb ass. When a PC gamer throws 4.5 Ghz i7-7700K at the problem, the bottleneck is usually the GPU.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p,

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Killer Instinct Season 1 XBO has 1280x720p.

Killer Instinct Season 2 XBO has 1600x900p

Deal with it.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#112  Edited By scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:
@scatteh316 said:
@ronvalencia said:
@scatteh316 said:

And here we go using baseless hardware charts for PC to use on console....... The chart troll strikes again.

People who've played Scorpio and spoken to the engineers >>>> Your charts

Your post is a troll

@scatteh316 said:

.People who've played Scorpio and spoken to the engineers >>>> Your charts

Show that link.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Deal with it.

XBO has Killer Instinct Season 1 has 1280x720p resolution. Season 2 has 1600 x 900p.

Scorpio's Killer Instinct Season 1 to 3 has 4K resolution.

R9-290X can play Killer Instinct Season 1 at 4K resolution 55 fps. It's 4K on my R9-390X you stupid clown.

And here we go with the yet again stupid PC hardware comparisons...... CONSOLES are not PC's you dumb ass..... their software and API does not function like a PC does so simply saying PC GPU = SAME IN A CONSOLE is fucking retarded.

But but.... PC otimaztttioonnnnsssssss make my charts accurate......... About as accurate and useful as a fucking chocolate tea pot.

"As we landed on 4K, Andrew [Goossen] and team did a pretty deep analysis," Gammill continues. "We have this developer tool called PIX [Performance Investigator for Xbox]. It lets us do some GPU trace capture. He and his team did a really deep analysis across a breadth of titles with the goal that any 900p or better title would be able to easily run at frame-rate at 4K on Scorpio. That was our big stake in the ground, and so with that we began our work speccing out what the Scorpio Engine is. It's not a process of calling up AMD and saying I'll take this part, this part and this part. A lot of really specific custom work went into this"

But the proof of the pudding is in the eating. Specs are one thing, but Microsoft is promising that both 900p and 1080p Xbox One games should be able to run at native 4K on Project Scorpio.

Forza engine runs at native 4k60fps on Scorpio with 60-70% GPU utilization, Forza is a NATIVE 1080P on Xbone and only has 30-40% left in the tank....... Now imagine if Forza was 720p on Xbone instead........ That spare 30-40% that's left is no where near enough to cover umping from 720p to native 4k

It has been known now that Microsft have in NO WAY stated that games below 900p will be native 4k, read ANY interview about Scorpio and they all say games that are 900p+ should run at native 4k.

It has been said in the DF Scorpio unveil video that 720p to native 4k is too much of a jump for Scorpio... and Microsoft themselves are only claiming games above 900p will be native 4k.... Conincidence?

Understand?? Or do you want me to draw you a picture a in Crayon?

Or are you going to tell me that you know better then Microsofts own engineers you complete tool.....

Now you continue to post your bull shit PC charts that prove you're nothing more then a noob who reads Wiki for 10 minutes and then he thinks he knows it all.

API overheads occurs on the CPU side dumb ass. When a PC gamer throws 4.5 Ghz i7-7700K at the problem, the bottleneck is usually the GPU.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p,

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Killer Instinct Season 1 XBO has 1280x720p.

Killer Instinct Season 2 XBO has 1600x900p

Deal with it.

Ahahahahahaha....... You've proven NOTHING......... Lets compare resolutions on different games in the same series and use that as proof...... Lmao....... You really are a special one.....

I stated originally that any games below 900p on Xbone is not going to be native 4k on Scorpio, I was called out on this by YOU........ I have proven this with words from actual quotes from Scorpio engineers and from people who seen/used the system........ You have just posted a bunch of slides from PC and tried to say they prove I'm wrong............. Lmao.......

Se what is there to deal with?? The only thing you've done is prove once again you're nothing but a copy and paste chart troll.

Please continue with your bullshittery.......

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#113  Edited By ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:
@ronvalencia said:
@scatteh316 said:
@ronvalencia said:

Your post is a troll

@scatteh316 said:

.People who've played Scorpio and spoken to the engineers >>>> Your charts

Show that link.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Deal with it.

XBO has Killer Instinct Season 1 has 1280x720p resolution. Season 2 has 1600 x 900p.

Scorpio's Killer Instinct Season 1 to 3 has 4K resolution.

R9-290X can play Killer Instinct Season 1 at 4K resolution 55 fps. It's 4K on my R9-390X you stupid clown.

And here we go with the yet again stupid PC hardware comparisons...... CONSOLES are not PC's you dumb ass..... their software and API does not function like a PC does so simply saying PC GPU = SAME IN A CONSOLE is fucking retarded.

But but.... PC otimaztttioonnnnsssssss make my charts accurate......... About as accurate and useful as a fucking chocolate tea pot.

"As we landed on 4K, Andrew [Goossen] and team did a pretty deep analysis," Gammill continues. "We have this developer tool called PIX [Performance Investigator for Xbox]. It lets us do some GPU trace capture. He and his team did a really deep analysis across a breadth of titles with the goal that any 900p or better title would be able to easily run at frame-rate at 4K on Scorpio. That was our big stake in the ground, and so with that we began our work speccing out what the Scorpio Engine is. It's not a process of calling up AMD and saying I'll take this part, this part and this part. A lot of really specific custom work went into this"

But the proof of the pudding is in the eating. Specs are one thing, but Microsoft is promising that both 900p and 1080p Xbox One games should be able to run at native 4K on Project Scorpio.

Forza engine runs at native 4k60fps on Scorpio with 60-70% GPU utilization, Forza is a NATIVE 1080P on Xbone and only has 30-40% left in the tank....... Now imagine if Forza was 720p on Xbone instead........ That spare 30-40% that's left is no where near enough to cover umping from 720p to native 4k

It has been known now that Microsft have in NO WAY stated that games below 900p will be native 4k, read ANY interview about Scorpio and they all say games that are 900p+ should run at native 4k.

It has been said in the DF Scorpio unveil video that 720p to native 4k is too much of a jump for Scorpio... and Microsoft themselves are only claiming games above 900p will be native 4k.... Conincidence?

Understand?? Or do you want me to draw you a picture a in Crayon?

Or are you going to tell me that you know better then Microsofts own engineers you complete tool.....

Now you continue to post your bull shit PC charts that prove you're nothing more then a noob who reads Wiki for 10 minutes and then he thinks he knows it all.

API overheads occurs on the CPU side dumb ass. When a PC gamer throws 4.5 Ghz i7-7700K at the problem, the bottleneck is usually the GPU.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p,

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Killer Instinct Season 1 XBO has 1280x720p.

Killer Instinct Season 2 XBO has 1600x900p

Deal with it.

Ahahahahahaha....... You've proven NOTHING......... Lets compare resolutions on different games in the same series and use that as proof...... Lmao....... You really are a special one.....

I stated originally that any games below 900p on Xbone is not going to be native 4k on Scorpio, I was called out on this by YOU........ I have proven this with words from actual quotes from Scorpio engineers and from people who seen/used the system........ You have just posted a bunch of slides from PC and tried to say they prove I'm wrong............. Lmao.......

Se what is there to deal with?? The only thing you've done is prove once again you're nothing but a copy and paste chart troll.

Please continue with your bullshittery.......

Battlefront is already 4K ~60 Fury X/980 ti class GPU.

According to DF, Scorpio's GPU ~= GTX 1070/ Fury X class

Battlefront 2 follows this trend.

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Killer Instinct Season 1 XBO has 1280x720p.

Killer Instinct Season 2 XBO has 1600x900p

Deal with it.

Please continue with your bullshittery.......

Avatar image for HalcyonScarlet
HalcyonScarlet

13668

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#114 HalcyonScarlet
Member since 2011 • 13668 Posts

@Juub1990 said:
@KBFloYd said:

im able to get rise of the tomb raider to 4k 60

just by turning off AA and pure hair to on and dynamic foilage to medium.....everything else cranked.

thats fine imo..

It is but still not max settings.

Do you even need AA at 4K?

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#115 Juub1990
Member since 2013 • 12620 Posts

@HalcyonScarlet: Nope and I turn it off too. Still no 60fps.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#116  Edited By scatteh316
Member since 2004 • 10273 Posts

@ronvalencia said:
@scatteh316 said:
@ronvalencia said:
@scatteh316 said:

And here we go with the yet again stupid PC hardware comparisons...... CONSOLES are not PC's you dumb ass..... their software and API does not function like a PC does so simply saying PC GPU = SAME IN A CONSOLE is fucking retarded.

But but.... PC otimaztttioonnnnsssssss make my charts accurate......... About as accurate and useful as a fucking chocolate tea pot.

"As we landed on 4K, Andrew [Goossen] and team did a pretty deep analysis," Gammill continues. "We have this developer tool called PIX [Performance Investigator for Xbox]. It lets us do some GPU trace capture. He and his team did a really deep analysis across a breadth of titles with the goal that any 900p or better title would be able to easily run at frame-rate at 4K on Scorpio. That was our big stake in the ground, and so with that we began our work speccing out what the Scorpio Engine is. It's not a process of calling up AMD and saying I'll take this part, this part and this part. A lot of really specific custom work went into this"

But the proof of the pudding is in the eating. Specs are one thing, but Microsoft is promising that both 900p and 1080p Xbox One games should be able to run at native 4K on Project Scorpio.

Forza engine runs at native 4k60fps on Scorpio with 60-70% GPU utilization, Forza is a NATIVE 1080P on Xbone and only has 30-40% left in the tank....... Now imagine if Forza was 720p on Xbone instead........ That spare 30-40% that's left is no where near enough to cover umping from 720p to native 4k

It has been known now that Microsft have in NO WAY stated that games below 900p will be native 4k, read ANY interview about Scorpio and they all say games that are 900p+ should run at native 4k.

It has been said in the DF Scorpio unveil video that 720p to native 4k is too much of a jump for Scorpio... and Microsoft themselves are only claiming games above 900p will be native 4k.... Conincidence?

Understand?? Or do you want me to draw you a picture a in Crayon?

Or are you going to tell me that you know better then Microsofts own engineers you complete tool.....

Now you continue to post your bull shit PC charts that prove you're nothing more then a noob who reads Wiki for 10 minutes and then he thinks he knows it all.

API overheads occurs on the CPU side dumb ass. When a PC gamer throws 4.5 Ghz i7-7700K at the problem, the bottleneck is usually the GPU.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p,

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Killer Instinct Season 1 XBO has 1280x720p.

Killer Instinct Season 2 XBO has 1600x900p

Deal with it.

Ahahahahahaha....... You've proven NOTHING......... Lets compare resolutions on different games in the same series and use that as proof...... Lmao....... You really are a special one.....

I stated originally that any games below 900p on Xbone is not going to be native 4k on Scorpio, I was called out on this by YOU........ I have proven this with words from actual quotes from Scorpio engineers and from people who seen/used the system........ You have just posted a bunch of slides from PC and tried to say they prove I'm wrong............. Lmao.......

Se what is there to deal with?? The only thing you've done is prove once again you're nothing but a copy and paste chart troll.

Please continue with your bullshittery.......

Battlefront is already 4K ~60 Fury X/980 ti class GPU.

According to DF, Scorpio's GPU ~= GTX 1070/ Fury X class

Battlefront 2 follows this trend.

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Killer Instinct Season 1 XBO has 1280x720p.

Killer Instinct Season 2 XBO has 1600x900p

Deal with it.

Please continue with your bullshittery.......

That's the best you have? Scorpio will perform no where near perform as well as the GPU's in that chart..... A 1070 has more bandwidth and a much better CPU pushing it.

As I asked you a few pages back, did you factor in the badnwidth the CPU would steal from the Scorpio's GPU? Because looking at your charts you didn't..... pretty noobie mistake considering you 'know' it all.

But then again I've told you multiple times and it fails to sink in.... maybe there's an underlying learning disability there... who knows.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#117  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@scatteh316 said:

That's the best you have? Scorpio will perform no where near perform as well as the GPU's in that chart..... A 1070 has more bandwidth and a much better CPU pushing it.

As I asked you a few pages back, did you factor in the badnwidth the CPU would steal from the Scorpio's GPU? Because looking at your charts you didn't..... pretty noobie mistake considering you 'know' it all.

But then again I've told you multiple times and it fails to sink in.... maybe there's an underlying learning disability there... who knows.

The cpu will use some of the bandwidth from the GDDR5 so the gpu should have around 300gb/s , also only 8gb of the 12gb is usable for games. which means that the gpu will have 6gb or less for vram. So average usage will most likely be 4/4, 2/6 or 3/5 type of split. Scorpio will most likely not perform better than RX 480 OC/ 580 paired with an i7 or Ryzen based cpu. since it still has to to contend with 2.3ghz "upgrade" jaguar or puma like cpu, along with not having a full 8gb of vram buffer.

Avatar image for HalcyonScarlet
HalcyonScarlet

13668

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#118 HalcyonScarlet
Member since 2011 • 13668 Posts

@Juub1990 said:

@HalcyonScarlet: Nope and I turn it off too. Still no 60fps.

Does it give you back significant resources, turning it off?

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#119 Juub1990
Member since 2013 • 12620 Posts

@HalcyonScarlet: FXAA is uglier but basically free performance. Turning it off might give 1 fps if that. SMAA is alright but takes a bit more performance. SSAA just kills performance.

Avatar image for HalcyonScarlet
HalcyonScarlet

13668

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#120  Edited By HalcyonScarlet
Member since 2011 • 13668 Posts

@Juub1990 said:

@HalcyonScarlet: FXAA is uglier but basically free performance. Turning it off might give 1 fps if that. SMAA is alright but takes a bit more performance. SSAA just kills performance.

FXAA is great for old games that don't have any AA settings. I put it on Jedi Outcast a while back, it really cleaned up there. But I heard at 4K you shouldn't need it at all, and processing AA at that resolution can be a bit of a thing.

Avatar image for APiranhaAteMyVa
APiranhaAteMyVa

4160

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#121  Edited By APiranhaAteMyVa
Member since 2011 • 4160 Posts

Scorpio was never going to have max settings, it was always like Pro just going to be Xbox One games w/ enhancements. No way are they going to have Ultra/4K/60fps.

Xbox One level games with maybe a couple of shadow/texture improvements and AF should be no problem for Scorpio at 4k/60fps.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#122 Juub1990
Member since 2013 • 12620 Posts

@HalcyonScarlet: You don't need AA in 4K. I personally never use it. I mean it still looks nicer but the performance hit is just too much. Image can look cleaner and crisper but 4K looks gorgeous as is so it's really not necessary.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#123  Edited By 04dcarraher
Member since 2004 • 23832 Posts
@Juub1990 said:

@HalcyonScarlet: You don't need AA in 4K. I personally never use it. I mean it still looks nicer but the performance hit is just too much. Image can look cleaner and crisper but 4K looks gorgeous as is so it's really not necessary.

IMO with 4k AA is only needed if your using a large screen , where your pixel density is below 100 PPI.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#124  Edited By RyviusARC
Member since 2011 • 5708 Posts

@AzatiS said:
@RyviusARC said:
@AzatiS said:
@Juub1990 said:

Oh definitely. I even use 3rd party tools to lock the frame rates. Never owned a Gsync monitor but I might try one. Apparently even uneven frame rates are much better with those.

BRB, will try to see if I can get 2000fps. Don't have minesweeper though.

you wont but if you get anything above 60fps your TV will show that and beleive me , the more the better.

People keep saying over 60hz or whatever isnt any different ... and im saying 60hz to over 120hz is a big difference. HUGE to be precise as long as you can maintain your fps.

On 4K , thats a nono atm , so to me gaming on 4K just for the resolution while i have to sacrifice settings along with FPS ... no freaking way ill do that. FPS+SETTINGS >> RESOLUTION to me as long as resolution is 1080p ( except some really competitive games i drop my resolution even further to maintain above 144fps at all times etc )

You said " a 144Hz TV is something I've never heard of. " when your TV is able to do over 2000hz ... beeeh

I would take 4k 60fps over 1440p 120/144hz any day of the week.

While I notice the difference I'd rather have the extra clarity of 4k. I would only use 120/144hz monitors for competitive games like Counterstrike which I don't really play. For everything else 60fps is perfectly fine. Hell I am fine with lower than 60fps if it's not an fps/hack and slash/fighting or racing game. 45fps was still smooth enough for me in Mass Effect.

After years of gaming at 144hz ( or close to that because not all games can sustain 144fps easily even at 1080p ) , theres no way ill go back to 60fps or even worse 45fps just to play on 4K.

144hz monitors is not for competitive games , thats a marketing bullshit gaming monitor companies came up with. What matters most for competitive play is input lag and that alone. Everything else is a bonus

Well everyone has their opinion.

I believe higher than 60fps is not needed unless it's a twitch based shooter. Even then you can do well at 60fps against people who are playing at 144fps.

The feeling you have while comparing 60fps to 144fps will be forgotten if you went back to 60fps exclusively for a few weeks. Everyone I know including myself knows the feeling and it just takes time to adjust back.

To me it's much better to have crisp beautiful graphics and smooth enough frame rate of 60fps compared to 144fps and ugly graphics that look like they came from a Picasso painting.

I've tried both for many months and only care about the higher than 60fps frame rate in competitive games otherwise I don't gain any more enjoyment from it. While better graphics that help get rid of things like pop in and allow me to see further help me much more in gameplay.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#125 tormentos
Member since 2003 • 33784 Posts

@EG101 said:

Funny thing is No One has said that about Scorpio but Every single Rabid Cow on this Forum stated something along those Lines when mentioning the Cell in the PS3. Every one of you stated the Cell would Magically give the PS3 More Powah. Funny how the Reality is opposite of what You think it is.

The Delusion is so Bad with the Cows that they Twist their own Statements then Pretend the Lemmings said it.

But there is a difference Cell was a hybrid CPU GPU processor,crunching numbers Cell was a beast thanks to is parallelism,which is why Cell was so good at physics like an Ageia PPU was what you know now as Nvidia Physx.

Cell was capable of doing GPU task,it could even do ray tracing,that Jaguar inside the Pro and Scorpio are not what Cell use to be in 2006.

And in fact the only reason why the PS3 could match or even exceed the 360 graphically was because of Cell,which could offload GPU task from the RSX,allowing it to compensate for its weaker hardware,that kind of hand shake work didn't happen on 360 where the Xenos practically did everything an the Xenon was completely weak sauce.

See how Cell here beat the crap out of the 360 Xenon? and that was 5 SPE a 6th one would have also beat the xbox one,with 5 it beat the jaguar on PS4.

So yeah Cell actually deliver to the point of helping the PS3 beat the 360 graphically when the 360 had a more capable GPU.

By the way it wasn't cows the ones who believe on magical Api that double power mysteriously or the unlimited power of the cloud.

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#126  Edited By AzatiS
Member since 2004 • 14969 Posts

@RyviusARC said:
@AzatiS said:
@RyviusARC said:
@AzatiS said:

you wont but if you get anything above 60fps your TV will show that and beleive me , the more the better.

People keep saying over 60hz or whatever isnt any different ... and im saying 60hz to over 120hz is a big difference. HUGE to be precise as long as you can maintain your fps.

On 4K , thats a nono atm , so to me gaming on 4K just for the resolution while i have to sacrifice settings along with FPS ... no freaking way ill do that. FPS+SETTINGS >> RESOLUTION to me as long as resolution is 1080p ( except some really competitive games i drop my resolution even further to maintain above 144fps at all times etc )

You said " a 144Hz TV is something I've never heard of. " when your TV is able to do over 2000hz ... beeeh

I would take 4k 60fps over 1440p 120/144hz any day of the week.

While I notice the difference I'd rather have the extra clarity of 4k. I would only use 120/144hz monitors for competitive games like Counterstrike which I don't really play. For everything else 60fps is perfectly fine. Hell I am fine with lower than 60fps if it's not an fps/hack and slash/fighting or racing game. 45fps was still smooth enough for me in Mass Effect.

After years of gaming at 144hz ( or close to that because not all games can sustain 144fps easily even at 1080p ) , theres no way ill go back to 60fps or even worse 45fps just to play on 4K.

144hz monitors is not for competitive games , thats a marketing bullshit gaming monitor companies came up with. What matters most for competitive play is input lag and that alone. Everything else is a bonus

Well everyone has their opinion.

I believe higher than 60fps is not needed unless it's a twitch based shooter. Even then you can do well at 60fps against people who are playing at 144fps.

The feeling you have while comparing 60fps to 144fps will be forgotten if you went back to 60fps exclusively for a few weeks. Everyone I know including myself knows the feeling and it just takes time to adjust back.

To me it's much better to have crisp beautiful graphics and smooth enough frame rate of 60fps compared to 144fps and ugly graphics that look like they came from a Picasso painting.

I've tried both for many months and only care about the higher than 60fps frame rate in competitive games otherwise I don't gain any more enjoyment from it. While better graphics that help get rid of things like pop in and allow me to see further help me much more in gameplay.

Sorry man but how 144fps at 1080p/max settings is ugly compared to 60fps at 4K/medium settings ( or else you really need a 2000$ plus rig if you want to play 4K/Max settings at stable 60fps the newest games , and even that is questionable ).

I understand though that you like 4K , is impressive i admit. But not be able to sustain 60fps at max settings at all times for this extra clarity 4K has and all while you spend a little fortune? I guess this is not for me ... yet.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#127  Edited By ronvalencia
Member since 2008 • 29612 Posts

@scatteh316 said:
@ronvalencia said:
@scatteh316 said:
@ronvalencia said:

API overheads occurs on the CPU side dumb ass. When a PC gamer throws 4.5 Ghz i7-7700K at the problem, the bottleneck is usually the GPU.

http://www.eurogamer.net/articles/digitalfoundry-2015-star-wars-battlefront-face-off

XBO's Battlefront has 720p,

https://www.windowscentral.com/forza-motorsport-7-4k-project-scorpio-xbox

Scorpio's Battlefront 2 has 4K

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Killer Instinct Season 1 XBO has 1280x720p.

Killer Instinct Season 2 XBO has 1600x900p

Deal with it.

Ahahahahahaha....... You've proven NOTHING......... Lets compare resolutions on different games in the same series and use that as proof...... Lmao....... You really are a special one.....

I stated originally that any games below 900p on Xbone is not going to be native 4k on Scorpio, I was called out on this by YOU........ I have proven this with words from actual quotes from Scorpio engineers and from people who seen/used the system........ You have just posted a bunch of slides from PC and tried to say they prove I'm wrong............. Lmao.......

Se what is there to deal with?? The only thing you've done is prove once again you're nothing but a copy and paste chart troll.

Please continue with your bullshittery.......

Battlefront is already 4K ~60 Fury X/980 ti class GPU.

According to DF, Scorpio's GPU ~= GTX 1070/ Fury X class

Battlefront 2 follows this trend.

http://www.gamezone.com/news/first-party-microsoft-games-will-run-at-native-4k-on-project-scorpio-3443615

First-party Microsoft games will run at native 4K on Project Scorpio

Killer Instinct Season 1 XBO has 1280x720p.

Killer Instinct Season 2 XBO has 1600x900p

Deal with it.

Please continue with your bullshittery.......

That's the best you have? Scorpio will perform no where near perform as well as the GPU's in that chart..... A 1070 has more bandwidth and a much better CPU pushing it.

1. As I asked you a few pages back, did you factor in the badnwidth the CPU would steal from the Scorpio's GPU? Because looking at your charts you didn't..... pretty noobie mistake considering you 'know' it all.

But then again I've told you multiple times and it fails to sink in.... maybe there's an underlying learning disability there... who knows.

1. You failed to grasped the concept for producer (CPU) and consumer (GPU) processing model. As long physics, AI and command list building calculations is not competed, the GPU render will be in a wait state. GPU can't render without the final object position results. GPU wait idle state during CPU physics and AI calculations leads to the need for Async Compute.

The above slide was from Sony's 1st party CPU optimization guidelines which are the same with XBO. This optimization guideline reduces the hit rates to main memory.

http://gamingbolt.com/project-cars-dev-ps4-single-core-speed-slower-than-high-end-pc-splitting-renderer-across-threads-challenging

On being asked if there was a challenge in development due to different CPU threads and GPU compute units in the PS4, Tudor stated that, “It’s been challenging splitting the renderer further across threads in an even more fine-grained manner – even splitting already-small tasks into 2-3ms chunks. The single-core speed is quite slow compared to a high-end PC though so splitting across cores is essential.

The bottlenecks are mainly in command list building – we now have this split-up of up to four cores in parallel. There are still some bottlenecks to work out with memory flushing to garlic, even after changing to LCUE, the memory copying is still significant.”

For 16 ms(60 fps target) frame time render target, the CPU's command list building consumed 2-3 ms of the total 16 ms. My graphs already has CPU ms frame time consumption built into it.

SMS devs was working to reduce "memory flushing to garlic".

Development build with" memory flushing to garlic" path: CPU ---> GDDR5 --- Garlic bus ---> GPU

Final build goal path: CPU ---> Onion Fusion bus---> GPU.

You are forgetting PS4 and XBO has direct CPU and GPU FUSION links which doesn't exist for gaming PC with discrete GPU cards.

http://www.eurogamer.net/articles/digitalfoundry-2017-the-scorpio-engine-in-depth

"For 4K assets, textures get larger and render targets get larger as well. This means a couple of things - you need more space, you need more bandwidth," explains Nick Baker. "The question though was how much? We'd hate to build this GPU and then end up having to be memory-starved. All the analysis that Andrew was talking about, we were able to look at the effect of different memory bandwidths, and it quickly led us to needing more than 300GB/s memory bandwidth. In the end we ended up choosing 326GB/s. On Scorpio we are using a 384-bit GDDR5 interface - that is 12 channels. Each channel is 32 bits, and then 6.8GHz on the signalling so you multiply those up and you get the 326GB/s."

Scorpio's GPU has more than 300 Gbps memory bandwidth allocated to it. Polaris DCC recovers memory bandwidth lost to memory subsystem inefficiencies.

Plot complication, MS's hardware customization.

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

"On the GPU we added some compressed render target formats like our 6e4 [6 bit mantissa and 4 bits exponent per component] and 7e3 HDR float formats [where the 6e4 formats] that were very, very popular on Xbox 360, which instead of doing a 16-bit float per component 64bpp render target, you can do the equivalent with us using 32 bits - so we did a lot of focus on really maximising efficiency and utilisation of that ESRAM."

XBO's GPU has additional features over other GCNs e.g. 64 bits (FP16 bits per component) ---> compressed into 32 bits.

When XBO's compressed data format customization was combined with 6 TFLOPS brute force(overcoming XBO's GPU ALU bound issues), it would beat PC R9-390X.

GTX 1070 effectively runs into memory bandwidth wall when Forza 6's wet track is used and Scorpio has changed the rules with customized compressed data formats. To test this idea, run GTX 1080 Ti at 6.5 TFLOPS with keeping memory configuration as is which still yield 4K/60 fps Forza 6 wet track result.

From EA DICE, moving more CPU rendering logic to GPU!

Both SMS and EA DICE are aware of CPU's memory transfer context overheads and CPU memory bandwidth consumption with unified memory architecture.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#128 ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@EG101 said:

Funny thing is No One has said that about Scorpio but Every single Rabid Cow on this Forum stated something along those Lines when mentioning the Cell in the PS3. Every one of you stated the Cell would Magically give the PS3 More Powah. Funny how the Reality is opposite of what You think it is.

The Delusion is so Bad with the Cows that they Twist their own Statements then Pretend the Lemmings said it.

But there is a difference Cell was a hybrid CPU GPU processor,crunching numbers Cell was a beast thanks to is parallelism,which is why Cell was so good at physics like an Ageia PPU was what you know now as Nvidia Physx.

Cell was capable of doing GPU task,it could even do ray tracing,that Jaguar inside the Pro and Scorpio are not what Cell use to be in 2006.

And in fact the only reason why the PS3 could match or even exceed the 360 graphically was because of Cell,which could offload GPU task from the RSX,allowing it to compensate for its weaker hardware,that kind of hand shake work didn't happen on 360 where the Xenos practically did everything an the Xenon was completely weak sauce.

See how Cell here beat the crap out of the 360 Xenon? and that was 5 SPE a 6th one would have also beat the xbox one,with 5 it beat the jaguar on PS4.

So yeah Cell actually deliver to the point of helping the PS3 beat the 360 graphically when the 360 had a more capable GPU.

By the way it wasn't cows the ones who believe on magical Api that double power mysteriously or the unlimited power of the cloud.

Loading Video...

Intel Pentium(R) Dual-Core CPU

E5300 @ 2.60 GHZ

2.00 GB RAM

Intel Core 2 Duo based CPU running pure software Direct3D9c rendering via Just-In-Time LLVM re-compiler method i.e the same re-compiler method built into PC GPU drivers.

Intel Core 2 Duo based CPU can run Direct3D9C GPU workloads when programmed correctly which respects CPU cache boundaries (via config settings).

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#129  Edited By tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

Intel Pentium(R) Dual-Core CPU

E5300 @ 2.60 GHZ

2.00 GB RAM

Intel Core 2 Duo based CPU running pure software Direct3D9c rendering via Just-In-Time LLVM re-compiler method i.e the same re-compiler method built into PC GPU drivers.

Intel Core 2 Duo based CPU can run Direct3D9C GPU workloads when programmed correctly which respects CPU cache boundaries (via config settings).

Every time you post something completely irrelevant to what i was saying i will post this screen,you really have a problem fallowing arguments...

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#130  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@ronvalencia said:

Intel Pentium(R) Dual-Core CPU

E5300 @ 2.60 GHZ

2.00 GB RAM

Intel Core 2 Duo based CPU running pure software Direct3D9c rendering via Just-In-Time LLVM re-compiler method i.e the same re-compiler method built into PC GPU drivers.

Intel Core 2 Duo based CPU can run Direct3D9C GPU workloads when programmed correctly which respects CPU cache boundaries (via config settings).

Every time you post something completely irrelevant to what i was saying i will post this screen,you really have a problem fallowing arguments...

CELL doesn't have a monopoly on pure software graphics render. Try again Sony ass kisser.

I used software render example that respects CPU cache boundaries. Hint: program it like SPE. Normal PC apps just over spills the cache boundaries while SPE version hits a crash wall.

Avatar image for blangenakker
blangenakker

3240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#131 blangenakker
Member since 2006 • 3240 Posts

The technology will get there eventually, it always does.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#132 ronvalencia
Member since 2008 • 29612 Posts

http://gamingbolt.com/xbox-scorpio-has-enough-power-in-principle-to-deliver-4k60fps-madness-engine-is-highly-scaleable-project-cars-2-dev

Xbox Scorpio Has Enough Power In Principle To Deliver 4K/60fps, Madness Engine Is Highly Scaleable – Project CARS 2 Dev

Avatar image for Legend002
Legend002

13405

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 1

#133  Edited By Legend002
Member since 2007 • 13405 Posts

Only titles that are built with 4K in mind actually matter and there's not a whole lot right now. That said, Gears 4 and RotTR looks INCREDIBLE in 4K.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#134 ronvalencia
Member since 2008 • 29612 Posts

@Legend002 said:

Only titles that are built with 4K in mind actually matter and there's not a whole lot right now. That said, Gears 4 and RotTR looks INCREDIBLE in 4K.

Scorpio's 4K 60 fps list

Forza Motorsport 7

SW Battlefield 2 (high probability with 60 fps)

Project Cars 2 (provisional)

Gears of War 4 (4K remaster)

Halo 5 (4K remaster)

Killer Instinct Season 1 to 3 (4K remaster)

---------

Scorpio's 4K list

ReCore (4K remaster)

Forza Horizon 3 (4K remaster)

Crackdown 3

Sea of Thieves

World of Tanks

Red Dead Redemption 2

Call of Duty WW2 (high probability with 60 fps) http://www.express.co.uk/entertainment/gaming/787026/Xbox-One-news-Call-of-Duty-WW2-Project-Scorpio-4K-Backwards-Compatibility-new-games

Avatar image for ShepardCommandr
ShepardCommandr

4939

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#135 ShepardCommandr
Member since 2013 • 4939 Posts

not it's not

but vega is coming and volta is nearing

Avatar image for howmakewood
Howmakewood

7712

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#136 Howmakewood
Member since 2015 • 7712 Posts

@ShepardCommandr: do you expect vega to top 1080ti? I think it has to, Amd has taken way too long with top end cards and even now the workstation vegas dont have a launch date, which are coming before gaming ones, meaning Volta is right around the corner while Amd is still trying to reach the previous gen..

Avatar image for EG101
EG101

2091

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#137 EG101
Member since 2007 • 2091 Posts

I know all about the Cell, it's capabilities are well documented.

The Cell didn't Magically Stop 95% of all games from running and looking better on 360.

It took Developers 2X the amount of work on PS3 to get slightly worse performance than 360 Versions.

My original point was that Rabid Cows made those claims not Lemmings.

NO one expects Scorpio's CPU to perform like an i7. What I've stated is that MS has improved Jaguar by making improvements to it's bottlenecks. That is What MS stated and it makes sense to do so when Jaguar is a very weak CPU. MS is offloading Draw Calls to a different Chip (XB1 has this capability too When using DX12) plus enhancements to CPU Latency and Memory Latency. How those Enhancements will affect Games is TBD.

Avatar image for whalefish82
whalefish82

511

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#138 whalefish82
Member since 2013 • 511 Posts

@scatteh316 said:
@whalefish82 said:

I just have a standard 1080 and, for me, I actually prefer to play at 2K, 60FPS, cranking everything to max with AA. It results in a softer look but I prefer that to 4K and no AA.

I always preffered 60fps at 1440p with maximum settings and downsampling for amazing image quality, no use having the best textures, shaders and lighting if they're riddled with jaggies and shimmering.

Completely agree. Take Witcher 3. Below is a shot with the settings I use - 2k res, ultra settings, including Haiworks and 60FPS lock. It doesn't drop a frame, looks amazing and even better in motion, with zero noticeable aliasing. I could turn down a few settings and crank it up to 4k, but it just doesn't look as polished. If I wanted more sharpness I could adjust it in game or use an injector.

An extreme example of this is RoTTR on the Pro - the 4k mode looks terrible compared to PC, completely lacking in textures and detail. I'd rather halve that resolution and keep the bells & whistles.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#139  Edited By ronvalencia
Member since 2008 • 29612 Posts

@howmakewood said:

@ShepardCommandr: do you expect vega to top 1080ti? I think it has to, Amd has taken way too long with top end cards and even now the workstation vegas dont have a launch date, which are coming before gaming ones, meaning Volta is right around the corner while Amd is still trying to reach the previous gen..

The latest leak for Vega 10 has 1600 Mhz clock speed (13.1 TFLOPS), 480 GBps physical memory bandwidth and high bandwidth cache below the L2 cache.

My biggest concern with Vega 10 is effective memory bandwidth. Vega has double rate FP16 feature which current Pascal GP102 lacks, hence it needs Volta GP102.

Vega's high bandwidth cache below L2 cache could be a substitute for AMD's inferior DCC.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#140 tormentos
Member since 2003 • 33784 Posts

@whalefish82 said:

Completely agree. Take Witcher 3. Below is a shot with the settings I use - 2k res, ultra settings, including Haiworks and 60FPS lock. It doesn't drop a frame, looks amazing and even better in motion, with zero noticeable aliasing. I could turn down a few settings and crank it up to 4k, but it just doesn't look as polished. If I wanted more sharpness I could adjust it in game or use an injector.

An extreme example of this is RoTTR on the Pro - the 4k mode looks terrible compared to PC, completely lacking in textures and detail. I'd rather halve that resolution and keep the bells & whistles.

Any one who say ROTTR look terrible compare to PC is a BLIND fanboy,i can agree without problems that the PC version is sharper and has higher detail from there to claim it look terrible is a grouse exaggeration and a total lie.

Loading Video...

Like DF say the PC version is superior but they know and admit the game is quite impressive for the PS4 Pro,by no means the Pro should look better the PS4 Pro doesn't gave a 1070gtx let alone something stronger.

And some of the missing effects on this ^^ video were introduce back.

Adjusting the game sharpness will not produce or come close to the level of sharpness you get from 2k to 4k.

The Pro should look cleaner in 1800p that it would on your PC if you run it in 1080p.

Again the PC version look better and is sharper from there to claim the PS4 Pro version look terrible is a joke and clearly an exaggeration.

@EG101 said:

I know all about the Cell, it's capabilities are well documented.

The Cell didn't Magically Stop 95% of all games from running and looking better on 360.

It took Developers 2X the amount of work on PS3 to get slightly worse performance than 360 Versions.

My original point was that Rabid Cows made those claims not Lemmings.

NO one expects Scorpio's CPU to perform like an i7. What I've stated is that MS has improved Jaguar by making improvements to it's bottlenecks. That is What MS stated and it makes sense to do so when Jaguar is a very weak CPU. MS is offloading Draw Calls to a different Chip (XB1 has this capability too When using DX12) plus enhancements to CPU Latency and Memory Latency. How those Enhancements will affect Games is TBD.

Apparently not else you would not have claim id did not increase power,it did.

NO what stopped Cell from delivering that was ease of development on xbox 360,by the way developers got a hold of cell it was to late,which is why GTA 4 and RDR were 720p on xbox 360 but not on PS3 but by the time GTA5 came out both were on par and the PS3 version was actually say to be superior.

https://www.extremetech.com/gaming/166731-gta-5-ps3-vs-xbox-360-gameplay-and-graphics-comparison

So yeah Cell helped the PS3 beat the 360,but most games ran and look better on 360 because of cell it self and how hard it was to make it shine it was a double edge sword that played in the 360 favor,also helped allot that MS tools were easy to use,a mistake that sony corrected this gen.

Yes Ronvalencia does blindly which is why i have such arguments with him,Scorpio like the Pro have a shitty CPU,both use the same tech and yes it is a Puma or Jaguar which you like to call it which yes is bottom of the barrel performance wise and will not equal a newly release i3 let alone an i7 or Ryzen,so yes Scorpio will run many games at 60FPS but not all games are created equal and many will be bottleneck by the CPU.

The improvements MS made i already prove they were on xbox one as well,they are nothing new from reducing latency to the command processor modification,what MS is doing here is trying to save face for using a Jaguar,and considering they lie about the cloud and DX12 trying to make seen that the xbox one would improve greatly which it didn't i would not take anything from them as more than damage control for using a weak CPU.

They claimed before the xbox one there was no other CPU with 2 cluster which is a lie the PS4 came first and have the same 2 clusters and a bigger GPU to.

So you see this is MS and their job is to give an impression that nothing will go wrong even when everything is going wrong,sony would do the same probably as well,and i will remind you that this is the same MS that on 2013 would not even admit that sony had a 40% GPU advantage,and that try to claim games would be equal they are on record doing so,so anything they say about the CPU is damage control we all know it is a Jaguar puma CPU which is the same as the PS4 and that even with modification it wasn't enough when it was push on xbox one,as Project Cars it self prove when many cars were on screen at once the xbox one choked more regardless of the PS4 already using its 40% more GPU to render 1080p 44% more pixels than the xbox one 900p version.

Avatar image for funsohng
funsohng

29976

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#142 funsohng
Member since 2005 • 29976 Posts

@RyviusARC: MEA is more demanding than TW3?

Hmm..... I can barely get 60fps in 1080p with 1070 on TW3....

Avatar image for pimphand_gamer
PimpHand_Gamer

3048

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#143 PimpHand_Gamer
Member since 2014 • 3048 Posts

PC max settings are not console max settings and vise versa. Even if it achieves 6 TF's and is within $400, that's a pretty good achievement considering it also comes including a gamepad, warranty and support and preinstalled OS. If it's 8 TF's and within $500 I'd say that's amazing.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#144  Edited By Juub1990
Member since 2013 • 12620 Posts
@RyviusARC said:

My 1080ti core clock is fluctuates between 2050mhz and 2025mhz. Memory clock is 12ghz.

Aftermarket model or FE on water?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#145  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

PCs with nVIDIA Maxwell GP204/GM200 and Pascal GPUs has DSR i.e. users are not limited to 1080p and 1440p resolution.

Loading Video...

@pimphand_gamer said:

PC max settings are not console max settings and vise versa. Even if it achieves 6 TF's and is within $400, that's a pretty good achievement considering it also comes including a gamepad, warranty and support and preinstalled OS. If it's 8 TF's and within $500 I'd say that's amazing.

It's not amazing since AMD is still in the red despite two game console design win. AMD's profit margin is razor thin, while NVIDIA's GP104 profit margins are pretty fat i.e GTX 1070's BOM cost are similar to RX-480 8 GB with a higher asking retail price.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#146  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@whalefish82 said:

Completely agree. Take Witcher 3. Below is a shot with the settings I use - 2k res, ultra settings, including Haiworks and 60FPS lock. It doesn't drop a frame, looks amazing and even better in motion, with zero noticeable aliasing. I could turn down a few settings and crank it up to 4k, but it just doesn't look as polished. If I wanted more sharpness I could adjust it in game or use an injector.

An extreme example of this is RoTTR on the Pro - the 4k mode looks terrible compared to PC, completely lacking in textures and detail. I'd rather halve that resolution and keep the bells & whistles.

Any one who say ROTTR look terrible compare to PC is a BLIND fanboy,i can agree without problems that the PC version is sharper and has higher detail from there to claim it look terrible is a grouse exaggeration and a total lie.

Like DF say the PC version is superior but they know and admit the game is quite impressive for the PS4 Pro,by no means the Pro should look better the PS4 Pro doesn't gave a 1070gtx let alone something stronger.

And some of the missing effects on this ^^ video were introduce back.

Adjusting the game sharpness will not produce or come close to the level of sharpness you get from 2k to 4k.

The Pro should look cleaner in 1800p that it would on your PC if you run it in 1080p.

Again the PC version look better and is sharper from there to claim the PS4 Pro version look terrible is a joke and clearly an exaggeration.

@EG101 said:

I know all about the Cell, it's capabilities are well documented.

The Cell didn't Magically Stop 95% of all games from running and looking better on 360.

It took Developers 2X the amount of work on PS3 to get slightly worse performance than 360 Versions.

My original point was that Rabid Cows made those claims not Lemmings.

NO one expects Scorpio's CPU to perform like an i7. What I've stated is that MS has improved Jaguar by making improvements to it's bottlenecks. That is What MS stated and it makes sense to do so when Jaguar is a very weak CPU. MS is offloading Draw Calls to a different Chip (XB1 has this capability too When using DX12) plus enhancements to CPU Latency and Memory Latency. How those Enhancements will affect Games is TBD.

Apparently not else you would not have claim id did not increase power,it did.

NO what stopped Cell from delivering that was ease of development on xbox 360,by the way developers got a hold of cell it was to late,which is why GTA 4 and RDR were 720p on xbox 360 but not on PS3 but by the time GTA5 came out both were on par and the PS3 version was actually say to be superior.

https://www.extremetech.com/gaming/166731-gta-5-ps3-vs-xbox-360-gameplay-and-graphics-comparison

So yeah Cell helped the PS3 beat the 360,but most games ran and look better on 360 because of cell it self and how hard it was to make it shine it was a double edge sword that played in the 360 favor,also helped allot that MS tools were easy to use,a mistake that sony corrected this gen.

1. Yes Ronvalencia does blindly which is why i have such arguments with him,Scorpio like the Pro have a shitty CPU,both use the same tech and yes it is a Puma or Jaguar which you like to call it which yes is bottom of the barrel performance wise and will not equal a newly release i3 let alone an i7 or Ryzen,so yes Scorpio will run many games at 60FPS but not all games are created equal and many will be bottleneck by the CPU.

2. The improvements MS made i already prove they were on xbox one as well,they are nothing new from reducing latency to the command processor modification,what MS is doing here is trying to save face for using a Jaguar,and considering they lie about the cloud and DX12 trying to make seen that the xbox one would improve greatly which it didn't i would not take anything from them as more than damage control for using a weak CPU.

They claimed before the xbox one there was no other CPU with 2 cluster which is a lie the PS4 came first and have the same 2 clusters and a bigger GPU to.

1.

http://www.anandtech.com/bench/product/1223?vs=1367

http://www.anandtech.com/bench/product/1223?vs=1682

In 3D particle movement multi-threading

K16 (Athlon 5350) at 2.05 Ghz = 174, quad core/four threads <------ below i3-4130T

i3-4130T at 2.9 Ghz = 274, dual core/four threads

i3-6100 at at 3.7 Ghz = 366, dual core/four threads

K16 with 7 cores at 2.05 Ghz, 7 thread scale estimate: 304.5

K16 with 7 cores at 2.1 Ghz, 7 thread scale estimate: 312

K16 with 7 cores at 2.3 Ghz, 7 thread scale estimate: 342

---------

K16 with 6 cores at 2.1 Ghz, 6 thread scale estimate:267 <------ below i3-4130T

K16 with 6 cores at 2.3 Ghz, 6 thread scale estimate: 293 <------ above i3-4130T

To drive the GPU quicker besides higher clock speed, lower write cache latency is important since the results needs to be written into cache prior to flushing to Onion bus.

Please provide a Sony claim for lower latency K16 CPU.

2. MS's command custom processor argument doesn't fix XBO's shader bound issues. Your argument is a red herring.

Avatar image for Yams1980
Yams1980

2862

Forum Posts

0

Wiki Points

0

Followers

Reviews: 27

User Lists: 0

#147 Yams1980
Member since 2006 • 2862 Posts

u should be able to max most games with a 1080ti. especially games these days, they haven't changed much over the last couple years, im still using my 970 and easily hitting 60fps on most of the games i play at 2560x1920. Things may change though if xbox's new console starts pushing the graphics to a better level.

Just disable antialiasing since its not essential at 4k, and or use something less taxing like SMAA with a graphics injector. Put shadows and ambient occlusion to normal etc, you can't notice these little things unless you are studying side by side comparisons.

Avatar image for GiantAssPanda
GiantAssPanda

1885

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#148 GiantAssPanda
Member since 2011 • 1885 Posts

It's pretty much spot on for 3440x1440 and 60+ FPS gaming.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#149 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

Please provide a Sony claim for lower latency K16 CPU.

2. MS's command custom processor argument doesn't fix XBO's shader bound issues. Your argument is a red herring.

1-will not equal a newly release i3.

NO you provide a quote where sony claim they did nothing for latency on the CPU side,considering we have even developers that talking about how the PS4 CPU was more efficient than the xbox one CPU,i find it funny that you still try to ride the command processor modification..

Is the Jit compression or ESRAM of 2017 in your pathetic arguments so how did ESRAM,jit compression,DX12 and the cloud helped the xbox one beat the PS4.?

That is a ball of bullshit since you claim the PS4 loss in hitman when there is many people at once,that imply that the CPU is enough to beat the PS4 stronger GPU gap,since hitman is 1080p on both,the PS4 advantage is not there and the CPU some how make for the GPU disparity so you do claim it fix the xbox one shader bound scenario,more people on screen is not only a load on the CPU but also the GPU doing the rendering.

So please stop your excuses.

It has a shitty Jaguar deal with it.

Avatar image for Pedro
Pedro

69716

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#150 Pedro
Member since 2002 • 69716 Posts

“One thing that I will say that is very interesting is that our game is limited by CPU for rendering right now. A lot of people know that the Xbox has a stronger CPU than the PS4. That’s not to say that it runs better on the Xbox or PlayStation, but the PlayStation having a little bit more powerful GPU is offset by the fact that our game requires a lot of CPU to run well. It’s kind of a balance between the two right now for different reasons.”Read more at