MS: Please support the ESRAM even though we are abandoning it. . . (Scorpio white papers reveal the sad truth)

Avatar image for Zero_epyon
Zero_epyon

20131

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#502 Zero_epyon
Member since 2004 • 20131 Posts

@FLOPPAGE_50 said:
@Zero_epyon said:
@spitfire-six said:
@GunSmith1_basic said:

I know MS said the Scorpio would not replace the Xbone, but c'mon. I think we'd be lucky to have 1 year of Xbone support before it's dropped in favour of exclusive Scorpio software. At least with Sony the difference between the PS4 and the Pro is not that much. Having them share software isn't that much of a stretch.

Just because you dont understand how something will work does not mean it is not possible. Most of you are still trapped in the past where console platform meant hardware + software. In this day and age software platform is getting to a point where it is a standalone thing and it allows for the software to be developed regardless of hardware changes.

They're not going to get it until they see it with their own eyes.

Just like you won't accept that VR flopped.

silly cow.

Haha I must have hurt you bad. I'm actually here defending MS because I understand their universal platform from a software developer perspective. But you're so mad you got owned that you think I'm a cow no matter what. So sad...

Avatar image for FLOPPAGE_50
FLOPPAGE_50

4500

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#503 FLOPPAGE_50
Member since 2004 • 4500 Posts

@Zero_epyon said:
@FLOPPAGE_50 said:
@Zero_epyon said:
@spitfire-six said:
@GunSmith1_basic said:

I know MS said the Scorpio would not replace the Xbone, but c'mon. I think we'd be lucky to have 1 year of Xbone support before it's dropped in favour of exclusive Scorpio software. At least with Sony the difference between the PS4 and the Pro is not that much. Having them share software isn't that much of a stretch.

Just because you dont understand how something will work does not mean it is not possible. Most of you are still trapped in the past where console platform meant hardware + software. In this day and age software platform is getting to a point where it is a standalone thing and it allows for the software to be developed regardless of hardware changes.

They're not going to get it until they see it with their own eyes.

Just like you won't accept that VR flopped.

silly cow.

Haha I must have hurt you bad. I'm actually here defending MS because I understand their universal platform from a software developer perspective. But you're so mad you got owned that you think I'm a cow no matter what. So sad...

Hurt and owned? lmao you're the one defending VR.

I don't care for the success for it, as I have clearly showed.

Avatar image for Zero_epyon
Zero_epyon

20131

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#504  Edited By Zero_epyon
Member since 2004 • 20131 Posts

@FLOPPAGE_50 said:
@Zero_epyon said:
@FLOPPAGE_50 said:
@Zero_epyon said:

They're not going to get it until they see it with their own eyes.

Just like you won't accept that VR flopped.

silly cow.

Haha I must have hurt you bad. I'm actually here defending MS because I understand their universal platform from a software developer perspective. But you're so mad you got owned that you think I'm a cow no matter what. So sad...

Hurt and owned? lmao you're the one defending VR.

I don't care for the success for it, as I have clearly showed.

You're talking about PSVR in an Xbox thread, to a guy defending Xbox. I'd call that hurt and owned...

This is getting sadder. You can have the last word.

Avatar image for FLOPPAGE_50
FLOPPAGE_50

4500

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#505 FLOPPAGE_50
Member since 2004 • 4500 Posts

@Zero_epyon said:
@FLOPPAGE_50 said:
@Zero_epyon said:
@FLOPPAGE_50 said:
@Zero_epyon said:

They're not going to get it until they see it with their own eyes.

Just like you won't accept that VR flopped.

silly cow.

Haha I must have hurt you bad. I'm actually here defending MS because I understand their universal platform from a software developer perspective. But you're so mad you got owned that you think I'm a cow no matter what. So sad...

Hurt and owned? lmao you're the one defending VR.

I don't care for the success for it, as I have clearly showed.

You're talking about PSVR in an Xbox thread, to a guy defending Xbox. I'd call that hurt and owned...

none of the above, If anything I'm just following you around like a creep, buddy.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#506  Edited By ronvalencia
Member since 2008 • 29612 Posts

@GunSmith1_basic said:
@dynamitecop said:
@spitfire-six said:
@GunSmith1_basic said:

I know MS said the Scorpio would not replace the Xbone, but c'mon. I think we'd be lucky to have 1 year of Xbone support before it's dropped in favour of exclusive Scorpio software. At least with Sony the difference between the PS4 and the Pro is not that much. Having them share software isn't that much of a stretch.

Just because you dont understand how something will work does not mean it is not possible. Most of you are still trapped in the past where console platform meant hardware + software. In this day and age software platform is getting to a point where it is a standalone thing and it allows for the software to be developed regardless of hardware changes.

What they're saying is as dumb as saying PC games will only work with flagship hardware and low end functional specs should be abandoned as if they have some type of impact...

Scalability... They don't seem to grasp it...

The whole crux of this thread was around MS abandoning ESRAM. That means that not only is there a massive power difference between the Scorpio and Xbone, but now a major architectural difference as well. You can't just say it's as easy as scaling. You probably know more about the ins and outs of this tech better than I do (or maybe you don't), but I do notice how MS's narrative has been changing. At first, the statement was that Xbone would get ALL the games. Now, MS says they are "asking" devs to support the original Xbone. The abandonment has already started. You bring up PC architecture like it's some kind of defense. In PC gaming you need to have a high end rig to be able to play all the games. There is such a thing as minimum system requirements.

So sure, Xbone will always get games, but there is a lot of uncertainty about the kinds of games it will get, and if devs will think the effort is worth it. Anyone who thinks that all the Scorpio will ever be is a high end Xbone, and that Xbone will get all the same games, is putting their head int the sand.

PC gaming doesn't require high end rig to be able to play all the games.

Again, Scorpio's estimate from 5.9 TFLOPS Radeon HD R9-390X results.

The graphics details from the above benchmarks are higher than XBO's graphics detail settings.

The difference between XBO and Scorpio is like R7-360 at 853 Mhz and R9-390X OC (6 TFLOPS, 1070 Mhz, 44 CU).

In terms of memory architecture, Scorpio is like XBO's 256 bit DDR3-2133Mhz scaled up to 384 bit GDDR5-7000 but without ESRAM. XBO without ESRAM usage has 7770 like results which may create a market need for hardware upgrade.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#507 ronvalencia
Member since 2008 • 29612 Posts

@Shewgenja:

http://fraghero.com/head-xbox-phil-spencer-reveals-first-play-xbox-scorpio/

“People who have spent thousands of dollars on a high-end PC are getting that experience. How do we bring that to scale in a console in your living room? That’s a big part of what Project Scorpio is about.” said the head of Marketing for Microsoft, Aaron Greenberg.

Avatar image for PinchySkree
PinchySkree

1342

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#508 PinchySkree
Member since 2012 • 1342 Posts

@ up to 30 with cut assets, filters and other garbage.

Avatar image for howmakewood
Howmakewood

7712

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#509 Howmakewood
Member since 2015 • 7712 Posts

@PinchySkree said:

@ up to 30 with cut assets, filters and other garbage.

*4K Gaming

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#510 tormentos
Member since 2003 • 33784 Posts

@FastRobby said:

Well the GPU of PS4 Pro is 4.2TF, the one from Scorpio 6TF => 42.86% difference. So you would actually get 8.228.763 pixels, which is practically at 4k... But you're the only one who would be nitpicking this hard when the difference is 0.8%

Yes but reality is it can barely which mean unlike dynamitecop and other lemmings claim scorpio will not be this huge beast on another level...hahaha

Not only that like i already stated 42% is not enough to drop from 4k to 1800p and get ultra or close,so if the Pro was at xbox one level Scorpio will be a tap over that.

I told you people so there evidence was there 42% more GPU power will not make a huge difference,in fact you people have 3+ years downplaying 40% of the PS4 as nothing.

@ronvalencia said:

For Rise of Tomb Raider PC edition, RX-480 OC's 4K/30 fps target doesn't require PS4 Pro's Rise of Tomb Raider checkerboard render. PS4 Pro's Rise of Tomb Raider didn't match PC's Ultra settings.

HIS Radeon RX-480 Roaring Turbo at 1338 Mhz has ~6.1 TFLOPS.

For future workloads, double rate FP16 can add extra TFLOPS resource.

R9-390X 8 GB with greater than 310 GB/s effective memory bandwidth destroys RX-480 8 GB with 264 GB/s effective memory bandwidth

RX-390X has 5.9 TFLOPS

RX-480 has 5.8 TFLOPS

RX-470 (4.9 TFLOPS) will require checkerboard interpolation for 4K/30 fps target.

IF PC ported game was designed for Vega 10 with 12.5 TFLOPS that targets 4K/60 fps, Scorpio/Vega 11 at 6 TFLOPS will need checkerboard interpolation for 4K/60 fps target.

PS4 Pro's GPU is similar to RX-470D SKU which is less than the normal RX-470.

RX-470D has 4.5 TFLOPS and 218 GB/s physical memory bandwidth which is very close to PS4 Pro's basic parameters.

PS4 Pro is a half-assed upgrade e.g. effective memory bandwidth upgrade is just ~1.6X better than PS4. Sony has added at least another 64bit memory bus for DDR3 which they could have added another GDDR5-7000 memory bandwidth.

PS4 Pro already has 320 bit memory bus PCB for 256 bit GDDR5-7000 + at least 64 bit DDR3-2133. Scorpio has 384 bit memory bus GDDR5-7000 PCB.

I personally wouldn't buy RX-470D, RX-470, RX-480, Vega 11 (if AMD equipped Vega 11 with shit memory bandwidth).

Vega 10 should be able to nearly double R9-390X's result for RE7.

Vega 11 and Scorpio is about half of Vega 10.

The Pro doesn't have 6TF so it doesn't need 320GB/s power relative to bandwidth,even better it has some vega features which also save bandwidth as well.

I love how you say the 390X destroy the RX480,when it does 8 frame more in Tomb Raider,but when the PS4 is pulling close to 20FPS more at 1080p in Doom vs the XBO version that when it drops to mid low 40's also drops in resolution close to 720p that is not destroying right.?

Did you even see that benchmark well? You claim bandwidth is what is carrying the 390X i say drive issues probably i see that you completely miss this on purposes as always.

Yeah bandwidth is the issue right? Oh wait the Fury X has 8.6TF and 512GB/s it destroy the R390X power wise and has higher bandwidth too..

Ill wait for your usual spin this game need a drivers fix,and seeing that this is from capcom yeah i am sure the problem isn't bandwidth.

Half assed is the xbox one with DDR3 and ESRAM which has translate into the xbox one been beaten in many games by a 72GB/s 7770..lol

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#511 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

PC gaming doesn't require high end rig to be able to play all the games.

Again, Scorpio's estimate from 5.9 TFLOPS Radeon HD R9-390X results.

The graphics details from the above benchmarks are higher than XBO's graphics detail settings.

The difference between XBO and Scorpio is like R7-360 at 853 Mhz and R9-390X OC (6 TFLOPS, 1070 Mhz, 44 CU).

In terms of memory architecture, Scorpio is like XBO's 256 bit DDR3-2133Mhz scaled up to 384 bit GDDR5-7000 but without ESRAM. XBO without ESRAM usage has 7770 like results which may create a market need for hardware upgrade.

Hahahaahaa that one will be true when you show me Scorpio has a top of the line i7 like on those benchmarks.

DF article claim that Ryzen will not probably be on Scorpio,based on MS own document about CPU usage an warning,a 8 core Ryzen would need no such warnings,as 8 core dual thread is 16 threads is enough to drive that 6TF GPU,so from what i see either they are using Jaguar again or something like it.

Ultra requires a good CPU because not only it push higher quality it also draw more things which is a strain on the CPU.

So comparing the 390X PC using and i7 will not yield you the same results as Scorpio just because it has 6TF,i think that is why MS make the warnings about the cpu usage.

Loading Video...

Hey Ron if the 390X destroy the RX480 because it does 8 more frames in 4k,what does the PS4 does with the xbox one wen it pumps out up to 13FPS more and never drops from 60FPS..

Ill wait again while you bring some charts that favor the xbox one and ignore this one..

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#512  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@tormentos:

Its not a totally a bandwidth issue with 390x vs fury its a vram thing. The 390x has 512bit 384GB/s GDDR5 while Fury is using HBM at 512Gb/s.

AMD literally has to patch every new game that wants to allocate 4gb+ and get the driver to stream data to the Fury at the right rate to prevent stuttering from having to dump data for new data on that 4gb HBM. Having to swap data constantly from DDR3/DDR4 to Fury's HBM causes the gpu to have hiccups (frame pacing issues) lowering performance.

Polaris and Vega have more bandwidth saving features than earlier GCN. But also you have to take into account memory type and how many modules will determine the Bus/bandwidth. Its stupid if you are going to cheapen out using any under 300Gb/s with a 6 TFLOP gpu and cpu with an unified memory pool. PS4 Pro having DDR3 allows them to allocate most of the GDDR5's 218gb/s to the gpu along with its delta color compression method to save more bandwidth vs without using it.

They should be using a modified Ryzen cpu for the new Scorpio. There is no reason to pair another Jaguar or any other older AMD based cpu into the console that will bottleneck the system more ways than one. Also the console isn't in production and wont be for awhile yet so changes could happen, AMD has been working on Zen for awhile ie at least since 2014 and we know they had working prototypes last year. They dont need to have dual threads per core. All we know is that MS confirmed 8 cores, upscaling framerate running graphics at 60Hz, but the CPU at 30Hz and interpolating animation. If they are using Jaguar again better be above 3ghz. And that is stupid if they are reusing Jaguar.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#513  Edited By tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

@tormentos:

Its not a totally a bandwidth issue with 390x vs fury its a vram thing. The 390x has 512bit 384GB/s GDDR5 while Fury is using HBM at 512Gb/s.

AMD literally has to patch every new game that wants to allocate 4gb+ and get the driver to stream data to the Fury at the right rate to prevent stuttering from having to dump data for new data on that 4gb HBM. Having to swap data constantly from DDR3/DDR4 to Fury's HBM causes the gpu to have hiccups (frame pacing issues) lowering performance.

Polaris and Vega have more bandwidth saving features than earlier GCN. But also you have to take into account memory type and how many modules will determine the Bus/bandwidth. Its stupid if you are going to cheapen out using any under 300Gb/s with a 6 TFLOP gpu and cpu with an unified memory pool. PS4 Pro having DDR3 allows them to allocate most of the GDDR5's 218gb/s to the gpu along with its delta color compression method to save more bandwidth vs without using it.

They should be using a modified Ryzen cpu for the new Scorpio. There is no reason to pair another Jaguar or any other older AMD based cpu into the console that will bottleneck the system more ways than one. Also the console isn't in production and wont be for awhile yet so changes could happen, AMD has been working on Zen for awhile ie at least since 2014 and we know they had working prototypes last year. They dont need to have dual threads per core. All we know is that MS confirmed 8 cores, upscaling framerate running graphics at 60Hz, but the CPU at 30Hz and interpolating animation. If they are using Jaguar again better be above 3ghz. And that is stupid if they are reusing Jaguar.

But that is the problem DF article talks about CPU and doesn't appear to be Ryzen,Ryzen 8 cores is 95 watts that is far to much for an APU with a 6TF CPU and we know all the problems AMD had with Polaris they claim it had a certain TDP but then was found out about the GPU pulling power from the PCIE bus.

In the wake of CES, there is renewed speculation that Scorpio may feature more advanced Zen CPU cores. However, a throwaway comment within the Microsoft whitepaper on how developers may wish to use Scorpio's capabilities again makes this seem unlikely.

Another option developers might consider is frame-rate upscaling - running graphics at 60Hz but the CPU at 30Hz and interpolating animation."

PlayStation 4 Pro has offered several titles with high performance modes running on unlocked frame-rates - and notably none of them has managed to double performance consistently from 30fps to 60fps. Microsoft makes no claim that Scorpio is able to do so either and instead suggests a compromise - running GPU elements at twice the speed, while CPU-bound elements are interpolated. A move to far more powerful Zen cores would almost certainly make such advice redundant. We're not aware of any console titles that use this frame-rate upscaing technique - though there was a really impressive Force Unleashed 2 tech demo back in the day - but at the very least, the comment reaffirms our belief that Scorpio's CPU technology has not moved on in step with its GPU.

Probably Ryzen is out of the question,Ryzen 8 cores is 95 watts $249 cheapest.

So maybe a water down version or a equivalent to Jaguar but using Ryzen tech.

Avatar image for howmakewood
Howmakewood

7712

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#514  Edited By Howmakewood
Member since 2015 • 7712 Posts

@tormentos: It could still be Ryzen with low clocks and no SMT, AMD is bringing 4 core non SMT Ryzen's out as well

So it would most likely be 2 of those, if the 8 cores holds true.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#515  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:
@04dcarraher said:

@tormentos:

Its not a totally a bandwidth issue with 390x vs fury its a vram thing. The 390x has 512bit 384GB/s GDDR5 while Fury is using HBM at 512Gb/s.

AMD literally has to patch every new game that wants to allocate 4gb+ and get the driver to stream data to the Fury at the right rate to prevent stuttering from having to dump data for new data on that 4gb HBM. Having to swap data constantly from DDR3/DDR4 to Fury's HBM causes the gpu to have hiccups (frame pacing issues) lowering performance.

Polaris and Vega have more bandwidth saving features than earlier GCN. But also you have to take into account memory type and how many modules will determine the Bus/bandwidth. Its stupid if you are going to cheapen out using any under 300Gb/s with a 6 TFLOP gpu and cpu with an unified memory pool. PS4 Pro having DDR3 allows them to allocate most of the GDDR5's 218gb/s to the gpu along with its delta color compression method to save more bandwidth vs without using it.

They should be using a modified Ryzen cpu for the new Scorpio. There is no reason to pair another Jaguar or any other older AMD based cpu into the console that will bottleneck the system more ways than one. Also the console isn't in production and wont be for awhile yet so changes could happen, AMD has been working on Zen for awhile ie at least since 2014 and we know they had working prototypes last year. They dont need to have dual threads per core. All we know is that MS confirmed 8 cores, upscaling framerate running graphics at 60Hz, but the CPU at 30Hz and interpolating animation. If they are using Jaguar again better be above 3ghz. And that is stupid if they are reusing Jaguar.

But that is the problem DF article talks about CPU and doesn't appear to be Ryzen,Ryzen 8 cores is 95 watts that is far to much for an APU with a 6TF CPU and we know all the problems AMD had with Polaris they claim it had a certain TDP but then was found out about the GPU pulling power from the PCIE bus.

In the wake of CES, there is renewed speculation that Scorpio may feature more advanced Zen CPU cores. However, a throwaway comment within the Microsoft whitepaper on how developers may wish to use Scorpio's capabilities again makes this seem unlikely.

Another option developers might consider is frame-rate upscaling - running graphics at 60Hz but the CPU at 30Hz and interpolating animation."

PlayStation 4 Pro has offered several titles with high performance modes running on unlocked frame-rates - and notably none of them has managed to double performance consistently from 30fps to 60fps. Microsoft makes no claim that Scorpio is able to do so either and instead suggests a compromise - running GPU elements at twice the speed, while CPU-bound elements are interpolated. A move to far more powerful Zen cores would almost certainly make such advice redundant. We're not aware of any console titles that use this frame-rate upscaing technique - though there was a really impressive Force Unleashed 2 tech demo back in the day - but at the very least, the comment reaffirms our belief that Scorpio's CPU technology has not moved on in step with its GPU.

Probably Ryzen is out of the question,Ryzen 8 cores is 95 watts $249 cheapest.

So maybe a water down version or a equivalent to Jaguar but using Ryzen tech.

8 core Ryzen at >3 GHz consumes 95 watts. 2.4 Ghz version could consume 45 watts. 2.0 Ghz version could consume 35 watts.

The alternative

AMD Athlon X4 845 has Excavator V1's 103 fps results which is better than 30 fps XBO/PS4 Jaguar result.

AMD Athlon X4 845 may need software CPU tricks for GTA5 for 60 fps target.

R9-290X with high end Intel CPU yields 82 fps which is useless for 60 fps limited HDTVs.

At 4K resolution, R9-390X is mostly GPU bound not CPU bound.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#516  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

Read http://gamingbolt.com/ps4-pro-bandwidth-is-potential-bottleneck-for-4k-but-a-thought-through-tradeoff-little-nightmares-dev

At 4K, PS4 Pro has memory bandwidth bottleneck.

Scorpio's first party games was stated to render at 4K just like my R9-390X's results.

The graphics details from the above benchmarks are higher than XBO's graphics detail settings.

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#517 deactivated-5a30e101a977c
Member since 2006 • 5970 Posts
@tormentos said:
@FastRobby said:

Well the GPU of PS4 Pro is 4.2TF, the one from Scorpio 6TF => 42.86% difference. So you would actually get 8.228.763 pixels, which is practically at 4k... But you're the only one who would be nitpicking this hard when the difference is 0.8%

Yes but reality is it can barely which mean unlike dynamitecop and other lemmings claim scorpio will not be this huge beast on another level...hahaha

Not only that like i already stated 42% is not enough to drop from 4k to 1800p and get ultra or close,so if the Pro was at xbox one level Scorpio will be a tap over that.

Sorry, but you don't get to say this AT ALL. You were the one that kept posting these digital foundry topics where they compared a game on both platforms only to show that the PS4 comes out on top. Now that the tables have turned it suddenly doesn't matter anymore? Hypocrite.

Scorpio is much stronger than the PS Pro, and you better learn to deal with that. You will see the difference in quality of graphics, #pixels, and the fps. Scorpio is console king.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#518 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

8 core Ryzen at >3 GHz consumes 95 watts. 2.4 Ghz version could consume 45 watts. 2.0 Ghz version could consume 35 watts.

Wait so just by dropping 600mhz Ryzen some how will lose 50 watts.?

More than half its TDP.?

That seems completely off so at 2.4ghz ryzen consumes 45 watts,but rising 600mhz more will draw 50 more watts.?

You are your deluded arguments to help MS..

By the way a 2.4ghz ryzen still cost the same to produce than a 3000ghz one,since it is the same CPU but hold back.

@howmakewood said:

@tormentos: It could still be Ryzen with low clocks and no SMT, AMD is bringing 4 core non SMT Ryzen's out as well

So it would most likely be 2 of those, if the 8 cores holds true.

Then it is not Ryzen as found on PC like i already claimed,Ryzen is dual threaded which mean you kill the second thread and the performance you will get from it will not be the same at all,worse is you downgrade the speed to 2.4ghz or lower.

And the 4 core one is out of the question because MS claimed 8 cores not 4,so even if you have 4 cores and 8 threads that still doesn't match MS claims.

Avatar image for dynamitecop
dynamitecop

6395

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#519  Edited By dynamitecop
Member since 2004 • 6395 Posts

@tormentos said:
@ronvalencia said:

PC gaming doesn't require high end rig to be able to play all the games.

Again, Scorpio's estimate from 5.9 TFLOPS Radeon HD R9-390X results.

The graphics details from the above benchmarks are higher than XBO's graphics detail settings.

The difference between XBO and Scorpio is like R7-360 at 853 Mhz and R9-390X OC (6 TFLOPS, 1070 Mhz, 44 CU).

In terms of memory architecture, Scorpio is like XBO's 256 bit DDR3-2133Mhz scaled up to 384 bit GDDR5-7000 but without ESRAM. XBO without ESRAM usage has 7770 like results which may create a market need for hardware upgrade.

Hahahaahaa that one will be true when you show me Scorpio has a top of the line i7 like on those benchmarks.

DF article claim that Ryzen will not probably be on Scorpio,based on MS own document about CPU usage an warning,a 8 core Ryzen would need no such warnings,as 8 core dual thread is 16 threads is enough to drive that 6TF GPU,so from what i see either they are using Jaguar again or something like it.

Ultra requires a good CPU because not only it push higher quality it also draw more things which is a strain on the CPU.

So comparing the 390X PC using and i7 will not yield you the same results as Scorpio just because it has 6TF,i think that is why MS make the warnings about the cpu usage.

Is that a fact? You know shit all about CPU load or the effect GRAPHICAL (GPU bound) settings change CPU load...

Protip, they fucking don't.

Low to Ultra bears absolutely no CPU load changes, none, zero, it's all changed state on the GPU which only reflects a framerate difference and VRAM load differences, nothing is reflected on the CPU.

P.S. Digital Foundry's Jaguar CPU claim is based off of nothing more than 30hz to 60hz interpolated animation, that's it, that's their entire argument and it's weak as all hell. Something which could equally be done on a more powerful CPU to save resources and parse them to other much more important tasks. If you're rendering a game at 60 FPS and you want animations to be bogging down the CPU less, you can introduce interpolation to save valuable resources regardless of how weak or how powerful the processors. It will look perceivably identical to 60hz animation without the cost of double the calculation on the processor...

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#520  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos: 95 watts was for ~3.5 Ghz. i.e. it's more than your 600 Mhz assertion e.g. ~1000 Mhz reduction.

My Core i7 Ivybridge mobile has 45 watts at 2.4 Ghz (all 4 cores active) which includes HD 4000 IGP.

RYZEN is similar to late model Intel Core I series which scales from laptops to high end desktops.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#522  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@dynamitecop:

Your explanation of cpu not affecting gpu performance based on usage with low to ultra settings is off.

Larger the workload(higher resolution and data chunks) for the gpu longer it takes to process the image which allows cpu the keep up better in feeding gpu work. Smaller the workload the faster the gpu gets the work done the faster the cpu has to send gpu data. If the cpu is bogged down with other tasks it cant keep up causing whats called a cpu bottleneck.

Your example with TW3 is off to show cpu usage increasing because of gpu load based on its processing power. Your running an i7 at 4.4ghz which its able to keep up with the gpu really in any combination your using testing with. switching between low and ultra at same high resolution in a single spot your not doing anything much more for cpu usage since your cpu can keep up with the gpu.

What you should do is tone down your OC back to stock clocks and see your cpu usage climb in same tests that you did. Now if you take an AMD FX8350 you would see the cpu usage skyrocket to an average of around 70%+ vs a stock i7 2600k that averages around 60% in a city like Novigrad both using same Titan X.

This is why you dont pair a weak cpu with a strong gpu because the cpu cant feed the gpu what it needs as fast as it needs it. This is why intel's kill AMD's cpus using same gpus. Intel cpu's can process alot more per cycle than AMD offerings.

Avatar image for dynamitecop
dynamitecop

6395

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#523  Edited By dynamitecop
Member since 2004 • 6395 Posts
@04dcarraher said:

@dynamitecop:

Your explanation of cpu not affecting gpu performance based on usage with low to ultra settings is off.

Larger the workload(higher resolution and data chunks) for the gpu longer it takes to process the image which allows cpu the keep up better in feeding gpu work. Smaller the workload the faster the gpu gets the work done the faster the cpu has to send gpu data. If the cpu is bogged down with other tasks it cant keep up causing whats called a cpu bottleneck.

Your example with TW3 is off to show cpu usage increasing because of gpu load based on its processing power. Your running an i7 at 4.4ghz which its able to keep up with the gpu really in any combination your using testing with. switching between low and ultra at same resolution in a spot your not doing anything nor in a real active spot with npcs etc isnt going to drive up cpu usage with that i7 2600k at 4.4ghz.

What you should do is tone down your OC back to stock clocks and see your cpu usage climb in same tests that you did. Now if you take an AMD FX8350 you would see the cpu usage skyrocket to an average of around 70%+ vs a stock i7 2600k that averages around 60% in a city like Novigrad.

This is why you dont pair a weak cpu with a strong gpu because the cpu cant feed the gpu what it needs as fast as it needs it. This is why intel's i5 and i7's kill AMD's cpus using same gpus. Intel cpu's can process alot more per cycle than AMD.

Incorrect, even cutting my CPU frequency in half bodes the almost same results, yes this is from 4.4Ghz down to 2.2Ghz... A 50% reduction in CPU capability and the numbers are almost the same, think about that, the processor was 100% more powerful before this, yet the results are essentially the same, within margins of error and calculable differences based upon AI spawns, angles, different geometry etc. Games operating at this resolution and these settings become almost entirely GPU bound, the CPU is an afterthought, changes to it make little to no difference whatsoever, even when cut in half, it all scales exactly the same regardless.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#524  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@dynamitecop:

You really need to go into a city and test it because when your idling in a middle of nowhere even Techpot's test using i7 4790k and fX 9590 at4.5ghz and 2.5ghz, seen virtually same fps at the different clockrates. Something isnt kosher.

Look at Doom as an example, we can see cpu average usage is higher on AMD FX 8 while the i7 2600k usage is lower in one test, and then clockrates really affected gpu performance on another test. Saying that gpu load does not affect cpu usage is wrong, because if your lowered the resolution you allow the gpu to get work done faster causing the cpu to feed gpu data at a faster rate.

Upping resolution to the point where the gpu caps at is the main bottleneck slows the rate the cpu has to send gpu work. This method console devs use to off set the weak cpus, and when something gets tacked on the cpu that isnt the norm you get even larger fps fluctuations. Should try a city test at 4.4ghz and 2.2ghz and see the what happens.

Avatar image for dynamitecop
dynamitecop

6395

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#525 dynamitecop
Member since 2004 • 6395 Posts

@04dcarraher said:

@dynamitecop:

You really need to go into a city and test it because when your idling in a middle of nowhere even Techpot's test using i7 4790k and fX 9590 at4.5ghz and 2.5ghz, seen virtually same fps at the different clockrates. Something isnt kosher.

Look at Doom as an example, we can see cpu average usage is higher on AMD FX 8 while the i7 2600k usage is lower in one test, and then clockrates really affected gpu performance on another test. Saying that gpu load does not affect cpu usage is wrong, because if your lowered the resolution you allow the gpu to get work done faster causing the cpu to feed gpu data at a faster rate.

Upping resolution to the point where the gpu caps at is the main bottleneck slows the rate the cpu has to send gpu work. This method console devs use to off set the weak cpus, and when something gets tacked on the cpu that isnt the norm you get even larger fps fluctuations. Should try a city test at 4.4ghz and 2.2ghz and see the what happens.

As I emphatically stated again, and again, graphical settings do no impact the CPU, the game is entirely GPU bound at this resolution as are 99/100 modern game releases.

Do you see how the CPU usage is increased at 2.2ghz in the city? Do you know why? Increased structural geometry and higher AI load on the processor, as the CPU is weaker the usage is sitting higher, again, this has nothing to do with the GPU or graphical settings whatsoever...

It's geometry and AI calculations, more people, more complex structures, both of which are handled by the CPU... Notice the framerates are essentially unchanged regardless of the processor being at 4.4Ghz or 2.2Ghz?

Take the L and move on dude, you're arguing with reality that I can push and prove again, and again, and again regardless of scenario.

----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#526  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@dynamitecop:

You really need to look at your results again..... you proved the point I was making cpu power affects gpu performance once you start taxing cpu.

2.2 ghz vs 4.4 ghz ultra uncapped seen more than double the cpu usage 48% vs 22%....... and yet the performance is virtually the same. So ask yourself this what happens when you have a cpu that is 50% slower clock per clock and is clocked at 2 ghz using the same gpu..... your going to be taxing that cpu where it cant keep up even with high resolution that makes it more gpu limited.

Again this is why AMD FX 8/9 cpu cant keep up with 4c/8t i7's from 2nd gen and later. CPU processing power affects gpu performance. Even when you make the game more gpu bound by increasing resolution.

Avatar image for dynamitecop
dynamitecop

6395

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#527  Edited By dynamitecop
Member since 2004 • 6395 Posts

@04dcarraher said:

@dynamitecop:

You really need to look at your results again.....

2.2 ghz vs 4.4 ghz ultra uncapped seen double the cpu usage 48% vs 22%....... and yet the performance is virtually the same. So ask yourself this what happens when you have a cpu that is 50% slower clock per clock and is clocked at 2 ghz using the same gpu..... your going to be taxing thatcpu where it cant keep up even with resolution that makes the gpu limited.

First off, who is to say the CPU is going to be slower? This is a 6 year old 2600k operating at half staff at only 2.2Ghz, at that frequency only 49% of it is being used at maximum, on a PC, on a multi-function device running a bloated OS with countless background tasks being performed and only general optimization being available...

Even if the CPU were exactly 50% slower than this 2600k at 2.2Ghz (which is ridiculous), the game performance would still be identical to what you're seeing here as the game and everything else with the OS running is only parsing 49% of its available compute capability. With a less bloated OS, fixed optimization and compute almost entirely committed to the game itself, you'd probably see somewhere in the realm of only 70-75% usage.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#528  Edited By 04dcarraher
Member since 2004 • 23832 Posts
@dynamitecop said:

First off, who is to say the CPU is going to be slower? This is a 6 year old 2600k operating at half staff at only 2.2Ghz, at that frequency only 49% of it is being used at maximum, on a PC, on a multi-function device running a bloated OS with countless background tasks being performed and only general optimization being available...

Even if the CPU were exactly 50% slower than this 2600k at 2.2Ghz (which is ridiculous), the game performance would still be identical to what you're seeing here as the game and everything else with the OS running is only parsing 49% of its available compute capability. With a less bloated OS, fixed optimization and compute almost entirely committed to the game itself, you'd probably see somewhere in the realm of only 70-75% usage.

you proved the point I was making cpu power affects gpu performance once you start taxing cpu. 2.2ghz from 4.4 showed the point usage increased based on cpu not being able to process as much per cycle.

If you were to use an AMD FX cpu and did the same thing as you did with your i7 the results would much different and worse.

concluding performance would be the same even if you had a weaker cpu is false. CPU load from the game along with gpu workload all add up.

Techspot took an old cpu Q6600 pair it with a GTX 1060 and GTX 1070. even at 3.1 ghz Q6600 gets virtually same performance with 1070 with 1060. Their tests with other games show the same results using that cpu. aka a cpu bottleneck. The point is that they will have to use a stronger cpu than the jaguar at 2.4ghz to fully utilize the 6 tflop gpu even if you make the game gpu bound, not taking in consideration with games that put more stress on the cpu because its more complex on cpu side. That jag at 2.4ghz would be like you running your i7 at 1.4ghz, which would make TW3 usage skyrocket and bottleneck that 290x even at that high resolution.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#529 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

@tormentos: 95 watts was for ~3.5 Ghz. i.e. it's more than your 600 Mhz assertion e.g. ~1000 Mhz reduction.

My Core i7 Ivybridge mobile has 45 watts at 2.4 Ghz (all 4 cores active) which includes HD 4000 IGP.

RYZEN is similar to late model Intel Core I series which scales from laptops to high end desktops.

You were the one who claim 95 watts at 3.0ghz not me,so i challenge your claim.

Your ivybridge mean shit to us since AMD doesn't make it,and Ryzen isn't make by intel which has better performance and TDP than AMD for years.

@dynamitecop said:

Incorrect, even cutting my CPU frequency in half bodes the almost same results, yes this is from 4.4Ghz down to 2.2Ghz... A 50% reduction in CPU capability and the numbers are almost the same, think about that, the processor was 100% more powerful before this, yet the results are essentially the same, within margins of error and calculable differences based upon AI spawns, angles, different geometry etc. Games operating at this resolution and these settings become almost entirely GPU bound, the CPU is an afterthought, changes to it make little to no difference whatsoever, even when cut in half, it all scales exactly the same regardless.

That is because you CPU is a damn i7 buffoon.

Ultra settings does tax the CPU you are rendering more things,and drawing more stuff tax the CPU because it is a CPU process,the more you draw the more the performance drop.

Is one of the things in which DX12,Vulkan and mantle help lower end CPU.

So you take an 8 core AMD CPU and take away half the speed and you will have a performance deep.

Case in point look at the i7 2ghz drop and just nothing happen a single frame.

Now look at the FX 9590 same drop 2ghz and 10FPS are drop on the spot on average and minimum sink 13 FPS lower.

So yeah your CPU isn't showing the drop because it is a dam i7,an AMD CPU would surely show a dip.

Avatar image for dynamitecop
dynamitecop

6395

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#530  Edited By dynamitecop
Member since 2004 • 6395 Posts
@04dcarraher said:

@dynamitecop:

@04dcarraher said:
@dynamitecop said:

First off, who is to say the CPU is going to be slower? This is a 6 year old 2600k operating at half staff at only 2.2Ghz, at that frequency only 49% of it is being used at maximum, on a PC, on a multi-function device running a bloated OS with countless background tasks being performed and only general optimization being available...

Even if the CPU were exactly 50% slower than this 2600k at 2.2Ghz (which is ridiculous), the game performance would still be identical to what you're seeing here as the game and everything else with the OS running is only parsing 49% of its available compute capability. With a less bloated OS, fixed optimization and compute almost entirely committed to the game itself, you'd probably see somewhere in the realm of only 70-75% usage.

you proved the point I was making cpu power affects gpu performance once you start taxing cpu. 2.2ghz from 4.4 showed the point usage increased based on cpu not being able to process as much per cycle.

If you were to use an AMD FX cpu and did the same thing as you did with your i7 the results would much different and worse.

concluding performance would be the same even if you had a weaker cpu is false. CPU load from the game along with gpu workload all add up.

Techspot took an old cpu Q6600 pair it with a GTX 1060 and GTX 1070. even at 3.1 ghz Q6600 gets virtually same performance with 1070 with 1060. Their tests with other games show the same results using that cpu. aka a cpu bottleneck. The point is that they will have to use a stronger cpu than the jaguar at 2.4ghz to fully utilize the 6 tflop gpu even if you make the game gpu bound, not taking in consideration with games that put more stress on the cpu because its more complex on cpu side. That jag at 2.4ghz would be like you running your i7 at 1.4ghz, which would make TW3 usage skyrocket and bottleneck that 290x even at that high resolution.

@tormentos said:
@ronvalencia said:

@tormentos: 95 watts was for ~3.5 Ghz. i.e. it's more than your 600 Mhz assertion e.g. ~1000 Mhz reduction.

My Core i7 Ivybridge mobile has 45 watts at 2.4 Ghz (all 4 cores active) which includes HD 4000 IGP.

RYZEN is similar to late model Intel Core I series which scales from laptops to high end desktops.

You were the one who claim 95 watts at 3.0ghz not me,so i challenge your claim.

Your ivybridge mean shit to us since AMD doesn't make it,and Ryzen isn't make by intel which has better performance and TDP than AMD for years.

@dynamitecop said:

Incorrect, even cutting my CPU frequency in half bodes the almost same results, yes this is from 4.4Ghz down to 2.2Ghz... A 50% reduction in CPU capability and the numbers are almost the same, think about that, the processor was 100% more powerful before this, yet the results are essentially the same, within margins of error and calculable differences based upon AI spawns, angles, different geometry etc. Games operating at this resolution and these settings become almost entirely GPU bound, the CPU is an afterthought, changes to it make little to no difference whatsoever, even when cut in half, it all scales exactly the same regardless.

That is because you CPU is a damn i7 buffoon.

Ultra settings does tax the CPU you are rendering more things,and drawing more stuff tax the CPU because it is a CPU process,the more you draw the more the performance drop.

Is one of the things in which DX12,Vulkan and mantle help lower end CPU.

So you take an 8 core AMD CPU and take away half the speed and you will have a performance deep.

Case in point look at the i7 2ghz drop and just nothing happen a single frame.

Now look at the FX 9590 same drop 2ghz and 10FPS are drop on the spot on average and minimum sink 13 FPS lower.

So yeah your CPU isn't showing the drop because it is a dam i7,an AMD CPU would surely show a dip.

Jesus you people are incredibly ignorant and make the dumbest posts... Do you know absolutely nothing about CPU allocation and core performance related to resolution...

All of these tests and results you're posting are from 1080p resolution....

"1920x1080" "@1080p"

You're posting CPU results from a resolution where CPU performance is still of high import.. So obviously a more powerful per core Intel CPU is going to maintain its performance at a lower resolution... We're discussing 3K & 4K resolutions here which are three and four times higher where mostly any game becomes completely GPU bound, and you're using CPU tedious 1920x1080 resolution results as an some type of damning comparison, it's stupid beyond belief, I can't even believe you have the nerve to post this crap...

Just stupid... As resolution decreases a game engine becomes more and more CPU bound, the more you increase the resolution the more GPU bound it becomes and less dependent on CPU performance...

Rise of the Tomb Raider @ Very High settings @ 1920x1080 2.2Ghz 2600k...

62% CPU Usage...

Rise of the Tomb Raider @ Very High settings @ 3200x1800 2.2Ghz 2600k...

14% CPU Usage...

Please for the love of all that is holy, take a hike, you guys have no idea what you're talking about...

Want some more reality? I can do this all night.

Avatar image for me2002
me2002

3058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#531 me2002
Member since 2002 • 3058 Posts

But I thought power didn't matter to lemmings.....no casual is gonna be counting those pixels right?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#532  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos: I claimed ">3" Ghz which is short for "greater than 3 Ghz". ">" is a basic math symbol for "greater than".

@tormentos:

Your ivybridge mean shit to us since AMD doesn't make it,and Ryzen isn't make by intel which has better performance and TDP than AMD for years.

Mobile Raven Ridge APU's 35 TDP target is similar to Intel's mobile Core i7 series APU with 35-to-45 watts.

Notice

1. Stoney Ridge's two core Excavator V2 replaces Carrizo-L's four Puma CPU cores.

2. AMD is implying Stoney Ridge's two cores Excavator V2 ~= Carrizo-L's four Puma CPU cores

Excavator V1/V2 has better perf/watt over older Streamroller/Piledriver/Bulldozer.

For year 2016 H2 to 2017 H1, Excavator V2 has displaced Puma CPU IP.

Alien Isolation PS4's 30 fps result is linked with Jaguar 1.6Ghz very few multithreading usage..

For Alien Isolation PS4 that targets 60 fps with very few multithreading usage, Jaguar needs to have 3.2 Ghz clock speed which might as well use Excavator CPU IP instead of Puma CPU IP.

@tormentos:

By the way a 2.4ghz ryzen still cost the same to produce than a 3000ghz one,since it is the same CPU but hold back.

By the way, a SR3 with 4 active RYZEN cores and 4 disabled cores still cost the same to produce as 8 active RYZEN core version.

FX-4xxx with two active modules (4 threads) and two disabled modules still cost the same to produce as FX-8xxx's four active modules (8 threads).

You continually missed my Excavator CPU gaming benchmarks.

Your post means shit.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#533  Edited By ronvalencia
Member since 2008 • 29612 Posts

@04dcarraher said:

@dynamitecop:

You really need to go into a city and test it because when your idling in a middle of nowhere even Techpot's test using i7 4790k and fX 9590 at4.5ghz and 2.5ghz, seen virtually same fps at the different clockrates. Something isnt kosher.

Look at Doom as an example, we can see cpu average usage is higher on AMD FX 8 while the i7 2600k usage is lower in one test, and then clockrates really affected gpu performance on another test. Saying that gpu load does not affect cpu usage is wrong, because if your lowered the resolution you allow the gpu to get work done faster causing the cpu to feed gpu data at a faster rate.

Upping resolution to the point where the gpu caps at is the main bottleneck slows the rate the cpu has to send gpu work. This method console devs use to off set the weak cpus, and when something gets tacked on the cpu that isnt the norm you get even larger fps fluctuations. Should try a city test at 4.4ghz and 2.2ghz and see the what happens.

Nearly pointless benchmarks since 60 Hz is a typical limit for HDTVs.

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#534 deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@me2002 said:

But I thought power didn't matter to lemmings.....no casual is gonna be counting those pixels right?

I thought power was super important to cows, suddenly it isn't anymore? How weird...

Avatar image for no-scope-AK47
no-scope-AK47

3755

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#535 no-scope-AK47
Member since 2012 • 3755 Posts

Problem with these "new" consoles is the must play like the old ones. So if the xb/ps4 have 30fps then the new ones must use 30fps online.

Avatar image for loe12k
loe12k

3465

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#536  Edited By loe12k
Member since 2013 • 3465 Posts

The reality is if DF is right the Scorpio will not have many 4K games. CPU has to be new or it be just like the PRO. We see nicer visuals than the PRO, but the frame rate will be all over the place with jaguar cores.

Avatar image for me2002
me2002

3058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#538 me2002
Member since 2002 • 3058 Posts

@FastRobby said:
@me2002 said:

But I thought power didn't matter to lemmings.....no casual is gonna be counting those pixels right?

I thought power was super important to cows, suddenly it isn't anymore? How weird...

how was power super important? It's all about exclusives like it always has been!

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#539 deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@me2002 said:
@FastRobby said:
@me2002 said:

But I thought power didn't matter to lemmings.....no casual is gonna be counting those pixels right?

I thought power was super important to cows, suddenly it isn't anymore? How weird...

how was power super important? It's all about exclusives like it always has been!

Sure, and all the topics about the Digital Foundry comparisons were all a dream?

Avatar image for dakur
Dakur

3275

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#540 Dakur
Member since 2014 • 3275 Posts

@FastRobby said:
@me2002 said:
@FastRobby said:
@me2002 said:

But I thought power didn't matter to lemmings.....no casual is gonna be counting those pixels right?

I thought power was super important to cows, suddenly it isn't anymore? How weird...

how was power super important? It's all about exclusives like it always has been!

Sure, and all the topics about the Digital Foundry comparisons were all a dream?

Didn't QuadKnight make those? And he never posted his opinions, just the results from the analysis and the ones I remember were relatively close.

Avatar image for me2002
me2002

3058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#541 me2002
Member since 2002 • 3058 Posts

@FastRobby said:
@me2002 said:
@FastRobby said:
@me2002 said:

But I thought power didn't matter to lemmings.....no casual is gonna be counting those pixels right?

I thought power was super important to cows, suddenly it isn't anymore? How weird...

how was power super important? It's all about exclusives like it always has been!

Sure, and all the topics about the Digital Foundry comparisons were all a dream?

And who were the ones downplaying all the DF comparisons. Of course the lemmings! Saying power didn't matter, until the Scorpio is announced then suddenly power is the most important thing! Lems now tell me how PS4 is only winning because it's more powerful and how Scorpio is gonna change all that....when power didn't matter before. Hahahaha

Avatar image for deactivated-5a30e101a977c
deactivated-5a30e101a977c

5970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#542 deactivated-5a30e101a977c
Member since 2006 • 5970 Posts

@dakur said:
@FastRobby said:
@me2002 said:
@FastRobby said:

I thought power was super important to cows, suddenly it isn't anymore? How weird...

how was power super important? It's all about exclusives like it always has been!

Sure, and all the topics about the Digital Foundry comparisons were all a dream?

Didn't QuadKnight make those? And he never posted his opinions, just the results from the analysis and the ones I remember were relatively close.

tormentos a lot, and I think even Shewgenja sometimes, or a lem for that 1-2 times that suddenly Xbox One came out on top.

@me2002 said:

And who were the ones downplaying all the DF comparisons. Of course the lemmings! Saying power didn't matter, until the Scorpio is announced then suddenly power is the most important thing! Lems now tell me how PS4 is only winning because it's more powerful and how Scorpio is gonna change all that....when power didn't matter before. Hahahaha

Obviously that's the point right... Everyone is a hypocrite. Power doesn't matter when you're losing, but it matters when you're winning. Clearly based on sales power matters, and hype.

Avatar image for me2002
me2002

3058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#543 me2002
Member since 2002 • 3058 Posts

@FastRobby: haha I agree everyone are hypocrits, I'm just playing my part :P

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#544  Edited By ronvalencia
Member since 2008 • 29612 Posts

@loe12k said:

The reality is if DF is right the Scorpio will not have many 4K games. CPU has to be new or it be just like the PRO. We see nicer visuals than the PRO, but the frame rate will be all over the place with jaguar cores.

http://www.gamasutra.com/view/news/281742/Scorpio_will_render_Microsofts_firstparty_titles_in_native_4K.php

During an interview with USA Today, Microsoft Studios Publishing general manager Shannon Loftis explained all first-party games that launch during the Scorpio's lifecycle will be able to natively render at 4K on the high-end machine.

"Any games we're making that we're launching in the Scorpio time frame, we're making sure they can natively render at 4K," said Loftis, adding that Microsoft is looking at ways to bring VR experiences to the system.

Scorpio's first party games was stated to render at 4K just like R9-390X's results.

The graphics details from the above benchmarks are higher than XBO's graphics detail settings.

3 first party games established a reference GPU i.e. Scorpio being R9-390X like solution.

--------------

Scorpio's estimate from R9-390X results.

The graphics details from the above benchmarks are higher than XBO's graphics detail settings.

With 4K/60 fps target, R9-390X doesn't have the compute and memory bandwidth for the 4K/sub-50 fps games mentioned in my post i.e. Vega 10 with 12.5 TFLOPS has superior 4K/60 fps handling. In terms of basic parameters, R99-390X's 5.9 TFLOPS is slightly less than half of Vega 10 with 12.5 TFLOPS.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#545  Edited By 04dcarraher
Member since 2004 • 23832 Posts

All I have to say is that an i7 at 2.2 ghz is still leagues better than a jaguar at 2.1 ghz..... you will not see the same results if you were running an AMD fx cpu or an i5....

I tested TW3 with my i5 4690k at 2ghz with my GTX 970 in Oxenfurt at the docks.

2ghz

4k -low 33 fps cpu 85% avg

4k -ultra 25 fps cpu 80% avg

4ghz

4k -low 33 fps cpu 60%

4k ultra 26 fps cpu 50%

So like I was saying if the cpu being used is 50% or more weaker than intel's performance per clock your going to bottleneck that type of gpu even if you make the game gpu bound. This is why we see these console games never able to keep a steady 30 fps when alot starts happening.

"This means developers will have to compromise specific features while providing upgrades like upscaled 4K 30FPS and native 1080p 60FPS in games, with certain effects like lighting, shadows, environmental features and more being scaled and dialed back. In fact, Digital Foundry notes that Rise of the Tomb Raider wasn't able to hit a consistent native 1080p 60FPS on the PS4 Pro due to CPU bottlenecks. This sets a disappointing precedent for other PS4 Pro upgrades, hinting we may not finally get native 1080p 60FPS in games after all."

Avatar image for oflow
oflow

5185

Forum Posts

0

Wiki Points

0

Followers

Reviews: 40

User Lists: 0

#546 oflow
Member since 2003 • 5185 Posts

All you need to know is cows will be like this when Scorpio comes out.

Avatar image for Pedro
Pedro

69824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#547 Pedro
Member since 2002 • 69824 Posts

@oflow said:

All you need to know is cows will be like this when Scorpio comes out.

Avatar image for kvally
kvally

8445

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 9

#548 kvally
Member since 2014 • 8445 Posts
@GunSmith1_basic said:
@dynamitecop said:
@spitfire-six said:
@GunSmith1_basic said:

I know MS said the Scorpio would not replace the Xbone, but c'mon. I think we'd be lucky to have 1 year of Xbone support before it's dropped in favour of exclusive Scorpio software. At least with Sony the difference between the PS4 and the Pro is not that much. Having them share software isn't that much of a stretch.

Just because you dont understand how something will work does not mean it is not possible. Most of you are still trapped in the past where console platform meant hardware + software. In this day and age software platform is getting to a point where it is a standalone thing and it allows for the software to be developed regardless of hardware changes.

What they're saying is as dumb as saying PC games will only work with flagship hardware and low end functional specs should be abandoned as if they have some type of impact...

Scalability... They don't seem to grasp it...

The whole crux of this thread was around MS abandoning ESRAM. That means that not only is there a massive power difference between the Scorpio and Xbone, but now a major architectural difference as well. You can't just say it's as easy as scaling. You probably know more about the ins and outs of this tech better than I do (or maybe you don't), but I do notice how MS's narrative has been changing. At first, the statement was that Xbone would get ALL the games. Now, MS says they are "asking" devs to support the original Xbone. The abandonment has already started. You bring up PC architecture like it's some kind of defense. In PC gaming you need to have a high end rig to be able to play all the games. There is such a thing as minimum system requirements.

So sure, Xbone will always get games, but there is a lot of uncertainty about the kinds of games it will get, and if devs will think the effort is worth it. Anyone who thinks that all the Scorpio will ever be is a high end Xbone, and that Xbone will get all the same games, is putting their head int the sand.

http://gametransfers.com/xbox-chief-reiterates-that-project-scorpio-will-not-have-exclusive-games/

Xbox Chief reiterates that Project Scorpio will not have exclusive games

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#549 Shewgenja
Member since 2009 • 21456 Posts

@Pedro said:
@oflow said:

All you need to know is cows will be like this when Scorpio comes out.

@kvally said:

http://gametransfers.com/xbox-chief-reiterates-that-project-scorpio-will-not-have-exclusive-games/

Xbox Chief reiterates that Project Scorpio will not have exclusive games

We all know ya'll will be hyping 360 games being added to BC after this thing comes out. Ya'll thought you were getting something. It's actually kinda funny.

Avatar image for FLOPPAGE_50
FLOPPAGE_50

4500

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#550 FLOPPAGE_50
Member since 2004 • 4500 Posts

@Shewgenja said:
@Pedro said:
@oflow said:

All you need to know is cows will be like this when Scorpio comes out.

@kvally said:

http://gametransfers.com/xbox-chief-reiterates-that-project-scorpio-will-not-have-exclusive-games/

Xbox Chief reiterates that Project Scorpio will not have exclusive games

We all know ya'll will be hyping 360 games being added to BC after this thing comes out. Ya'll thought you were getting something. It's actually kinda funny.

I mean, the PS4 Pro is in the same exact boat.

There's no Pro exclusives as there will be no Scorpio exclusives, since they're suppose to work with the OG consoles too.

and BC 360 games is a damn good deal, you call yourself a gamer but seem to bitch about fantastic gaming features.