Xbox One’s Driver Update To Bring 10% Performance Boost Read

This topic is locked from further discussion.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#151  Edited By ronvalencia
Member since 2008 • 29612 Posts

@MlauTheDaft said:

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

It's a little vague is'nt it?

It says 1 CPU, not 1 Core.

"One CPU" is not vague.

From http://en.wikipedia.org/wiki/Central_processing_unit

"A computer can have more than one CPU; this is called multiprocessing. All modern CPUs are microprocessors, meaning contained on a single chip. Some integrated circuits (ICs) can contain multiple CPUs on a single chip; those ICs are called multi-core processors"

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#152  Edited By ronvalencia
Member since 2008 • 29612 Posts
@stereointegrity said:

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

x1 is 30 not 55...and only reads into ram. MS wanted to say hUMA and HSA but per their own layouts its not there and amd even said it wasnt

CPU's Cache Coherent memory access has two blue colored pipes i.e.

1. connection to GPU's "Host Guest GPU MMU". MS didn't disclose the bandwidth for this connection i.e. perhaps another 30 GB/s pipe? This seems to be Onion+ like link.

2. connection to DDR3 memory. You could saturate 30 GB/s with 2.14 CPU cores.

----------

I combined PS4's onion/onion+ CPU-to-GPU links and main memory connections i.e.

1. One CPU writes to GPU

2. One CPU writes to main memory.

From http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

That's real code running. That's not some diagnostic or some simulation case or something like that. That is real code that is running at that bandwidth. You can add that to the external memory and say that that probably achieves in similar conditions 50-55GB/s and add those two together you're getting in the order of 200GB/s across the main memory and internally."

http://www.edge-online.com/news/power-struggle-the-real-differences-between-ps4-and-xbox-one-performance/

Xbox One does, however, boast superior performance to PS4 in other ways. Lets say you are using procedural generation or raytracing via parametric surfaces that is, using a lot of memory writes and not much texturing or ALU Xbox One will be likely be faster, said one developer

As for AMD's hUMA,

From http://www.tomshardware.com/news/AMD-HSA-hUMA-APU,22324.html

The hardware coherency provided by hUMA brings three key features to the table.

  • Coherent Memory: Ensures that CPU and CPU caches both see an up-to-date view of the data
  • Page-able Memory that allows the GPU to seamless access virtual memory addresses that are not (yet) present in physical memory
  • Entire Memory Space: Both CPU and GPU can access and allocate any location in the system’s virtual memory space.

Does Xbox One's blue colored pipe connection between "Cache Coherent memory access" and "Host Guest GPU MMU" deliver the above 3 points?

From http://www.theregister.co.uk/2013/10/22/amnd_heterogenous_queueing/

With the new driver that support ACE units, does Xbox One has HSA style queuing?

Avatar image for MlauTheDaft
MlauTheDaft

5189

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#153  Edited By MlauTheDaft
Member since 2011 • 5189 Posts

@ronvalencia said:

@MlauTheDaft said:

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

It's a little vague is'nt it?

It says 1 CPU, not 1 Core.

"One CPU" is not vague.

From http://en.wikipedia.org/wiki/Central_processing_unit

"A computer can have more than one CPU; this is called multiprocessing. All modern CPUs are microprocessors, meaning contained on a single chip. Some integrated circuits (ICs) can contain multiple CPUs on a single chip; those ICs are called multi-core processors"

It is vague....

I actually consider wikipedia rather legit most of the time, but it's not a reliable source and DXT compression is multi threaded.

Edit:

Regardless of Wikpedia, the author just mentions "one CPU", which sounds intentionally unclear to me and it's rather common to use the term "core", for the sake of clarity.

EditEdit:

In regards to multiprocessing:

"Multiprocessing is the use of two or more central processing units (CPUs) within a single computer system. The term also refers to the ability of a system to support more than one processor and/or the ability to allocate tasks between them.[1]There are many variations on this basic theme, and the definition of multiprocessing can vary with context, mostly as a function of how CPUs are defined (multiple cores on one die, multiple dies in one package, multiple packages in one system unit, etc.)."

You sort of ignored the bolded and underscored part.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#154 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

"One CPU" is not vague.

From http://en.wikipedia.org/wiki/Central_processing_unit

"A computer can have more than one CPU; this is called multiprocessing. All modern CPUs are microprocessors, meaning contained on a single chip. Some integrated circuits (ICs) can contain multiple CPUs on a single chip; those ICs are called multi-core processors"

OK what the fu** is the point of this stupid post.?

One CPU or one Core the PS4 was faster period,not matter what if the PS4 use 3 core and the xbox one use 3 cores the result is the freaking same,the PS4 perform better period you are grasping lemming,and is sad there is no secrete sauce move on 1080p vs 720p,the time for excuses is over.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#155  Edited By tormentos
Member since 2003 • 33784 Posts

@acp_45 said:

@tormentos:

PS4 titles such as Killzone: Shadow Fall are using 800MB for render targets alone.

At Microsoft’s BUILD event this year, the team showed the hardware based tiled resources support added in DX11.2.

3GBs of textures were able to be stored in 16MB of RAM.

Hardware tiled resources offers a bunch of improvements over the shader based method available in Granite 1.8. Especially when using HQ filtering modes such as full anisotropic trilinear filtering with a high degree of anisotropy there are clear advantages.

Firstly,the shader can be simplified, reducing the instruction count from around 30 to 10-15 shader instructions in the case of anisotropic filtering.

Secondly, since no overlap needs to be used on tile borders cache use and compression can be improved.

Finally, streaming throughput is improved by 33% as no mipmaps have to be generated for uploaded tiles.

The eSRAM is the dedicated hardware for tiled resources and DirectX 11.2 contains the APIs to take advantage of it. AS I’VE SAID BEFORE. -_-

Microsoft has implemented some APIs in DirectX11.2 so that developers don't have to utilize their own implementation from scratch.

The more exciting implications are the possibility of the combination of tiled textures and the cloud. Developers could go crazy since they wouldn't have to store these massive textures on a disc. They could either offer them as : I would imagine the possibility of actually streaming the tiles straight from the cloud in real time thanks to the LZ encode/decode capabilities of the data move engines straight to the eSRAM to be fed into the GPU.Using the cloud to process your procedural textures for free rather than depending on your CPU. This idea is amazing.

The eSRAM and data move engines are not simply a work around for bandwidth and to label it as such(which I have seen numerous times) is disingenuous to the X1's design. They have specifically equipped it with this for specific applications beyond mitigating bandwidth limitations. Tiled textures, shadow mapping, cloud offloading, cloud streaming and of course a very fast scratch pad for the GPU. Simply put, eSRAM is superior to both GDDR5 and DDR3 for certain applications. That's why it's there. Not just to boost bandwidth.

PS4 supports the hardware implementation of PRT and the difference would be that it would require part of its RAM and RAM/GPU bandwidth to emulate it while dealing with the GPU/RAM latency.

DirectX11.2 introduces new technology called Tiled Resources - essentially, what Tiled Resources does is it increases and decreases graphics fidelity based on location and what the player is actively viewing.

To simplify - imagine your house rendered as a video game. The room that you are in and rooms that are visible to you are rendered as usual, but as you approach something, it maintains the same quality because its render quality has been increased, whereas the objects you are moving away from, and the room you no longer occupy, have had their render quality decreased.

It's like an automatic light dimmer, for video games.

OpenGL has a very similar (yet not complete) plugin.

PS4 load on cpu and gpu causing optimization troubles that will only get worse as comlexity tries to increase.

Xbox One shines here as cpu cache/register/esram snoops allow computes to be offloaded and retrieved from specialized CU's seamlessly alleviating cpu/gpu load.

The 1.3 tflop do converting 6gb to 32mb and on call !

This is much better than 1.8tflop rendering full texture loads!

50% more TFlops does not necessarily mean that the PS4 will have 50% more graphics ability.

PS4 is stronger raw, hands down. This won't make any Xbox One fans happy but thats the truth of it, but it’s not really important. Now let’s look at diminishing returns of raw gpu performance. Once PS4 raw compute limit is reached games will cease to increase in detail, complexity and performance. 1.8Tflop today is nothing great. Especially for an off the shelf card. Thats why games are buggy atm. Thats why Crytek say that Cryengine could already max out console.

So why does a dev with a Gfx engine that can already max out next gen console go exclusive with a supposedly less powerful console.

No not because MS paid them. Cause’ that would be FANBOY LOGIC .

And the answer is because PS4 cant render it at stable FPS. Once PS4 hits performance cap there is no workaround, you just cant do anything about it. Xbox One performance cap reduced by factors of ( I DON’T EVEN FREAKING KNOW) by Tiled Resources. We don’t even know by how much yet as its just now being used by big Devs. Xbox One has low level API that runs seamlessly with High level DX11.2 and Tiled Resources. Game changer.

PS4 does not have access and they are using OpenGL to try and duplicate but nowhere close to what Xbox One does with it.

32mb=6+GB of texture on call. Think about the imlications of this number for a minute when it starts getting used widespread and efficiently.

When MS said they will let the games do the talking they meant it because once more games launch using this resource there will be no denying the proof. Ryse is just the first wave.

Now what happens when an 1.8 tflop machine runs out of headroom to render and a 1.3tflop(raw) can convert GB's of data on the fly only using MB's of memory address?

MS didn’t care about GDDR5 as its cost put too much loss on Xbox One sales when Tiled Resources(referred to as TR from this point on) significantly changes the game.

host/guest gpus : It is R280x based, but with DDR3 mem won’t hit the Tflop number. They just wanted the processing power of the GPU and the latency of DDR3. Used in conjunction with TR it’s crazy.

An 8GB texture render takes 100% GPU load on ps4 while TR on XB1 can do that with 48mb when devs utilize it. Ryse is just the beginning!

http://www.youtube.com/watch?feature=player_embedded&v=EswYdzsHKMc Watch this please. It’s pretty cool.

When Devs start utilizing TR to it’s full potential when that 1.8 PS4 has reached it’s computational limit because of memory space and compute the Xbox One can render exponentially more data and exponentially more fidelity because it can take far more data and render it using only a fraction of memory. 32mb-->rendering 6gb and more!

Microsoft created a hardware accelerated version of an existing technique, though it offers a lot of improvements because now DirectX takes care of a lot of the problems that programmers used to have to deal with in order to implement Tiled Resources (mainly blending issues).

How many Tflops does it take to call on 32mb? So not only is gpu load significantly less but that means that it can call on significantly more data using far less gpu resources.

http://www.youtube.com/watch?v=QB0VKmk5bmI&feature=player_detailpage. Watch this please. Yet again. COOL !

So that is a one of the many tech inside that will separete Xbox One from PS4. Developers still do not know how powerfull this tech is because that could not be calculated with RAW TFs numbers. The hardware argument used around most of the net is completely moot, as it only considers the raw gpu specs, and not the APU as a whole, and both systems are running custom APU with key differences.

The important thing is both will have the same type of technique but one will have more renewed version.

But the part that is interesting is the Cloud. If this could happen then hardware would become less and less relevant. See this as something good.

Not because you want to glorify your console.

Comparing a Multigenerational game like BF4 or COD: Ghosts is irrelevant because it is most likely not using that technique due to the games also running on PS3 and Xbox 360. Those consoles are not using this. It’s poorly optimized on all platforms not just Xbox ONE. Exclusives on PS4 and Xbox One would rely more on the technique.

If you are going on about 1080p then you are easily IMPRESSED. I was expecting 4K.

But it isn’t important to me.

Anyway both consoles are good.

So why are you downplaying something like this, when it’s good news.

OK first no fu**ing GCN has ESRAM get all support PRT,for using PRT YOU DON'T NEED ESRAM PERIOD........

Now that we got this stupid theory of you out of the way,prt will work on any GCN and will work on PS4,PRT is a freaking way to have more diverse and bigger textures on a smaller space,is not the fu**ing be all end all of gaming Rage proved that already with Mega Textures which is basically the same.

Second not all games accepts are textures,there are many more things several which are not suitable to be put on ESRAM either.

Quote me saying that 50% more power = 50% better graphics,see this is one of the giveaway that you are copy pasting arguments from other people,i have read that one before with many of the crap you put there,not even n PC 50% more power = 50% more visuals,50% more power on PC mean faster frames,better AA,better effects,or higher resolution,the xbox one can look just as good as the PS4 version in any game,but the resolution,and some other things will suffer for the lack of power period this is an in inescapable fact period.

And lastly that bold part is hand down one of the most stupid sh** garbage crap i have ever read.

Once PS4 raw compute limit is reached games will cease to increase in detail, complexity and performance.

1.8Tflop today is nothing great.

That can be freaking say about every GPU out there,all have a damn limit man what the fu**,then how you go into complain that 1.8TF is nothing great yur freaking cheer leading for the xbox one that at this moment has 1.18TF usable for games 660Gflops less.

MS has integrated some API in DX11.2? DX11.2 is the damn API dude...

PS4 load on cpu and gpu causing optimization troubles that will only get worse as comlexity tries to increase.

Xbox One shines here as cpu cache/register/esram snoops allow computes to be offloaded and retrieved from specialized CU's seamlessly alleviating cpu/gpu load.

WTF does this crap even mean.? No really man.? The CU on the xbox one are not specialized in any freaking way,they are the same found across all GCN,you are inventing sh** as well.

Ok i will stop quoting your stupidity is show that you are a blind biased lemming,who pretend to know something about hardware but that repeat like a parrot what ever crap any one one,say by the way love your whole diminishing return argument for the PS4 hahaha,which is pull from MS fu**ing ass,when they try to claim that using more than 14 CU got diminishing returns,but the problem is they only use 12 not 14..lol also i would love to have the diminishing returns of the 7970 that has more than 14 CU.

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#156 FoxbatAlpha
Member since 2009 • 10669 Posts

@tormentos said:

@acp_45 said:

@tormentos:

PS4 titles such as Killzone: Shadow Fall are using 800MB for render targets alone.

At Microsoft’s BUILD event this year, the team showed the hardware based tiled resources support added in DX11.2.

3GBs of textures were able to be stored in 16MB of RAM.

Hardware tiled resources offers a bunch of improvements over the shader based method available in Granite 1.8. Especially when using HQ filtering modes such as full anisotropic trilinear filtering with a high degree of anisotropy there are clear advantages.

Firstly,the shader can be simplified, reducing the instruction count from around 30 to 10-15 shader instructions in the case of anisotropic filtering.

Secondly, since no overlap needs to be used on tile borders cache use and compression can be improved.

Finally, streaming throughput is improved by 33% as no mipmaps have to be generated for uploaded tiles.

The eSRAM is the dedicated hardware for tiled resources and DirectX 11.2 contains the APIs to take advantage of it. AS I’VE SAID BEFORE. -_-

Microsoft has implemented some APIs in DirectX11.2 so that developers don't have to utilize their own implementation from scratch.

The more exciting implications are the possibility of the combination of tiled textures and the cloud. Developers could go crazy since they wouldn't have to store these massive textures on a disc. They could either offer them as : I would imagine the possibility of actually streaming the tiles straight from the cloud in real time thanks to the LZ encode/decode capabilities of the data move engines straight to the eSRAM to be fed into the GPU.Using the cloud to process your procedural textures for free rather than depending on your CPU. This idea is amazing.

The eSRAM and data move engines are not simply a work around for bandwidth and to label it as such(which I have seen numerous times) is disingenuous to the X1's design. They have specifically equipped it with this for specific applications beyond mitigating bandwidth limitations. Tiled textures, shadow mapping, cloud offloading, cloud streaming and of course a very fast scratch pad for the GPU. Simply put, eSRAM is superior to both GDDR5 and DDR3 for certain applications. That's why it's there. Not just to boost bandwidth.

PS4 supports the hardware implementation of PRT and the difference would be that it would require part of its RAM and RAM/GPU bandwidth to emulate it while dealing with the GPU/RAM latency.

DirectX11.2 introduces new technology called Tiled Resources - essentially, what Tiled Resources does is it increases and decreases graphics fidelity based on location and what the player is actively viewing.

To simplify - imagine your house rendered as a video game. The room that you are in and rooms that are visible to you are rendered as usual, but as you approach something, it maintains the same quality because its render quality has been increased, whereas the objects you are moving away from, and the room you no longer occupy, have had their render quality decreased.

It's like an automatic light dimmer, for video games.

OpenGL has a very similar (yet not complete) plugin.

PS4 load on cpu and gpu causing optimization troubles that will only get worse as comlexity tries to increase.

Xbox One shines here as cpu cache/register/esram snoops allow computes to be offloaded and retrieved from specialized CU's seamlessly alleviating cpu/gpu load.

The 1.3 tflop do converting 6gb to 32mb and on call !

This is much better than 1.8tflop rendering full texture loads!

50% more TFlops does not necessarily mean that the PS4 will have 50% more graphics ability.

PS4 is stronger raw, hands down. This won't make any Xbox One fans happy but thats the truth of it, but it’s not really important. Now let’s look at diminishing returns of raw gpu performance. Once PS4 raw compute limit is reached games will cease to increase in detail, complexity and performance. 1.8Tflop today is nothing great. Especially for an off the shelf card. Thats why games are buggy atm. Thats why Crytek say that Cryengine could already max out console.

So why does a dev with a Gfx engine that can already max out next gen console go exclusive with a supposedly less powerful console.

No not because MS paid them. Cause’ that would be FANBOY LOGIC .

And the answer is because PS4 cant render it at stable FPS. Once PS4 hits performance cap there is no workaround, you just cant do anything about it. Xbox One performance cap reduced by factors of ( I DON’T EVEN FREAKING KNOW) by Tiled Resources. We don’t even know by how much yet as its just now being used by big Devs. Xbox One has low level API that runs seamlessly with High level DX11.2 and Tiled Resources. Game changer.

PS4 does not have access and they are using OpenGL to try and duplicate but nowhere close to what Xbox One does with it.

32mb=6+GB of texture on call. Think about the imlications of this number for a minute when it starts getting used widespread and efficiently.

When MS said they will let the games do the talking they meant it because once more games launch using this resource there will be no denying the proof. Ryse is just the first wave.

Now what happens when an 1.8 tflop machine runs out of headroom to render and a 1.3tflop(raw) can convert GB's of data on the fly only using MB's of memory address?

MS didn’t care about GDDR5 as its cost put too much loss on Xbox One sales when Tiled Resources(referred to as TR from this point on) significantly changes the game.

host/guest gpus : It is R280x based, but with DDR3 mem won’t hit the Tflop number. They just wanted the processing power of the GPU and the latency of DDR3. Used in conjunction with TR it’s crazy.

An 8GB texture render takes 100% GPU load on ps4 while TR on XB1 can do that with 48mb when devs utilize it. Ryse is just the beginning!

http://www.youtube.com/watch?feature=player_embedded&v=EswYdzsHKMc Watch this please. It’s pretty cool.

When Devs start utilizing TR to it’s full potential when that 1.8 PS4 has reached it’s computational limit because of memory space and compute the Xbox One can render exponentially more data and exponentially more fidelity because it can take far more data and render it using only a fraction of memory. 32mb-->rendering 6gb and more!

Microsoft created a hardware accelerated version of an existing technique, though it offers a lot of improvements because now DirectX takes care of a lot of the problems that programmers used to have to deal with in order to implement Tiled Resources (mainly blending issues).

How many Tflops does it take to call on 32mb? So not only is gpu load significantly less but that means that it can call on significantly more data using far less gpu resources.

http://www.youtube.com/watch?v=QB0VKmk5bmI&feature=player_detailpage. Watch this please. Yet again. COOL !

So that is a one of the many tech inside that will separete Xbox One from PS4. Developers still do not know how powerfull this tech is because that could not be calculated with RAW TFs numbers. The hardware argument used around most of the net is completely moot, as it only considers the raw gpu specs, and not the APU as a whole, and both systems are running custom APU with key differences.

The important thing is both will have the same type of technique but one will have more renewed version.

But the part that is interesting is the Cloud. If this could happen then hardware would become less and less relevant. See this as something good.

Not because you want to glorify your console.

Comparing a Multigenerational game like BF4 or COD: Ghosts is irrelevant because it is most likely not using that technique due to the games also running on PS3 and Xbox 360. Those consoles are not using this. It’s poorly optimized on all platforms not just Xbox ONE. Exclusives on PS4 and Xbox One would rely more on the technique.

If you are going on about 1080p then you are easily IMPRESSED. I was expecting 4K.

But it isn’t important to me.

Anyway both consoles are good.

So why are you downplaying something like this, when it’s good news.

OK first no fu**ing GCN has ESRAM get all support PRT,for using PRT YOU DON'T NEED ESRAM PERIOD........

Now that we got this stupid theory of you out of the way,prt will work on any GCN and will work on PS4,PRT is a freaking way to have more diverse and bigger textures on a smaller space,is not the fu**ing be all end all of gaming Rage proved that already with Mega Textures which is basically the same.

Second not all games accepts are textures,there are many more things several which are not suitable to be put on ESRAM either.

Quote me saying that 50% more power = 50% better graphics,see this is one of the giveaway that you are copy pasting arguments from other people,i have read that one before with many of the crap you put there,not even n PC 50% more power = 50% more visuals,50% more power on PC mean faster frames,better AA,better effects,or higher resolution,the xbox one can look just as good as the PS4 version in any game,but the resolution,and some other things will suffer for the lack of power period this is an in inescapable fact period.

And lastly that bold part is hand down one of the most stupid sh** garbage crap i have ever read.

Once PS4 raw compute limit is reached games will cease to increase in detail, complexity and performance.

1.8Tflop today is nothing great.

That can be freaking say about every GPU out there,all have a damn limit man what the fu**,then how you go into complain that 1.8TF is nothing great yur freaking cheer leading for the xbox one that at this moment has 1.18TF usable for games 660Gflops less.

MS has integrated some API in DX11.2? DX11.2 is the damn API dude...

PS4 load on cpu and gpu causing optimization troubles that will only get worse as comlexity tries to increase.

Xbox One shines here as cpu cache/register/esram snoops allow computes to be offloaded and retrieved from specialized CU's seamlessly alleviating cpu/gpu load.

WTF does this crap even mean.? No really man.? The CU on the xbox one are not specialized in any freaking way,they are the same found across all GCN,you are inventing sh** as well.

Ok i will stop quoting your stupidity is show that you are a blind biased lemming,who pretend to know something about hardware but that repeat like a parrot what ever crap any one one,say by the way love your whole diminishing return argument for the PS4 hahaha,which is pull from MS fu**ing ass,when they try to claim that using more than 14 CU got diminishing returns,but the problem is they only use 12 not 14..lol also i would love to have the diminishing returns of the 7970 that has more than 14 CU.

Oh Christ. You should be banned from talking about anything technical.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#157 tormentos
Member since 2003 • 33784 Posts

@FoxbatAlpha said:

Oh Christ. You should be banned from talking about anything technical.

You should be ban oh wait you have been like in what 10 accounts.?

lol...

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#158  Edited By deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@tormentos: huhuhu lol .

Laugh at your own jokes.

At least they are funny to you.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#159 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@tormentos:

"MS has integrated some API in DX11.2? DX11.2 is the damn API dude"

LOL............

really are you serious ?

You did not just say that ?

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#160 FoxbatAlpha
Member since 2009 • 10669 Posts

@tormentos said:

@FoxbatAlpha said:

Oh Christ. You should be banned from talking about anything technical.

You should be ban oh wait you have been like in what 10 accounts.?

lol...

There is only ONE FoxBatAlpha. He plays on THE ONE. The greatest, most sophisticated gaming console know to man.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#161 tormentos
Member since 2003 • 33784 Posts

@acp_45 said:

@tormentos:

"MS has integrated some API in DX11.2? DX11.2 is the damn API dude"

LOL............

really are you serious ?

You did not just say that ?

No DirectX11.2 is not an API..

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#162 tormentos
Member since 2003 • 33784 Posts
@FoxbatAlpha said:

There is only ONE FoxBatAlpha. He plays on THE ONE. The greatest, most sophisticated gaming console know to man.


Oh yeah and the other account had other names...lol

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#163 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@tormentos: DIRECT 3D IS THE API.

GET YOUR FACTS STRAIGHT !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#164  Edited By ronvalencia
Member since 2008 • 29612 Posts

@ACP_45

Note that PS4's graphics API is similar to AMD''s Mantle API. Attacking PS4's API for being crap is also attacking AMD's Mantle API since they are both similar.

Mantle API includes "the right amount" high and low abstraction layers.

As stated in APU13 lecture. BattleField 4 Mantle's rendering engine closer to PS4's rendering engine.


@tormentos said:

OK first no fu**ing GCN has ESRAM get all support PRT,for using PRT YOU DON'T NEED ESRAM PERIOD........

Now that we got this stupid theory of you out of the way,prt will work on any GCN and will work on PS4,PRT is a freaking way to have more diverse and bigger textures on a smaller space,is not the fu**ing be all end all of gaming Rage proved that already with Mega Textures which is basically the same.

Second not all games accepts are textures,there are many more things several which are not suitable to be put on ESRAM either.

Quote me saying that 50% more power = 50% better graphics,see this is one of the giveaway that you are copy pasting arguments from other people,i have read that one before with many of the crap you put there,not even n PC 50% more power = 50% more visuals,50% more power on PC mean faster frames,better AA,better effects,or higher resolution,the xbox one can look just as good as the PS4 version in any game,but the resolution,and some other things will suffer for the lack of power period this is an in inescapable fact period.

And lastly that bold part is hand down one of the most stupid sh** garbage crap i have ever read.

Once PS4 raw compute limit is reached games will cease to increase in detail, complexity and performance.

1.8Tflop today is nothing great.

That can be freaking say about every GPU out there,all have a damn limit man what the fu**,then how you go into complain that 1.8TF is nothing great yur freaking cheer leading for the xbox one that at this moment has 1.18TF usable for games 660Gflops less.

MS has integrated some API in DX11.2? DX11.2 is the damn API dude...

PS4 load on cpu and gpu causing optimization troubles that will only get worse as comlexity tries to increase.

Xbox One shines here as cpu cache/register/esram snoops allow computes to be offloaded and retrieved from specialized CU's seamlessly alleviating cpu/gpu load.

WTF does this crap even mean.? No really man.? The CU on the xbox one are not specialized in any freaking way,they are the same found across all GCN,you are inventing sh** as well.

Ok i will stop quoting your stupidity is show that you are a blind biased lemming,who pretend to know something about hardware but that repeat like a parrot what ever crap any one one,say by the way love your whole diminishing return argument for the PS4 hahaha,which is pull from MS fu**ing ass,when they try to claim that using more than 14 CU got diminishing returns,but the problem is they only use 12 not 14..lol also i would love to have the diminishing returns of the 7970 that has more than 14 CU.

One shouldn't equate RAGE with PRT i.e. there are differences on the technical level.

Xbox One has "DirectX 11.X" which is a superset of PC's DirectX 11.2.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#165  Edited By ronvalencia
Member since 2008 • 29612 Posts

@MlauTheDaft said:

@ronvalencia said:

@MlauTheDaft said:

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

It's a little vague is'nt it?

It says 1 CPU, not 1 Core.

"One CPU" is not vague.

From http://en.wikipedia.org/wiki/Central_processing_unit

"A computer can have more than one CPU; this is called multiprocessing. All modern CPUs are microprocessors, meaning contained on a single chip. Some integrated circuits (ICs) can contain multiple CPUs on a single chip; those ICs are called multi-core processors"

It is vague....

I actually consider wikipedia rather legit most of the time, but it's not a reliable source and DXT compression is multi threaded.

Edit:

Regardless of Wikpedia, the author just mentions "one CPU", which sounds intentionally unclear to me and it's rather common to use the term "core", for the sake of clarity.

EditEdit:

In regards to multiprocessing:

"Multiprocessing is the use of two or more central processing units (CPUs) within a single computer system. The term also refers to the ability of a system to support more than one processor and/or the ability to allocate tasks between them.[1]There are many variations on this basic theme, and the definition of multiprocessing can vary with context, mostly as a function of how CPUs are defined (multiple cores on one die, multiple dies in one package, multiple packages in one system unit, etc.)."

You sort of ignored the bolded and underscored part.

No, refer to "Multiprocessing is the use of two or more central processing units (CPUs) within a single computer system."

Your bold text points to the "definition of multiprocessing" and "CPUs" has the letter "s" in it i.e. more than one CPU.

Like the "GPU" was defined by NVIDIA (via GeForce 256's release), Intel has defined the "CPU",

What I learnt from Uni's CS, a CPU's definition encapsulates from the word's first monolithic CPU i.e. Intel 4004.

Texture generation benchmark didn't talk about socket counts.

Avatar image for MlauTheDaft
MlauTheDaft

5189

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#166  Edited By MlauTheDaft
Member since 2011 • 5189 Posts

@ronvalencia said:

@MlauTheDaft said:

@ronvalencia said:

@MlauTheDaft said:

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

It's a little vague is'nt it?

It says 1 CPU, not 1 Core.

"One CPU" is not vague.

From http://en.wikipedia.org/wiki/Central_processing_unit

"A computer can have more than one CPU; this is called multiprocessing. All modern CPUs are microprocessors, meaning contained on a single chip. Some integrated circuits (ICs) can contain multiple CPUs on a single chip; those ICs are called multi-core processors"

It is vague....

I actually consider wikipedia rather legit most of the time, but it's not a reliable source and DXT compression is multi threaded.

Edit:

Regardless of Wikpedia, the author just mentions "one CPU", which sounds intentionally unclear to me and it's rather common to use the term "core", for the sake of clarity.

EditEdit:

In regards to multiprocessing:

"Multiprocessing is the use of two or more central processing units (CPUs) within a single computer system. The term also refers to the ability of a system to support more than one processor and/or the ability to allocate tasks between them.[1]There are many variations on this basic theme, and the definition of multiprocessing can vary with context, mostly as a function of how CPUs are defined (multiple cores on one die, multiple dies in one package, multiple packages in one system unit, etc.)."

You sort of ignored the bolded and underscored part.

No, refer to "Multiprocessing is the use of two or more central processing units (CPUs) within a single computer system."

Your bold text points to the "definition of multiprocessing" and "CPUs" has the letter "s" in it i.e. more than one CPU.

Like the "GPU" was defined by NVIDIA (via GeForce 256's release), Intel has defined the "CPU",

What I learnt from Uni's CS, a CPU's definition encapsulates from the word's first monolithic CPU i.e. Intel 4004.

Texture generation benchmark didn't talk about socket counts.

DXT compression is still multithreaded and you know damn well that it's common to distinguish between CPU and Core.

You assume a single core, I assume all available threads. It's vague.

Avatar image for sukraj
sukraj

27859

Forum Posts

0

Wiki Points

0

Followers

Reviews: 22

User Lists: 0

#167 sukraj
Member since 2008 • 27859 Posts

@Suppaman100 said:

Are lemmings still in denial that the Xbone 720p is a weak POS?

Let me freshen up your minds lems:

Let the war begin I'm looking forward to seeing what the cloud can achieve on the xbox one.

Avatar image for LJS9502_basic
LJS9502_basic

178867

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#168 LJS9502_basic
Member since 2003 • 178867 Posts

@acp_45 said:

@tormentos: LOL dude if you are going to compare 19 games vs 29 games.

What if Microsofts created an extra ten games and them all being at least rated 70 on Metacritic and maybe one 50 that will boost it to surpass the PS4.

Last thing I heard is most Ponies say reviews aren’t important when Knack got destroyed and when Ryze came you all went ballistic. Shows you.

You are a fool and you are stubborn.

Both consoles are good but you make yours look worse because you are highly repulsive.

What if.......means nothing really.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#169 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

No, refer to "Multiprocessing is the use of two or more central processing units (CPUs) within a single computer system."

Your bold text points to the "definition of multiprocessing" and "CPUs" has the letter "s" in it i.e. more than one CPU.

Like the "GPU" was defined by NVIDIA (via GeForce 256's release), Intel has defined the "CPU",

What I learnt from Uni's CS, a CPU's definition encapsulates from the word's first monolithic CPU i.e. Intel 4004.

Texture generation benchmark didn't talk about socket counts.

1 CPU or 1 core all cores it makes no difference,the PS4 CPU performed better the test didn't lie period your losing your time.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#170  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos said:

@ronvalencia said:

No, refer to "Multiprocessing is the use of two or more central processing units (CPUs) within a single computer system."

Your bold text points to the "definition of multiprocessing" and "CPUs" has the letter "s" in it i.e. more than one CPU.

Like the "GPU" was defined by NVIDIA (via GeForce 256's release), Intel has defined the "CPU",

What I learnt from Uni's CS, a CPU's definition encapsulates from the word's first monolithic CPU i.e. Intel 4004.

Texture generation benchmark didn't talk about socket counts.

1 CPU or 1 core all cores it makes no difference,the PS4 CPU performed better the test didn't lie period your losing your time.

LOL, 1 CPU test doesn't reflect mulch-threading games, period.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#171  Edited By ronvalencia
Member since 2008 • 29612 Posts
@MlauTheDaft said:

@ronvalencia said:

@MlauTheDaft said:

@ronvalencia said:

@MlauTheDaft said:

@ronvalencia said:
@tormentos said:


5-Totally irrelevant again the PS4 CPU is proving to have an edge on the xbox one CPU,even with the speed bump.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively


A single core CPU test... LOL... Have you realize PS4's CPU to main memory connection is around peak 20 GB/s while X1 has about 55 GB/s (from peak 68 GB/s)?

If you combine peak 20 GB/s (main memory write) + 10 GB/s (onion/onion+ link write direction to GPU) is still inferior to X1's CPU memory writes.

On multi-CPU core texture generation, X1 would be faster. Your cited benchmark can backfire on you.

On gaming PC and PS4, texture generation should be done on the GpGPU where it's linked to higher memory bandwidth and higher CU count.

It's a little vague is'nt it?

It says 1 CPU, not 1 Core.

"One CPU" is not vague.

From http://en.wikipedia.org/wiki/Central_processing_unit

"A computer can have more than one CPU; this is called multiprocessing. All modern CPUs are microprocessors, meaning contained on a single chip. Some integrated circuits (ICs) can contain multiple CPUs on a single chip; those ICs are called multi-core processors"

It is vague....

I actually consider wikipedia rather legit most of the time, but it's not a reliable source and DXT compression is multi threaded.

Edit:

Regardless of Wikpedia, the author just mentions "one CPU", which sounds intentionally unclear to me and it's rather common to use the term "core", for the sake of clarity.

EditEdit:

In regards to multiprocessing:

"Multiprocessing is the use of two or more central processing units (CPUs) within a single computer system. The term also refers to the ability of a system to support more than one processor and/or the ability to allocate tasks between them.[1]There are many variations on this basic theme, and the definition of multiprocessing can vary with context, mostly as a function of how CPUs are defined (multiple cores on one die, multiple dies in one package, multiple packages in one system unit, etc.)."

You sort of ignored the bolded and underscored part.

No, refer to "Multiprocessing is the use of two or more central processing units (CPUs) within a single computer system."

Your bold text points to the "definition of multiprocessing" and "CPUs" has the letter "s" in it i.e. more than one CPU.

Like the "GPU" was defined by NVIDIA (via GeForce 256's release), Intel has defined the "CPU",

What I learnt from Uni's CS, a CPU's definition encapsulates from the word's first monolithic CPU i.e. Intel 4004.

Texture generation benchmark didn't talk about socket counts.

DXT compression is still multithreaded and you know damn well that it's common to distinguish between CPU and Core.

You assume a single core, I assume all available threads. It's vague.

"DXT compression" can be multi-threaded or single threaded and "DXT compression" by itself doesn't say much on the thread count. LOL.

The benchmark itself is being done on Allegorithmic’s Substance Engine.

"Core" by itself is nothing without it's context i.e. "CPU core" or GPU core.

Where's your proof "one CPU" = multi-threaded?

From http://www.nvidia.com/object/real-time-ycocg-dxt-compression.html

A single core Intel Core 2 Extreme at 2.9 Ghz can do 200 megapixels for DXT1 compression and 109 megapixels for DXT5 compression.

200,000,000 pixel/s x 4 bytes = 800,000,000 byte/s or 762 MB/s

200,000,000 pixel/s x 2 bytes = 400,000,000 byte/s or 381 MB/s

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#172 Tighaman
Member since 2006 • 1038 Posts

TORM TORM TORM you poor soul yes the ps4 has a GPU that supports PRT BUT THATS THE ONLY THING IN THE SYSTEM THAT DOES you need coherent memory and fast cache to run PRT efficiently. and can you stop with the 1.3tf that was only one processor dummy there was THREE UP THERE with 3 DIFFERENT PROCESSES to ps4 one BIG PROCESSOR TO DO WHAT THEM 3 different ones are doing in the x1 so to get the total flops of the x1 you would have to add all 3 different processors together come back to me when you get that answer. Every and I mean every GPU has a certain percent reserved inside of the so keep thinking you have that whole 1.8tfs to play with because you don't I promise you that the ps4 has atleast 22% system reservation for cam and OS.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#173 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

LOL, 1 CPU test doesn't reflect mulch-threading games, period.

That test is for a game engine you blind biased lemming,so yeah the PS4 is running that engine better than the xbox one despite the CPU speed difference.

@Tighaman said:

TORM TORM TORM you poor soul yes the ps4 has a GPU that supports PRT BUT THATS THE ONLY THING IN THE SYSTEM THAT DOES you need coherent memory and fast cache to run PRT efficiently. and can you stop with the 1.3tf that was only one processor dummy there was THREE UP THERE with 3 DIFFERENT PROCESSES to ps4 one BIG PROCESSOR TO DO WHAT THEM 3 different ones are doing in the x1 so to get the total flops of the x1 you would have to add all 3 different processors together come back to me when you get that answer. Every and I mean every GPU has a certain percent reserved inside of the so keep thinking you have that whole 1.8tfs to play with because you don't I promise you that the ps4 has atleast 22% system reservation for cam and OS.

PRT don't require ESRAM GTFO with that crappy theory period,it will work just as well on PS4 and who knows if better.

The PS4 has 22% reservation for a camera that is not mandatory,and a OS that is light without snap features...hahahahaaaaaaaaaaaaaaaaaaaa

Oh my god so you just confirmed it you are pulling sh** from deep deep down your ass,please i need a link to where the PS4 reserves 22% of its GPU for OS,System and camera...The ball is on your side.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#174 Tighaman
Member since 2006 • 1038 Posts

It don't have to be mandatory you need reservation for all things that gets connected to a system hell my mouse needs 2% reservation on the system on my pc! I know you cant be that stupid I know you got a PC Im telling you them not the exact number but its no less that 18% reservation you have to have reservation if not whenever you connect anything extra on the system it would stutter and not work properly that's common sence TORM and you do need coherent bandwidth and fast cache to make PRT work properly that's why RAGE didn't work properly and you had to install it on the harddrive to take the place of fast cache but harddrive speeds still wasn't fast enough please don't discuss anything about tech you will hurt yourself TORM lol

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#175  Edited By ronvalencia
Member since 2008 • 29612 Posts
@tormentos said:

@ronvalencia said:

LOL, 1 CPU test doesn't reflect mulch-threading games, period.

That test is for a game engine you blind biased lemming,so yeah the PS4 is running that engine better than the xbox one despite the CPU speed difference.

@Tighaman said:

TORM TORM TORM you poor soul yes the ps4 has a GPU that supports PRT BUT THATS THE ONLY THING IN THE SYSTEM THAT DOES you need coherent memory and fast cache to run PRT efficiently. and can you stop with the 1.3tf that was only one processor dummy there was THREE UP THERE with 3 DIFFERENT PROCESSES to ps4 one BIG PROCESSOR TO DO WHAT THEM 3 different ones are doing in the x1 so to get the total flops of the x1 you would have to add all 3 different processors together come back to me when you get that answer. Every and I mean every GPU has a certain percent reserved inside of the so keep thinking you have that whole 1.8tfs to play with because you don't I promise you that the ps4 has atleast 22% system reservation for cam and OS.

PRT don't require ESRAM GTFO with that crappy theory period,it will work just as well on PS4 and who knows if better.

The PS4 has 22% reservation for a camera that is not mandatory,and a OS that is light without snap features...hahahahaaaaaaaaaaaaaaaaaaaa

Oh my god so you just confirmed it you are pulling sh** from deep deep down your ass,please i need a link to where the PS4 reserves 22% of its GPU for OS,System and camera...The ball is on your side.

PRT maximises two memory pools with different speed i.e. to-be-displayed textures are stream'ed from slower larger memory pool to faster smaller memory pool.

With a single memory pool, texture streaming would be nearly pointless since PS4 has a single speed memory pool. Streaming textures from SysMem1 to SysMem1 is LOL i.e. just point to texture's location and TMUs can load/store it.

Your not understanding why PRT required for gaming PC and X1 i.e. both has two memory pools with different speeds.

Doing it via a seperate hardware reduces the CPU/GPU(stream processors) usage.

PRT enables AMD GCN to fake a fast large memory pool by using just-in-time methods to stream-in textures into fast smaller memory pool when it's required to be displayed.

AMD PRT wouldn't solve X1's lesser CU count.

Avatar image for Mr_KMG
Mr_KMG

2971

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#176 Mr_KMG
Member since 2003 • 2971 Posts

@acp_45 said:

@tormentos: LOL,

*insert massive post of misinformation and bias

I normally stay out of participating in system wars discussions but I've never seen such great attempt at trying to (falsely) claiming a system is better....

Jeez...

Avatar image for stuff238
stuff238

3284

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#177 stuff238
Member since 2012 • 3284 Posts

LMAO at 3-4 times more power with cloud gaming. So microsoft is going to turn on skynet in 2015? gtfo. LOL.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#178 Tighaman
Member since 2006 • 1038 Posts

@ronvalencia: the x1 doesn't have the same CUs inside of it so you cant measure it the same you need to your 12 compute cores in your 290x is not the same as the CUs in the ps4

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#179 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

PRT maximises two memory pools with different speed i.e. to-be-displayed textures are stream'ed from slower larger memory pool to faster smaller memory pool.

With a single memory pool, texture streaming would be nearly pointless since PS4 has a single speed memory pool. Streaming textures from SysMem1 to SysMem1 is LOL i.e. just point to texture's location and TMUs can load/store it.

Your not understanding why PRT required for gaming PC and X1 i.e. both has two memory pools with different speeds.

Doing it via a seperate hardware reduces the CPU/GPU(stream processors) usage.

PRT enables AMD GCN to fake a fast large memory pool by using just-in-time methods to stream-in textures into fast smaller memory pool when it's required to be displayed.

AMD PRT wouldn't solve X1's lesser CU count.

And here we go again with the whole PRT crap,which is also supported on PS4.

The PS4 is faster period test show it,i have a link with a test and you have nothing but your biased ass opinion better luck next time..hehehehe

@Mr_KMG said:

I normally stay out of participating in system wars discussions but I've never seen such great attempt at trying to (falsely) claiming a system is better....

Jeez...

Provably with good reason,since all you did there was claim that i was falsely claiming things but without anything to prove it.

So your argument basically is your wrong because i say so..

@Tighaman said:

It don't have to be mandatory you need reservation for all things that gets connected to a system hell my mouse needs 2% reservation on the system on my pc! I know you cant be that stupid I know you got a PC Im telling you them not the exact number but its no less that 18% reservation you have to have reservation if not whenever you connect anything extra on the system it would stutter and not work properly that's common sence TORM and you do need coherent bandwidth and fast cache to make PRT work properly that's why RAGE didn't work properly and you had to install it on the harddrive to take the place of fast cache but harddrive speeds still wasn't fast enough please don't discuss anything about tech you will hurt yourself TORM lol

LINKKKKKKKK.....

Since Kinect,and snap are the 2 biggest reasons why the xbox one has a 10% reservation,i say your completely full of sh**,the PS4 doesn't have a mandatory camera,nor has a snap feature with TV guide and a metro like Accelerated UI.

Claiming that the PS4 has 18 % GPU reservation without any proof,basically prove that you are a Mister X media moron typed of poster,who invent crap left and right to try to downplay the system it hates,there is not a single reason why the PS4 should reserve such an absurd % of GPU resources when the PS4 doesn't have Kinect mandatory nor HDMI in with TV guide an snap feature.

Funny how MS with more demanding features has only 10% GPU reservation,but the PS4 without them has 18 to 22%,dude stop posting you don't know sh** of what your saying and by now every one basically know your angle.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#180 ronvalencia
Member since 2008 • 29612 Posts

@Tighaman said:

@ronvalencia: the x1 doesn't have the same CUs inside of it so you cant measure it the same you need to your 12 compute cores in your 290x is not the same as the CUs in the ps4

AMD/MS has stated X1's GCN is based on "Sea Islands".

I have multiple AMD GCN products e.g. R9-290 (non-X), 7950-900Mhz(factory OC), non-ref HIS IcqQ 7970(1 Ghz OC), 8870M(slim gaming laptop), 8570M (ultrabook).

8870M has 10 CUs.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#181  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

Again, your not understanding PRT's main functions in relation to memory access patterns. Think harder.

AMD Kaveri APU's GCN IGP also has PRT hardware (as part of it's GCN's MMU/virtual memory/logical memory address space functions), but it doesn't have Hyper cache/ Turbo cache type VRAM.

.With PS4's single large fast memory pool, all the textures are resident instead of "partial resident". If PS4's user view port requires a more detailed texture, it will "blend in" the textures that are already in the single fast memory pool i.e. it doesn't need to stream from the slower larger memory pool.

Your not thinking harder enough.

Avatar image for KungfuKitten
KungfuKitten

27389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#182  Edited By KungfuKitten
Member since 2006 • 27389 Posts

Well maybe I'll upgrade my rig. Oh wait it's not even close.
Everyone in the industry is waiting for grandma and grandpa PS4 and X1 to get a MOVE ON.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#183 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@ronvalencia: Leave Him Be.

He is not going to understand.

Then he starts saying that you are inventing stuff and he will call you a lem.

He is just a total D!ck.

The amount of childishness in this guy is crazy.

Downplaying everything that has to do with XB1 even if it’s good.

Don’t try to explain anything to torments.

Unless he finally admits that XB1 isn’t a piece of scrap.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#184  Edited By deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@tormentos: So are you telling me that even though the PS4 cam isn’t included with the PS4 and it’s not mandatory, that Sony wouldn't reserve a % of the system for it ?

:/

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#185 tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

@tormentos:

Again, your not understanding PRT's main functions in relation to memory access patterns. Think harder.

AMD Kaveri APU's GCN IGP also has PRT hardware (as part of it's GCN's MMU/virtual memory/logical memory address space functions), but it doesn't have Hyper cache/ Turbo cache type VRAM.

.With PS4's single large fast memory pool, all the textures are resident instead of "partial resident". If PS4's user view port requires a more detailed texture, it will "blend in" the textures that are already in the single fast memory pool i.e. it doesn't need to stream from the slower larger memory pool.

Your not thinking harder enough.

PRT will work just as well on PS4 as it will on xbox one,you people should drop it already,do you have a test proving how the PS4 under perform vs the xbox one on PRT because you have almost a year claiming it will perform better on xbox one,but without a single proof,like i already told you no GCN has ESRAM and the ram pool on xbox one is also unified,which doesn't quite exactly work like PC where you have a pool of DDR3 + a pool of GDDD5,ESRAM is not there to do all the work the main ram does,is there to speed things up.

Point is the test i posted is for textures and unless PRT is not been use on that test which you don't know,the fact still stand the PS4 CPU is performing faster period.

@acp_45 said:

@tormentos: So are you telling me that even though the PS4 cam isn’t included with the PS4 and it’s not mandatory, that Sony wouldn't reserve a % of the system for it ?

:/

There is no need to reserve resources for something that is not there.

Kinect is mandatory is comes with the xbox one,and the xbox one was build around it,its features requires it,the PS4 wasn't build around the PS eye,this is the mark difference.

Not only that snap is a feature that also requires GPU time,snap is not on PS4,so you see the PS4 even if it reserves something i don't think more than 2% the xbox one and PS4 are not the same with it comes to OS and system features,the xbox one was build to be a multimedia remote all in one box the PS4 wasn't.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#186 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@tormentos: So what happens if someone actually buys the PS cam?

Does the system then use some percentage of the system that would have been used for something else but which isn’t needed at that moment ?

Not sure how this works.

How will the cam function ?

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#187 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@ronvalencia: PS4 will have PRT using an AMD extension.

Can’t remember what the extension is called though.

People are saying that it is better on the Xbox One DX11.2 superset.

Others are saying PS4 tiled resources through extension are better.

Is the PS4’s PRT hardware based or software based. Using an extension through the GPU sounds pretty hardware based to me.

I’ve also read on OpenGL.org that their Tiled Resources are sitting on a fix list with all the other buggy stuff that OpenGL has. But apparently it’s not used for consoles but usually at tech demos. Does PS4 use OpenGL or does it use a hybrid of it ?

Other than that, PS4 does have PRT.

But how can you people say MS PRT is better than SONY’s PRT ?

You seem to know about these things.

Explain.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#188 tormentos
Member since 2003 • 33784 Posts

@acp_45 said:

@tormentos: So what happens if someone actually buys the PS cam?

Does the system then use some percentage of the system that would have been used for something else but which isn’t needed at that moment ?

Not sure how this works.

How will the cam function ?

Nothing.

If you want to run a game that support the camera the resources are coming like in any game from the GPU,if you are not gaming the PS4 is not using its power so it can use any resources to use the camera.

On xbox one is not like that,you are gaming and you call Kinect or a guide for cable and to do that in mid game you need resources,not only the xbox one will allow you to call a feature over Kinect it will run it alone side the game.

That is where the penalty is,is the same reason i don't believe MS when they claimed that they would give back the 10% from the reservation to games,because those resources are tied,now lets say that it doesn't use 10% and it only use 6 or 5% well yeah you can get the other 5% back,but the complete 10% is a joke.they do that and you will not be able to use kinect to call anything and much less runs different apps at the same time like snap.

And i think the reason for the so call 10% spike by this driver is to silence people about the 10% reservation which is very real,drivers improve performance,that also apply to the PS4,and PC,there is no evidence that the drive on the final one,drivers improve all the time on PC,and this are GPU from PC,so both will see improvements there.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#189 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

So the PS cam won’t use anything in the system.

Am I right ?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#190 tormentos
Member since 2003 • 33784 Posts

@acp_45 said:

So the PS cam won’t use anything in the system.

Am I right ?

They PS4 already use the CPU for system and OS.

If there is a reservation is lower than 2% since the PS4 doesn't have features like snap,nor has a camera that doubles as a remote for cable box guide,the xbox one reservation is so high that most of Shape's the sound block is also for Kinect,and little if for developers to use,this was state in Beyond3D by some one who worked on that sound block on the xbox one.

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#191 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

Oh Okay........

But what happens if a game is heavy CPU dependent and you are multitasking on PS4 and using PS cam.

How will this play out ?

The sound block on Xbox One has a lot more channels though. Some of them obviously used by Kinect.

511 channels in total if I’m correct.

But games these days only use somewhere in the 100 channels.

PS4 has 200 and something.

So there can’t really be a comparison.

Since Kinect is mandatory and Xbox One has those extra 200 channels for it.

All in All 200 channels is already enough to provide audio for a game.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#192  Edited By blackace
Member since 2002 • 23576 Posts

@Mr_KMG said:

@acp_45 said:

@tormentos: LOL,

*insert massive post of misinformation and bias

I normally stay out of participating in system wars discussions but I've never seen such great attempt at trying to (falsely) claiming a system is better....

Jeez...

Tormentos is clueless. That's why most intelligent gamers on here just ignore him. I don't read any of his garbage anymore. He thinks he knows everything, but he don't know shit!!

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#193 tormentos
Member since 2003 • 33784 Posts

@acp_45 said:

Oh Okay........

But what happens if a game is heavy CPU dependent and you are multitasking on PS4 and using PS cam.

How will this play out ?

The sound block on Xbox One has a lot more channels though. Some of them obviously used by Kinect.

511 channels in total if I’m correct.

But games these days only use somewhere in the 100 channels.

PS4 has 200 and something.

So there can’t really be a comparison.

Since Kinect is mandatory and Xbox One has those extra 200 channels for it.

All in All 200 channels is already enough to provide audio for a game.

Is that scenario even possible.?

How much you think the PS4 can multi task+ also runs the camera while a heavy CPU bound game is running.?

If the camera is been use is because it is for a game and the resources were allocated for that game,you will not use the PS camera to call for features while your gaming on PS4,in fact on PS4 to use commands you don't even need a camera.

You are trying to create a scenario that doesn't exist,there is no reservation for the camera on PS4,if a game support it is because the resources are coming and been given by the developer of that game,unlike MS which has 10% reserve for that purpose or other like snap,the whole point of the reservation is that all runs smooth,without having to tax into game resources.

Kinect takes almost the resources from Shape,stated by Bikillian who worked for MS on that sound block,it doesn't matter if it has 1000 channels most of the resources are there for Kinect,the rest for games,and is very little.

The PS4 has true audio from AMD,but unlike the xbox one is almost all for the camera.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#194 tormentos
Member since 2003 • 33784 Posts
@blackace said:

@Mr_KMG said:

@acp_45 said:

@tormentos: LOL,

*insert massive post of misinformation and bias

I normally stay out of participating in system wars discussions but I've never seen such great attempt at trying to (falsely) claiming a system is better....

Jeez...

Tormentos is clueless. That's why most intelligent gamers on here just ignore him. I don't read any of his garbage anymore. He thinks he knows everything, but he don't know shit!!

Basically know you are the running joke of Gamespot,you are a die hard lemming who claims to be a manticore,but what you do all day is kiss MS ass while flaming sony,your last few days here had been mark by epic meltdowns and been owned left and right all the time...

hahaha

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#195 deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@tormentos: You need very little for games. As I said.

There hasn’t been ONE game that has ever surpassed the 200channel mark.

I don’t want to go into Audio tech because I’m not savvy when it comes to that.

But there are some things we don’t know.

But I do believe that the PS4 wouldn’t need any resources from GPU when it comes to sound processing.

Unless, I’m a total idiot.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#196  Edited By Tighaman
Member since 2006 • 1038 Posts

@acp_45: TORM is down right retarded if don't think if power is not reserved for other things, no way you can be tech savy and come up with that conclusion. If you have a system but you KNOW you gonna have other things connected you must reserve some of the resources of the system that cam for the ps4 already has an UI and a GAME when you hook the cam up so you don't think theres some RESERVED power to run that UI and GAME TORM? come on dude cant be that silly lol

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#197  Edited By Tighaman
Member since 2006 • 1038 Posts

@tormentos: I got the info from the guy on beyond3d on the reservation and he said between 22 and 18% believe me I never make numbers on my own I because numbers never lie. Kinect 10% has nothing to do with the reservation of power on the ps4 do you actually think it takes more power to use Kinect for voice commands than from your controller to the ps4 system to the tv? The cam has its own UI and GAME already embedded in the system. You scare me the way you think, but you can still keep that thought I see you cant change your thoughts and you riding that SONY wave and you don't want to give me your hand before you crash into the a bigger wave lol

Avatar image for deactivated-5a44ec138c1e6
deactivated-5a44ec138c1e6

2638

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 5

#198  Edited By deactivated-5a44ec138c1e6
Member since 2013 • 2638 Posts

@Tighaman: LMAO xD

"you riding that SONY wave and you don't want to give me your hand before you crash into the a bigger wave"


That’s the best thing anyone has said in this forum.

:P

It makes totally sense what you have said.

"If you have a system but you KNOW you gonna have other things connected you must reserve some of the resources of the system that cam for the ps4 already has an UI and a GAME when you hook the cam up"

Just because it’s not mandatory doesn’t mean there is no space for it in the system.

How will the PS cam function with no % of the system ready for it’s use ?

Then that PS cam will be bought for nothing really, unless if it’s only there to make your setup look pretty.

Tormentos, c’mon.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#199 tormentos
Member since 2003 • 33784 Posts

@acp_45 said:

@tormentos: You need very little for games. As I said.

There hasn’t been ONE game that has ever surpassed the 200channel mark.

I don’t want to go into Audio tech because I’m not savvy when it comes to that.

But there are some things we don’t know.

But I do believe that the PS4 wouldn’t need any resources from GPU when it comes to sound processing.

Unless, I’m a total idiot.

Well that is the reason why is mostly for Kinect.

Also i bring Shapes so your could see why the xbox one has such a GPU demand.

@Tighaman said:

@acp_45: TORM is down right retarded if don't think if power is not reserved for other things, no way you can be tech savy and come up with that conclusion. If you have a system but you KNOW you gonna have other things connected you must reserve some of the resources of the system that cam for the ps4 already has an UI and a GAME when you hook the cam up so you don't think theres some RESERVED power to run that UI and GAME TORM? come on dude cant be that silly lol

Dude the PS4 has 2 CPU cores are of now for system,it doesn't need huge GPU resources because is not doing heavy compute on its UI like the xbox one has,10% of the xbox one GPU is more than 100Gflops,from 1310 Gflops the xbox one has is down to 1180 that 130Gflops that more that the total power of Shapes the Audio block.

When you hook the camera idiot and play that game the system is assigning resources to the camera like it does with any game that doesn't use the camera,the PS4 camera is not like Kinect,and the xbox on UI is not like the Ps4 one either snap fu**ing requires GPU time period,you are doing picture and picture basically while multitasking,and having the game in suspend state or running,that is why the xbox one need a reservation,on PS4 is not like that period.

@Tighaman said:

@tormentos: I got the info from the guy on beyond3d on the reservation and he said between 22 and 18% believe me I never make numbers on my own I because numbers never lie. Kinect 10% has nothing to do with the reservation of power on the ps4 do you actually think it takes more power to use Kinect for voice commands than from your controller to the ps4 system to the tv? The cam has its own UI and GAME already embedded in the system. You scare me the way you think, but you can still keep that thought I see you cant change your thoughts and you riding that SONY wave and you don't want to give me your hand before you crash into the a bigger wave lol

Once again link..

You are and IDIOT with capital letters,how the fu** the PS4 will require 18 to 22% when it doesn't have snap or a mandatory camera how the fu**.?

Yeah on your crazy little world,you know what i will play by your rules then.

I read the xbox one true reservation is 30 to 35% on beyond3d,now i don't have a link you most take my word for it...

This is what you are doing basically.

A game coming build in into the drive moron doesn't mean system resources are reserved,what the fu** man,from where in hell do you pull this sad theories...

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#200 tormentos
Member since 2003 • 33784 Posts
@acp_45 said:

@Tighaman: LMAO xD

"you riding that SONY wave and you don't want to give me your hand before you crash into the a bigger wave"

That’s the best thing anyone has said in this forum.

:P

It makes totally sense what you have said.

"If you have a system but you KNOW you gonna have other things connected you must reserve some of the resources of the system that cam for the ps4 already has an UI and a GAME when you hook the cam up"

Just because it’s not mandatory doesn’t mean there is no space for it in the system.

How will the PS cam function with no % of the system ready for it’s use ?

Then that PS cam will be bought for nothing really, unless if it’s only there to make your setup look pretty.

Tormentos, c’mon.

You don't get it do you.?

You don't need to reserve resources on PS4 to do those things,the PS4 already has 2 CPU cores for system and OS,that includes the UI as well,the only reservation the PS4 could have is for display,not for the camera.

It doesn't mater that the PS4 has support for the cameras and games build into the system,it could have Killzone SF build in on the HDD that means nothing that is just another app installed on the HDD,that one is free so if some one want the camera they have something to play with,once you hook the camera and play the game the system moves all its resources but the CPU cores reserve for OS and system,and anything the machine is using for display.

On the xbox one has a reservation because MS build the xbox one to be a cable box remote,with snap features to run multiple apps at once.