yea, its happening. DirectX12 on XB1 -Fable Legends/DX12-

Avatar image for slimdogmilionar
slimdogmilionar

1343

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#151 slimdogmilionar
Member since 2014 • 1343 Posts

@tormentos:

http://www.ign.com/articles/2010/08/26/xbox-360-vs-playstation-3-the-hardware-throwdown

yea you see it says 256GBps via edram, granted we all know 360 wasn't geting that speed in every scenario but it could still hit those numbers, but xbox360 required edram and main ram to be able to work concurrent. I already said PRT don't need esram to work but Intel and Nvidia both have chips coming out with embedded ram, so I'm guessing that the top guys in the industry or on to something new. Thank you for proving my point PRT don't need Esram to work but what microsoft had invisioned for tiled resources they obviously do. Whether its the same or not M$ vision was to use esram. Once again trying to prove you know everything innovation just fly's right over your head. I could go into depth more but I just got done writing a huge a paper and don't feel like going back and forth with you right now. Just know this Nvidia has a gpu to support Dx12 at a hardware level so why would M$ not make their own console support it the hardware level. Even though older cards are dx12 ready Nvidia has the only card right now to support DX12 at a hardware level.

http://www.extremetech.com/computing/190581-nvidias-ace-in-the-hole-against-amd-maxwell-is-the-first-gpu-with-full-directx-12-support

Now again I ask you why would M$ not build their own console to support Directx features at the hardware level?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#152 tormentos
Member since 2003 • 33784 Posts

@slimdogmilionar said:

@tormentos:

http://www.ign.com/articles/2010/08/26/xbox-360-vs-playstation-3-the-hardware-throwdown

yea you see it says 256GBps via edram, granted we all know 360 wasn't geting that speed in every scenario but it could still hit those numbers, but xbox360 required edram and main ram to be able to work concurrent. I already said PRT don't need esram to work but Intel and Nvidia both have chips coming out with embedded ram, so I'm guessing that the top guys in the industry or on to something new.

Thank you for proving my point PRT don't need Esram to work but what microsoft had invisioned for tiled resources they obviously do. Whether its the same or not M$ vision was to use esram. Once again trying to prove you know everything innovation just fly's right over your head. I could go into depth more but I just got done writing a huge a paper and don't feel like going back and forth with you right now. Just know this Nvidia has a gpu to support Dx12 at a hardware level so why would M$ not make their own console support it the hardware level. Even though older cards are dx12 ready Nvidia has the only card right now to support DX12 at a hardware level.

http://www.extremetech.com/computing/190581-nvidias-ace-in-the-hole-against-amd-maxwell-is-the-first-gpu-with-full-directx-12-support

Now again I ask you why would M$ not build their own console to support Directx features at the hardware level?

No it could not...

The link you quoted is from 2010,the one i posted about Major nelson lying about EDRAM was from 2005 the PS3 wasn't even out and MS was damage controlling it..

The 278GB comes from joining the EDRAM wit the main memory speed which didn't work that way you can't ad bandwidth like that,in fact on xbox one there was nothing to feed the EDRAM 258GB's.

No you sad monkey PRT = Tile resources..

reply to me when you accept that PRT and Tile resources are the same sh**...

Nvidia’s ace in the hole against AMD: Maxwell is the first GPU with full DirectX 12 support

From your own link that means not for xbox one first mean FIRST that mean the xbox one doesn't have it since it is GCN.

Hahahaaaaaaaaaaaaaaaaaa

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#154 Tighaman
Member since 2006 • 1038 Posts

@tormentos: dx12 is xbox where you think they testing and incubating? The first full dx12 game is gonna be on the xbox one. You still dreaming and hoping that its a 7790 in the xbox one when 1.3 tf just for gpgpu just look at the games show me a busy game on the ps4? Show me a locked framerate game that's not a cross gen or indie game. Show me something with more than 7 people on the screen. Show me interaction with the world. Show me the power lol

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#155 Tighaman
Member since 2006 • 1038 Posts

@magicalclick: No dx 11 is high level Api to much legacy overhead that's why they are making dx12 API from scratch. That's what to the metal means.

Ps4 has its own high and low level API gsm& gxms something like that it only has opengl and dx wrappers

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#156  Edited By tymeservesfate
Member since 2003 • 2230 Posts

@cainetao11 said:

@lostrib said:

Didn't some Dev already say that DX12 won't do much for X1

I think Spencer himself said it wont. This only matters to insecure people.

i'm insecure you're saying...?

Avatar image for Krelian-co
Krelian-co

13274

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#157  Edited By Krelian-co
Member since 2006 • 13274 Posts

for all you xbros out there

Avatar image for cainetao11
cainetao11

38036

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#158 cainetao11
Member since 2006 • 38036 Posts

@tymeservesfate said:

@cainetao11 said:

@lostrib said:

Didn't some Dev already say that DX12 won't do much for X1

I think Spencer himself said it wont. This only matters to insecure people.

i'm insecure you're saying...?

Well, what does it really matter, unless you care about others (cows) insults to the X1? And why would one care? The console is stronger than last gens, and is going to share most of the same games as the PS4. PS4 will have an edge in graphics. My PC shits on both, big deal. There will be games that are on the X1 that I want to play, so I own it and will play them. Why do you need to be right about DX12?

Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#159 Gue1
Member since 2004 • 12171 Posts

lems expecting a miracle out of dx12 on the xbone will be in for a disappointment. Consoles not only have low level api support from day one but whatever new stuff dx12 will allow the xbone will not even be able to render it anyway.

dx12 might be big for PC but for the xbone it will be nothing. An Api can't make weak hw become more powerful.

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#160 tymeservesfate
Member since 2003 • 2230 Posts

@cainetao11 said:

@tymeservesfate said:

@cainetao11 said:

@lostrib said:

Didn't some Dev already say that DX12 won't do much for X1

I think Spencer himself said it wont. This only matters to insecure people.

i'm insecure you're saying...?

Well, what does it really matter, unless you care about others (cows) insults to the X1? And why would one care? The console is stronger than last gens, and is going to share most of the same games as the PS4. PS4 will have an edge in graphics. My PC shits on both, big deal. There will be games that are on the X1 that I want to play, so I own it and will play them. Why do you need to be right about DX12?

that's true, but like i said earlier in the thread...i didnt realize this was such a sensitive subject on here. i just saw a story about Dx12 being implemented onto the XB1 through the Fable and The Division games. the thought that the XB1 is finding ways/techniques to enhance it's graphic capabilities over time is good and interesting to me, so i made the thread to discuss it. the thread wasn't made to really jab at the cows on here though. its not about me being right or wrong. this thread is a 'look what seems to be happening' thread. yea, it turned into the usual system wars BS but thats how it is on here.

if you read through the thread i'm not instigating shii....i'm mostly replying to BS comments directed towards me just for posting info. while adding more info i came across that re-enforces the fact that something positive is going on between DX12 and the XB1, that's it. no one is saying what you seem to think i'm saying. saying i'm insecure because i actually responded to certain people in my own thread on a forum that discusses things, is ridiculous.

what, anyone pro-Xbox, or who posts good news about the console gets attacked n slandered on here now lol? how dare i post positive news about the Xbox One?? i must be insecure...-_- smh. -sarcasm- no offense, but some of you self hating Xbox fans need to stop being so insecure about liking the system yourselves. nothing about this thread or the posters posting in it screams insecure besides a few cows.

Avatar image for cainetao11
cainetao11

38036

Forum Posts

0

Wiki Points

0

Followers

Reviews: 77

User Lists: 1

#161 cainetao11
Member since 2006 • 38036 Posts

@tymeservesfate said:

@cainetao11 said:

@tymeservesfate said:

@cainetao11 said:

@lostrib said:

Didn't some Dev already say that DX12 won't do much for X1

I think Spencer himself said it wont. This only matters to insecure people.

i'm insecure you're saying...?

Well, what does it really matter, unless you care about others (cows) insults to the X1? And why would one care? The console is stronger than last gens, and is going to share most of the same games as the PS4. PS4 will have an edge in graphics. My PC shits on both, big deal. There will be games that are on the X1 that I want to play, so I own it and will play them. Why do you need to be right about DX12?

that's true, but like i said earlier in the thread...i didnt realize this was such a sensitive subject on here. i just saw a story about Dx12 being implemented onto the XB1 through the Fable and The Division games. the thought that the XB1 is finding ways/techniques to enhance it's graphic capabilities over time is good and interesting to me, so i made the thread to discuss it. the thread wasn't made to really jab at the cows on here though. its not about me being right or wrong. this thread is a 'look what seems to be happening' thread. yea, it turned into the usual system wars BS but thats how it is on here.

if you read through the thread i'm not instigating shii....i'm mostly replying to BS comments directed towards me just for posting info. while adding more info i came across that re-enforces the fact that something positive is going on between DX12 and the XB1, that's it. no one is saying what you seem to think i'm saying. saying i'm insecure because i actually responded to certain people in my own thread on a forum that discusses things, is ridiculous.

what, anyone pro-Xbox, or who posts good news about the console gets attacked n slandered on here now lol? how dare i post positive news about the Xbox One?? i must be insecure...-_- smh. -sarcasm- no offense, but some of you self hating Xbox fans need to stop being so insecure about liking the system yourselves. nothing about this thread or the posters posting in it screams insecure besides a few cows.

Ok, but I never really said YOU are insecure. If you took it to mean you, one could say that is proof, couldn't they? I have read the thread, and I see a lot of back n forth about this being like the 4xMSAA or whatever. Dude, the console is awesome, only people who don't accept that it will have games worth playing are fanboys and a small percentage of actual grounded people who made that decision AFTER all the games are out, so at the end of the gen.

Avatar image for scatteh316
scatteh316

10273

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#162 scatteh316
Member since 2004 • 10273 Posts

Damn, @tormentos on a role owning noobs in here...

Avatar image for gamersjustgame
GamersJustGame

323

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#163 GamersJustGame
Member since 2014 • 323 Posts

Can we please get this bet official?

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#164 GrenadeLauncher
Member since 2004 • 6843 Posts

@tymeservesfate said:

lack of advancement and game production.

Can easily say the same of the Xbone tbh.

Avatar image for slimdogmilionar
slimdogmilionar

1343

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#165 slimdogmilionar
Member since 2014 • 1343 Posts

@tormentos said:

Nvidia’s ace in the hole against AMD: Maxwell is the first GPU with full DirectX 12 support

From your own link that means not for xbox one first mean FIRST that mean the xbox one doesn't have it since it is GCN.

Hahahaaaaaaaaaaaaaaaaaa

You are an idiot if you truly believe that once M$ got the gpu from AMD that they did not make their own adjustments and customize the gpu to fit their vision, evidence of that is the fact that they stripped it down to add esram. Xbox is a heavily customized architecture customized by M$ not AMD idiot. During the whole interview with Eurogamer all Goosen and Baker did was talk about how they customized the Xbox one architecture and customizing it to add shortcuts into the hardware.

"Andrew Goossen: I'll jump in on that one. Like Nick said there's a bunch of engineering that had to be done around the hardware but the software has also been a key aspect in the virtualisation. We had a number of requirements on the software side which go back to the hardware."

The way xbox 360 was designed it required you to not only is this old news but Goosen once again confirmed this in his interview with Eurogamer

"Andrew Goossen: I just wanted to jump in from a software perspective. This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM."

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#166 tymeservesfate
Member since 2003 • 2230 Posts

@cainetao11 said:

@tymeservesfate said:

@cainetao11 said:

@tymeservesfate said:

@cainetao11 said:

@lostrib said:

Didn't some Dev already say that DX12 won't do much for X1

I think Spencer himself said it wont. This only matters to insecure people.

i'm insecure you're saying...?

Well, what does it really matter, unless you care about others (cows) insults to the X1? And why would one care? The console is stronger than last gens, and is going to share most of the same games as the PS4. PS4 will have an edge in graphics. My PC shits on both, big deal. There will be games that are on the X1 that I want to play, so I own it and will play them. Why do you need to be right about DX12?

that's true, but like i said earlier in the thread...i didnt realize this was such a sensitive subject on here. i just saw a story about Dx12 being implemented onto the XB1 through the Fable and The Division games. the thought that the XB1 is finding ways/techniques to enhance it's graphic capabilities over time is good and interesting to me, so i made the thread to discuss it. the thread wasn't made to really jab at the cows on here though. its not about me being right or wrong. this thread is a 'look what seems to be happening' thread. yea, it turned into the usual system wars BS but thats how it is on here.

if you read through the thread i'm not instigating shii....i'm mostly replying to BS comments directed towards me just for posting info. while adding more info i came across that re-enforces the fact that something positive is going on between DX12 and the XB1, that's it. no one is saying what you seem to think i'm saying. saying i'm insecure because i actually responded to certain people in my own thread on a forum that discusses things, is ridiculous.

what, anyone pro-Xbox, or who posts good news about the console gets attacked n slandered on here now lol? how dare i post positive news about the Xbox One?? i must be insecure...-_- smh. -sarcasm- no offense, but some of you self hating Xbox fans need to stop being so insecure about liking the system yourselves. nothing about this thread or the posters posting in it screams insecure besides a few cows.

Ok, but I never really said YOU are insecure. If you took it to mean you, one could say that is proof, couldn't they? I have read the thread, and I see a lot of back n forth about this being like the 4xMSAA or whatever. Dude, the console is awesome, only people who don't accept that it will have games worth playing are fanboys and a small percentage of actual grounded people who made that decision AFTER all the games are out, so at the end of the gen.

fair enough, sorry for the rant. but again, i didnt make the thread with the mindset of, "this will show cows Xbox/MS is awesome" lol. but i am going to answer anyone on here, save a few people, that seem like they need an answer.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#168 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@gamersjustgame said:

Can we please get this bet official?

@tormentos@GrenadeLauncher Lets make the bet. Since both of you are so confident that DX12 will do nothing for the Xbox One lets bet your accounts on it. Put up or shut up time gentlemen.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#169  Edited By tormentos
Member since 2003 • 33784 Posts

@ttboy said:

@gamersjustgame said:

I hate to be that guy.....

But is the bet on or off?

I'm down to do it but none of the usual Sony guys want to. I wonder why.

You have an alter account with 200+ post this account is as irrelevant as your tdkmilksy one as well from 2003,get blackace to bet with me and you have a deal sucker,your account is worthless to me..hahaha

@Tighaman said:

Dx11. high level API

Dx12 low level API

its not hard to understand that xbox one has not used a low level API its just been using dx11 a bunch of jealous fanboys with nothing else to do. Torm is a cry baby bring them cross gen ports and pasting things from the beginning of the gen trying to claim ownage and no ones listening to him hes system wars MrXmedia. Whats ps4 take on now on SVGI because it ran terrible on the last demo,

Ps4 GPU can use PRT but its tiers to that and the ps4 is on tier one

Nvidia already said VTR volume tiled resources is next big thing now they lying?

the DDR3 in the xbox is not the same as you buy from the store alot denser.

the two move engines compared to the 4 in the xbox one dont have lz color compression.

the 32mb is not for storing stupid its for moving fast

there is not one ps4 game right not thats 4xmsaa 1080p locked at 30fps open world

all them things that made ps4 more powerful is null and void.

ps4 fan is not running loud because of the gpu but the cpu is getting killed thats why you never and never will see a ps4 game with alot of stuff going on on the screen at once. not to mention BW bound with 130gb real BW for the whole system while xbox one has 150+ for the gpu alone

DX11.X on xbox one is already low level buffoon,another example of you know sh** about what your talking and just invent crap.

Lord of the Fallen is not cross gen idiot,is XBO,PC,PS4 what is your excuse for that.?

Hahaha that bold part is total gibberish...hahahaha

Tile Resources and PRT are the same sh**,work the same way and don't need either ESRAM.

PRT is hardware support for virtual texturing or megatexturs,it is not some magical thing that only MS has and sony doesn't you people keep using the same sh** excuse..

By the end that end of the bold part yeah that is what Ronvalencia use to say..hahahaha

Is the same DDR3 dude stop your bullshi**,the only different is that is solder into the board unlike on PC where you buy it on a small card,and has teh same bandwidth to 68GB/s.

So what is next the Blu-ray drive isn't the same as PC either is some custom made Blu-ray drive that in conjunction with ESRAM increase the power of the XBO.?

And the only game that does in on xbox one look like shit and is a freaking racer..hahahaa

DC doesn't use MSAAx4 yet freaking piss all over FH2 graphically hahahaa and does everything dynamic,not you can change the argument to which will be better to hide that you loss..lol

That second bold part... yeah...

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#170  Edited By tormentos
Member since 2003 • 33784 Posts

@Tighaman said:

@tormentos: dx12 is xbox where you think they testing and incubating? The first full dx12 game is gonna be on the xbox one. You still dreaming and hoping that its a 7790 in the xbox one when 1.3 tf just for gpgpu just look at the games show me a busy game on the ps4? Show me a locked framerate game that's not a cross gen or indie game. Show me something with more than 7 people on the screen. Show me interaction with the world. Show me the power lol

For once we agree with something,DX12 is on xbox one it has been since 2005,DX12 is basically MS tools for consoles,so when DX12 hit you will see some gain on PC which didn't have to the metal push support,on XBO you will see nothing is already there since day 1.

Watch Dogs is super busy and is superior on PS4,now you can hide on cross gen..hahahaha

No matter what the xbox one best looking games is an enclosed crap fill with narrow pathways is basically a corridor shooter but outside and without the shooting..hahah

Infamous has better frames than Ryse, and is 1080p open world..hahahaaaaaaaaaaaaaaaaaa

Show me a next gen game on xbox one that look next gen and is not an enclosed crap like Ryse..hahaha

@Tighaman said:

@magicalclick: No dx 11 is high level Api to much legacy overhead that's why they are making dx12 API from scratch. That's what to the metal means.

Ps4 has its own high and low level API gsm& gxms something like that it only has opengl and dx wrappers

DX11 is not on xbox one..ahahhaaaaaaaaaaaaaaaaaaaaaaaaaa

Is DX11.X and is lower level compare to PC,some developer have talk about it already..haha

But is funny on your other post you claim that DX12 is on xbox one already,but on this one it is now and they are building it from scratch..hahaha

@tymeservesfate said:

that's true, but like i said earlier in the thread...i didnt realize this was such a sensitive subject on here. i just saw a story about Dx12 being implemented onto the XB1 through the Fable and The Division games. the thought that the XB1 is finding ways/techniques to enhance it's graphic capabilities over time is good and interesting to me, so i made the thread to discuss it. the thread wasn't made to really jab at the cows on here though. its not about me being right or wrong. this thread is a 'look what seems to be happening' thread. yea, it turned into the usual system wars BS but thats how it is on here.

if you read through the thread i'm not instigating shii....i'm mostly replying to BS comments directed towards me just for posting info. while adding more info i came across that re-enforces the fact that something positive is going on between DX12 and the XB1, that's it. no one is saying what you seem to think i'm saying. saying i'm insecure because i actually responded to certain people in my own thread on a forum that discusses things, is ridiculous.

what, anyone pro-Xbox, or who posts good news about the console gets attacked n slandered on here now lol? how dare i post positive news about the Xbox One?? i must be insecure...-_- smh. -sarcasm- no offense, but some of you self hating Xbox fans need to stop being so insecure about liking the system yourselves. nothing about this thread or the posters posting in it screams insecure besides a few cows.

No you see a demo of Fable on PC running on a shinny new Maxwell GPU from Nvidia and you falsely claim the gain would be on xbox one..hahahaaaaaaaaaaa

Get this on your head what ever thing the XBO can do by software the PS4 can do it as well and probably faster,they have the same GPU and CPU period that is were their capabilities come from.

The PS4 simple has more resources period MS basically has tune down their claims,because they know and xbox fans can't do the same.

@slimdogmilionar said:

You are an idiot if you truly believe that once M$ got the gpu from AMD that they did not make their own adjustments and customize the gpu to fit their vision, evidence of that is the fact that they stripped it down to add esram. Xbox is a heavily customized architecture customized by M$ not AMD idiot. During the whole interview with Eurogamer all Goosen and Baker did was talk about how they customized the Xbox one architecture and customizing it to add shortcuts into the hardware.

"Andrew Goossen: I'll jump in on that one. Like Nick said there's a bunch of engineering that had to be done around the hardware but the software has also been a key aspect in the virtualisation. We had a number of requirements on the software side which go back to the hardware."

The way xbox 360 was designed it required you to not only is this old news but Goosen once again confirmed this in his interview with Eurogamer

"Andrew Goossen: I just wanted to jump in from a software perspective. This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM."

You are the idiot hardware is everything inside the xbox one,he didn't say GPU,and the xbox one has some more things,2 extra DNA and ESRAM that is about it,the rest of the shit is also on PS4,like sound hardware,encoders and decoders and all that crap which was already disprove.

They didn't strip down idiot they took a 7790 and fit it in,the reason they chose a 7790 was because Bonairne is smaller than Pitcairn inside the PS4,since they need it ESRAM and all that crap inside the APU,something got to give,and it was the GPU,With ESRAM a PS4 like GPU would have not fit.

Is the same GCN sony has but weaker,the biggest change they did was disabling 2CU for yields..hahaha

Oh it is confirmed there is a 7790 with 2 CU disable and lower clock speed,which is the reason why the xbox one has problems even keeping with a 7770..hahahaha

Once again your link own you,...

Digital Foundry: When you look at the specs of the GPU, it looks very much like Microsoft chose the AMD Bonaire design and Sony chose Pitcairn - and obviously one has got many more compute units than the other. Let's talk a little bit about the GPU - what AMD family is it based on: Southern Islands, Sea Islands, Volcanic Islands?

Andrew Goossen: Just like our friends we're based on the Sea Islands family. We've made quite a number of changes in different parts of the areas. The biggest thing in terms of the number of compute units, that's been something that's been very easy to focus on. It's like, hey, let's count up the number of CUs, count up the gigaflops and declare the winner based on that. My take on it is that when you buy a graphics card, do you go by the specs or do you actually run some benchmarks? Firstly though, we don't have any games out. You can't see the games. When you see the games you'll be saying, "What is the performance difference between them?" The games are the benchmarks. We've had the opportunity with the Xbox One to go and check a lot of our balance. The balance is really key to making good performance on a games console. You don't want one of your bottlenecks being the main bottleneck that slows you down.

Balance is so key to real effective performance. It's been really nice on Xbox One with Nick and his team and the system design folks have built a system where we've had the opportunity to check our balances on the system and make tweaks accordingly. Did we do a good job when we did all of our analysis a couple of years ago and simulations and guessing where games would be in terms of utilisation? Did we make the right balance decisions back then? And so raising the GPU clock is the result of going in and tweaking our balance. Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing. But we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did.

Now everybody knows from the internet that going to 14 CUs should have given us almost 17 per cent more performance but in terms of actual measured games - what actually, ultimately counts - is that it was a better engineering decision to raise the clock. There are various bottlenecks you have in the pipeline that [can] cause you not to get the performance you want [if your design is out of balance].

Hahahaaaaaaaaaaaaaaaaaaaaaa...

No DX12 full support..hahaha

Digital Foundry: And you have CPU read access to the ESRAM, right? This wasn't available on Xbox 360 eDRAM.

Nick Baker: We do but it's very slow.

This ^^ is a lie MS they openly lie to DF about this ^^ i remember there was a huge buzz over the xbox one not been true HSA because for HSA the CPU has to be able to see the Data at all times,this eliminates the copy and paste that goes all the time between GPU and CPU on PC.

MS is a desperate attend at hot ships showed a diagram which showed a black line connecting from ESRAM to CPU so people begin to claim it was true HSA.

This article your quoting is from October 5 2013,on April 4 MS was making a parade about how great ESRAM was,and let slip that the CPU has no access to ESRAM,which mean while the data is on ESRAM the CPU can't see it so no true HSA for MS.

http://www.dualshockers.com/2014/04/04/microsoft-explains-why-the-xbox-ones-esram-is-a-huge-win-and-helps-reaching-1080p60-fps/

EDRAM had nothing to feed it 258Gb,which was the problem,and again having 278GB's with a crappy GPU like the Xenos was a joke much more advance GPU of today don't use that,hell the 7950 doesn't and is generations ahead.

EDRAM could not read and write which basically kill the whole adding bandwidth crap,he PS3 had 5 times less bandwidth and produced better looking games what does that tell you.?

@gamersjustgame said:

Can we please get this bet official?

Is this your new account ttboy.?

Get blackace to be with me and you have a deal..hahaha

@ttboy said:

@gamersjustgame said:

Can we please get this bet official?

@tormentos@GrenadeLauncher Lets make the bet. Since both of you are so confident that DX12 will do nothing for the Xbox One lets bet your accounts on it. Put up or shut up time gentlemen.

Get blackace no one would bet against your alter account dude,200 post..lol

Hell get Mems hahahah....

Your account is already worthless,but i can bet my Eltormo account with you if you want.? hahaha

Avatar image for Heil68
Heil68

60718

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#172 Heil68
Member since 2004 • 60718 Posts

tormentos giving out his daily dose of knowledge AND ownage. It's breathtaking.

Avatar image for tonitorsi
tonitorsi

8692

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#173  Edited By tonitorsi
Member since 2006 • 8692 Posts

@tormentos said:

@Tighaman said:

@tormentos: dx12 is xbox where you think they testing and incubating? The first full dx12 game is gonna be on the xbox one. You still dreaming and hoping that its a 7790 in the xbox one when 1.3 tf just for gpgpu just look at the games show me a busy game on the ps4? Show me a locked framerate game that's not a cross gen or indie game. Show me something with more than 7 people on the screen. Show me interaction with the world. Show me the power lol

For once we agree with something,DX12 is on xbox one it has been since 2005,DX12 is basically MS tools for consoles,so when DX12 hit you will see some gain on PC which didn't have to the metal push support,on XBO you will see nothing is already there since day 1.

Watch Dogs is super busy and is superior on PS4,now you can hide on cross gen..hahahaha

No matter what the xbox one best looking games is an enclosed crap fill with narrow pathways is basically a corridor shooter but outside and without the shooting..hahah

Infamous has better games than Ryse, and is 1080p open world..hahahaaaaaaaaaaaaaaaaaa

Show me a next gen game on xbox one that look next gen and is not an enclosed crap like Ryse..hahaha

@Tighaman said:

@magicalclick: No dx 11 is high level Api to much legacy overhead that's why they are making dx12 API from scratch. That's what to the metal means.

Ps4 has its own high and low level API gsm& gxms something like that it only has opengl and dx wrappers

DX11 is not on xbox one..ahahhaaaaaaaaaaaaaaaaaaaaaaaaaa

Is DX11.X and is lower level compare to PC,some developer have talk about it already..haha

But is funny on your other post you claim that DX12 is on xbox one already,but on this one it is now and they are building it from scratch..hahaha

@tymeservesfate said:

that's true, but like i said earlier in the thread...i didnt realize this was such a sensitive subject on here. i just saw a story about Dx12 being implemented onto the XB1 through the Fable and The Division games. the thought that the XB1 is finding ways/techniques to enhance it's graphic capabilities over time is good and interesting to me, so i made the thread to discuss it. the thread wasn't made to really jab at the cows on here though. its not about me being right or wrong. this thread is a 'look what seems to be happening' thread. yea, it turned into the usual system wars BS but thats how it is on here.

if you read through the thread i'm not instigating shii....i'm mostly replying to BS comments directed towards me just for posting info. while adding more info i came across that re-enforces the fact that something positive is going on between DX12 and the XB1, that's it. no one is saying what you seem to think i'm saying. saying i'm insecure because i actually responded to certain people in my own thread on a forum that discusses things, is ridiculous.

what, anyone pro-Xbox, or who posts good news about the console gets attacked n slandered on here now lol? how dare i post positive news about the Xbox One?? i must be insecure...-_- smh. -sarcasm- no offense, but some of you self hating Xbox fans need to stop being so insecure about liking the system yourselves. nothing about this thread or the posters posting in it screams insecure besides a few cows.

No you see a demo of Fable on PC running on a shinny new Maxwell GPU from Nvidia and you falsely claim the gain would be on xbox one..hahahaaaaaaaaaaa

Get this on your head what ever thing the XBO can do by software the PS4 can do it as well and probably faster,they have the same GPU and CPU period that is were their capabilities come from.

The PS4 simple has more resources period MS basically has tune down their claims,because they know and xbox fans can't do the same.

@slimdogmilionar said:

You are an idiot if you truly believe that once M$ got the gpu from AMD that they did not make their own adjustments and customize the gpu to fit their vision, evidence of that is the fact that they stripped it down to add esram. Xbox is a heavily customized architecture customized by M$ not AMD idiot. During the whole interview with Eurogamer all Goosen and Baker did was talk about how they customized the Xbox one architecture and customizing it to add shortcuts into the hardware.

"Andrew Goossen: I'll jump in on that one. Like Nick said there's a bunch of engineering that had to be done around the hardware but the software has also been a key aspect in the virtualisation. We had a number of requirements on the software side which go back to the hardware."

The way xbox 360 was designed it required you to not only is this old news but Goosen once again confirmed this in his interview with Eurogamer

"Andrew Goossen: I just wanted to jump in from a software perspective. This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM."

You are the idiot hardware is everything inside the xbox one,he didn't say GPU,and the xbox one has some more things,2 extra DNA and ESRAM that is about it,the rest of the shit is also on PS4,like sound hardware,encoders and decoders and all that crap which was already disprove.

They didn't strip down idiot they took a 7790 and fit it in,the reason they chose a 7790 was because Bonairne is smaller than Pitcairn inside the PS4,since they need it ESRAM and all that crap inside the APU,something got to give,and it was the GPU,With ESRAM a PS4 like GPU would have not fit.

Is the same GCN sony has but weaker,the biggest change they did was disabling 2CU for yields..hahaha

Oh it is confirmed there is a 7790 with 2 CU disable and lower clock speed,which is the reason why the xbox one has problems even keeping with a 7770..hahahaha

Once again your link own you,...

Digital Foundry: When you look at the specs of the GPU, it looks very much like Microsoft chose the AMD Bonaire design and Sony chose Pitcairn - and obviously one has got many more compute units than the other. Let's talk a little bit about the GPU - what AMD family is it based on: Southern Islands, Sea Islands, Volcanic Islands?

Andrew Goossen: Just like our friends we're based on the Sea Islands family. We've made quite a number of changes in different parts of the areas. The biggest thing in terms of the number of compute units, that's been something that's been very easy to focus on. It's like, hey, let's count up the number of CUs, count up the gigaflops and declare the winner based on that. My take on it is that when you buy a graphics card, do you go by the specs or do you actually run some benchmarks? Firstly though, we don't have any games out. You can't see the games. When you see the games you'll be saying, "What is the performance difference between them?" The games are the benchmarks. We've had the opportunity with the Xbox One to go and check a lot of our balance. The balance is really key to making good performance on a games console. You don't want one of your bottlenecks being the main bottleneck that slows you down.

Balance is so key to real effective performance. It's been really nice on Xbox One with Nick and his team and the system design folks have built a system where we've had the opportunity to check our balances on the system and make tweaks accordingly. Did we do a good job when we did all of our analysis a couple of years ago and simulations and guessing where games would be in terms of utilisation? Did we make the right balance decisions back then? And so raising the GPU clock is the result of going in and tweaking our balance. Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing. But we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did.

Now everybody knows from the internet that going to 14 CUs should have given us almost 17 per cent more performance but in terms of actual measured games - what actually, ultimately counts - is that it was a better engineering decision to raise the clock. There are various bottlenecks you have in the pipeline that [can] cause you not to get the performance you want [if your design is out of balance].

Hahahaaaaaaaaaaaaaaaaaaaaaa...

No DX12 full support..hahaha

Digital Foundry: And you have CPU read access to the ESRAM, right? This wasn't available on Xbox 360 eDRAM.

Nick Baker: We do but it's very slow.

This ^^ is a lie MS they openly lie to DF about this ^^ i remember there was a huge buzz over the xbox one not been true HSA because for HSA the CPU has to be able to see the Data at all times,this eliminates the copy and paste that goes all the time between GPU and CPU on PC.

MS is a desperate attend at hot ships showed a diagram which showed a black line connecting from ESRAM to CPU so people begin to claim it was true HSA.

This article your quoting is from October 5 2013,on April 4 MS was making a parade about how great ESRAM was,and let slip that the CPU has no access to ESRAM,which mean while the data is on ESRAM the CPU can't see it so no true HSA for MS.

http://www.dualshockers.com/2014/04/04/microsoft-explains-why-the-xbox-ones-esram-is-a-huge-win-and-helps-reaching-1080p60-fps/

EDRAM had nothing to feed it 258Gb,which was the problem,and again having 278GB's with a crappy GPU like the Xenos was a joke much more advance GPU of today don't use that,hell the 7950 doesn't and is generations ahead.

EDRAM could not read and write which basically kill the whole adding bandwidth crap,he PS3 had 5 times less bandwidth and produced better looking games what does that tell you.?

@gamersjustgame said:

Can we please get this bet official?

Is this your new account ttboy.?

Get blackace to be with me and you have a deal..hahaha

@ttboy said:

@gamersjustgame said:

Can we please get this bet official?

@tormentos@GrenadeLauncher Lets make the bet. Since both of you are so confident that DX12 will do nothing for the Xbox One lets bet your accounts on it. Put up or shut up time gentlemen.

Get blackace no one would bet against your alter account dude,200 post..lol

Hell get Mems hahahah....

Your account is already worthless,but i can bet my Eltormo account with you if you want.? hahaha

Tomato Goya going ham

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#174  Edited By deactivated-583e460ca986b
Member since 2004 • 7240 Posts























Don't bet your account Tormentos! Where will we get meltdowns like this if you lose?



Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#175 GrenadeLauncher
Member since 2004 • 6843 Posts

@ttboy said:

@gamersjustgame said:

Can we please get this bet official?

@tormentos@GrenadeLauncher Lets make the bet. Since both of you are so confident that DX12 will do nothing for the Xbox One lets bet your accounts on it. Put up or shut up time gentlemen.

And how do you intend to objectively measure the difference in graphical output between exclusives released at launch and by the end of 2016?

What's the cut off for DX12 justifying itself as the sekrit sauce that saves the Xbone?

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#176 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@GrenadeLauncher said:

@ttboy said:

@gamersjustgame said:

Can we please get this bet official?

@tormentos@GrenadeLauncher Lets make the bet. Since both of you are so confident that DX12 will do nothing for the Xbox One lets bet your accounts on it. Put up or shut up time gentlemen.

And how do you intend to objectively measure the difference in graphical output between exclusives released at launch and by the end of 2016?

What's the cut off for DX12 justifying itself as the sekrit sauce that saves the Xbone?

I propose the following:

  1. A Developer quote that says implicitly or explicitly that DX12 has increased the fps/ Amount of objects on screen or resolution.

If we use screen shots then people will not be objective.

Time duration: No more than 1 year after DX 12 sdk has been in Developer's hands.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#177 GrenadeLauncher
Member since 2004 • 6843 Posts

@ttboy said:

@GrenadeLauncher said:

@ttboy said:

@gamersjustgame said:

Can we please get this bet official?

@tormentos@GrenadeLauncher Lets make the bet. Since both of you are so confident that DX12 will do nothing for the Xbox One lets bet your accounts on it. Put up or shut up time gentlemen.

And how do you intend to objectively measure the difference in graphical output between exclusives released at launch and by the end of 2016?

What's the cut off for DX12 justifying itself as the sekrit sauce that saves the Xbone?

I propose the following:

  1. A Developer quote that says implicitly or explicitly that DX12 has increased the fps/ Amount of objects on screen or resolution.

If we use screen shots then people will not be objective.

Time duration: No more than 1 year after DX 12 sdk has been in Developer's hands.

Any developer? What if that developer is working on a game that has a marketing deal with MS like The Division? What if the source is a click bait article on a website no one's heard of?

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#178  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@GrenadeLauncher said:

@ttboy said:

@GrenadeLauncher said:

@ttboy said:

@gamersjustgame said:

Can we please get this bet official?

@tormentos@GrenadeLauncher Lets make the bet. Since both of you are so confident that DX12 will do nothing for the Xbox One lets bet your accounts on it. Put up or shut up time gentlemen.

And how do you intend to objectively measure the difference in graphical output between exclusives released at launch and by the end of 2016?

What's the cut off for DX12 justifying itself as the sekrit sauce that saves the Xbone?

I propose the following:

  1. A Developer quote that says implicitly or explicitly that DX12 has increased the fps/ Amount of objects on screen or resolution.

If we use screen shots then people will not be objective.

Time duration: No more than 1 year after DX 12 sdk has been in Developer's hands.

Any developer? What if that developer is working on a game that has a marketing deal with MS like The Division? What if the source is a click bait article on a website no one's heard of?

Any Developer who legitimately has access to the SDK. There are already 2 devs who have stated that its a big deal for the Xbox One (hence my confidence). Lets say it has to be a Dev not an anonymous source :).

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#179 GrenadeLauncher
Member since 2004 • 6843 Posts

@ttboy said:

@GrenadeLauncher said:

@ttboy said:

@GrenadeLauncher said:

@ttboy said:

@gamersjustgame said:

Can we please get this bet official?

@tormentos@GrenadeLauncher Lets make the bet. Since both of you are so confident that DX12 will do nothing for the Xbox One lets bet your accounts on it. Put up or shut up time gentlemen.

And how do you intend to objectively measure the difference in graphical output between exclusives released at launch and by the end of 2016?

What's the cut off for DX12 justifying itself as the sekrit sauce that saves the Xbone?

I propose the following:

  1. A Developer quote that says implicitly or explicitly that DX12 has increased the fps/ Amount of objects on screen or resolution.

If we use screen shots then people will not be objective.

Time duration: No more than 1 year after DX 12 sdk has been in Developer's hands.

Any developer? What if that developer is working on a game that has a marketing deal with MS like The Division? What if the source is a click bait article on a website no one's heard of?

Any Developer who legitimately has access to the SDK. There are already 2 devs who have stated that its a big deal for the Xbox One (hence my confidence). Lets say it has to be a Dev not an anonymous source :).

How will we know if they legit have access to the SDK? Their word on it?

Out of interest, which devs are they so far?

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#180 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@GrenadeLauncher said:

@ttboy said:

@GrenadeLauncher said:

@ttboy said:

@GrenadeLauncher said:

@ttboy said:

@gamersjustgame said:

Can we please get this bet official?

@tormentos@GrenadeLauncher Lets make the bet. Since both of you are so confident that DX12 will do nothing for the Xbox One lets bet your accounts on it. Put up or shut up time gentlemen.

And how do you intend to objectively measure the difference in graphical output between exclusives released at launch and by the end of 2016?

What's the cut off for DX12 justifying itself as the sekrit sauce that saves the Xbone?

I propose the following:

  1. A Developer quote that says implicitly or explicitly that DX12 has increased the fps/ Amount of objects on screen or resolution.

If we use screen shots then people will not be objective.

Time duration: No more than 1 year after DX 12 sdk has been in Developer's hands.

Any developer? What if that developer is working on a game that has a marketing deal with MS like The Division? What if the source is a click bait article on a website no one's heard of?

Any Developer who legitimately has access to the SDK. There are already 2 devs who have stated that its a big deal for the Xbox One (hence my confidence). Lets say it has to be a Dev not an anonymous source :).

How will we know if they legit have access to the SDK? Their word on it?

Out of interest, which devs are they so far?

Can you think of any Dev who has said that they have access to a SDK and were lying? Its rare for a Dev to try and lie directly because it really destroys their reputation and is easily found out. The Killzone 1080p thing could be argued as a lie since they're being sued over it however thats rare.

Hehe I've posted one dev many times but some do not want to believe him (Brad Wardell). He has been questioned many times on his statements and has defended them. You can research his background and who he currently works with. The other Dev group was asked at E3 directly from a fan who took pics with them. They stated that it was a big deal. I'm not posting links since Google is your friend. If you don't believe me then take the bet :)....

Avatar image for Boddicker
Boddicker

4458

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#182 Boddicker
Member since 2012 • 4458 Posts

I'm still waiting on proof that software like DX12 can overcome the hardware blunders that were made on the X1.

I don't think it's going to happen.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#183 GrenadeLauncher
Member since 2004 • 6843 Posts

@ttboy said:

Can you think of any Dev who has said that they have access to a SDK and were lying? Its rare for a Dev to try and lie directly because it really destroys their reputation and is easily found out. The Killzone 1080p thing could be argued as a lie since they're being sued over it however thats rare.

Hehe I've posted one dev many times but some do not want to believe him (Brad Wardell). He has been questioned many times on his statements and has defended them. You can research his background and who he currently works with. The other Dev group was asked at E3 directly from a fan who took pics with them. They stated that it was a big deal. I'm not posting links since Google is your friend. If you don't believe me then take the bet :)....

Oh yeah, Wardell. I forgot about him. I think I remember that second one, Forza Horizon devs, wasn't it? Taken with a grain of salt.

@magicalclick said:

Guys, you know tomato will win because he can spin it. There is no way you can prove DX12 can help on XboxOne. Because no devs would be stupid enough to make two versions of the same game on XboxOne to prove it is faster on DX12. And on PC, it wont matter with XboxOne as the DX11 on PC and XboxOne are different, MS already said that on DX12 announcement.

The only thing you can get from devs are just saying they don't have to work extra when developing both PC and XboxOne versions. And you can't guarantee such statement can translate to better graphics.

The only thing that may help is saying the CPU is better at multi-threading now. But, that is such a minor improvement when most of the renderings are on GPU. Hell, Forward+ does GPGPU instead of CPU.

And most of all, it will take at least a year after DX12 released to see how development process gets improved. Most of them will just use the old ways and slowly transition to the new way. It takes a much longer time to see it in effect.

This guy gets it. That's why I won't be taking you up on your generous bet offer, ttboy. Nothing personal. :)

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#184 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@magicalclick said:

Guys, you know tomato will win because he can spin it. There is no way you can prove DX12 can help on XboxOne. Because no devs would be stupid enough to make two versions of the same game on XboxOne to prove it is faster on DX12. And on PC, it wont matter with XboxOne as the DX11 on PC and XboxOne are different, MS already said that on DX12 announcement.

The only thing you can get from devs are just saying they don't have to work extra when developing both PC and XboxOne versions. And you can't guarantee such statement can translate to better graphics.

The only thing that may help is saying the CPU is better at multi-threading now. But, that is such a minor improvement when most of the renderings are on GPU. Hell, Forward+ does GPGPU instead of CPU.

And most of all, it will take at least a year after DX12 released to see how development process gets improved. Most of them will just use the old ways and slowly transition to the new way. It takes a much longer time to see it in effect.

Anyway, would hope Fable Legend looks as great as this experiment. As long as I get great games like Horizon 2, I am happy.

DX12 is in certain Devs hands right now. When they announce it they will want software to display. I don't think it will take more than a year before we see some tangible evidence. If so then I leave this place you will have Tomato to yourselves :) ..

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#185  Edited By tormentos
Member since 2003 • 33784 Posts

@GoldenElementXL said:

Don't bet your account Tormentos! Where will we get meltdowns like this if you lose?

Come on dude look at the bad things and tell me it deserved a 20 point deduction for that.? Hell Fifa just got 8 here and is the same sh** every year..hahahaa

Avatar image for inb4uall
inb4uall

6564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#187 inb4uall
Member since 2012 • 6564 Posts

@clyde46 said:

So Mantle is not a thing anymore?

MAntle is not longer AMD exclusive if what Linus has been saying is correct.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#188  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@magicalclick said:

@ttboy:

Problem is, even after they used DX12. It will just be like Horza2. It looks nice and plays well. But, just like Horizon 2, they wouldn't bring any prove that the new SDK helped them to achieve things easier.

If they don't say it then I lose the bet. Its fairly simple I think.

Avatar image for slimdogmilionar
slimdogmilionar

1343

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#189 slimdogmilionar
Member since 2014 • 1343 Posts

@tormentos said:

@slimdogmilionar said:

You are an idiot if you truly believe that once M$ got the gpu from AMD that they did not make their own adjustments and customize the gpu to fit their vision, evidence of that is the fact that they stripped it down to add esram. Xbox is a heavily customized architecture customized by M$ not AMD idiot. During the whole interview with Eurogamer all Goosen and Baker did was talk about how they customized the Xbox one architecture and customizing it to add shortcuts into the hardware.

"Andrew Goossen: I'll jump in on that one. Like Nick said there's a bunch of engineering that had to be done around the hardware but the software has also been a key aspect in the virtualisation. We had a number of requirements on the software side which go back to the hardware."

The way xbox 360 was designed it required you to not only is this old news but Goosen once again confirmed this in his interview with Eurogamer

"Andrew Goossen: I just wanted to jump in from a software perspective. This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM."

You are the idiot hardware is everything inside the xbox one,he didn't say GPU,and the xbox one has some more things,2 extra DNA and ESRAM that is about it,the rest of the shit is also on PS4,like sound hardware,encoders and decoders and all that crap which was already disprove.

They didn't strip down idiot they took a 7790 and fit it in,the reason they chose a 7790 was because Bonairne is smaller than Pitcairn inside the PS4,since they need it ESRAM and all that crap inside the APU,something got to give,and it was the GPU,With ESRAM a PS4 like GPU would have not fit.

Is the same GCN sony has but weaker,the biggest change they did was disabling 2CU for yields..hahaha

Oh it is confirmed there is a 7790 with 2 CU disable and lower clock speed,which is the reason why the xbox one has problems even keeping with a 7770..hahahaha

Once again your link own you,...

Digital Foundry: When you look at the specs of the GPU, it looks very much like Microsoft chose the AMD Bonaire design and Sony chose Pitcairn - and obviously one has got many more compute units than the other. Let's talk a little bit about the GPU - what AMD family is it based on: Southern Islands, Sea Islands, Volcanic Islands?

Andrew Goossen: Just like our friends we're based on the Sea Islands family. We've made quite a number of changes in different parts of the areas. The biggest thing in terms of the number of compute units, that's been something that's been very easy to focus on. It's like, hey, let's count up the number of CUs, count up the gigaflops and declare the winner based on that. My take on it is that when you buy a graphics card, do you go by the specs or do you actually run some benchmarks? Firstly though, we don't have any games out. You can't see the games. When you see the games you'll be saying, "What is the performance difference between them?" The games are the benchmarks. We've had the opportunity with the Xbox One to go and check a lot of our balance. The balance is really key to making good performance on a games console. You don't want one of your bottlenecks being the main bottleneck that slows you down.

Balance is so key to real effective performance. It's been really nice on Xbox One with Nick and his team and the system design folks have built a system where we've had the opportunity to check our balances on the system and make tweaks accordingly. Did we do a good job when we did all of our analysis a couple of years ago and simulations and guessing where games would be in terms of utilisation? Did we make the right balance decisions back then? And so raising the GPU clock is the result of going in and tweaking our balance. Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing. But we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did.

Now everybody knows from the internet that going to 14 CUs should have given us almost 17 per cent more performance but in terms of actual measured games - what actually, ultimately counts - is that it was a better engineering decision to raise the clock. There are various bottlenecks you have in the pipeline that [can] cause you not to get the performance you want [if your design is out of balance].

Hahahaaaaaaaaaaaaaaaaaaaaaa...

No DX12 full support..hahaha

Digital Foundry: And you have CPU read access to the ESRAM, right? This wasn't available on Xbox 360 eDRAM.

Nick Baker: We do but it's very slow.

This ^^ is a lie MS they openly lie to DF about this ^^ i remember there was a huge buzz over the xbox one not been true HSA because for HSA the CPU has to be able to see the Data at all times,this eliminates the copy and paste that goes all the time between GPU and CPU on PC.

MS is a desperate attend at hot ships showed a diagram which showed a black line connecting from ESRAM to CPU so people begin to claim it was true HSA.

This article your quoting is from October 5 2013,on April 4 MS was making a parade about how great ESRAM was,and let slip that the CPU has no access to ESRAM,which mean while the data is on ESRAM the CPU can't see it so no true HSA for MS.

Try to stay on topic. Bro you still don't get it nothing you posted proved the xbox one does have directx hardware support. Saying it doesn't is like saying AMD cards are built with no Mantle support!?

Would you yourself build a machine that's based on your own API without built in support. Think about tiled resources the fact that maxwell cards are built with tiled resources in mind and M$ has been working with them for who knows how long, do you really honestly think that they would not build the xbox one to support their own API. Honestly you can't be that biased.

Secondly it's not the same Xbox one audio block is capable of 512 audio streams at once PS4 is only capable of 200 and will require the help of the cpu.

Xbox One's SHAPE: Microsoft is very proud of their audio block as it was actually completely made from the ground up by their Xbox Team. It is capable of 512 audio channels (~300 more than Ps4's block), and is used to decode and encode audio data. It will be used to do high quality sound for games, and also help with Kinect's voice commands. They also have spoken about how they have allowed it to freely exchange data with the CPU in case a title decides to push some highly complex audio and requires CPU help to process. But yet you claim M$ did not customize their shit!?

When did I ever say anything about cpu reading from esram, that's why XBox has DMA engines.Stop tryna spin stuff to make yourself feel like you proved something. Besides that fact the moving things from esram to main memory can be done without any help from the cpu or gpu, thanks to those DMA engines.

So we have esram, two extra DMA engines, and an optimised audio block and you telling me that M$ did not customize the xbox one architecture.

How about this you find me a link that proves the Xbox one has not been customized by M$, and stop tryna run away from the topic at hand.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#190 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@GrenadeLauncher said:

@ttboy said:

Can you think of any Dev who has said that they have access to a SDK and were lying? Its rare for a Dev to try and lie directly because it really destroys their reputation and is easily found out. The Killzone 1080p thing could be argued as a lie since they're being sued over it however thats rare.

Hehe I've posted one dev many times but some do not want to believe him (Brad Wardell). He has been questioned many times on his statements and has defended them. You can research his background and who he currently works with. The other Dev group was asked at E3 directly from a fan who took pics with them. They stated that it was a big deal. I'm not posting links since Google is your friend. If you don't believe me then take the bet :)....

Oh yeah, Wardell. I forgot about him. I think I remember that second one, Forza Horizon devs, wasn't it? Taken with a grain of salt.

@magicalclick said:

Guys, you know tomato will win because he can spin it. There is no way you can prove DX12 can help on XboxOne. Because no devs would be stupid enough to make two versions of the same game on XboxOne to prove it is faster on DX12. And on PC, it wont matter with XboxOne as the DX11 on PC and XboxOne are different, MS already said that on DX12 announcement.

The only thing you can get from devs are just saying they don't have to work extra when developing both PC and XboxOne versions. And you can't guarantee such statement can translate to better graphics.

The only thing that may help is saying the CPU is better at multi-threading now. But, that is such a minor improvement when most of the renderings are on GPU. Hell, Forward+ does GPGPU instead of CPU.

And most of all, it will take at least a year after DX12 released to see how development process gets improved. Most of them will just use the old ways and slowly transition to the new way. It takes a much longer time to see it in effect.

This guy gets it. That's why I won't be taking you up on your generous bet offer, ttboy. Nothing personal. :)

Come on Bro we can be a wolf pack. You'll have one less lem on here.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#191 tormentos
Member since 2003 • 33784 Posts
@ttboy said:

I propose the following:

  1. A Developer quote that says implicitly or explicitly that DX12 has increased the fps/ Amount of objects on screen or resolution.

If we use screen shots then people will not be objective.

Time duration: No more than 1 year after DX 12 sdk has been in Developer's hands.

What developer say mean sh** didn't Rebellion stated that the xbox one was running Sniper Elite 3 comfortably at 1080p.? Didn't they call the game 60FPS when it isn't.?

No parity means equality that mean if all multiplatforms games after 2016 run at the same resolution and frames (2 to 4 frames difference the most) then parity was achieve and DX12 really deliver is any game comes after 2016 in ,912p900p720p792p, vs 1080p or 50 FPS on PS4,30 on xbox one or even 40FPS you are gone..hahaha

What developers say mean crap if the end result is the PS4 version is faster or runs at higher resolution.

@ttboy said:

DX12 is in certain Devs hands right now. When they announce it they will want software to display. I don't think it will take more than a year before we see some tangible evidence. If so then I leave this place you will have Tomato to yourselves :) ..

DX12 hit in late 2015 is not finish yet.

The xbox one doesn't need DX12 because the gain DX12 bring to PC are the gains that belong to the xbox,you don't get it because you are to blind and biased..

GET IT PEOPLE DX12 IS MS CONSOLE TOOL DONE ON PC.

There is no gain performance wise other than a minuscule crap,and it will certainly not double frames,the faster you get this the faster you can get on with your life..

@magicalclick said:

Guys, you know tomato will win because he can spin it. There is no way you can prove DX12 can help on XboxOne. Because no devs would be stupid enough to make two versions of the same game on XboxOne to prove it is faster on DX12. And on PC, it wont matter with XboxOne as the DX11 on PC and XboxOne are different, MS already said that on DX12 announcement.

The only thing you can get from devs are just saying they don't have to work extra when developing both PC and XboxOne versions. And you can't guarantee such statement can translate to better graphics.

The only thing that may help is saying the CPU is better at multi-threading now. But, that is such a minor improvement when most of the renderings are on GPU. Hell, Forward+ does GPGPU instead of CPU.

And most of all, it will take at least a year after DX12 released to see how development process gets improved. Most of them will just use the old ways and slowly transition to the new way. It takes a much longer time to see it in effect.

Anyway, would hope Fable Legend looks as great as this experiment. As long as I get great games like Horizon 2, I am happy.

There is no spin 100 developers can say DX12 improved its games,but if what is shown in those games is been again 900p on XBO vs 1080p on PS4 the whole argument is null.

The problem with DX12 is that people don't understand,that the gains DX12 bring to PC are already on consoles they have been for endless years,higher draw calls,to the metal push is something old on PC even MS admit it was on 360,they stated it is on xbox one and that they would port it to PC.

I make a thread with irrefutable data period it came from MS it self.

I would not bet on that the demo was made on maxwell GPU on PC which isn't even close to an xbox one hardware.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#192 blackace
Member since 2002 • 23576 Posts
@gamersjustgame said:

DX12 will no doubt help. But as i have said. Both systems are comparable. I absolutely love the PS4 and X1. I am dying to play Bloodbourne, Halo and many others.

Yeah, I can't wait for both games. True gamers always win.

Avatar image for Jankarcop
Jankarcop

11058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#194 Jankarcop
Member since 2011 • 11058 Posts

lol fighting for 2nd place gfx.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#195 lostrib
Member since 2009 • 49999 Posts

@Jankarcop said:

lol fighting for 2nd place gfx.

you get a gaming PC yet?

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#197 lostrib
Member since 2009 • 49999 Posts

@Jankarcop said:

@lostrib said:

@Jankarcop said:

lol fighting for 2nd place gfx.

you get a gaming PC yet?

**** YOU

Do you want me to pass out first?

Avatar image for Jankarcop
Jankarcop

11058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#198 Jankarcop
Member since 2011 • 11058 Posts

@lostrib said:

@Jankarcop said:

@lostrib said:

@Jankarcop said:

lol fighting for 2nd place gfx.

you get a gaming PC yet?

**** YOU

Do you want me to pass out first?

Sorry I don't like white meat.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#199  Edited By lostrib
Member since 2009 • 49999 Posts

@Jankarcop said:

@lostrib said:

@Jankarcop said:

@lostrib said:

you get a gaming PC yet?

**** YOU

Do you want me to pass out first?

Sorry I don't like white meat.

Well I'm not white, but it's probably for the best since I'm not into dudes

But it's totally cool if you are

Avatar image for wis3boi
wis3boi

32507

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#200  Edited By wis3boi
Member since 2005 • 32507 Posts

@lostrib said:

@Jankarcop said:

lol fighting for 2nd place gfx.

you get a gaming PC yet?