Devs React to DX 12 Doubling Xbox One GPU Speed

Avatar image for darkangel115
darkangel115

4562

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#201  Edited By darkangel115
Member since 2013 • 4562 Posts

@StormyJoe said:

@tormentos, you can't even answer all my posts, guy - you have to cherry pick parts of some of them to try and make a point, and you even go as far as to counter claims I didn't even make. Weak. It just so fun proving that you are full of crap that I just can't seem to bring myself to ignore you like so many other people do.

Its really easy. Read his post. Laugh, don't reply. Its what I do.

Avatar image for highking_kallor
highking_kallor

594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#202 highking_kallor
Member since 2014 • 594 Posts

@mrxboxone said:

Xbox One is the most technically advanced console ever imagined!

More powerful then PS4!

Imagine it actually coming true. Im a dreamer a clueless dreamer. La de da de la

Avatar image for highking_kallor
highking_kallor

594

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#203  Edited By highking_kallor
Member since 2014 • 594 Posts

@darkangel115 said:

@StormyJoe said:

@tormentos, you can't even answer all my posts, guy - you have to cherry pick parts of some of them to try and make a point, and you even go as far as to counter claims I didn't even make. Weak. It just so fun proving that you are full of crap that I just can't seem to bring myself to ignore you like so many other people do.

Its really easy. Read his post. Laugh, don't reply. Its what I do.

Some people lack self control.

Avatar image for tdkmillsy
tdkmillsy

5869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#204  Edited By tdkmillsy
Member since 2003 • 5869 Posts

Cloud is a joke - Tell that to Windows Azure, Amazon and Google. Just a few years ago server based services where running locally in businesses. Now massive amounts of data are being held in the cloud and its just increasing. Give it a couple more years and broadband will be as much a utility as electric and gas in the major countries Microsoft are interested in. Cloud will differentiate the Xbox and Sony have reacted with there purchase of Gaikai. The 24 hour online access was for DRM and the cloud can be used without it.

In fact The PS4 APi LibGNM is ahead of DX12 and is now available to PS4 coders so when DX12 arrive on holiday 2015 the PS4 will be welcome DX12 to 2012...lol - This explains why the difference between current games is what it is, PS4 has Mantle/DirectX 12 features, Xbox One doesn't, if it did the release of DirectX 12 wouldn't have an impact and Microsoft would be hiding the fact not shouting about it. So when DirectX 12 is released the difference will be much less. All good news for Microsoft.

Always be a difference - Of course there will be but it wont be what it is now. The PS4 doesn't have the power to go above 1080P but the Xbox One has the power to get to 900p (1080p in certain scenarios). 900p or 1080p is close enough for nobody to care.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#205 tormentos
Member since 2003 • 33784 Posts
@StormyJoe said:

@tormentos, you can't even answer all my posts, guy - you have to cherry pick parts of some of them to try and make a point, and you even go as far as to counter claims I didn't even make. Weak. It just so fun proving that you are full of crap that I just can't seem to bring myself to ignore you like so many other people do.

Oh dude STFU you are the dumbest IT i have see in my entire life,if you can't fu**ing get the difference between GPU that are on PC and if can't even see how big a gap is when is presented to you with facts,they you are either dumb or a blin d biased fanboy.

I have benchmarks on my side proving all my points what do you have.?

Oh your years as IT and you argument about .net lol..

The difference in game is all i care what has been show now is not promising for the xbox one,in march they were outdone again by MGS5 which is 720p on xbox one and 1080p with more effects on PS4,now if you consider that small have it your way,i know what you need to double another card resolution wise while keeping parity in frames.

@darkangel115 said:

@Wasdie said:

@darkangel115: The PS3 didn't have a huge RAM advantage, it had a huge RAM disadvantage. 256 mb split pools of two separate types of memory were horrible, especially considering devs had to use the Cell to compensate for the lacking GPU. The faster ram was given to the cell while the slower ram was given to the GPU. It was a terrible design choice.

I know that, but in the link provided, which was an article from 2010 so 3+ years into the gen. IGN said the PS3 had better RAM, which is beyond a joke, and goes to show how little these gaming sites really know about hardware. They do for clicks because clicks = money, and people will keep quoting their articles to try and prove a they are correct to other people who wouldn't change their mind regardless.

And what is your point that IGN suck.? Because there is no debate here about that...lol

Avatar image for ZoomZoom2490
ZoomZoom2490

3943

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#206  Edited By ZoomZoom2490
Member since 2008 • 3943 Posts

dx12 has nothing to do with higher resolution.

games will still be 720p, but not dropping below 30fps like they are now.

dx12 will allow less cpu overhead, meaning better/steady fps, aka low level API.

bad news is that PS4 has already been using this technique before the system launched, lol.

no matter what MS tries to do, they will always be behind, and once we see a ps4 game from Naughty Dog or Santa Monica it will be the final nail in the coffin.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#207 tormentos
Member since 2003 • 33784 Posts

@tdkmillsy said:

Cloud is a joke - Tell that to Windows Azure, Amazon and Google. Just a few years ago server based services where running locally in businesses. Now massive amounts of data are being held in the cloud and its just increasing. Give it a couple more years and broadband will be as much a utility as electric and gas in the major countries Microsoft are interested in. Cloud will differentiate the Xbox and Sony have reacted with there purchase of Gaikai. The 24 hour online access was for DRM and the cloud can be used without it.

In fact The PS4 APi LibGNM is ahead of DX12 and is now available to PS4 coders so when DX12 arrive on holiday 2015 the PS4 will be welcome DX12 to 2012...lol - This explains why the difference between current games is what it is, PS4 has Mantle/DirectX 12 features, Xbox One doesn't, if it did the release of DirectX 12 wouldn't have an impact and Microsoft would be hiding the fact not shouting about it. So when DirectX 12 is released the difference will be much less. All good news for Microsoft.

Always be a difference - Of course there will be but it wont be what it is now. The PS4 doesn't have the power to go above 1080P but the Xbox One has the power to get to 900p (1080p in certain scenarios). 900p or 1080p is close enough for nobody to care.

Those cloud are not the same as MS was saying,you can use cloud for many things,but increasing the xbox one power from 10 xbox 360 to 40 xbox 360 it is a joke.

Cloud is not fast enough to deliver graphics over the damn internet,you can't period so on one side you fools are debating how ESRAM is enough to not starve the xbox one GPU for having 140 to 150 GB/s,and on the other side you completely ignore what that means to graphics,nothing complex can be deliver by online, not only network lack the speed,latency will kill anything,graphics need results in the same frame,the data a GPU requires every second is infinitely bigger than the can you can transmit using your online connection.

Oh the xbox one API is not DX11.2 is DX11X is already modify,is the reason i have been telling you that don't spec huge changes with DX12 because the xbox one api already is more efficient than DX11.2 on PC,just like the 360 version also was.

Aside from some improvements don't spec much,never the less the PS4 will always be ahead,it has more power.

Actually the PS4 does have the power to go beyond 1080p even the 7770 can,but what TV support those resolutions.? Most TV out there are 1080p not higher.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#208  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

From Turn 10

Some features are already on XB1:

1) Bundles which is part of "CPU Overhead: Redundant Render Commands"

2) Nearly zero D3D resource overhead which should be part of "Direct3D 12 – Command Creation Parallelism"

----------------------------------------------------------------------------------------------------------------------------------------------------------

Features that aren't on XB1:

1) Pipeline State Objects (PSOs).

2) Resource Binding.

These features will be available on XB1 later. Also "Descriptor Heaps & Tables" which is a sort of bindles rendering (page 19 under the "CPU Overhead: Redundant Resource Binding") would be possible only on GPUs that are fully DX11.2 capable (tier 2) and beyond. Considering that both DX11.2 and DX12 were announced for XB1 and DX team is prototyping DX12 on XB1 HW right now, it's likely that Descriptor Heaps & Tables will be available on XB1, too.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#209 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

From that its safe to say that the Xbox One is in for a good bump. We'll find out at E3!! This means that the modified GPU is at least a tier 2 DirectX 11 card.

Avatar image for GravityX
GravityX

865

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#210 GravityX
Member since 2013 • 865 Posts

DirectX 12 won't increase GPU performance. It will increase the CPU performance.

Both for Xbox One and PS4 have 6 CPU cores for games, however with DirectX 12, Xbox One will have the equivalent of 12 cores.

============

Xbox One will have twice the CPU performance, like the PS3 last gen.

Xbox One has split memory, like the PS3 last gen.

Xbox One has a weaker GPU, like the PS3 last gen.

Xbox One is harder to developer for, like the PS3 last gen.

Conclusion Xbox One will have the best looking games at the end of this new generation, like the PS3.

=============

Xbox One is perfectly balanced.

12 CPU cores + 12 GPU shader cores + DirectX 12 = Win

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#211  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@GravityX said:

Both for Xbox One and PS4 have 6 CPU cores for games, however with DirectX 12, Xbox One will have the equivalent of 12 cores.

============

Xbox One will have twice the CPU performance, like the PS3 last gen.

Xbox One has split memory, like the PS3 last gen.

Xbox One has a weaker GPU, like the PS3 last gen.

Xbox One is harder to developer for, like the PS3 last gen.

Conclusion Xbox One will have the best looking games at the end of this new generation, like the PS3.

=============

Xbox One is perfectly balanced.

12 CPU cores + 12 GPU shader cores + DirectX 12 = Win

Not sure if serious.... but that is a load of crap.

Avatar image for gta50419
gta50419

350

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#212 gta50419
Member since 2010 • 350 Posts

People who believe something can't be improved or performance can't get better without new hardware is senseles. It's 2014 not 2005 people.....

Avatar image for gta50419
gta50419

350

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#213 gta50419
Member since 2010 • 350 Posts

Looks like me trading ps4 for xb1 will pay off HUGE afterall

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#214  Edited By Spitfire-Six
Member since 2014 • 1378 Posts

@ttboy: Thanks for that diagram cleared lots of things up.

Avatar image for xxgunslingerxx
xxgunslingerxx

4275

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#216  Edited By xxgunslingerxx
Member since 2005 • 4275 Posts

@StormyJoe said:

@scatteh316 said:

@StormyJoe said:

@lunar1122 said:

watch as the xbox 1 is still 720p'ish after dx 12 releases.. Come on people. Direct X isnt the reason the xbone sucks.. Its the amount of ROPS and the terrible RAM configuration in the bone. Nothing they can do will help or solve that situation until some new hardware is put out there.

I love it when people who obviously don't know d!ck about hardware and software try and talk like they do using buzz words...

He's correct though, 16 ROP's is really piss poor for a next gen console aiming for 1080p.

Look, I have been writing software for... well, more years than I care to admit to. And, I will tell you first hand that optimized APIs can significantly improve an applications performance, just like poorly written APIs can cripple it. So without having any benchmark comparisons between DX 11's API performance and DX 12's API performance, making a blanket comment like "Its all because of hardware piece X" is nothing short of pure ignorance.

The difference between the PS4 and XB1 is, in the large scale of things, relatively minor. If you apply Moore's Law, the difference doesn't even amount to 1 CPU generation (2 years). So, is a 1-2 year old high end PC that much slower than a brand new high end PC?

Im a developer as well and yes optimized APIs will improve performance but like you said before how poorly written where the previous APIs? most likely they were in fact fairly well written (minimal performace gains)

secondly i didnt do any research into this i but how compatible it the 7750 card in the xbone with dx 12?

thirdly a 2 year old high end pc will run crysis 3 on high where as a new high end pc would run crysis 3 on ultra at the same fps
that is a fair diffrence expecially when the new high end pc is 100 dollars less

finally unless you are a video game developer and not just a application dev(like I am) you dont really know what your talking about ... both of us are just making very very very educated guesses

also i dont know why lems keep forgetting sony can improve their apis as well

Avatar image for jwsoul
jwsoul

5467

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#218 jwsoul
Member since 2005 • 5467 Posts

If this is true and DX12 is this leap forward then excellent for PC Gaming. Just to add if these gains are made in software only then SOny will not sit back and allow MS to dominate they will find a way to make the same gains with the PS4 CPU with Open GL software updates.

Avatar image for delta3074
delta3074

20003

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#219  Edited By delta3074
Member since 2007 • 20003 Posts

@tormentos said:


The xbox 360 didn't run DX 10,it could handle some things but DX 10 not it can't




I never said it did, re-read my post, i said it was capable of certain DX10 subroutines.

Beginning to wonder whether you actually read my posts properly , seems to me you read what you want to see when you read my posts.

Reading comprehension FTW

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#220 FoxbatAlpha
Member since 2009 • 10669 Posts

Devs reacting that have never used DX12 or never will use it. Devs am jelly.

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#221  Edited By misterpmedia
Member since 2013 • 6209 Posts

@MBirdy88 said:

Did people actually believe this tripe?

How the hell can an API double a GPU speed? it doesn't even make sense.....

DX12 will probably be REALISTICALLY 5-10% of a performance boost on X1 and maybe abit more on PC for SOME setups..... if that even.

the f*ck is wrong with lemmings?

They need secret sauce to hold on to! What I like to do is picture a merri-go-round that is labelled 'Secret Sauce' and you have all the lemmings holding onto the rails. The faster it gets the more people let go but no matter how hard the G-force hits there's always going to be those diehards that can take the spin and hold on for eternity.

Avatar image for handssss
handssss

1907

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#222  Edited By handssss
Member since 2013 • 1907 Posts

@mrxboxone said:

Xbox One is the most technically advanced console ever imagined!

More powerful then PS4!

Sing with me, sing it for the year

Sing for the laughter and sing for the tear

Sing with me, if it's just for today

Maybe tomorrow the good Lord will take you away

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#223 StormyJoe
Member since 2011 • 7806 Posts

@tormentos said:
@StormyJoe said:

@tormentos, you can't even answer all my posts, guy - you have to cherry pick parts of some of them to try and make a point, and you even go as far as to counter claims I didn't even make. Weak. It just so fun proving that you are full of crap that I just can't seem to bring myself to ignore you like so many other people do.

Oh dude STFU you are the dumbest IT i have see in my entire life,if you can't fu**ing get the difference between GPU that are on PC and if can't even see how big a gap is when is presented to you with facts,they you are either dumb or a blin d biased fanboy.

I have benchmarks on my side proving all my points what do you have.?

Oh your years as IT and you argument about .net lol..

The difference in game is all i care what has been show now is not promising for the xbox one,in march they were outdone again by MGS5 which is 720p on xbox one and 1080p with more effects on PS4,now if you consider that small have it your way,i know what you need to double another card resolution wise while keeping parity in frames.

@darkangel115 said:

@Wasdie said:

@darkangel115: The PS3 didn't have a huge RAM advantage, it had a huge RAM disadvantage. 256 mb split pools of two separate types of memory were horrible, especially considering devs had to use the Cell to compensate for the lacking GPU. The faster ram was given to the cell while the slower ram was given to the GPU. It was a terrible design choice.

I know that, but in the link provided, which was an article from 2010 so 3+ years into the gen. IGN said the PS3 had better RAM, which is beyond a joke, and goes to show how little these gaming sites really know about hardware. They do for clicks because clicks = money, and people will keep quoting their articles to try and prove a they are correct to other people who wouldn't change their mind regardless.

And what is your point that IGN suck.? Because there is no debate here about that...lol

See what I mean? Did you just skim over the part where I said the PS4 has a more powerful GPU?

Just bow out of this thread gracefully. You know you lost this one, why keep shaming yourself?

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#224 StormyJoe
Member since 2011 • 7806 Posts

@xxgunslingerxx said:

@StormyJoe said:

@scatteh316 said:

@StormyJoe said:

@lunar1122 said:

watch as the xbox 1 is still 720p'ish after dx 12 releases.. Come on people. Direct X isnt the reason the xbone sucks.. Its the amount of ROPS and the terrible RAM configuration in the bone. Nothing they can do will help or solve that situation until some new hardware is put out there.

I love it when people who obviously don't know d!ck about hardware and software try and talk like they do using buzz words...

He's correct though, 16 ROP's is really piss poor for a next gen console aiming for 1080p.

Look, I have been writing software for... well, more years than I care to admit to. And, I will tell you first hand that optimized APIs can significantly improve an applications performance, just like poorly written APIs can cripple it. So without having any benchmark comparisons between DX 11's API performance and DX 12's API performance, making a blanket comment like "Its all because of hardware piece X" is nothing short of pure ignorance.

The difference between the PS4 and XB1 is, in the large scale of things, relatively minor. If you apply Moore's Law, the difference doesn't even amount to 1 CPU generation (2 years). So, is a 1-2 year old high end PC that much slower than a brand new high end PC?

Im a developer as well and yes optimized APIs will improve performance but like you said before how poorly written where the previous APIs? most likely they were in fact fairly well written (minimal performace gains)

secondly i didnt do any research into this i but how compatible it the 7750 card in the xbone with dx 12?

thirdly a 2 year old high end pc will run crysis 3 on high where as a new high end pc would run crysis 3 on ultra at the same fps

that is a fair diffrence expecially when the new high end pc is 100 dollars less

finally unless you are a video game developer and not just a application dev(like I am) you dont really know what your talking about ... both of us are just making very very very educated guesses

also i dont know why lems keep forgetting sony can improve their apis as well

In interviews with the devs prior to launch, they said the APIs sucked and they didn't even have final builds until a week or two before launch. @tormentos Even went as far as to bash MS for it - something I am sure he wishes I'd forgotten about.

Apparently, the version of DX12 the XB1 is getting is optimized for the XB1.

And yes, these are educated guesses, but considering a lot of devs complained about the XB1's APIs, and no one has complained about Sony's, I would speculate that Sony's APIs were in better shape a while ago - there is a finite amount of optimization you can do on code.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#225 tormentos
Member since 2003 • 33784 Posts

@ttboy said:

From Turn 10

Some features are already on XB1:

1) Bundles which is part of "CPU Overhead: Redundant Render Commands"

2) Nearly zero D3D resource overhead which should be part of "Direct3D 12 – Command Creation Parallelism"

----------------------------------------------------------------------------------------------------------------------------------------------------------

Features that aren't on XB1:

1) Pipeline State Objects (PSOs).

2) Resource Binding.

These features will be available on XB1 later. Also "Descriptor Heaps & Tables" which is a sort of bindles rendering (page 19 under the "CPU Overhead: Redundant Resource Binding") would be possible only on GPUs that are fully DX11.2 capable (tier 2) and beyond. Considering that both DX11.2 and DX12 were announced for XB1 and DX team is prototyping DX12 on XB1 HW right now, it's likely that Descriptor Heaps & Tables will be available on XB1, too.

Thank another for Tormentos...hahaha how many times in this thread i have tell you people that some of this features are already on the xbox one.? That the so call gains you see on PC will not translate to console because consoles are from the get go more efficient than PC's.?

@ttboy said:

From that its safe to say that the Xbox One is in for a good bump. We'll find out at E3!! This means that the modified GPU is at least a tier 2 DirectX 11 card.

DX12 will work on all GCN,you know what that means right.? Any feature it get by API the PS4 also get it on LibGNM.

Hell DX12 will work on Nvidia GPU from the 400GTX series and up,no special hardware need it..

And as i already say the jump in performance will not be much,since the xbox one already have some of the features of DX12 since before launch...lol

@04dcarraher said:

@GravityX said:


Xbox One is perfectly balanced.

12 CPU cores + 12 GPU shader cores + DirectX 12 = Win

Not sure if serious.... but that is a load of crap.

Hahahahaaaaaaaaaaaaaaaaaa.... Agreeeeeeee.....

@FastRobby said:

You are one of the most ignorant people I have ever met. On the one hand you know a bit about your tech, but then you claim it's the cloud's fault that the AI is dumb, apparently it's not up to the developers no more... If you're going to say shit like that, you better say nothing at all

Let me say it again Titanfall runs on the cloud it uses the cloud for AI,the AI is a smart as a chair,already show by Angry Joe review he stand in front of 3 soldiers like 4 feet from then and all he did was strife from left to right and the AI could not hit him....Hahahaaaaaaaaaaaaaaaaaaaaaa

Then he stop and killed all 3..lol

So yeah the cloud is sh**,it doesn't improve graphics like MS claim,and the AI done on the cloud so far has proven to be bad,the game runs a little higher than 720p with drop into the 30's on a game where the developer claimed that FPS was king,they wanted to prove CBOAT wrong so bad that they move the game from 720p to a little higher and the game suffer from it,COD Ghost doesn't have the drops Titanfall has at 720p,Titanfall should have stay that way.

@FastRobby said:

The Xbox One API was really bad in the beginnen, a lot of developers have complained about it, but they got an updated one in February/March, and that one is much better. A lot of developers said they have no worries about the Xbox One anymore, and that the gap is getting much smaller with Sony.

Developers didn't really complain like they did when the PS3 came out,no matter how bad the xbox one API could be,i am sure better than Cell it was by a truck load,what developers complained about was about performance they got running those API and about the reservation for Kinect,which activision confirm that they asked MS about dropping it,if the problems is the hardware been to weak there is no fix around that.

Also the xbox one effectively on launch had less power than a 7770 1.18TF because it had a 10% reservation the 7770 is 1.28TF,now without the reservation is 1.28 TF just like the 7770,you can't really expect performance like that to top a much stronger GPU at 1.84TF on PS4.

Avatar image for bforrester420
bforrester420

3480

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#226  Edited By bforrester420
Member since 2014 • 3480 Posts

@rocky_denace said:

Intel has come out recently also saying that DX12 is the biggest leap in a long long time it's looking like DX12 is the real deal and it will make GPU's do things and add performance gains never before seem until now.

I think if you factor in the latest Cloud demo of MS showing off real time demos of game using the cloud and you factor in DX12 and also Tiled Resources it's starting to look obvious that the X1 is going to be a powerhouse in a few years and do 60fps 1080p easily and most like have the graphics king games. Also AMD has hinted that the GPU in the X1 is not as close to a 7790 as people have claimed and it's more of an exotic design heavily modified and it's starting to look like now that this GPU has been specifically designed with forward thinking and DX12 in mind.

MS isn't dumb they aren't the biggest computer company in the world for nothing. Lets give MS some praise and be happy that the X1 has so much room to stretch it's legs in the coming years. Lets stop bickering PS4 is a great console with powerful hardware right out the gate but lets start to give MS due props they deserve with what is more and more seemingly a system designed with heavily forward thinking and can and will coming into it's own graphics in the coming years and start to open up lots of more power with superb software and API designs by MS some of the best engineers in the world.

Um, no. Both Apple and Google are considerably larger than Microsoft in Market Capitalization.

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#228  Edited By Shewgenja
Member since 2009 • 21456 Posts

@FastRobby said:

Also PS4 never reaches 1.84TF because of the CPU, but keep saying it does, lol.

This is getting a bit sad. Why do you even pretend?

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#229 GrenadeLauncher
Member since 2004 • 6843 Posts

@FastRobby said:

Also PS4 never reaches 1.84TF because of the CPU, but keep saying it does, lol.

Hahaha oh wow

Avatar image for bforrester420
bforrester420

3480

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#230 bforrester420
Member since 2014 • 3480 Posts

@FastRobby: You should just go back to not posting.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#231 tormentos
Member since 2003 • 33784 Posts

@StormyJoe said:

In interviews with the devs prior to launch, they said the APIs sucked and they didn't even have final builds until a week or two before launch. @tormentos Even went as far as to bash MS for it - something I am sure he wishes I'd forgotten about.

Apparently, the version of DX12 the XB1 is getting is optimized for the XB1.

And yes, these are educated guesses, but considering a lot of devs complained about the XB1's APIs, and no one has complained about Sony's, I would speculate that Sony's APIs were in better shape a while ago - there is a finite amount of optimization you can do on code.

No i i don't wish that,but please do talk about what developers were really complaining about,hard to code.? No it wasn't that it was performance and ESRAM complexity.

That bold part is wrong,the xbox one already has DX12 features it has has them for a while and they were use in Forza.

This is why i have been telling you,the xbox one API since the get go is more streamline than the PC version and already DX11X xbox one has some of this features,so don't expect the same gains you saw on PC the xbox one was already enjoying them.

2 features and already in 2 more to come don't expect much.

The fact that sony API were in a better spot than MS ones doesn't mean sony one will not improve for the 100 time...

Finally wrote that ASM I was looking forward to. Early results: PS4 surface tiling/detiling on the CPU is ~10-100x faster now. SIMDlicious!

https://twitter.com/postgoodism/status/439568280232013824

The PS4 surface tilling is now from early test 10 to 100 times faster on CPU,10 to 100 times man as you can see just because no one complained about the PS4 doesn't mean there aren't things that can't improve greatly,i just proved that with link from a sony ICE team member..

Mostly an sign of how wretched the original code was, but still. Thanks for reminding me to write full VRAM cache lines per iter!

Just some twits lower look at how he say how wretched the original code was,hell i think most developer complained about the xbox one because the gap they were getting with the PS4 more than anything no matter what DX is easy to use on xbox one,the problems comes trying to bring performance up because it has some pitfalls like ESRAM 10% reservation and not completely cook API.

As you can see there were wrong thing about the PS4 as well which are getting improve no launch game took advantage of this new code neither did Infamous which look incredible even so.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#232 tormentos
Member since 2003 • 33784 Posts

@FastRobby said:

Nope that's wrong, they complained about the tools also. Also PS4 never reaches 1.84TF because of the CPU, but keep saying it does, lol.

Really...

They didn't say the tool were hard to use or the console hard to code,they complain that they where getting low performance vs the PS4,ESRAM added complexity as well and to think DX on xbox one already had some DX 12 features,imagine how much sadder it would have been without them.

Really please link me to where it say that the PS4 can't reach 1.84TF because of the CPU..lol

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#233 misterpmedia
Member since 2013 • 6209 Posts

@Shewgenja said:

@FastRobby said:

Also PS4 never reaches 1.84TF because of the CPU, but keep saying it does, lol.

This is getting a bit sad. Why do you even pretend?

Wouldn't that mean xbone doesn't reach its flop peak of 1.1/1.3Tfs? lol Surely that's a lot worse for the bone if he thinks that...

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#234 StormyJoe
Member since 2011 • 7806 Posts

@tormentos said:

@StormyJoe said:

In interviews with the devs prior to launch, they said the APIs sucked and they didn't even have final builds until a week or two before launch. @tormentos Even went as far as to bash MS for it - something I am sure he wishes I'd forgotten about.

Apparently, the version of DX12 the XB1 is getting is optimized for the XB1.

And yes, these are educated guesses, but considering a lot of devs complained about the XB1's APIs, and no one has complained about Sony's, I would speculate that Sony's APIs were in better shape a while ago - there is a finite amount of optimization you can do on code.

No i i don't wish that,but please do talk about what developers were really complaining about,hard to code.? No it wasn't that it was performance and ESRAM complexity.

That bold part is wrong,the xbox one already has DX12 features it has has them for a while and they were use in Forza.

This is why i have been telling you,the xbox one API since the get go is more streamline than the PC version and already DX11X xbox one has some of this features,so don't expect the same gains you saw on PC the xbox one was already enjoying them.

2 features and already in 2 more to come don't expect much.

The fact that sony API were in a better spot than MS ones doesn't mean sony one will not improve for the 100 time...

Finally wrote that ASM I was looking forward to. Early results: PS4 surface tiling/detiling on the CPU is ~10-100x faster now. SIMDlicious!

https://twitter.com/postgoodism/status/439568280232013824

The PS4 surface tilling is now from early test 10 to 100 times faster on CPU,10 to 100 times man as you can see just because no one complained about the PS4 doesn't mean there aren't things that can't improve greatly,i just proved that with link from a sony ICE team member..

Mostly an sign of how wretched the original code was, but still. Thanks for reminding me to write full VRAM cache lines per iter!

Just some twits lower look at how he say how wretched the original code was,hell i think most developer complained about the xbox one because the gap they were getting with the PS4 more than anything no matter what DX is easy to use on xbox one,the problems comes trying to bring performance up because it has some pitfalls like ESRAM 10% reservation and not completely cook API.

As you can see there were wrong thing about the PS4 as well which are getting improve no launch game took advantage of this new code neither did Infamous which look incredible even so.

Again, I never said that Sony GPU wasn't more powerful, or that Sony's APIs wouldn't improve. Or that improving APIs on MS's side would make the consoles be "even" with each other as far as hardware performance. I said optimized APIs would shorten the gap, and that it appears that MS had far more room for improvement than Sony.

MS was shipping out new versions of their APIs within weeks of the console's launch. Devs (and your mecca NeoGAF) have said that the "resolution-gate" issue would disappear over time. You seem to keep omitting that in your replies... it must negate your "moon is made of cheese" mindset.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#236  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@FastRobby:

@FastRobby said:

@tormentos said:

@FastRobby said:

Nope that's wrong, they complained about the tools also. Also PS4 never reaches 1.84TF because of the CPU, but keep saying it does, lol.

Really...

They didn't say the tool were hard to use or the console hard to code,they complain that they where getting low performance vs the PS4,ESRAM added complexity as well and to think DX on xbox one already had some DX 12 features,imagine how much sadder it would have been without them.

Really please link me to where it say that the PS4 can't reach 1.84TF because of the CPU..lol

I'm searching for the link, it's a slide from a presentation given by Sony.

The developers said that the tools sucked, that's why when they had an updated version, they said it was waaaaaaay better and that they don't have any worries anymore.

I believe you mean the Sniper elite comment by Rebellion. Source

Just because Xbox One may have 2 features of Direct X12 it does not mean that they were optimized implementations. They could've been a hack of Direct X 11 in order to get them in (hence Direct X 11.x reference).

The likely situation is that Direct X 12 was not ready to include in the Xbox release, therefore they hacked the existing Direct X 11 to extend the 2 classes mentioned above. They may have been completely re-written since you don't optimize until later in the cycle. I would not be so quick to say that it will not give Xbox a big jump. There is one Dev who is ecstatic about it and thats telling. E3 is going to be world war III and I am excited to see how much crow will be eaten. More and more that messed up architecture is making sense.

Also I believe the last 2 features of Direct X 12 requires a tier 2 card so not all features will go to all cards.

Avatar image for deactivated-5cf3bfcedc29b
deactivated-5cf3bfcedc29b

776

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#237 deactivated-5cf3bfcedc29b
Member since 2014 • 776 Posts

@GravityX said:

DirectX 12 won't increase GPU performance. It will increase the CPU performance.

Both for Xbox One and PS4 have 6 CPU cores for games, however with DirectX 12, Xbox One will have the equivalent of 12 cores.

============

Xbox One will have twice the CPU performance, like the PS3 last gen.

Xbox One has split memory, like the PS3 last gen.

Xbox One has a weaker GPU, like the PS3 last gen.

Xbox One is harder to developer for, like the PS3 last gen.

Conclusion Xbox One will have the best looking games at the end of this new generation, like the PS3.

=============

Xbox One is perfectly balanced.

12 CPU cores + 12 GPU shader cores + DirectX 12 = Win

Only in your dreams, will this happen.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#238 tormentos
Member since 2003 • 33784 Posts

@ttboy said:

@FastRobby:

I believe you mean the Sniper elite comment by Rebellion. Source

Just because Xbox One may have 2 features of Direct X12 it does not mean that they were optimized implementations. They could've been a hack of Direct X 11 in order to get them in (hence Direct X 11.x reference).

The likely situation is that Direct X 12 was not ready to include in the Xbox release, therefore they hacked the existing Direct X 11 to extend the 2 classes mentioned above. They may have been completely re-written since you don't optimize until later in the cycle. I would not be so quick to say that it will not give Xbox a big jump. There is one Dev who is ecstatic about it and thats telling. E3 is going to be world war III and I am excited to see how much crow will be eaten. More and more that messed up architecture is making sense.

Also I believe the last 2 features of Direct X 12 requires a tier 2 card so not all features will go to all cards.

Yes and look Watch Dogs unconfirmed to be 1080p on xbox one,what happen did the tilling fail or the tricks.?

Dude this is MS stop inventing crap DX11X for the xbox one was say to be a custom version from the get go,also DX12 is a damn response to mantle,and mantle is a response to CONSOLES API,yeah even the xbox 360 API which is DX9 is more streamline than DX 11 on PC,because console target 1 single hardware,so yeah the xbox one already have DX12 features when the PC ones arrive on holiday 2015..

They didn't hack anything MS make custom version of its API all the time dude stop inventing things.

http://auth.gamespot.com/forums/system-wars-314159282/cod-on-xbox-one-will-run-at-1080p-60fps-30875337/

Yeah just like we ate crow from that thread ^^ you make...lol

There is no GCN2 hardware on xbox one stop inventing crap.

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#239 Shewgenja
Member since 2009 • 21456 Posts

If you want to double the power of the XBone, just move your couch and chairs twice as far from your TV.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#240 blackace
Member since 2002 • 23576 Posts

@misterpmedia said:

Ignore the PS4 dev, seems like a pointless comment. More quotes at the source.

PS4 ICE Team programmer Cort Stratton added that, “New SDKs can significantly improve performance on the same hardware, yes. Dunno about DX12/X1 specifically, of course; not my dept.

He also said that people have a right to be skeptical about performance gains. “Good; always be suspicious of ANY perf. improvement claims. e.g. what *exactly* got 50-100% faster? Faster than what? Details!”

Treyarch software engineer Dan Olson had a less amused take. “Here’s an article… no idea why people go on record for stuff like this.”

Programmer Dean Ashton found it downright hilarious. Either that or life-threatening judging by his response. “2x perf on Xbox One when using DX12? That article nearly made me choke on my cup of tea.”

Read more at http://gamingbolt.com/devs-react-to-...AoDlY3FJqQ5.99

Seems it's not just the fanboys that are skeptical. Discuss.

Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?

AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."

Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."

Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#241 tormentos
Member since 2003 • 33784 Posts

@blackace said:

Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?

AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."

Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."

Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."

Yeah AMD also claim that all xbox 360 games would be 720p minimum with 4XAA... Which is infinitely more achievable and failed, than making hardware jump 4 generations ahead because of a damn APi that will help with CPU over head..lol

The crow you will be eating next year will make you open 10 troll accounts and never post under balckace again..hahahaha

Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#242  Edited By Shewgenja
Member since 2009 • 21456 Posts

@tormentos said:

@blackace said:

Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?

AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."

Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."

Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."

Yeah AMD also claim that all xbox 360 games would be 720p minimum with 4XAA... Which is infinitely more achievable and failed, than making hardware jump 4 generations ahead because of a damn APi that will help with CPU over head..lol

The crow you will be eating next year will make you open 10 troll accounts and never post under balckace again..hahahaha

Why kid yourself. If TVBox720 becoming a reality didn't chase the astroturfers away, why would DX12 Sauce flopping?

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#243 misterpmedia
Member since 2013 • 6209 Posts

@blackace said:

@misterpmedia said:

Ignore the PS4 dev, seems like a pointless comment. More quotes at the source.

PS4 ICE Team programmer Cort Stratton added that, “New SDKs can significantly improve performance on the same hardware, yes. Dunno about DX12/X1 specifically, of course; not my dept.

He also said that people have a right to be skeptical about performance gains. “Good; always be suspicious of ANY perf. improvement claims. e.g. what *exactly* got 50-100% faster? Faster than what? Details!”

Treyarch software engineer Dan Olson had a less amused take. “Here’s an article… no idea why people go on record for stuff like this.”

Programmer Dean Ashton found it downright hilarious. Either that or life-threatening judging by his response. “2x perf on Xbox One when using DX12? That article nearly made me choke on my cup of tea.”

Read more at http://gamingbolt.com/devs-react-to-...AoDlY3FJqQ5.99

Seems it's not just the fanboys that are skeptical. Discuss.

Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?

AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."

Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."

Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."

We'll have to see at E3 2016 for confirmation like I'm surprised you haven't said yet ;). At the minute it's fairy dust & hope.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#244  Edited By blackace
Member since 2002 • 23576 Posts

@tormentos said:

@blackace said:

Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?

AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."

Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."

Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."

Yeah AMD also claim that all xbox 360 games would be 720p minimum with 4XAA... Which is infinitely more achievable and failed, than making hardware jump 4 generations ahead because of a damn APi that will help with CPU over head..lol

The crow you will be eating next year will make you open 10 troll accounts and never post under balckace again..hahahaha

I doubt it. lol!! I've was on the money with PS4 demand, I'm pretty sure I'll be right about this as well. We'll see how many 720P games we see on XB1 after March 2015. LOL!! I would believe AMD, Intel and Nvidia over anyone on these boards, that's for sure.

Avatar image for blackace
blackace

23576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#245  Edited By blackace
Member since 2002 • 23576 Posts

@misterpmedia said:

@blackace said:

@misterpmedia said:

Ignore the PS4 dev, seems like a pointless comment. More quotes at the source.

PS4 ICE Team programmer Cort Stratton added that, “New SDKs can significantly improve performance on the same hardware, yes. Dunno about DX12/X1 specifically, of course; not my dept.

He also said that people have a right to be skeptical about performance gains. “Good; always be suspicious of ANY perf. improvement claims. e.g. what *exactly* got 50-100% faster? Faster than what? Details!”

Treyarch software engineer Dan Olson had a less amused take. “Here’s an article… no idea why people go on record for stuff like this.”

Programmer Dean Ashton found it downright hilarious. Either that or life-threatening judging by his response. “2x perf on Xbox One when using DX12? That article nearly made me choke on my cup of tea.”

Read more at http://gamingbolt.com/devs-react-to-...AoDlY3FJqQ5.99

Seems it's not just the fanboys that are skeptical. Discuss.

Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?

AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."

Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."

Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."

We'll have to see at E3 2016 for confirmation like I'm surprised you haven't said yet ;). At the minute it's fairy dust & hope.

Never said anything about E3 2016. Stop pulling more BS out of your ass. Your crediability on here is continuing to sink into the abyss of a blackhole. I said, "this time next year". Let's see how many 720P/900P games are being announced at that time.

I also see you're not discrediting anything AMD, Intel & Nvidia are saying. lol!! Covering your ass, I'm sure.

Avatar image for deactivated-58270bc086e0d
deactivated-58270bc086e0d

2317

Forum Posts

0

Wiki Points

0

Followers

Reviews: 113

User Lists: 0

#246 deactivated-58270bc086e0d
Member since 2006 • 2317 Posts

@misterpmedia: I doubt it will double shit. But I'm not a software developer. I don't know, it sounds to me to be, 1 would that GPU even be possible to double its power AT ALL? As in if you were to overclock it right up to the brink of meltdown and then make the software on the thing as perfect for that GPU as it can possibly be, would it be double the overall power? Something tells me not from what I've seen of PC overclocking.

2. How much more can developers and stuff like DX12 get out of hardware? To me it seems like programming in its current form has been around for about 30 years, it has been coming along in leaps and bounds, but to say that Microsoft have suddenly found this holy grail that will take something like a console GPU to double its capacity using software alone sounds very unlikely to me.

I am going to be getting an Xbox One. I haven't liked Playstation since 2 and PS4 hasn't changed that so far. Just feel uninspiring. Whether the GPU on the Xbox One gets doubled in power or not I'll still be buying it. But you know every little helps.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#247  Edited By tormentos
Member since 2003 • 33784 Posts

@blackace said:

I doubt it. lol!! I've was on the money with PS4 demand, I'm pretty sure I'll be right about this as well. We'll see how many 720P games we see on XB1 after March 2015. LOL!! I would believe AMD, Intel and Nvidia over anyone on these boards, that's for sure.

Sure you were $449 at walmart yesterday and yet is the PS4 the one sold out..hahahaaaaaaaaaaaa

Yeah the demand is surely dying..lol

You were wrong and the PS4 continues to have high demand..

oh please you will believe any moron who post pro xbox one...lol

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#248 misterpmedia
Member since 2013 • 6209 Posts

@blackace said:

@misterpmedia said:

@blackace said:

@misterpmedia said:

Ignore the PS4 dev, seems like a pointless comment. More quotes at the source.

PS4 ICE Team programmer Cort Stratton added that, “New SDKs can significantly improve performance on the same hardware, yes. Dunno about DX12/X1 specifically, of course; not my dept.

He also said that people have a right to be skeptical about performance gains. “Good; always be suspicious of ANY perf. improvement claims. e.g. what *exactly* got 50-100% faster? Faster than what? Details!”

Treyarch software engineer Dan Olson had a less amused take. “Here’s an article… no idea why people go on record for stuff like this.”

Programmer Dean Ashton found it downright hilarious. Either that or life-threatening judging by his response. “2x perf on Xbox One when using DX12? That article nearly made me choke on my cup of tea.”

Read more at http://gamingbolt.com/devs-react-to-...AoDlY3FJqQ5.99

Seems it's not just the fanboys that are skeptical. Discuss.

Are these developers who are making XB1 games? I think I would listen to these guys more than any developer, when it comes to DX12. So all these guys are lying I presume?

AMD's Raja Khodury: "And it's not a small benefit. It's… like getting four generations of hardware ahead with this API."

Intel's VP of Engineering, Eric Mentezer: "This is absolutely, I think, the most significant jump in technology in a long, long time."

Nvidia's VP of Content and Technology, Tony Tamasi: "existing cards will see orders of magnitude improvements from DirectX 12's release, going from hundreds of thousands to millions and maybe tens of millions of system draws in a second."

We'll have to see at E3 2016 for confirmation like I'm surprised you haven't said yet ;). At the minute it's fairy dust & hope.

Never said anything about E3 2016. Stop pulling more BS out of your ass. Your crediability on here is continuing to sink into the abyss of a blackhole. I said, "this time next year". Let's see how many 720P/900P games are being announced at that time.

I also see you're not discrediting anything AMD, Intel & Nvidia are saying. lol!! Covering your ass, I'm sure.

lmfao I almost spat out my tea. But yes you're right, getting too ahead of the game there for when you'll eventually push it back to E3 2016, that comment was meant for your faux-insider comments dotted around old topics about it all 'happening' this year then suddenly it's all down to next year's E3, interdasting. Also, addressing credibility, I'm sorry but none of us have credibility on here, what kind of special club do you think this is blackace? We're forum posters. I never had any credibility to 'sink into the abyss of a blackhole'(bit vicious, damn :P!)in the first place. Sounds like I'm rustling your jimmies a bit. Also, side note, flattered that you thought I had credibility in the first place. Danke.

I didn't discredit them, but will remain skeptical like the other devs will especially for the Xbone, however it sounds like it's going to be quite beneficial....for PC. The doubling of GPU sounds like hyperbole PR and a desperate attempt to ensure xbone dieharders that their console won't stay gimped for the rest of the gen. Once I see a FULL DX12 game working on actual xbone hardware I will reassess the situation.

Avatar image for misterpmedia
misterpmedia

6209

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#249 misterpmedia
Member since 2013 • 6209 Posts

@Dannystaples14 said:

@misterpmedia: I doubt it will double shit. But I'm not a software developer. I don't know, it sounds to me to be, 1 would that GPU even be possible to double its power AT ALL? As in if you were to overclock it right up to the brink of meltdown and then make the software on the thing as perfect for that GPU as it can possibly be, would it be double the overall power? Something tells me not from what I've seen of PC overclocking.

2. How much more can developers and stuff like DX12 get out of hardware? To me it seems like programming in its current form has been around for about 30 years, it has been coming along in leaps and bounds, but to say that Microsoft have suddenly found this holy grail that will take something like a console GPU to double its capacity using software alone sounds very unlikely to me.

I am going to be getting an Xbox One. I haven't liked Playstation since 2 and PS4 hasn't changed that so far. Just feel uninspiring. Whether the GPU on the Xbox One gets doubled in power or not I'll still be buying it. But you know every little helps.

Didn't they already upclock the GPU anyway? I share similar views with you. Nice post, and for the record I've never said it wouldn't improve anything, I don't think any on has said that lol.

Avatar image for tdkmillsy
tdkmillsy

5869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#250 tdkmillsy
Member since 2003 • 5869 Posts

Anyone calling the cloud rubbish because the AI in Titanfall is bad is just an idiot.

It wouldn't be that hard to program the AI to be harder if they wanted to. Even I could program them to have more hit points or armour. They could easily make them move around better or shoot you more accurately or faster.

They make them they way they do by choice. Titanfall servers are pretty rock solid and 100,000+ virtual servers spinning up as required is pretty impressive.