DX12 Will not change X1's graphics capabilities

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#101  Edited By tormentos
Member since 2003 • 33784 Posts

@slimdogmilionar said:

First things first you are a MORON with capital letter the name was pull from DX the console wasn't built around DX,in fact the original xbox use a freaking Celeron and a Nvidia Gforce GPU,with a PC HDD,a PC DVD it was basically a PC with a closed OS,reason why it was so easy to mod i know i use to repair them and i mod them to.

Why the heck would they name the box after directx if they had no intentions of incorporating directx into the system. That like trying to say apple would build an iphone that's not built around iOS. It didn't matter what hardware the box used you idiot it was still built to run directx. Here's a link that talks directly about how the Directx team birthed the xbox.

“And then, one day, a couple of guys from the DirectX team dropped by my office, and they said they had this idea. It was called the DirectX Box.

With the DirectX team having won over Bill Gates with their idea, a new problem presented itself. They had to change it FRIES: Once we got going, the biggest step we had to take was removing the Windows OS from the Xbox. A key reason why Bill [Gates] chose our team instead of the Dreamcast guys was because our idea for the Xbox kept Windows’ OS inside, theirs didn’t.

So now what do you have to say every xbox since the OG xbox has been built by the Directx team, the fact that you don't know this shows just how uneducated you really are when it comes to this stuff. How many times have people said PS4 is just off the shelf parts and Xbox one is a customized architecture.

Wtf are talking about the 360 was less powerful than the ps4 on paper but it also had an architecture built on directx the xbox 360 was made in 2005 with a built in tessellation unit, directx didn't support tessellation until directx 11. The fact that xbox was built on directx last gen was the main reason developers chose it directx made it easier to port games to the 360, ie The Witcher 2. The xbox one may have weaker hardware but like I keep trying to tell you it has a customized architecture, if it did not have a customized architecture it would have the same setup as PS4. M$ did'nt hide the specs on anything they kept quiet the same as Sony. When Sony announced the PS4 M$ made their announcement a month later not in October or November.

Why would anybody need to talk about a secret xbox one API, everyone with any common since already knows it's going to be using Directx why beat a dead horse everyone knows that xbox will be built using Directx 12 because it's a microsoft product invented by the directx team. You just want to downplay this because the xbox has something that PS4 doesn't and you feel threatened by it. Regarding the cloud Spencer also just stated that more games will be using the cloud next year, we have companies like cloudgine taking advantage of this not too mention the other big name companies I've already showed you plenty of times before.

Yet again you say textures can't sit in esram but then turn around and say that esram is a frame buffer for xbox one gpu. Listen you idiot that's all gddr5 is on any graphics card it's a frame buffer. It uses the data sent from the cpu to process and hold the images needed to be shown on screen. You know not a damn thing about computer systems but yet you try to sit here and argue as if you do, you obviously have no clue how a cpu and gpu work to create the images we see on screen.

Dude stop your spinning no developer has ever said Xbox one is easier to develop for, now you are lying just to try and make yourself look good. How could the xbox be easier to develop for when Ps4 is built on the same idea as the 360 without edram. The fact that developers have been working with a unified system for years on the 360 gives them a head start not too mention not having edram to worry about.

LOL it's funny you say I don't know what I'm talking about then I proceed to give you a breakdown of how computer systems work. Everything in bold is fact if you question that then you are in fact denying how computers work….you not that dumb are you.

PS4 can’t work better than a PC gpu and cpu if it could then all games would run at 4k 60 fps. The fact that ps4 has all GDDR5 in fact proves that. Sony’s own first party studio already admitted that the cpu and gpu get hung up on each other. All apu’s come with gpu and cpu on die but you don’t see people putting gddr5 in their PC’s 1600 - 2100 ddr3 is what most people use with 1600 and 1866 being the norm. Also I’d like to hear your point of view on why the second bold part is lol worthy……

Bro now you trying to argue things that I’ve already pointed out to you. I told you already PRT/tiled resources was a way to save memory, what makes it different on xbox is that M$ put in esram for that sole purpose instead of spending more money on Gddr5. Duh if you are building your Api to support tiled textures then why not.

Great way to pick two completely bugged out games that the PC community have been complaining about since launch. Both of your examples are a glitchy mess on PC, Battlefield was just a poorly coded game on all systems.

Trying to imply that because you have DDR3 + GDDR5 you get better performance is a joke,you would not and the xbox one has no GDDR5 which would also destroy your argument either way.

This statement just catapulted you to the top of the idiot ladder. Cpu’s do better with ddr3 that’s a fact you can’t dispute. Gpu’s do better with Gddr5 yet another fact. Xbox does not have Gddr5 it has esram you idot. How did I destroy my argument? What would have happened if Sony had built the PS4 using only ddr3?

My explanation was not stupid just too smart for you to understand. How can you say M$ used ddr3 because it was cheap. When you build a balanced system you build a system where you’re cpu and gpu work the most efficient, if the gpu needs bandwidth but the cpu needs to move data fast you can’t accomplish that by being biased towards one or the other. You have to find a way to make both work at peak performance without any compromise.

MAn there is just too much stupidity in your post for me too address it all, I'm done lecturing you on how computers work. From the sounds of it you're just some hotshot kid who found out how to mod a system or two using automated scripts, and now thinks he knows everything about technology and can gain some internet cred by using ctrl+c/crtl+v to make yourself feel like you know something, throwing around a few technical terms he has no idea about. That frame buffer comment is too die for and lets me know you have no idea what you are talking about.

This is the only thing i will say you since it is clear that you love to spin sh** and invent crap,unless you prove to me that MS DX team actually Invented Intel CPU and Nvidia GPU's you have shit,the xbox was a PC with a huge case and a closed OS nothing more nothing less even the DVD drive was pulled from PC.

But hey what did i say to you.?

""First things first you are a MORON with capital letter the name was pull from DX the console wasn't built around DX,in fact the original xbox use a freaking Celeron and a Nvidia Gforce GPU,with a PC HDD,a PC DVD it was basically a PC with a closed OS""

What your link say.....

“It was basically a PC running Windows, but they wanted to hide the windows-ness of it. So the Windows OS was going to be hidden, and it was going to be packaged and sold as a game console."

""To image what the early prototype was like, think of a PC with a hidden OS, and putting a PC game in there and it going in auto-install and auto run. So, a little bit like a console.""

Basically no where in that fu**ing link you just posted stated that the xbox was build around DX,in fact it was build by DX group which was competing with another team and they even state that they gave in to ideas of the other team and the design was one of them.

But then again like i already told you when you prove to me that DX team is responsible for creating the CPU or GPU inside the xbox one you will have a point,part of the reason MS loss 4 billions on the xbox was because they bought off the shelve parts,they even enter into a fight with Nvidia over pricing.

The xbox was so PC like that moders actually replace the CPU with a 1.4ghz pentiun 3...lol I bet you didn't know that..lol

And you would not know this because you started gaming yesterday apparently i didn't and i use to install Evox on consoles,mine was modded..

My xbox was modded and ran by that time a 120GB HDD.

That bold part there again proves you know shit about what your talking the xbox 360 and xbox one are basically the same shit,the xbox 360 had EDRAM the xbox one has ESRAM but the main unified memory pool still is DDR3,so if any of the 2 consoles is xbox 360 like is the xbox one,the PS4 has not memory middle man like the xbox 360 and XBO..But hey don't take it from me...

"The Xbox One is pretty easy to understand because not just the hardware is similar to the PC, but everything like the SDK, the API is really similar to what you would find on a PC. On PS4 this is a little bit more complicated, but I personally worked on PS3 before.

The Witcher 3 developer..

http://www.eurogamer.net/articles/2013-11-19-the-witcher-3-what-is-a-next-gen-rpg

Now you were saying.? But but but no developer has say the xbox one is easier..hahaha

Balance meaning cheap ass,DDR3 + ESRAM = cheap way and if the xbox one had GDDR5 ESRAM would not exist on xbox one.

But since MS wanted to kinect you that is what you got..lol

Enjoy the ownage...hahha

@tymeservesfate said:

@tormentos

lol smh.

aye...Tormentos

Since you love his crap so much enjoy his ownage to..hahahaaa

@spitfire-six said:

Yes it will, hardware is just that its a collection of transistors. Without the proper device drivers you will have a very low usage of your hardware it will not be effective. It is true D3d12 will not change the number of transistors of the xbox 1 but the capabilities of the system are tied to how the different layers of software communicate and how much abstraction is present. Thus far console fans have basically been arguing over who has more transistors. "My system is more powerful because it base 1000 transistors vs your 950". These are systems, its not about a single aspect but the sum of the collection of components.

Total bullshit imply that the xbox one will close that gap is a joke,that imply that the xbox one will improve over time to close 100% gap in ROP 50% in CU which no matter what can't be done,power on this GPU come from CU and their stream processors,the more you have the better it performs,the PS4 simple has more.

Fun thing is that also imply that the PS4 will remain static and will never ever improve an inch which is even a bigger joke.

Avatar image for spitfire-six
Spitfire-Six

1378

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#102  Edited By Spitfire-Six
Member since 2014 • 1378 Posts

Correction :

To believe that DX12 will not improve the capabilities of the xbox 1 is ignorant and childish. You cannot separate "I'm proves graphics" or "improve cpu" the system as a whole will see improvement of its usage of the hardware. I didn't say anything about the ps4 thats your own injection. Im looking for Sony to implement something to change the priority of the task to fix the disproportional memory usage of the cpu vs gpu.

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103  Edited By tymeservesfate
Member since 2003 • 2230 Posts

@tormentos said:
@tymeservesfate said:

@tormentos

lol smh.

aye...Tormentos

Since you love his crap so much enjoy his ownage to..hahahaaa

take the bet.

Avatar image for tdkmillsy
tdkmillsy

5882

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#104 tdkmillsy
Member since 2003 • 5882 Posts

Funny how Phil says DX12 makes things easier but isn't the dramatic leap and people claim I told you so. But yet he talks about the cloud being important and offering a lot to games moving forward and the same people claim he's wrong about cloud.

go figure.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#105 GrenadeLauncher
Member since 2004 • 6843 Posts

@ttboy said:

@GrenadeLauncher said:

@general_solo76 said:

Yes.....but with the combined power of DX12, Azure cloud supercomputing, the secretly hidden nano superchips, and undeterred daily prayer, everything will fall into place and the world will be stunned by the power of the Xbox One! You just wait and see :D

Preach it, brother! AMEN!

Phil said ~35% of 2015 games will be cloud powered so both of you should accept the DX12 bet as well. But of course you will find some excuse not to.

lmaooooooooooooooooooooooooooooooooooooo

More of that AI guff probably. Cite your source, lem.

Avatar image for GreySeal9
GreySeal9

28247

Forum Posts

0

Wiki Points

0

Followers

Reviews: 41

User Lists: 0

#106 GreySeal9
Member since 2010 • 28247 Posts

Lems have handled having the weaker console in a pretty embarrassing fashion.

Avatar image for Cloud_imperium
Cloud_imperium

15146

Forum Posts

0

Wiki Points

0

Followers

Reviews: 103

User Lists: 8

#107 Cloud_imperium
Member since 2013 • 15146 Posts

@mikhail said:

There's always The Cloud.

Hehe

Avatar image for slimdogmilionar
slimdogmilionar

1343

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#108  Edited By slimdogmilionar
Member since 2014 • 1343 Posts

@tormentos said:

@slimdogmilionar said:

First things first you are a MORON with capital letter the name was pull from DX the console wasn't built around DX,in fact the original xbox use a freaking Celeron and a Nvidia Gforce GPU,with a PC HDD,a PC DVD it was basically a PC with a closed OS,reason why it was so easy to mod i know i use to repair them and i mod them to.

Why the heck would they name the box after directx if they had no intentions of incorporating directx into the system. That like trying to say apple would build an iphone that's not built around iOS. It didn't matter what hardware the box used you idiot it was still built to run directx. Here's a link that talks directly about how the Directx team birthed the xbox.

“And then, one day, a couple of guys from the DirectX team dropped by my office, and they said they had this idea. It was called the DirectX Box.

With the DirectX team having won over Bill Gates with their idea, a new problem presented itself. They had to change it FRIES: Once we got going, the biggest step we had to take was removing the Windows OS from the Xbox. A key reason why Bill [Gates] chose our team instead of the Dreamcast guys was because our idea for the Xbox kept Windows’ OS inside, theirs didn’t.

So now what do you have to say every xbox since the OG xbox has been built by the Directx team, the fact that you don't know this shows just how uneducated you really are when it comes to this stuff. How many times have people said PS4 is just off the shelf parts and Xbox one is a customized architecture.

Wtf are talking about the 360 was less powerful than the ps4 on paper but it also had an architecture built on directx the xbox 360 was made in 2005 with a built in tessellation unit, directx didn't support tessellation until directx 11. The fact that xbox was built on directx last gen was the main reason developers chose it directx made it easier to port games to the 360, ie The Witcher 2. The xbox one may have weaker hardware but like I keep trying to tell you it has a customized architecture, if it did not have a customized architecture it would have the same setup as PS4. M$ did'nt hide the specs on anything they kept quiet the same as Sony. When Sony announced the PS4 M$ made their announcement a month later not in October or November.

Why would anybody need to talk about a secret xbox one API, everyone with any common since already knows it's going to be using Directx why beat a dead horse everyone knows that xbox will be built using Directx 12 because it's a microsoft product invented by the directx team. You just want to downplay this because the xbox has something that PS4 doesn't and you feel threatened by it. Regarding the cloud Spencer also just stated that more games will be using the cloud next year, we have companies like cloudgine taking advantage of this not too mention the other big name companies I've already showed you plenty of times before.

Yet again you say textures can't sit in esram but then turn around and say that esram is a frame buffer for xbox one gpu. Listen you idiot that's all gddr5 is on any graphics card it's a frame buffer. It uses the data sent from the cpu to process and hold the images needed to be shown on screen. You know not a damn thing about computer systems but yet you try to sit here and argue as if you do, you obviously have no clue how a cpu and gpu work to create the images we see on screen.

Dude stop your spinning no developer has ever said Xbox one is easier to develop for, now you are lying just to try and make yourself look good. How could the xbox be easier to develop for when Ps4 is built on the same idea as the 360 without edram. The fact that developers have been working with a unified system for years on the 360 gives them a head start not too mention not having edram to worry about.

LOL it's funny you say I don't know what I'm talking about then I proceed to give you a breakdown of how computer systems work. Everything in bold is fact if you question that then you are in fact denying how computers work….you not that dumb are you.

PS4 can’t work better than a PC gpu and cpu if it could then all games would run at 4k 60 fps. The fact that ps4 has all GDDR5 in fact proves that. Sony’s own first party studio already admitted that the cpu and gpu get hung up on each other. All apu’s come with gpu and cpu on die but you don’t see people putting gddr5 in their PC’s 1600 - 2100 ddr3 is what most people use with 1600 and 1866 being the norm. Also I’d like to hear your point of view on why the second bold part is lol worthy……

Bro now you trying to argue things that I’ve already pointed out to you. I told you already PRT/tiled resources was a way to save memory, what makes it different on xbox is that M$ put in esram for that sole purpose instead of spending more money on Gddr5. Duh if you are building your Api to support tiled textures then why not.

Great way to pick two completely bugged out games that the PC community have been complaining about since launch. Both of your examples are a glitchy mess on PC, Battlefield was just a poorly coded game on all systems.

Trying to imply that because you have DDR3 + GDDR5 you get better performance is a joke,you would not and the xbox one has no GDDR5 which would also destroy your argument either way.

This statement just catapulted you to the top of the idiot ladder. Cpu’s do better with ddr3 that’s a fact you can’t dispute. Gpu’s do better with Gddr5 yet another fact. Xbox does not have Gddr5 it has esram you idot. How did I destroy my argument? What would have happened if Sony had built the PS4 using only ddr3?

My explanation was not stupid just too smart for you to understand. How can you say M$ used ddr3 because it was cheap. When you build a balanced system you build a system where you’re cpu and gpu work the most efficient, if the gpu needs bandwidth but the cpu needs to move data fast you can’t accomplish that by being biased towards one or the other. You have to find a way to make both work at peak performance without any compromise.

MAn there is just too much stupidity in your post for me too address it all, I'm done lecturing you on how computers work. From the sounds of it you're just some hotshot kid who found out how to mod a system or two using automated scripts, and now thinks he knows everything about technology and can gain some internet cred by using ctrl+c/crtl+v to make yourself feel like you know something, throwing around a few technical terms he has no idea about. That frame buffer comment is too die for and lets me know you have no idea what you are talking about.

This is the only thing i will say you since it is clear that you love to spin sh** and invent crap,unless you prove to me that MS DX team actually Invented Intel CPU and Nvidia GPU's you have shit,the xbox was a PC with a huge case and a closed OS nothing more nothing less even the DVD drive was pulled from PC.

But hey what did i say to you.?

""First things first you are a MORON with capital letter the name was pull from DX the console wasn't built around DX,in fact the original xbox use a freaking Celeron and a Nvidia Gforce GPU,with a PC HDD,a PC DVD it was basically a PC with a closed OS""

What your link say.....

“It was basically a PC running Windows, but they wanted to hide the windows-ness of it. So the Windows OS was going to be hidden, and it was going to be packaged and sold as a game console."

""To image what the early prototype was like, think of a PC with a hidden OS, and putting a PC game in there and it going in auto-install and auto run. So, a little bit like a console.""

Basically no where in that fu**ing link you just posted stated that the xbox was build around DX,in fact it was build by DX group which was competing with another team and they even state that they gave in to ideas of the other team and the design was one of them.

But then again like i already told you when you prove to me that DX team is responsible for creating the CPU or GPU inside the xbox one you will have a point,part of the reason MS loss 4 billions on the xbox was because they bought off the shelve parts,they even enter into a fight with Nvidia over pricing.

The xbox was so PC like that moders actually replace the CPU with a 1.4ghz pentiun 3...lol I bet you didn't know that..lol

And you would not know this because you started gaming yesterday apparently i didn't and i use to install Evox on consoles,mine was modded..

My xbox was modded and ran by that time a 120GB HDD.

That bold part there again proves you know shit about what your talking the xbox 360 and xbox one are basically the same shit,the xbox 360 had EDRAM the xbox one has ESRAM but the main unified memory pool still is DDR3,so if any of the 2 consoles is xbox 360 like is the xbox one,the PS4 has not memory middle man like the xbox 360 and XBO..But hey don't take it from me...

"The Xbox One is pretty easy to understand because not just the hardware is similar to the PC, but everything like the SDK, the API is really similar to what you would find on a PC. On PS4 this is a little bit more complicated, but I personally worked on PS3 before.

The Witcher 3 developer..

http://www.eurogamer.net/articles/2013-11-19-the-witcher-3-what-is-a-next-gen-rpg

Now you were saying.? But but but no developer has say the xbox one is easier..hahaha

Balance meaning cheap ass,DDR3 + ESRAM = cheap way and if the xbox one had GDDR5 ESRAM would not exist on xbox one.

But since MS wanted to kinect you that is what you got..lol

Enjoy the ownage...hahha

@tymeservesfate said:

@tormentos

lol smh.

aye...Tormentos

Since you love his crap so much enjoy his ownage to..hahahaaa

@spitfire-six said:

Yes it will, hardware is just that its a collection of transistors. Without the proper device drivers you will have a very low usage of your hardware it will not be effective. It is true D3d12 will not change the number of transistors of the xbox 1 but the capabilities of the system are tied to how the different layers of software communicate and how much abstraction is present. Thus far console fans have basically been arguing over who has more transistors. "My system is more powerful because it base 1000 transistors vs your 950". These are systems, its not about a single aspect but the sum of the collection of components.

Total bullshit imply that the xbox one will close that gap is a joke,that imply that the xbox one will improve over time to close 100% gap in ROP 50% in CU which no matter what can't be done,power on this GPU come from CU and their stream processors,the more you have the better it performs,the PS4 simple has more.

Fun thing is that also imply that the PS4 will remain static and will never ever improve an inch which is even a bigger joke.

Man how stupid are you. Why would the directx team build an nvidia gpu and cpu? They are a software team M$, it clearly says in that link that it was built by the directx team. The other team they were going against was the dreamcast team. They wanted to build a console box that ran directx, directx box, direct x box, xbox. Does that make sense to you now where the x in xbox comes from. They didn't need to build the hardware but they did add their own tweaks. The OG xbox used a custom cpu, I can't remember if the gpu was custom or not.

here's another link In 1998, four engineers from Microsoft's DirectX team, Kevin Bachus, Seamus Blackley, Ted Hase and DirectX team leader Otto Berkes, disassembled some Dell laptop computers to construct a prototype Microsoft Windows-based video game console. The team hoped to create a console to compete with the Sony's upcoming PlayStation 2, which was luring game developers away from the Windows platform. The team approached Ed Fries, the leader of Microsoft's game publishing business at the time, and pitched their "DirectX Box" console based on the DirectX graphics technology developed by Berkes' team. Fries decided to support the team's idea of creating a Windows DirectX based console.[8][9]

Bro you are an idiot you tried to get me to prove that the Directx made the hardware to try and prove something. Stop tryna spin the argument. Everyone knows the xbox was PC like everyone knew how to mod them, at the time though I did not like Xbox(only Halo) I had my PS2 and loved it, I could not stand OG xbox controller. Stop acting like it was so hard to mod the xbox it was child's play everyone I knew who had an xbox modded theirs, that proves nothing to me.

Man you are stupid how can you say that xbox one uses esram as a frame buffer then say xbox 360 is more like xbox one. On xbox 360 everything had to be sent back to gddr3 for it to be taken out, on PS4 everything has to be taken from gddr5, so on both systems the cpu has access to the same data as the gpu. On xbox you don't have to send things from esram to main memory unless you want too, esram acts as a frame buffer aka the same reason dgpu's have their own ram.

YEah you found the one dev that said PS4 was harder to develop for, but it would make sense for CDPR to favor the xbox considering how they worked with M$ to port The Witcher 2 to 360. BUt I can pull plenty of devs saying xbox is harder to develop for."It means you have to do it in chunks or using tricks, tiling it and so on. It's a bit like the reverse of the PS3. PS3 was harder to program for than the Xbox 360. Now it seems like everything has reversed but it doesn't mean it’s far less powerful--it’s just a pain in the ass to start with. We are on fine ground now but the first few months were hell."

No balance meaning cpu and gpu working as fast as possible. How bout going and buying some 2100 ddr3 ram and telling me how cheap it is. They didn't want GDDr5 on Xb1 Goossen said that plain and clear, they said "it took them into an uncomfortable spot". Which makes complete sense.

I am enjoying you owning yourself.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#109 Tighaman
Member since 2006 • 1038 Posts

@daious: I didnt get my info from there I got that diagram from there. I was going by old vg leaks and microsoft own pdf. that showed two gpu I was never talking about mrX or his insiders.

Everything I say I get from sony or MS papers and everything yall say you get from PR I would believe them faster than any blog or post from you.

Sound like you really mad come up with something technical reason why that 140gbs is better 200+gbs of BW.

the only console going SSAA

the first console doing Clustered Forward+ rendering

only console doing hundreds of NPCs on screen

ALL THEM THINGS THAT SO CALL SHOW YOUR POWER 1080p, 4xMSAA, LOCKED 30fps OPEN WORLD REAL TIME REFLECTIONS Weather NOW IN ONE GAME lol no ps4 game is even close and the one that are doing those things I mention above none of them are even 1080p lol I will take the weaker console.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#110  Edited By StormyJoe
Member since 2011 • 7806 Posts

@tormentos: Umm... if less resources are needed to do some tasks, those freed resources can be allocated to things like rendering. You didn't disprove anything. In fact, you actually strengthened my stance. Thank you?

Avatar image for Heil68
Heil68

60714

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#111 Heil68
Member since 2004 • 60714 Posts

@GreySeal9 said:

Lems have handled having the weaker console in a pretty embarrassing fashion.

True that.

Avatar image for Animal-Mother
Animal-Mother

27362

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#112  Edited By Animal-Mother
Member since 2003 • 27362 Posts

@kuu2 said:

@GrenadeLauncher: No Ignorance Launcher, a memory is what has happened to Sony's 2014 games lineup. Come the holidays The One proves which system is an actual console with games.

After 12 months of literally almost nothing?

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#113  Edited By tymeservesfate
Member since 2003 • 2230 Posts

@Heil68 said:

@GreySeal9 said:

Lems have handled having the weaker console in a pretty embarrassing fashion.

True that.

SSSSHHHHHHHHHHH

Avatar image for Heil68
Heil68

60714

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#114 Heil68
Member since 2004 • 60714 Posts

@tymeservesfate said:

@Heil68 said:

@GreySeal9 said:

Lems have handled having the weaker console in a pretty embarrassing fashion.

True that.

SSSSHHHHHHHHHHH

I even have an Xbone!!!111

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#115  Edited By tymeservesfate
Member since 2003 • 2230 Posts

@Heil68 said:

@tymeservesfate said:

@Heil68 said:

@GreySeal9 said:

Lems have handled having the weaker console in a pretty embarrassing fashion.

True that.

SSSSHHHHHHHHHHH

I even have an Xbone!!!111

SSHHHHH...lol

Avatar image for Heil68
Heil68

60714

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#116 Heil68
Member since 2004 • 60714 Posts

@tymeservesfate said:

@Heil68 said:

@tymeservesfate said:

@Heil68 said:

@GreySeal9 said:

Lems have handled having the weaker console in a pretty embarrassing fashion.

True that.

SSSSHHHHHHHHHHH

I even have an Xbone!!!111

SSHHHHH...lol

Wont you be me Neighbor and add BroknOath to Live so we can play on Heil Boulevard? AWWWW YEAAAHHH

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#117 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@GrenadeLauncher said:

@ttboy said:

@GrenadeLauncher said:

@general_solo76 said:

Yes.....but with the combined power of DX12, Azure cloud supercomputing, the secretly hidden nano superchips, and undeterred daily prayer, everything will fall into place and the world will be stunned by the power of the Xbox One! You just wait and see :D

Preach it, brother! AMEN!

Phil said ~35% of 2015 games will be cloud powered so both of you should accept the DX12 bet as well. But of course you will find some excuse not to.

lmaooooooooooooooooooooooooooooooooooooo

More of that AI guff probably. Cite your source, lem.

Google is your friend.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#118 Daious
Member since 2013 • 2315 Posts

@Tighaman said:

@daious: I didnt get my info from there I got that diagram from there. I was going by old vg leaks and microsoft own pdf. that showed two gpu I was never talking about mrX or his insiders.

Everything I say I get from sony or MS papers and everything yall say you get from PR I would believe them faster than any blog or post from you.

Sound like you really mad come up with something technical reason why that 140gbs is better 200+gbs of BW.

the only console going SSAA

the first console doing Clustered Forward+ rendering

only console doing hundreds of NPCs on screen

ALL THEM THINGS THAT SO CALL SHOW YOUR POWER 1080p, 4xMSAA, LOCKED 30fps OPEN WORLD REAL TIME REFLECTIONS Weather NOW IN ONE GAME lol no ps4 game is even close and the one that are doing those things I mention above none of them are even 1080p lol I will take the weaker console.

If only MS invested in decent hardware instead of cheaping out like they did with the xboxone.If you really care about dx12 features, get a computer.

Xboxone and PS4 are made with tech that was dated on release. Ps4 just happened to be stronger. You can't accept that because you are fanboy.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#119  Edited By Tighaman
Member since 2006 • 1038 Posts

@daious: please on a technical level how is it not decent? Give me a technical reason? Still talking about numbers but not how it pertains to each system. More numbers dont always mean better. If you put a B+GPU B- CPU 'B Memory you still end up with a B average but you can have a B GPU B CPU B+memory and still end up a better average.

Ps4 CPU 1.6ghz bw shared with the GPU

xbox one 1.75ghz has its own BW

PS4 GPU 800ghz 1.8tflops 140gbs bw shares BW with CPU also game sound and HDD, compressing and depressing data

32ROPS

18CUs

Xbox One 835ghz 1.3tflops 150gbs+ bw

SHAPE has its own BW and 400gflops(in game sound)

Has its own HIGH Quality compressing and depressing with it own BW

16 ROPS

12CUs

So break that down to me how the ps4 is better?

Avatar image for Heil68
Heil68

60714

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#120 Heil68
Member since 2004 • 60714 Posts

@Tighaman said:

@daious: please on a technical level how is it not decent? Give me a technical reason? Still talking about numbers but not how it pertains to each system. More numbers dont always mean better. If you put a B+GPU B- CPU 'B Memory you still end up with a B average but you can have a B GPU B CPU B+memory and still end up a better average.

Ps4 CPU 1.6ghz bw shared with the GPU

xbox one 1.75ghz has its own BW

PS4 GPU 800ghz 1.8tflops 140gbs bw shares BW with CPU also game sound and HDD, compressing and depressing data

32ROPS

18CUs

Xbox One 835ghz 1.3tflops 150gbs+ bw

SHAPE has its own BW and 400gflops(in game sound)

Has its own HIGH Quality compressing and depressing with it own BW

16 ROPS

12CUs

So break that down to me how the ps4 is better?

Simple. PS4 is the world's most powerful video game console to have ever been created in the history of video games and by golly we're lucky that SONY has given it to us.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#121 Tighaman
Member since 2006 • 1038 Posts

@Heil68: I already know you and you are cool with me with your trolling ass lol

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#122  Edited By Daious
Member since 2013 • 2315 Posts

@Tighaman said:

@daious: please on a technical level how is it not decent? Give me a technical reason? Still talking about numbers but not how it pertains to each system. More numbers dont always mean better. If you put a B+GPU B- CPU 'B Memory you still end up with a B average but you can have a B GPU B CPU B+memory and still end up a better average.

Ps4 CPU 1.6ghz bw shared with the GPU

xbox one 1.75ghz has its own BW

PS4 GPU 800ghz 1.8tflops 140gbs bw shares BW with CPU also game sound and HDD, compressing and depressing data

32ROPS

18CUs

Xbox One 835ghz 1.3tflops 150gbs+ bw

SHAPE has its own BW and 400gflops(in game sound)

Has its own HIGH Quality compressing and depressing with it own BW

16 ROPS

12CUs

So break that down to me how the ps4 is better?

PS4 is better because every single tech review states it so. Every single one. The GPU is stronger. Benchmarks on the CPU seems to show despite being different the performance is comparable (although some state the PS4 is better).

You are living with fanboy googles. Majority of multiplatform games are technically superior on PS4. There are way more 1080p games on the PS4 than on xboxone. If xboxone was stronger, then in the real world we would see it. We don't. Dx12 won't change that. Nothing will because its a weaker system. Sad part is that they are both pretty subpar consoles.

I mean give it up man.

Are you seriously comparing CPU/GPU clock speeds on two different components? Is that a joke? The only way clock speeds are comparable is if they are the same chip. They aren't. The fact that xboxone is at a higher clock and is equal to a lower clock chip shows that its weaker.

What is up for debate is ram.... woo... huge game changer...

There is absolutely no magical hardware from the future that is in the xboxone. Its a dated console just like a ps4.

Take off the fanboy goggles and have fun with your system instead of trying to prove hardware superiority that doesn't exist.

You and your buddy torm can throw the same numbers back and forth all you want.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#123 Tighaman
Member since 2006 • 1038 Posts

@daious: you can say ps4 are better when using old or deffered rendering engines we can say that but we can also say that its not one game on the ps4 thats not on the xbox showing that power and you still could give me a technical reason. Last gen when you cowboys was getting killed by the multiplats of the 360 but but but exclusive show the power now that your exclusives sucking now power is in the multiplats lol. Xbox is not for deffered rendering unless its tiled deffered , and no multiplat studio is going to change their code to fit one console. Thats what I accept and thats cool with me I got a game gear, snes, ps2, ps4, xbox, 360, one and a pc and a couple of arcade games calling me a fanboy of one company is special.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#124 tormentos
Member since 2003 • 33784 Posts

@tdkmillsy said:

Funny how Phil says DX12 makes things easier but isn't the dramatic leap and people claim I told you so. But yet he talks about the cloud being important and offering a lot to games moving forward and the same people claim he's wrong about cloud.

go figure.

You know what the problem is TTboy.?

That since they started trying to bullshit people with the whole 40 times the power of the xbox 360,no one take them serious now.

They lied the xbox one can offload some simple process mainly because.

1-Online speed are not fast enough to do anything significant,even CPU bandwidth which is slower than GPU one still is order of magnitudes faster than the fastest online connections out there.

So you see when you enter and try to trick people with crap don't get mad if they don't believe you latter.

@slimdogmilionar said:

Man how stupid are you. Why would the directx team build an nvidia gpu and cpu? They are a software team M$, it clearly says in that link that it was built by the directx team. The other team they were going against was the dreamcast team. They wanted to build a console box that ran directx, directx box, direct x box, xbox. Does that make sense to you now where the x in xbox comes from. They didn't need to build the hardware but they did add their own tweaks. The OG xbox used a custom cpu, I can't remember if the gpu was custom or not.

here's another link In 1998, four engineers from Microsoft's DirectX team, Kevin Bachus, Seamus Blackley, Ted Hase and DirectX team leader Otto Berkes, disassembled some Dell laptop computers to construct a prototype Microsoft Windows-based video game console. The team hoped to create a console to compete with the Sony's upcoming PlayStation 2, which was luring game developers away from the Windows platform. The team approached Ed Fries, the leader of Microsoft's game publishing business at the time, and pitched their "DirectX Box" console based on the DirectX graphics technology developed by Berkes' team. Fries decided to support the team's idea of creating a Windows DirectX based console.[8][9]

Bro you are an idiot you tried to get me to prove that the Directx made the hardware to try and prove something. Stop tryna spin the argument. Everyone knows the xbox was PC like everyone knew how to mod them, at the time though I did not like Xbox(only Halo) I had my PS2 and loved it, I could not stand OG xbox controller. Stop acting like it was so hard to mod the xbox it was child's play everyone I knew who had an xbox modded theirs, that proves nothing to me.

Man you are stupid how can you say that xbox one uses esram as a frame buffer then say xbox 360 is more like xbox one. On xbox 360 everything had to be sent back to gddr3 for it to be taken out, on PS4 everything has to be taken from gddr5, so on both systems the cpu has access to the same data as the gpu. On xbox you don't have to send things from esram to main memory unless you want too, esram acts as a frame buffer aka the same reason dgpu's have their own ram.

YEah you found the one dev that said PS4 was harder to develop for, but it would make sense for CDPR to favor the xbox considering how they worked with M$ to port The Witcher 2 to 360. BUt I can pull plenty of devs saying xbox is harder to develop for."It means you have to do it in chunks or using tricks, tiling it and so on. It's a bit like the reverse of the PS3. PS3 was harder to program for than the Xbox 360. Now it seems like everything has reversed but it doesn't mean it’s far less powerful--it’s just a pain in the ass to start with. We are on fine ground now but the first few months were hell."

No balance meaning cpu and gpu working as fast as possible. How bout going and buying some 2100 ddr3 ram and telling me how cheap it is. They didn't want GDDr5 on Xb1 Goossen said that plain and clear, they said "it took them into an uncomfortable spot". Which makes complete sense.

I am enjoying you owning yourself.

That is talking about software you idiot not hardware,the xbox ran a Intel CPU with a Nvidia GPU non were modified,in fact the xbox CPu could even be change by a faster PC one.

And lol wikipedia which even you can edit..hahaha

So no the xbox wasn't build around DX,in its first stage it was a PC with windows which has DX on it,the hardware it use was already make and on the market..hahahaha

You are the moron saying the xbox was build around DX you moron,if you say the xbox was build around DX that mean its components were made around DX as base,which isn't true stop owning your self,because the whole argument comes because you want to claim the xbox one is build around DX12 when it not..hahaha The xbox wasn't and the 360 wasn't either,non of the hardware was specially make for DX,so no.

Everything on xbox one also has to be place on DDR3 why teh fu** you think it has 8GB of memory for show.?

See this is the problem with your sad argument you quote things out of context,what Rebellion was talking on that part you quoted was ABOUT ESRAM not the xbox one tools and software.

I think eSRAM is easy to use. The only problem is…Part of the problem is that it’s just a little bit too small to output 1080p within that size," he said. "It's such a small size within there that we can't do everything in 1080p with that little buffer of super-fast RAM."

From your own link ^^.. ahahah ESRAM is easy to use the problem is that it is to SMALL which also Metro developer claims.

So yeah the xbox one is easier than the PS4 because of DX every one knoes this,the problem is that the PS4 is not as hard as the PS2 and PS2 were is more like the PS1 which was also very easy to code to.

Is the CPU and GPU would be working as fast as possible the XBO GPU would not be clocked at 853mhz since the 7790 is clocked at 1027mhz..lol

Dude MS doesn't pay for DDR3 even half of what you pay they buy in masses and get mass market price which is far away from what you pay,but hey have you try to buy GDDR5.?

Every one knows DDR3 is cheap ass compare to GDDR5 which is more expensive is not even debatable period.

Yeah it took them into a uncomfortable spot profits wise,in fact they were flame for trying to downplay GDDR5,DDR3 was dropped from GPU because it is slower and doesn't have the speed GDDR5,see you started gaming yesterday apparently DDR3 was part of GPu for years,before the xbox 360 was even launch there were GPU with DDR3,and GPU makers drop it because GDDR5 was superior,.

No it doesn't make any sense you are a complete moron and MS sucks up maximum level and you lack the intelligence to debate this with me.hahaha

lol now DDR3 is better than GDDR5 because MS say so..hahaha

@Tighaman said:

@daious: I didnt get my info from there I got that diagram from there. I was going by old vg leaks and microsoft own pdf. that showed two gpu I was never talking about mrX or his insiders.

Everything I say I get from sony or MS papers and everything yall say you get from PR I would believe them faster than any blog or post from you.

Sound like you really mad come up with something technical reason why that 140gbs is better 200+gbs of BW.

the only console going SSAA

the first console doing Clustered Forward+ rendering

only console doing hundreds of NPCs on screen

ALL THEM THINGS THAT SO CALL SHOW YOUR POWER 1080p, 4xMSAA, LOCKED 30fps OPEN WORLD REAL TIME REFLECTIONS Weather NOW IN ONE GAME lol no ps4 game is even close and the one that are doing those things I mention above none of them are even 1080p lol I will take the weaker console.

Ask Nvidia how their 660TI with 144GB/s bandwidth actually beat the 7870 which has 153GB/s,but wait why does it also beat the 7950 in many test when the 7950 has 240GB/s almost 100GB./s more than the 660Ti..

Bandwidth without power mean total shit.

Ask you self why is the xbox getting the short end of the stick since launch when it so call has 200Gb/s...lol

The only console going SSAA do you even know whats SSAA is.?

That is super sampling basically you take a game and render it at higher resolution that your target resolution and then downsample it to your target resolution.

And already the PS4 does that..

""Despite the change in art style, the basic rendering set-up remains unchanged from Lego The Movie. The PS4 game offers up a 1920x1280 image vertically super-sampled down to 1080p, providing extra anti-aliasing in the process, while Xbox One operates natively in 1080p and PC can do that and more.""

http://www.eurogamer.net/articles/digitalfoundry-2014-lego-the-hobbit-face-off

Both Lego the hobbit and Lego the movie was super sample on PS4,1920x1280...The xbox one version is 1080p not super sample..lol

But but but but the xbox one is the only console going SSAA.

Bag that one alone side the one about Forward rendering i also proved wrong..hahaha

The only consoles.? Last time i check Ubi soft had to pull back the PS4 to not shame the xbox one again..haha the xbox one has the same CPU as the PS4 is the PS4 can't run something CPU wise the xbox one can't either and the only test we have on CPu give the edge to the PS4.lol

Is a freaking racing game,the same game on PS4 would run faster,FH2 doesn't look even close to DC,DC may suck gameplay wise but graphics wise mop the floor with the FH2.

@StormyJoe said:

@tormentos: Umm... if less resources are needed to do some tasks, those freed resources can be allocated to things like rendering. You didn't disprove anything. In fact, you actually strengthened my stance. Thank you?

Really what stance is that.?

Because the because i have stand by my argument DX12 will do shit for the xbox one,because those gains are already there.

Avatar image for daious
Daious

2315

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#125  Edited By Daious
Member since 2013 • 2315 Posts

@Tighaman said:

@daious: you can say ps4 are better when using old or deffered rendering engines we can say that but we can also say that its not one game on the ps4 thats not on the xbox showing that power and you still could give me a technical reason. Last gen when you cowboys was getting killed by the multiplats of the 360 but but but exclusive show the power now that your exclusives sucking now power is in the multiplats lol. Xbox is not for deffered rendering unless its tiled deffered , and no multiplat studio is going to change their code to fit one console. Thats what I accept and thats cool with me I got a game gear, snes, ps2, ps4, xbox, 360, one and a pc and a couple of arcade games calling me a fanboy of one company is special.

You comparing this to last gen proves that you have no idea what you are talking about.

Last gen the systems were night and day different.

PS4 and Xboxone are the closest we have to computers than we seen since Xbox original. They are the closest in similarity in architecture. as well. It is a completely different story than last gen.

You have only proved that you know next to nothing when it comes to comparing hardware. You don't understand that you can't compare clock rates with different chips. You don't understand that bandwidth means nothing without power. You don't understand that GPU with lower bandwidth have outperformed higher ones. You claim ownage with AA technics you don't understand.

Everything you clearly want in a console is in a DX12 capable PC but you can't see that because of your fanboy goggles. If you really care for SSAA or 4k for that matter go PC. Stop talking about weak and dated hardware that is found in the xboxone and ps4.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#126 04dcarraher
Member since 2004 • 23829 Posts
@daious said:

@Tighaman said:

@daious: please on a technical level how is it not decent? Give me a technical reason? Still talking about numbers but not how it pertains to each system. More numbers dont always mean better. If you put a B+GPU B- CPU 'B Memory you still end up with a B average but you can have a B GPU B CPU B+memory and still end up a better average.

Ps4 CPU 1.6ghz bw shared with the GPU

xbox one 1.75ghz has its own BW

PS4 GPU 800ghz 1.8tflops 140gbs bw shares BW with CPU also game sound and HDD, compressing and depressing data

32ROPS

18CUs

Xbox One 835ghz 1.3tflops 150gbs+ bw

SHAPE has its own BW and 400gflops(in game sound)

Has its own HIGH Quality compressing and depressing with it own BW

16 ROPS

12CUs

So break that down to me how the ps4 is better?

PS4 is better because every single tech review states it so. Every single one. The GPU is stronger. Benchmarks on the CPU seems to show despite being different the performance is comparable (although some state the PS4 is better).

You are living with fanboy googles. Majority of multiplatform games are technically superior on PS4. There are way more 1080p games on the PS4 than on xboxone. If xboxone was stronger, then in the real world we would see it. We don't. Dx12 won't change that. Nothing will because its a weaker system.

I mean give it up man.

Are you seriously comparing CPU/GPU clock speeds on two different components? Is that a joke? The only way clock speeds are comparable is if they are the same chip. They aren't. The fact that xboxone is at a higher clock and is equal to a lower clock chip shows that its weaker.

What is up for debate is ram.... woo... huge game changer...

Both consoles have similar cpu's, based on the same jaguar architecture which means that their processing abilities are virtually the same. 1.7 ghz vs 1.6ghz just does not mean a whole lot with how weak the architecture is to begin with. and both only have 6 cores for gaming again making them both playing in the same ballpark.

The PS4's gpu is roughly around 30% faster overall. The PS4's gpu gives it the edge over the X1 in graphics performance. The methods in saving bandwidth, and tricks used to speed things up for X1 will help but the the sheer difference in the gpu's processing abilities will not allow it to catch up. Direct x 12 again will not be a game changer, its primary goal is to lower the use of cpu resources then before. Which this will allow the freed cpu resources to be put into the other tasks which will allow anything from slightly smoother experience, to a few fps gain in a game then before, to adding more cpu intensive additions to games.

The memory usage is a mixed bag since we have GDDR5 vs DDR3 +esram. We know that the PS4 has 256bit GDDR5 which helps gpu buffer however does not do anything for the cpu side and can add latency with certain tasks. Now with 256bit DDR3, It will limit the gpu's buffer While able to keep up with the cpu without no ill effects. Now with the 32mb esram its a memory bus patch for the gpu, able to read/write data upto 4x faster then 256bit GDDR5 it being able the quickly put data on and off making it a good back buffer to do the high bandwidth things like shadow and light maps. However as an actual vram buffer it cant do because of low amount of memory. So for developers the PS4 straight up and direct access to fast memory will be less time consuming and easier to use.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#127  Edited By Tighaman
Member since 2006 • 1038 Posts

@tormentos: Power without Bandwidth dont mean shit so what is you saying and again with them GPUs you mention are not even close to the same clock or sharing the same memory or Bandwidth with the CPU as the ps4 does so that comparison does not pertain how the whole console works.

Its getting the short end because ps4 set up is better suited for deffered rendering and older engines thats the only flaw the xbox has. Already was doing SSAA in Ryse so whats your point. None of those things I mention above apply to any ps4 exclusive are you telling me that 3rd party devs are more competent than the 1st party and the CERNY lol

Killzone had to cut out AA, AI, and dropped resolution to achieve 60fps in multiplayer and it was more like 45fps most of the time

forza5 used the cloud for AI didnt need to drop resolution and still got 60fps LOCKED

If unity was exclusive for xbox the CPU would be helped by the cloud with AI and NPCs how would the ps4 CPU help with AI and NPCs ooohhhh thats right it would have to drop resolution. Let me know when you see an Open World 1080p 4xmsaa locked framerate game on your system.

the only heavy compute game coming is the Order and you see its resolution is 800p (no CPU or BW so use blackbars lol) and Im for certain it will be unlocked framerate.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#128 StormyJoe
Member since 2011 • 7806 Posts

@tormentos: Didn't the article you linked say that resources for certain tasks would be freed up with DX12? If that's the case, those resources could be passed to rendering. Unless, I misread something in your post?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#129  Edited By tormentos
Member since 2003 • 33784 Posts

@Tighaman said:

@tormentos: Power without Bandwidth dont mean shit so what is you saying and again with them GPUs you mention are not even close to the same clock or sharing the same memory or Bandwidth with the CPU as the ps4 does so that comparison does not pertain how the whole console works.

Its getting the short end because ps4 set up is better suited for deffered rendering and older engines thats the only flaw the xbox has. Already was doing SSAA in Ryse so whats your point. None of those things I mention above apply to any ps4 exclusive are you telling me that 3rd party devs are more competent than the 1st party and the CERNY lol

Killzone had to cut out AA, AI, and dropped resolution to achieve 60fps in multiplayer and it was more like 45fps most of the time

forza5 used the cloud for AI didnt need to drop resolution and still got 60fps LOCKED

If unity was exclusive for xbox the CPU would be helped by the cloud with AI and NPCs how would the ps4 CPU help with AI and NPCs ooohhhh thats right it would have to drop resolution. Let me know when you see an Open World 1080p 4xmsaa locked framerate game on your system.

the only heavy compute game coming is the Order and you see its resolution is 800p (no CPU or BW so use blackbars lol) and Im for certain it will be unlocked framerate.

No it means you have a 5 line race track but your car is a Geo Metro and the rest of the cars around you are faster.

No is getting teh short end of the stick because is has 768 Stream processors vs the 1152 Stream Processors as simple as that,just like the R265 beat the R250X the same will happen with the PS4.

There is no secret sauce the xbox one is weak sauce vs the PS4 and not matter if the game is forward or deferred the game will perform betteron PS4.

Wait you claim the Xbo is the only console going SSAA i prove you wrong and you ask me what is my point.?

Ryse doing SSAA.? lol not it doesn't it does MSAA tx1 with MLAA not SSAA that i know..

http://www.eurogamer.net/articles/digitalfoundry-vs-ryse-son-of-rome

You are an idiot the fact that the PS4 is doing SSAA owned you hard,exclusive or 3rd party the PS4 does it what first party game does it on xbox one.? Ryse doesn't do it and is not a first party game either.

Yeah just like Titanfall had to drop to 792p and still had drops into the 30's even that it doesn't look even close to Killzone SF.

NO Forza doesn't use the cloud for AI and the game works offline you don't need online to play Forza 5,which mean no cloud AI..hahaha

Resolution has shit to do with the CPU ass,and if my grandmother would not have die she would still alive,there are plenty of games that were made for the xbox and don't use the cloud,the only game really using it was Titanfall and was 792p and look like crap.

Let me know when the xbox one does a shooter with MSAAx4 and open world a fu**ing racing game is a joke..hahaha

The fact that you want to hold tied to this stupid pathetic argument as the xbox one is the only console doing 1080p MSAAx4 is a joke and show how desperate you are MSAAx4 may be the only thing impressive about FH2 which look last gen compare to drive club..lol

See you are to dumb to understand that CPU has shit to do with GPU resolution,if you are cpu bottle neck at 1080p dropping the resolution to 900p will still leave you bottle neck,because to fix the bottleneck you need to either use a stronger CPU or lower you CPU use which you don't do by lowering resolution ass because resolution is handle by the GPU.

@04dcarraher said:

Both consoles have similar cpu's, based on the same jaguar architecture which means that their processing abilities are virtually the same. 1.7 ghz vs 1.6ghz just does not mean a whole lot with how weak the architecture is to begin with. and both only have 6 cores for gaming again making them both playing in the same ballpark.

The PS4's gpu is roughly around 30% faster overall. The PS4's gpu gives it the edge over the X1 in graphics performance. The methods in saving bandwidth, and tricks used to speed things up for X1 will help but the the sheer difference in the gpu's processing abilities will not allow it to catch up. Direct x 12 again will not be a game changer, its primary goal is to lower the use of cpu resources then before. Which this will allow the freed cpu resources to be put into the other tasks which will allow anything from slightly smoother experience, to a few fps gain in a game then before, to adding more cpu intensive additions to games.

The memory usage is a mixed bag since we have GDDR5 vs DDR3 +esram. We know that the PS4 has 256bit GDDR5 which helps gpu buffer however does not do anything for the cpu side and can add latency with certain tasks. Now with 256bit DDR3, It will limit the gpu's buffer While able to keep up with the cpu without no ill effects. Now with the 32mb esram its a memory bus patch for the gpu, able to read/write data upto 4x faster then 256bit GDDR5 it being able the quickly put data on and off making it a good back buffer to do the high bandwidth things like shadow and light maps. However as an actual vram buffer it cant do because of low amount of memory. So for developers the PS4 straight up and direct access to fast memory will be less time consuming and easier to use.

DDR3 has higher latency than DDR2,the speed benefit out weight the con.

But it was posted here by user BTK2 several times that GDDR5 from henix has more or less the same latency as DDR3,but DDR3 still lack the speed,it doesn't help the xbox one case either that it has a cumbersome memory system.

The only test we have so far give the edge to the PS4 even with the speed advantage on the xbox one,which i think is do to the xbox one memory system and how the CPU can't see the data when it is on ESRAM which doesn't make it HSA in the same way the PS4 is.

@StormyJoe said:

@tormentos: Didn't the article you linked say that resources for certain tasks would be freed up with DX12? If that's the case, those resources could be passed to rendering. Unless, I misread something in your post?

Oh that is what it is claim,but it was also claim that performance would double and a demo was show on PC doing so,then piggy back ride on the xbox when it wasn't true.

You people would know DX12 if you actually understand what it is,it is MS late entry once more to what other companies have been doing.

PRT done.

Bundles. done.

Lower CPU over head Done (even on xbox 360)

This is why DX12 will do nothing what i am sure it will do is ease development of cross platform games between PC and xbox one,i don't know if that will be enogh this time to help the xbox one case,last gen it was big time.

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#130 Tighaman
Member since 2006 • 1038 Posts

@tormentos: again you so stupid you still trying to compare PC graphic cards that dont share memory and Bandwidth such as the PS4 so its not the same dummy. lol

Pdf on Ryse says otherwise SSAA in the pre rendered and real time cutscenes.

If your CPU have plenty to do with resolution stop listening to Gaf a sidearm devs any real dev would tell you different. If your CPU lack the cycles can affect your resolution.

Drivatars are AI and its the cloud you mad because your console is too week to have a NOC

Titanfall one of the best games this year no matter the resolution.

DC is garbage weak IQ, poor AA, linear tracks, rubberbanding, lack content, poor online severs, pretty RAIN CLOUDS with no rain, poor AF, and incomplete game matter a fact you keep comparing them games when really its no comparison.

You cant say 1080p 4xmsaa, locked 30 CLUSTERED FORWARD+is a joke until you 40% more powerful console do it.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#131  Edited By deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

I don't know why this is a debate. Clearly there will be some gains and this is confirmed by an actual Dev who has access to DX12. This is not some dev speculating. Now why won't someone accept my bet....

DirectX 12 could double perf for XBox One games says Dev Using DX12

@XX__MX it's not literally (it's software, not hardware) but yes, dx12 games will likely by more than 2x as fast.

Now I would love to hear why he is wrong and you guys are right.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#132  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@tormentos said:


DDR3 has higher latency than DDR2,the speed benefit out weight the con.

But it was posted here by user BTK2 several times that GDDR5 from henix has more or less the same latency as DDR3,but DDR3 still lack the speed,it doesn't help the xbox one case either that it has a cumbersome memory system.

The only test we have so far give the edge to the PS4 even with the speed advantage on the xbox one,which i think is do to the xbox one memory system and how the CPU can't see the data when it is on ESRAM which doesn't make it HSA in the same way the PS4 is.


And DDR 1 has lower latency then DDR2 your poor attempt in trying to make an argument

GDDR5 latency is higher then DDR3. No if ands or butts about it. The GDDR5 memory controllers cause the latency..And like I said it depends on the tasks the cpu is doing. Latency isn't much of an issue with GPU's since their parallel nature allows them to move to other things when latency causes a pause in the workload it was doing. Cpu's handle things in a linear fashion making GDDR5 a slight problem depending on the tasks at hand. The CPU cache does levitate some of the GDDR5 latency issues but is overall still worse for cpu tasks over DDR3. What is you and your fetish with PS4 and HSA? because the only thing your correct aboutis that the cpu does not have access to the esram which is a whole 32mb, the X1 has a different implementation of HSA and the cpu and gpu still see the rest of the memory pool.

Avatar image for slimdogmilionar
slimdogmilionar

1343

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#133 slimdogmilionar
Member since 2014 • 1343 Posts

@tormentos said:

That is talking about software you idiot not hardware,the xbox ran a Intel CPU with a Nvidia GPU non were modified,in fact the xbox CPu could even be change by a faster PC one.

And lol wikipedia which even you can edit..hahaha

So no the xbox wasn't build around DX,in its first stage it was a PC with windows which has DX on it,the hardware it use was already make and on the market..hahahaha

You are the moron saying the xbox was build around DX you moron,if you say the xbox was build around DX that mean its components were made around DX as base,which isn't true stop owning your self,because the whole argument comes because you want to claim the xbox one is build around DX12 when it not..hahaha The xbox wasn't and the 360 wasn't either,non of the hardware was specially make for DX,so no.

Everything on xbox one also has to be place on DDR3 why teh fu** you think it has 8GB of memory for show.?

See this is the problem with your sad argument you quote things out of context,what Rebellion was talking on that part you quoted was ABOUT ESRAM not the xbox one tools and software.

I think eSRAM is easy to use. The only problem is…Part of the problem is that it’s just a little bit too small to output 1080p within that size," he said. "It's such a small size within there that we can't do everything in 1080p with that little buffer of super-fast RAM."

From your own link ^^.. ahahah ESRAM is easy to use the problem is that it is to SMALL which also Metro developer claims.

So yeah the xbox one is easier than the PS4 because of DX every one knoes this,the problem is that the PS4 is not as hard as the PS2 and PS2 were is more like the PS1 which was also very easy to code to.

Is the CPU and GPU would be working as fast as possible the XBO GPU would not be clocked at 853mhz since the 7790 is clocked at 1027mhz..lol

Dude MS doesn't pay for DDR3 even half of what you pay they buy in masses and get mass market price which is far away from what you pay,but hey have you try to buy GDDR5.?

Every one knows DDR3 is cheap ass compare to GDDR5 which is more expensive is not even debatable period.

Yeah it took them into a uncomfortable spot profits wise,in fact they were flame for trying to downplay GDDR5,DDR3 was dropped from GPU because it is slower and doesn't have the speed GDDR5,see you started gaming yesterday apparently DDR3 was part of GPu for years,before the xbox 360 was even launch there were GPU with DDR3,and GPU makers drop it because GDDR5 was superior,.

No it doesn't make any sense you are a complete moron and MS sucks up maximum level and you lack the intelligence to debate this with me.hahaha

lol now DDR3 is better than GDDR5 because MS say so..hahaha

@Tighaman said:

@daious: I didnt get my info from there I got that diagram from there. I was going by old vg leaks and microsoft own pdf. that showed two gpu I was never talking about mrX or his insiders.

Everything I say I get from sony or MS papers and everything yall say you get from PR I would believe them faster than any blog or post from you.

Sound like you really mad come up with something technical reason why that 140gbs is better 200+gbs of BW.

the only console going SSAA

the first console doing Clustered Forward+ rendering

only console doing hundreds of NPCs on screen

ALL THEM THINGS THAT SO CALL SHOW YOUR POWER 1080p, 4xMSAA, LOCKED 30fps OPEN WORLD REAL TIME REFLECTIONS Weather NOW IN ONE GAME lol no ps4 game is even close and the one that are doing those things I mention above none of them are even 1080p lol I will take the weaker console.

Bro you can't prove anything you are arguing do I have to go and pull links like I did last time to get you to shut up. Why can't you just be happy knowing PS4 has more powerful hardware instead of making yourself look stupid trying to nitpick things you have no knowledge of.

We already know that DX 12 will have a fairly strong impact on PC gaming when it launches late next year, but will it have the same technical impact on the Xbox One? Interestingly, Engel revealed that, “The Xbox One already has an API which is similar to DirectX 12. So Microsoft implemented a driver that is similar to DirectX 12 already on the Xbox One.” Is it a straight yes/no answer? No. But it does back up what I, and many others, have stated multiple times in the past – that Microsoft built their console around an API that they weren’t ready to release. Was that their best decision?

Built their console around, Built their CONSOLE AROUND, BUILT THEIR CONSOLE AROUND, BUILT THEIR CONSOLE AROUND

Esram may be easy for some devs to use if they try to use it the same way as edram on 360, but in the article he did say that it's like a reverse of last gen where PS was harder to dev for, now the xbox is easy to dev for. You claim dx will do nothing for xbox but then say it's easier to develop for xbox because of directx

Bro are you still going with everything on xbox has to be placed in ddr3, only things that don't favor bandwidth will be placed in ddr3. Remember how I proved to you before that devs could get dat out of esram if they want to, without going through ddr3.

Bro yet another flawed argument from you neither the xbox nor PS4 gpu's are clocked as high as their pc equivalents. Power consumption you idiot. Fast is possible means just that fast as possible in this case 853 mhz is as fast as possible.

I would not buy gddr5 for my PC that would be a dumb ass move. I'll stick with 1866 ddr3 and the gddr5 on my gpu. I don't think you will find one hermit on these threads that would buy gddr5 as system ram.

No you idiot it took them into an uncomfortable spot based on price and power consumption. DDR3 was on board gpu's I never said it wasn't then came gddr5 which was better and now we have stacked memory which is better than gddr5. AS time goes by technology advances, unlike your thinking patterns.

NO man ddr3 is better for cpu because of common sense.

At this point you are just fishing for a topic to be right on, if you want to be right then prove me wrong. Give me a link saying xbox wasn't built around dx12. Give me a link saying xbox will see no improvements from dx12, even though you already posted a chart showing otherwise. Give me a link saying the xbox does not have a custom architecture? Prove to me GDdr5 is better for any cpu? And while your at it look up the definition of a frame buffer.

Stop being butthurt that xbox has something PS4 might not, PS4 still has the more powerful hardware, why are you so insecure about dx. Why are you so insecure about the xbox period? Do you want directx 12 on PS4? Do you want Forza, Dr3, Titanfall, Sunset Overdrive, Halo on PS4? Do you want XBL on PS4? Maybe PS cloud would make you happy?

At any rate until you provide the links I asked for above I won't be going back and forth with you anymore. LIke I said before you like to post links to prove your point so stop talking out your ass and find some.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#134 tormentos
Member since 2003 • 33784 Posts

@Tighaman said:

@tormentos: again you so stupid you still trying to compare PC graphic cards that dont share memory and Bandwidth such as the PS4 so its not the same dummy. lol

Pdf on Ryse says otherwise SSAA in the pre rendered and real time cutscenes.

If your CPU have plenty to do with resolution stop listening to Gaf a sidearm devs any real dev would tell you different. If your CPU lack the cycles can affect your resolution.

Drivatars are AI and its the cloud you mad because your console is too week to have a NOC

Titanfall one of the best games this year no matter the resolution.

DC is garbage weak IQ, poor AA, linear tracks, rubberbanding, lack content, poor online severs, pretty RAIN CLOUDS with no rain, poor AF, and incomplete game matter a fact you keep comparing them games when really its no comparison.

You cant say 1080p 4xmsaa, locked 30 CLUSTERED FORWARD+is a joke until you 40% more powerful console do it.

No i am not and the 7870 doesn't have 176GB/s you bombastic moron,it has 153GB/s that mean that 23GB/s can still be use for the CPU without actually hurting what a 7870 would have on its own,the comes the super Onion bus which is another 20GB/s 10/10.

Then add to this that the PS4 APU is more efficient handling data than your average GPU and CPU on PC,

One barrier to this in a traditional PC hardware environment, he said, is communication between the CPU, GPU, and RAM. The PS4 architecture is designed to address that problem.

"A typical PC GPU has two buses," said Cerny. "There’s a bus the GPU uses to access VRAM, and there is a second bus that goes over the PCI Express that the GPU uses to access system memory. But whichever bus is used, the internal caches of the GPU become a significant barrier to CPU/GPU communication -- any time the GPU wants to read information the CPU wrote, or the GPU wants to write information so that the CPU can see it, time-consuming flushes of the GPU internal caches are required."

The three "major modifications" Sony did to the architecture to support this vision are as follows, in Cerny's words:

  • "First, we added another bus to the GPU that allows it to read directly from system memory or write directly to system memory, bypassing its own L1 and L2 caches. As a result, if the data that's being passed back and forth between CPU and GPU is small, you don't have issues with synchronization between them anymore. And by small, I just mean small in next-gen terms. We can pass almost 20 gigabytes a second down that bus. That's not very small in today’s terms -- it’s larger than the PCIe on most PCs!
  • "Next, to support the case where you want to use the GPU L2 cache simultaneously for both graphics processing and asynchronous compute, we have added a bit in the tags of the cache lines, we call it the 'volatile' bit. You can then selectively mark all accesses by compute as 'volatile,' and when it's time for compute to read from system memory, it can invalidate, selectively, the lines it uses in the L2. When it comes time to write back the results, it can write back selectively the lines that it uses. This innovation allows compute to use the GPU L2 cache and perform the required operations without significantly impacting the graphics operations going on at the same time -- in other words, it radically reduces the overhead of running compute and graphics together on the GPU."
  • Thirdly, said Cerny, "The original AMD GCN architecture allowed for one source of graphics commands, and two sources of compute commands. For PS4, we’ve worked with AMD to increase the limit to 64 sources of compute commands -- the idea is if you have some asynchronous compute you want to perform, you put commands in one of these 64 queues, and then there are multiple levels of arbitration in the hardware to determine what runs, how it runs, and when it runs, alongside the graphics that's in the system."

The PS4 is more efficient period,and doesn't have to flush or copy and paste redundant data,i am sure this is one of the reason why the PS4 CPU actually out performed the xbox one is test,.

SSAA in cut scenes..hahahaaaaaaaaaaaaaaaa

No the PS4 did it on all the game not just on things which you have no control what so ever over it..hahahaha..

No you freaking idiot it doesn't,run a video encoding program on your PC on 1080p then change the resolution to 720p the CPU usage will still be the same because the resolution has sh** to do with your CPU,it is your GPU what is in charge of that.

No what the CPU will affect is the frames not the resolution you blind biased fanboy.

Drivatar is not the game AI,that is a copy of your driving skills been emulated by the cloud...

Forza 5 doesn't require online to work so yeah it doesn't use the cloud for AI,other wise the game like titafall would not work offline.

Oh please it has 86 on meta that is what Resistance 1 has and was tag like garbage here..hahaha

Oh did i tell you that you don't need a xbox one to play it.?

DC can be the worse game ever made,but >>>>>>>>>>>>>>>>>>>>>>>> FH2 graphically and since the argument here is graphics not quality you lose again...

Yeah that is like saying the PS4 is the only console with a SSAA game at 1920x1280 when the DX12, cloud,ESRAM and Tile Resources get the xbox one there you talk ...

Meanwhile the xbox one still is weak sauce..lol

@ttboy said:

I don't know why this is a debate. Clearly there will be some gains and this is confirmed by an actual Dev who has access to DX12. This is not some dev speculating. Now why won't someone accept my bet....

DirectX 12 could double perf for XBox One games says Dev Using DX12

@XX__MX it's not literally (it's software, not hardware) but yes, dx12 games will likely by more than 2x as fast.

Now I would love to hear why he is wrong and you guys are right.

That is not a developer that is a moron..

Brad Wardell ‏@draginolApr 6

@misterxmedia@XX__MX nothing stopping sony from adopting mantle and then they'd get a huge perf boost.

Hahahaaaaaaaaaaaaaaaaaaaaaaaa................

Brad Wardell ‏@draginolApr 6

@joe_GeoThor@misterxmedia@XX__MX basically the issue is the software was single core. Dx12 and mantle are multicore.

Now i know where you morons get those fu** up theories..hahaha

DX has been multithreaded for years....

Thanks to the proliferation of multi-core processors, DirectX 11 can now make true use of multi-threaded instructions. Instead of burying all of the DirectX calls in a single thread and thus hogging a single core, DirectX 11 can properly split work between the cores of the processor itself. Modern games have been surprisingly CPU-dependent, so this ability to take fuller advantage of modern processors can in turn help the graphics hardware perform.

http://www.notebookreview.com/news/what-directx-11-and-windows-7-means-for-gamers/

Hahahaaaaaaaaaaaaaaa... Wrong multicore CPU had been supported for years DX11 came in 2009 and with support for multicore CPU saying that the software was single core is a joke it wasn't.

By the way the PS4 doesn't need mantle,Mantle Emulate what the PS3 xbox 360 has had for years lower CPU over head,so yeah the PS4 API work like mantle.

So yeah tdkmilksy you are wrong and gain and that isn't a developer is a blind moron,if he is a developer he is the dumbest ever of was simply making PR for MS.

@04dcarraher said:

And DDR 1 has lower latency then DDR2 your poor attempt in trying to make an argument

GDDR5 latency is higher then DDR3. No if ands or butts about it. The GDDR5 memory controllers cause the latency..And like I said it depends on the tasks the cpu is doing. Latency isn't much of an issue with GPU's since their parallel nature allows them to move to other things when latency causes a pause in the workload it was doing. Cpu's handle things in a linear fashion making GDDR5 a slight problem depending on the tasks at hand. The CPU cache does levitate some of the GDDR5 latency issues but is overall still worse for cpu tasks over DDR3. What is you and your fetish with PS4 and HSA? because the only thing your correct aboutis that the cpu does not have access to the esram which is a whole 32mb, the X1 has a different implementation of HSA and the cpu and gpu still see the rest of the memory pool.

Seriously you didn't catch my point.? Really.?

Because it was rather simple,DDR3 having higher latency than DDR2 didn't stop PC makers from moving to DDR3 because the benefit of the speed actually was worth it,if they would have stay with DDR2 they would have been bottleneck.

Not it doesn't it has been posted here with links.

@btk2k2:said

1) 2133 Mhz DDR3 is 1.5 - 1.65v or there abouts, different modules have different specs but if you look at newegg (or ebuyer if you are UK based) you can see the voltages there.

GDDR5 is about the same in terms of voltage but because it runs at a higher clock speed it does draw more power than DDR3 of the same capacity and voltage.

2) DDR3 is 64 bit/channel and GDDR5 is 32bit/channel. The memory controller on a CPU will obviously be configured for DDR3 but there is no reason that a CPU cannot use a GDDR5 configured memory controller, there is very little difference to be honest.

3) This is true but usually the limiting factor for bus width is not how small they can make the memory controller it is how small they can make the physical connections to the socket the GPU/CPU/APU sits in.

4) Not really. GDDR5 is not used in PC systems as standard memory because it is a) expensive and b) overkill. There are very few CPU bound processes that you use on a standard PC that would take advantage of GDDR5.

5) Timings and clock speed are related, hence my car analogy. If you have loose timings but a high clock speed the net effect is the same as tight timings and a slow clock speed.

6) True

7) According to Hynix from their specification documents the following are the timings.

ActionDDR3 2133N (timings in ns)5.5Gbit GDDR5 (timing in ns)
tAA13.09-
tRCD13.09Multiple Entries, 10 - 12
tRP13.0912
tRAS3328
tRC46.0940

DDR3 document Here (download) pg 31

GDDR5 document Here (opens in browser) pg 131 - 135

As you can see in this case the GDDR5 actually has a lower latency than the DDR3. I do not know the exact timings used in the Xbox One or the PS4 but the latency is going to be pretty much the same in both consoles so neither one has an advantage when it comes to latency.

Enjoy and you were on that thread...lol

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#136 tormentos
Member since 2003 • 33784 Posts

@slimdogmilionar said:

Bro you can't prove anything you are arguing do I have to go and pull links like I did last time to get you to shut up. Why can't you just be happy knowing PS4 has more powerful hardware instead of making yourself look stupid trying to nitpick things you have no knowledge of.

We already know that DX 12 will have a fairly strong impact on PC gaming when it launches late next year, but will it have the same technical impact on the Xbox One? Interestingly, Engel revealed that, “The Xbox One already has an API which is similar to DirectX 12. So Microsoft implemented a driver that is similar to DirectX 12 already on the Xbox One.” Is it a straight yes/no answer? No. But it does back up what I, and many others, have stated multiple times in the past – that Microsoft built their console around an API that they weren’t ready to release. Was that their best decision?

Stop you went from trying to quote MS to quote a Moron on a site call xbox daily,basically you owned your self there for all intended purposes that could be you.

You loss it dude xbox daily..hahahaaaaaaaaaaaaaaaaaaa

Avatar image for Tighaman
Tighaman

1038

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#137  Edited By Tighaman
Member since 2006 • 1038 Posts

@04dcarraher: That guy dumb he dont know anything about AXI Bridges and Memory Management Units because the ps4 he still talking about pc gpu that dont share the cpu memory or bandwidth most of the time its ddr3 for the cpu and gddr5 for the gpu

Avatar image for kuu2
kuu2

12063

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#138  Edited By kuu2
Member since 2005 • 12063 Posts

@Animal-Mother said:

@kuu2 said:

@GrenadeLauncher: No Ignorance Launcher, a memory is what has happened to Sony's 2014 games lineup. Come the holidays The One proves which system is an actual console with games.

After 12 months of literally almost nothing?

"literally almost" ?????????

Come on....................

I understand you are not a Cow but man you are starting to sound like Tomato with posts like this.

Titanfall, FH2, SO, KIS2, Project Spark, and MCC are all on or coming out on The One. You have one so you can enjoy greatness and don't have to wait like the Cows you whisper to.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#139  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Tighaman said:

@04dcarraher: That guy dumb he dont know anything about AXI Bridges and Memory Management Units because the ps4 he still talking about pc gpu that dont share the cpu memory or bandwidth most of the time its ddr3 for the cpu and gddr5 for the gpu

And he posts non relevant data on DDR3 and GDDR5 modules.... The memory controllers determine what the final latency is between the cpu and the memory... DDR3 uses a single 64-bit memory controller per channel or 128bit per dual channel. While GDDR5 uses multiple 32-bit dual 16 bit each for input and output which is dependent on the bus used . GDDR5 controllers nearly double the Latency compared with a DDR3 controller. GDDR5 with 256bit bus uses 8 memory controllers. Hell Intel's Xeon Phi x86 coprocessor needs 30mb of L2 cache to overcome GDDR5's latency and uses a ring interconnect to keep the a large buffer of data full at all times.

Avatar image for deactivated-62825bb2ccdb4
deactivated-62825bb2ccdb4

666

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#140 deactivated-62825bb2ccdb4
Member since 2003 • 666 Posts

@tormentos : So yeah tdkmilksy you are wrong and gain and that isn't a developer is a blind moron,if he is a developer he is the dumbest ever of was simply making PR for MS.

Bradley R. Wardell (born 24 June 1971[1]), is an American businessman, programmer and author residing in Michigan.[3] He is the founder, President, and CEO[4] ofStardock, a software development and computer games company.[5]

Wardell's specialty is the design and programming of artificial intelligence and game mechanics for turn-based strategy games.[6]

When the OS/2 market collapsed, he guided Stardock to Windows, heading development of PC game Entrepreneur (now The Corporate Machine) while coordinating the creation of WindowBlinds and other Object Desktop components.

Wardell designed Galactic Civilizations for Windows and its sequel, which became GameSpy's Game of the Year.[11] He subsequently designed The Political Machine and Elemental, as well as twoexpansions toGalactic Civilizations II.[12] In 2012 he was the producer of Elemental: Fallen Enchantress.

Wardell is credited with many different game projects during his career either as a game designer or as an executive producer.[21]

As designer[edit]

  • Galactic Civilizations for OS/2 (1994)
  • Entrepreneur (1997)
  • The Corporate Machine (2001)
  • Galactic Civilizations(2003)
  • The Political Machine (2004)
  • Galactic Civilizations II (2006)
  • Elemental: War of Magic (2010)
Avatar image for Animal-Mother
Animal-Mother

27362

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#141  Edited By Animal-Mother
Member since 2003 • 27362 Posts

@kuu2 said:

@Animal-Mother said:

@kuu2 said:

@GrenadeLauncher: No Ignorance Launcher, a memory is what has happened to Sony's 2014 games lineup. Come the holidays The One proves which system is an actual console with games.

After 12 months of literally almost nothing?

"literally almost" ?????????

Come on....................

I understand you are not a Cow but man you are starting to sound like Tomato with posts like this.

Titanfall, FH2, SO, KIS2, Project Spark, and MCC are all on or coming out on The One. You have one so you can enjoy greatness and don't have to wait like the Cows you whisper to.

Yeah titanfall and FH 2 are the only games out on that list. SSOD can still technically flop ( though it most likely won't), KIS2 and PS don't interest me one bit.

And I bought my One after titanfall so I got that on PC.

And I played all the halos years ago.

But otherwise I've only bought one game for XB1 in 6 months and that's D4...... I'm really lucky if I feel the need to turn it on once a month In all seriousness.

For me it literally has been almost nothing.

Avatar image for hehe101
hehe101

734

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#142 hehe101
Member since 2011 • 734 Posts

and yet xbox ones launch title ryse still looks better than any console game excluding PC, shame that.

Avatar image for Animal-Mother
Animal-Mother

27362

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#143 Animal-Mother
Member since 2003 • 27362 Posts

@hehe101 said:

and yet xbox ones launch title ryse still looks better than any console game excluding PC, shame that.

You can have the best looking game in the world... If it's not a fun game what's the point?

Avatar image for hehe101
hehe101

734

Forum Posts

0

Wiki Points

0

Followers

Reviews: 9

User Lists: 0

#144 hehe101
Member since 2011 • 734 Posts

@Animal-Mother said:

@hehe101 said:

and yet xbox ones launch title ryse still looks better than any console game excluding PC, shame that.

You can have the best looking game in the world... If it's not a fun game what's the point?

Exactly, I never said I liked ryse, just stating that the dx12 isn't a big issue as a launch title still looks better than its competitors.

Avatar image for Animal-Mother
Animal-Mother

27362

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#145 Animal-Mother
Member since 2003 • 27362 Posts

@hehe101 said:

@Animal-Mother said:

@hehe101 said:

and yet xbox ones launch title ryse still looks better than any console game excluding PC, shame that.

You can have the best looking game in the world... If it's not a fun game what's the point?

Exactly, I never said I liked ryse, just stating that the dx12 isn't a big issue as a launch title still looks better than its competitors.

Bingo

Avatar image for tymeservesfate
tymeservesfate

2230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#146  Edited By tymeservesfate
Member since 2003 • 2230 Posts

@Heil68 said:

@tymeservesfate said:

@Heil68 said:

@tymeservesfate said:

@Heil68 said:

@GreySeal9 said:

Lems have handled having the weaker console in a pretty embarrassing fashion.

True that.

SSSSHHHHHHHHHHH

I even have an Xbone!!!111

SSHHHHH...lol

Wont you be me Neighbor and add BroknOath to Live so we can play on Heil Boulevard? AWWWW YEAAAHHH

Avatar image for tdkmillsy
tdkmillsy

5882

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#147 tdkmillsy
Member since 2003 • 5882 Posts

@tormentos : For the record I have an never will have duplicate accounts. Seriously whats the point. Is it too hard to accept there are several people that simply dont agree with you or think you are a simple fanboy that prefers to write in these forums than actually play games.

Avatar image for GrenadeLauncher
GrenadeLauncher

6843

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#148 GrenadeLauncher
Member since 2004 • 6843 Posts

@ttboy said:

Google is your friend.

lmao a podcast designed to keep lemming peckers up.

Da cloud indeed.

Avatar image for FoxbatAlpha
FoxbatAlpha

10669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#149 FoxbatAlpha
Member since 2009 • 10669 Posts

@GrenadeLauncher said:

@ttboy said:

Google is your friend.

lmao a podcast designed to keep lemming peckers up.

Da cloud indeed.

That podcast was great of Phil to do. He even gave those guys some exclusive news! Just saying it is great to have Phil on our side. Maybe he will come on P2M????? @Heil68, @TheEroica,@Animal-Mother.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#150  Edited By tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

@Tighaman said:

@04dcarraher: That guy dumb he dont know anything about AXI Bridges and Memory Management Units because the ps4 he still talking about pc gpu that dont share the cpu memory or bandwidth most of the time its ddr3 for the cpu and gddr5 for the gpu

And he posts non relevant data on DDR3 and GDDR5 modules.... The memory controllers determine what the final latency is between the cpu and the memory... DDR3 uses a single 64-bit memory controller per channel or 128bit per dual channel. While GDDR5 uses multiple 32-bit dual 16 bit each for input and output which is dependent on the bus used . GDDR5 controllers nearly double the Latency compared with a DDR3 controller. GDDR5 with 256bit bus uses 8 memory controllers. Hell Intel's Xeon Phi x86 coprocessor needs 30mb of L2 cache to overcome GDDR5's latency and uses a ring interconnect to keep the a large buffer of data full at all times.

Can you prove that the memory controllers on PS4 are the same on PC.? Because the memory on PS4 working in a completely different way to how it work on PC,for started it is just 1 fast one,not 2.

But hey do you have actual proof of programs running in GDDR5 to know latency is an issue and that the speed benefit doesn't counter the higher latency.?

Do you have them.? So you are going by what other people say it has higher latency period.

Do you know that agreeing with that moron you just piggy back riding make you a even bigger moron.?

From 10 things that dude post he is wrong in 11.

That bold part is stupid,almost all CPU have L2 cache Intel's Xeon isn't any different and the whole 30 MB come from the fact that each core has 512kb of cache which can't be share between cores,which mean is not 30 MB of cache is 512kb per core so 60 cores = 30MB ass.

The same cache can be found on any CPU regardless of using DDR3 as memory,cache is basically a middle man to speed up data access which in this case does the same thing on DDR3 as it does in GDDR5.

@ttboy said:

@tormentos : So yeah tdkmilksy you are wrong and gain and that isn't a developer is a blind moron,if he is a developer he is the dumbest ever of was simply making PR for MS.

Bradley R. Wardell (born 24 June 1971[1]), is an American businessman, programmer and author residing in Michigan.[3] He is the founder, President, and CEO[4] ofStardock, a software development and computer games company.[5]

Wardell's specialty is the design and programming of artificial intelligence and game mechanics for turn-based strategy games.[6]

When the OS/2 market collapsed, he guided Stardock to Windows, heading development of PC game Entrepreneur (now The Corporate Machine) while coordinating the creation of WindowBlinds and other Object Desktop components.

Wardell designed Galactic Civilizations for Windows and its sequel, which became GameSpy's Game of the Year.[11] He subsequently designed The Political Machine and Elemental, as well as twoexpansions toGalactic Civilizations II.[12] In 2012 he was the producer of Elemental: Fallen Enchantress.

Wardell is credited with many different game projects during his career either as a game designer or as an executive producer.[21]

As designer[edit]

  • Galactic Civilizations for OS/2 (1994)
  • Entrepreneur (1997)
  • The Corporate Machine (2001)
  • Galactic Civilizations(2003)
  • The Political Machine (2004)
  • Galactic Civilizations II (2006)
  • Elemental: War of Magic (2010)

Then he didn't know what he was talking about.

Sony doesn't need mantle.

Both the Xbox One and PlayStation 4 have APIs for accessing the GCN architecture directly, so Mantle in itself isn’t needed for that. The main advantage Mantle gives us is the ability to have console-like performance, particularly in batch performance, on the PC. At AMD’s recent developer summit, Oxide demonstrated a PC running at over 100,000 batches per frame. Before now, this type of performance on a PC was unheard of.

http://gamingbolt.com/ps4-xbox-one-dont-need-mantle-to-boost-visuals-api-access-to-gcn-architecture-already-available#3BYFRU3Kw80v9L97.99

The PS4 and the xbox one don't need mantle,now DX12 is mantles competitions,which isn't need it on xbox one either because DX12 = xbox one and xbox 360 API bring to PC,those gains are there i have been telling you people this for months.

And like i showed with my link he is wrong about DX11 using a single core that is a joke DX has been multicore for years.So yeah he was lying or making PR.

@Tighaman said:

@04dcarraher: That guy dumb he dont know anything about AXI Bridges and Memory Management Units because the ps4 he still talking about pc gpu that dont share the cpu memory or bandwidth most of the time its ddr3 for the cpu and gddr5 for the gpu

You are the dumb one..but but but the xbox one is the only console going SSAA...hahhaaa

The xbox one can barely do 1080p on its own it will do 1280p to super sample..hahaha Oh yeah i forgot Ryse did it on cut scenes...hahhaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

@kuu2 said:

"literally almost" ?????????

Come on....................

I understand you are not a Cow but man you are starting to sound like Tomato with posts like this.

Titanfall, FH2, SO, KIS2, Project Spark, and MCC are all on or coming out on The One. You have one so you can enjoy greatness and don't have to wait like the Cows you whisper to.

If there an exclusive on that list.?

Oh yeah i see it now SO been buried into that cluster fu** or multiplatforms was hiding it.

Titanfall,FH2,Project Sparks and MCC all are multiplatforms,and KIS2 is a joke.

@hehe101 said:

and yet xbox ones launch title ryse still looks better than any console game excluding PC, shame that.