DX12 will not solve XBones 1080p problems

#301 Edited by slimdogmilionar (472 posts) -

Aaaaaaaahahahahahaahahahaha, lemmings still clinging to that canard.

http://www.cloudgine.com/technology.html

Not just lemmings now you can add Epic, Havok, and Nvidia to the list of people clinging on to the cloud. But I guess these guys would just be dumb and also using the cloud as a gimmick according to you and @tormentos. Starting to look like you guys gonna end up eating all that pooh you said about the cloud being fake.

@softwaregeek: I was referring to the quotes you posted earlier where a software engineer had said he expects the PS4 to plateau before the Xbox, I was asking for a link to the source of those comments.

As MS develops new development tools, it may not always be the case that the PS4 is easier to develop for. The ps4 has a little better hardware, but that doesn't necessarily mean it is the strongest technologically. It means as a stand alone unit it is stronger. I see where both consoles could plateau. However, the xb1 has an advantage here because incredibly complex calculations can be offloaded to the cloud. Calculations that could easily overwhelm either console. After taking both consoles strengths and weaknesses into consideration, I find it more likely that PS4 will plateau out before the xb1. Buuuut....that is speculation. It's speculation however that's based upon my 15+ years experience as a software engineer. I could be wrong. Probably not though. At the end of the day, it doesn't matter much because both systems will have killer games. Halo. The Last of Us. etc.

#302 Edited by tormentos (17069 posts) -

Oh wait... I have to look at DF to be able to see a difference?

LOL

Last gen.

This gen..lol

I would call you a spindoctor, but that would be a compliment and you aren't successful in it, incapable of it, mostly because you talk about things you don't know anything about... But that has been proven by a lot of members already. You don't even know basic AI, lol

Keep spinning bro..

There is no parity unless developer on purpose hold the PS4 back period.

The ps3 has more tflops than the ps4. Those numbers were confirmed by various scientist. You haven't give me a single source that says otherwise. That's why I said that tflops don't always give the right picture. You're comparing the ps3 rsx with a desktop cpu but that is completely wrong because the the ps3 is basically a cpu mixed with a gpu. These are not completely seperate parts like on a pc.

So while tflops might be very good to measure desktop graphic cards for consoles it doesn't always give the right picture because consoles are different chips working together. A desktop has this too but everything is a lot more seperate. Still your comparison about the 7790 and 7850 proves my point. The tflops on the 7790 are higher than the 7850 but it's clear as day the 7850 is the stronger card.

Your comparison between the 7950 and 660ti is wrong because these benchies are done with the hd 7950 at stock clock and that was basically an underclocked card. It was also before the never settle driver updates. A 7950 easily matches a 660 ti in performance. The 7950 wil also be a lot better at higher resolutions because of it's higher bandwith and memory but this is besides the point because this card is simply not in the same league as the consoles are, it is much better.

Here's a link to a hd 7770 which is 31 percent overclocked on the memory and 6 percent on the core clock. The overclock showed a 10 percent increase in performance . The memory bandwith on this card is already higher at stock clocks than the xbox one gpu. It has 73 gb/s and x1 gpu has 68 gb/s. The overclocked card has a memory bandwith of 92 gb/s.

So the card has a 10 percent increase in performance mainly due to memory overclocking but the xone's gpu would not had any benefits from that? While the x1's gpu is a bit stronger than this card. The xboxone also has shared memory, so memory bandwith in this case would show more performance in games than you would have with the memory bandwith of a desktop gpu, since this memory isn't shared.

another site also reports the hd7770 biggest handicap is the memory bandwith.

And again the 256 bit bus is needed to communicate with the esram, since these are 256 bit chips as well. I never said that the ps4 wasn't stronger than the xboxone, but you make it like it is such a big difference but if the esram is used that difference becomes very small, as games as wolfenstein prove.

If the devs don't use that esram then that's sad for x1 users but still it's just a difference vs 900 p and 1080p, because the 720p vs 1080p with launch games was because of the reservation for the kinect and I really doubt the devs won't use that esram in the future because dx 12 wil become very common and the esram is perfect for dx 12 tools.

No it doesn't and only a true moron would fall for it..

You are an idiot and is clear by not you don't know sh** about the PS3,or this topic,the PS3 has a PC GPU is the CPU which is not from PC and is a hybrid that can do GPU task,is the sole reason the PS3 was able to keep up and pass the 360.

The RSX is a PC GPU.

http://en.wikipedia.org/wiki/RSX_%27Reality_Synthesizer%27

192Gflops like i told you...lol

Memory256MB
Memory Interface128-bit
Memory Bandwidth (GB/sec)22.4
Fill Rate (Billion pixels/sec)6.7
Vertices/Second (Millions)700
Pixels per clock (peak)12
RAMDACs (MHz)400

The 7600TG has higher fill rate,the same exact bandwidth as the RSX,8 ROP and even the same memory size 256mb.

The RSX was a 7600TG basically it was 192Gflops not 1.8TF..lol

No it doesn't prove your point all the contrary,why do you think the 7850 performs better.?

It has 2 CU more 128SP more,to get to the point the 7850 is the 7790 has to rise its clock to 1027mhz,and still it fall behind because the extra CU work better than having less with a higher clock.

And to make things worse the only GCN that perform worse while having higher flops is the one inside the xbox one,if anything this work against your argument even more,since the xbox one has the only GPU that has higher flops but less performance on this line of GPU the xbox one has the weakest link in GCN..lol

No the 7950 wasn't underclocked that was the first model,just like the 660TI fair is fair,the latter model was over clocked which is the boost.

The 7950 has a 384 bit bus an 240GB/s bandwidth bro by your logic of bigger bus change everything it should destroy the 660TI,it doesn't because regardless of how much bus it has it lack the power to do so.

I know what the 7770 has for bandwidth,and while it has more bandwidth than the DDR3 memory inside the xbox one the xbox one has ESRAM to compensate,this is what i have been telling people all alone,the xbox one ESRAM is a patch for bandwidth what else is new.?

No it didn't you butthurt blind fanboy,read your own damn article..

Pros

  • 20% performance increase over last generation
  • Excellent energy efficiency
  • Overclocked out of the box
  • Extremely quiet
  • No power consumption increase despite overclock
  • Excellent GPU overclocking potential

Cons

  • Price too high to be competitive
  • Lower memory OC potential than other cards
  • Memory overclock is small

Basically most of the boost came from the actual GPU over clock and not from the memory one.

Memory on the memory wasn't over clock on the xbox one since it would gain little from over clocking DDR3,ESRAM got a clock boost because the GPU was also raise,and that is accounted on the specs already,and it didn't change the bus so yeah have a nice time proving the 7770 improved with a 256 bit bus,which doesn't have.

Is a tad stronger because it has 2 more CU,the 7770 has 10,but it has the same TF count which mean more or less the same performance,specially when the 7770 has no cumbersome memory setup.

No MS put that bus there probably to play it safe,need it no probably is not.

Wolfenstein on xbox one drops to 960x1080 on the fly...

960x1080 = 1,036,800 pixels It almost drop to 720p on the fly..hahahahahaha

1280x720 = 921,600 almost 720p the xbox one version of Wolfenstein drop and you think they are on par..

hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaa

The PS4 rarely drops from 1080p and when it does it say abode 900p..hahahaha

http://www.eurogamer.net/articles/digitalfoundry-2014-wolfenstein-the-new-order-face-off

All xbox one games uses the ESRAM without ESRAM games would be less than 720p all of them,68GB/s is not enough to feed the xbox one CPU and CPU at the same time and push graphics.

Class dismiss..

#303 Posted by StormyJoe (4877 posts) -

@tormentos: Those gifs are so inane they are barely worth mentioning. In fact, they are so stupid, they can't even get their point across correctly:

The game was "Red Dead Redemption", not "Red Dead Redention".

LOL!

#304 Edited by StormyJoe (4877 posts) -

@StormyJoe said:

Oh wait... I have to look at DF to be able to see a difference?

LOL

st gen.

This gen..lol

@FastRobby said:

I would call you a spindoctor, but that would be a compliment and you aren't successful in it, incapable of it, mostly because you talk about things you don't know anything about... But that has been proven by a lot of members already. You don't even know basic AI, lol

Keep spinning bro..

There is no parity unless developer on purpose hold the PS4 back period.

@evildead6789 said:

The ps3 has more tflops than the ps4. Those numbers were confirmed by various scientist. You haven't give me a single source that says otherwise. That's why I said that tflops don't always give the right picture. You're comparing the ps3 rsx with a desktop cpu but that is completely wrong because the the ps3 is basically a cpu mixed with a gpu. These are not completely seperate parts like on a pc.

So while tflops might be very good to measure desktop graphic cards for consoles it doesn't always give the right picture because consoles are different chips working together. A desktop has this too but everything is a lot more seperate. Still your comparison about the 7790 and 7850 proves my point. The tflops on the 7790 are higher than the 7850 but it's clear as day the 7850 is the stronger card.

Your comparison between the 7950 and 660ti is wrong because these benchies are done with the hd 7950 at stock clock and that was basically an underclocked card. It was also before the never settle driver updates. A 7950 easily matches a 660 ti in performance. The 7950 wil also be a lot better at higher resolutions because of it's higher bandwith and memory but this is besides the point because this card is simply not in the same league as the consoles are, it is much better.

Here's a link to a hd 7770 which is 31 percent overclocked on the memory and 6 percent on the core clock. The overclock showed a 10 percent increase in performance . The memory bandwith on this card is already higher at stock clocks than the xbox one gpu. It has 73 gb/s and x1 gpu has 68 gb/s. The overclocked card has a memory bandwith of 92 gb/s.

So the card has a 10 percent increase in performance mainly due to memory overclocking but the xone's gpu would not had any benefits from that? While the x1's gpu is a bit stronger than this card. The xboxone also has shared memory, so memory bandwith in this case would show more performance in games than you would have with the memory bandwith of a desktop gpu, since this memory isn't shared.

another site also reports the hd7770 biggest handicap is the memory bandwith.

And again the 256 bit bus is needed to communicate with the esram, since these are 256 bit chips as well. I never said that the ps4 wasn't stronger than the xboxone, but you make it like it is such a big difference but if the esram is used that difference becomes very small, as games as wolfenstein prove.

If the devs don't use that esram then that's sad for x1 users but still it's just a difference vs 900 p and 1080p, because the 720p vs 1080p with launch games was because of the reservation for the kinect and I really doubt the devs won't use that esram in the future because dx 12 wil become very common and the esram is perfect for dx 12 tools.

No it doesn't and only a true moron would fall for it..

You are an idiot and is clear by not you don't know sh** about the PS3,or this topic,the PS3 has a PC GPU is the CPU which is not from PC and is a hybrid that can do GPU task,is the sole reason the PS3 was able to keep up and pass the 360.

The RSX is a PC GPU.

http://en.wikipedia.org/wiki/RSX_%27Reality_Synthesizer%27

192Gflops like i told you...lol

Memory256MB
Memory Interface128-bit
Memory Bandwidth (GB/sec)22.4
Fill Rate (Billion pixels/sec)6.7
Vertices/Second (Millions)700
Pixels per clock (peak)12
RAMDACs (MHz)400

The 7600TG has higher fill rate,the same exact bandwidth as the RSX,8 ROP and even the same memory size 256mb.

The RSX was a 7600TG basically it was 192Gflops not 1.8TF..lol

No it doesn't prove your point all the contrary,why do you think the 7850 performs better.?

It has 2 CU more 128SP more,to get to the point the 7850 is the 7790 has to rise its clock to 1027mhz,and still it fall behind because the extra CU work better than having less with a higher clock.

And to make things worse the only GCN that perform worse while having higher flops is the one inside the xbox one,if anything this work against your argument even more,since the xbox one has the only GPU that has higher flops but less performance on this line of GPU the xbox one has the weakest link in GCN..lol

No the 7950 wasn't underclocked that was the first model,just like the 660TI fair is fair,the latter model was over clocked which is the boost.

The 7950 has a 384 bit bus an 240GB/s bandwidth bro by your logic of bigger bus change everything it should destroy the 660TI,it doesn't because regardless of how much bus it has it lack the power to do so.

I know what the 7770 has for bandwidth,and while it has more bandwidth than the DDR3 memory inside the xbox one the xbox one has ESRAM to compensate,this is what i have been telling people all alone,the xbox one ESRAM is a patch for bandwidth what else is new.?

No it didn't you butthurt blind fanboy,read your own damn article..

Pros

  • 20% performance increase over last generation
  • Excellent energy efficiency
  • Overclocked out of the box
  • Extremely quiet
  • No power consumption increase despite overclock
  • Excellent GPU overclocking potential

Cons

  • Price too high to be competitive
  • Lower memory OC potential than other cards
  • Memory overclock is small

Basically most of the boost came from the actual GPU over clock and not from the memory one.

Memory on the memory wasn't over clock on the xbox one since it would gain little from over clocking DDR3,ESRAM got a clock boost because the GPU was also raise,and that is accounted on the specs already,and it didn't change the bus so yeah have a nice time proving the 7770 improved with a 256 bit bus,which doesn't have.

Is a tad stronger because it has 2 more CU,the 7770 has 10,but it has the same TF count which mean more or less the same performance,specially when the 7770 has no cumbersome memory setup.

No MS put that bus there probably to play it safe,need it no probably is not.

Wolfenstein on xbox one drops to 960x1080 on the fly...

960x1080 = 1,036,800 pixels It almost drop to 720p on the fly..hahahahahaha

1280x720 = 921,600 almost 720p the xbox one version of Wolfenstein drop and you think they are on par..

hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaa

The PS4 rarely drops from 1080p and when it does it say abode 900p..hahahaha

http://www.eurogamer.net/articles/digitalfoundry-2014-wolfenstein-the-new-order-face-off

All xbox one games uses the ESRAM without ESRAM games would be less than 720p all of them,68GB/s is not enough to feed the xbox one CPU and CPU at the same time and push graphics.

Class dismiss..

Any you can see the difference?

960x1080 = 1,036,800 pixels It almost drop to 720p on the fly..hahahahahaha

1280x720 = 921,600 almost 720p the xbox one version of Wolfenstein drop and you think they are on par..

Oh, and btw. 921,600 + (921600*.1) = 1,013,700 (10% resources freed from June)

teh 50%$ more powah...LOLZ!

#305 Posted by FastRobby (1246 posts) -

There is no parity unless developer on purpose hold the PS4 back period.

I've never heard an actual developer saying that, only you. Are you a developer? What do you even do, that you think you are aware of how everything works? If you're just a teenager you can tell us

#306 Edited by tormentos (17069 posts) -

"Outlast is better on Xbone, hahaha cows!"

"No it isn't, here's indisputable evidence of that"

"N-Nobody cares about it"

Simon does it again!

Hahahahaaaaaaaaaaaaaaaa exactly..hahahahaa

@tormentos:

So that outlast video is your proof that the ps4 is superior? Pretty much looks like a toss up to me. You need to do much better to prove your point. Too close to call. When x1 starts off loading computations to the cloud their frame rate will be consistently higher. The PS4 right now doesn't have that ability. Follow the link grasshopper. You can start about 1:00 into the video.

https://www.youtube.com/watch?v=QxHdUDhOMyw

No it doesn't when the gameplay start the games chokes allot more on xbox one.

The fun part is how you say sh** to the person who actually claimed the xbox one version was superior,based on nothing since the gameplay show other wise.

Oh please you don't even know under what freaking parameters MS did that video,this is MS they have been faking crap since they enter the market.

Titanfall already uses the cloud and look average at best.

I think the one who need to do much better is MS,oh didn't they just a few weeks ago claim that the cloud biggest benefit would be dedicate servers.? Did you read Eurogamer last article on it,and how MS totally avoided talking about improving graphics over network.?Oh yeah they already confirm that the cloud can't make out for graphical bottle necks..lol

@GrenadeLauncher:

You can deny it all you want, but that ability gives the x1 an advantage. It's the next evolution for gaming. That's okay though because you still have a decent system with the ps4.

Hahahaaaaaaaaaaaaaaaaaaaaaaaa...

@Tighaman said:

ESRAM is not the problem per se but its a problem for deffered rendering and time restricted devs because most devs dont tightly pack buffers and rendertargets they also dont like to split g-buffers because its time and cost consuming. With the x1 you need to split buffers and move the stuff that need to be fast into the ESRAM and the rest in DRAM. GDDR5 is dying ESRAM or HBM or HBC on the GPU is going to be the future. Thast why the guy from valve says Tiled Resources and Megatextures or MESH (what lionheart is designing) will win in the long run.

Valve speaking great of something they strongly support how was Rage with megatextures.?

Also the PS4 has that feature to is just call PRT rather than Tile resources..lol

Oh God, the marketers are here.

Did they ever give the specs of the local hardware da clawd was operating against in that tech demo?

Hahahahaa no and it wasn't the xbox one..hahahah

You know MS they will lie until catch,right now they them self have debunk 2 of the greatest point i have challenge from the start,cloud and DX12..lol

Non of the 2 will be a big deal,MS already change the cloud message on their last conference to dedicated servers is the benefit,something most have gone wrong,and Phill Spencer it self debunked that DX12 will do anything great for the xbox one which is what i have been saying for months now..

#307 Edited by SoftwareGeek (206 posts) -

@slimdogmilionar:

"I was referring to the quotes you posted earlier where a software engineer had said he expects the PS4 to plateau before the Xbox, I was asking for a link to the source of those comments."

I am the software engineer and my opinions are just that. Opinions. But they are opinions that are formulated on my experience in the software industry (over 15 years). I write cloud based web services in Java. I know other languages including but not limited to C, C#, PHP, Cobol, SQL, Groovy on Grails, JavaScript, JQuery, HTML and XML. Also surrounding technologies for Java such as Spring, Struts, Maven, Ant, JaxB, XmlBeans, JSP and Servlets, Hibernate, IBatis. As far as database experience I am most familiar with Oracle databases.

#308 Posted by SoftwareGeek (206 posts) -

@tormentos

"The fun part is how you say sh** to the person who actually claimed the xbox one version was superior,based on nothing since the gameplay show other wise."

I'll say sh** to anybody that runs their mouth and shows their ignorance like you do. All systems have advantages and disadvantages. End of story. Go to school and get a job will ya?

#309 Posted by Tighaman (743 posts) -

@tormentos: ps4 doesnt have prt it has a GPU that can use PRT its a difference.

#311 Posted by tormentos (17069 posts) -

Any you can see the difference?

960x1080 = 1,036,800 pixels It almost drop to 720p on the fly..hahahahahaha

1280x720 = 921,600 almost 720p the xbox one version of Wolfenstein drop and you think they are on par..

Oh, and btw. 921,600 + (921600*.1) = 1,013,700 (10% resources freed from June)

teh 50%$ more powah...LOLZ!

Sure i would DF say it can be seen the xbox one version becomes fuzzier and with jaggies or you can't see jaggies this gen either.?..hahaaaaaa

while on the Xbox One this can drop to an extreme of 960x1080 in some scenes. This is usually identifiable by an increase in the amount of jaggies on screen, along with a slightly fuzzier appearance to the already gritty aesthetic that Machine Games employs throughout the game.

Hahahaaaaaaaaaaa...

The funny thing is the PS4 version doesn't drop from 1760x1080 the instances that drop.

1760x1080 = 1,900,800 almost double the pixel rate that is an almost 100% gap in pixels,is basically close to 720p vs 1080p on PS4...

But but but parity..

http://www.cloudgine.com/technology.html

Not just lemmings now you can add Epic, Havok, and Nvidia to the list of people clinging on to the cloud. But I guess these guys would just be dumb and also using the cloud as a gimmick according to you and @tormentos. Starting to look like you guys gonna end up eating all that pooh you said about the cloud being fake.

@softwaregeek: I was referring to the quotes you posted earlier where a software engineer had said he expects the PS4 to plateau before the Xbox, I was asking for a link to the source of those comments.

As MS develops new development tools, it may not always be the case that the PS4 is easier to develop for. The ps4 has a little better hardware, but that doesn't necessarily mean it is the strongest technologically. It means as a stand alone unit it is stronger. I see where both consoles could plateau. However, the xb1 has an advantage here because incredibly complex calculations can be offloaded to the cloud. Calculations that could easily overwhelm either console. After taking both consoles strengths and weaknesses into consideration, I find it more likely that PS4 will plateau out before the xb1. Buuuut....that is speculation. It's speculation however that's based upon my 15+ years experience as a software engineer. I could be wrong. Probably not though. At the end of the day, it doesn't matter much because both systems will have killer games. Halo. The Last of Us. etc.

Dude the cloud can't freaking make games look better,they can offload the complete xbox one CPU it will mean sh**.

Internet connection lack the speed for anything significant over online from the cloud,Titafall already uses the cloud is 720p and look average,that is because what you can offload are a few process that are done by the CPU,you can't offload AA or complex process period,ask MS if they can offload MSAA to the cloud i would lot to read their reply...lol

You don't know this because you are a sad lemming that believe on the sh** cloud,you people are so fix with power that you really are suffering for the xbox one been under power,and have to eat what ever crap MS trow at you.

The xbox one doesn't need 1.84TF to make great games.

The xbox one doesn't need to be a top of the line PC to have great games.

Once you admit this and move on i am sure you will not care about graphics any more,and will easily accept that the PS4 is more powerful.

I've never heard an actual developer saying that, only you. Are you a developer? What do you even do, that you think you are aware of how everything works? If you're just a teenager you can tell us

And you never will is totally apparent,because unlike what you think the is a different between the PS4 and xbox one hardware,you way not want to admit it,that doesn't mean is not there.

And like i have tell you like 100 times a 7770 will never beat a 7850 period no matter how many drivers come out,the PS4 will improve with time as well and the gap will remain period.

No i am not a developer i just know that a gap like this..

http://www.anandtech.com/bench/product/1127?vs=1079

Can't be over come by software because that would imply that.

A console that is weaker than B console will improve while B console which use basically the same GPU but stronger can't also improve.

You would see this if you didn't have the MS logo imprinted in your butt..

Yeah i am teenager lol my join date here was 11 years ago,so maybe i was 3 when i open my account..lol

#312 Edited by B4X (3564 posts) -

@tormentos:

I have never seen a guy white knight a console that will be beaten by tablets in 3-4 years. :P

Go post your insight on Beyond 3d. Please do Use this GT so I can follow your success?

Go tell them how the PS4 is 50% more powerful.

Spread your wealth of knowledge. Why keep it bottled up in here?

Spread your wings. Please keep us updated with your findings.

#313 Edited by evildead6789 (7435 posts) -

@tormentos said:


No it doesn't and only a true moron would fall for it..

You are an idiot and is clear by not you don't know sh** about the PS3,or this topic,the PS3 has a PC GPU is the CPU which is not from PC and is a hybrid that can do GPU task,is the sole reason the PS3 was able to keep up and pass the 360.

The RSX is a PC GPU.

http://en.wikipedia.org/wiki/RSX_%27Reality_Synthesizer%27

192Gflops like i told you...lol

Memory256MB
Memory Interface128-bit
Memory Bandwidth (GB/sec)22.4
Fill Rate (Billion pixels/sec)6.7
Vertices/Second (Millions)700
Pixels per clock (peak)12
RAMDACs (MHz)400

The 7600TG has higher fill rate,the same exact bandwidth as the RSX,8 ROP and even the same memory size 256mb.

The RSX was a 7600TG basically it was 192Gflops not 1.8TF..lol

No it doesn't prove your point all the contrary,why do you think the 7850 performs better.?

It has 2 CU more 128SP more,to get to the point the 7850 is the 7790 has to rise its clock to 1027mhz,and still it fall behind because the extra CU work better than having less with a higher clock.

And to make things worse the only GCN that perform worse while having higher flops is the one inside the xbox one,if anything this work against your argument even more,since the xbox one has the only GPU that has higher flops but less performance on this line of GPU the xbox one has the weakest link in GCN..lol

No the 7950 wasn't underclocked that was the first model,just like the 660TI fair is fair,the latter model was over clocked which is the boost.

The 7950 has a 384 bit bus an 240GB/s bandwidth bro by your logic of bigger bus change everything it should destroy the 660TI,it doesn't because regardless of how much bus it has it lack the power to do so.

I know what the 7770 has for bandwidth,and while it has more bandwidth than the DDR3 memory inside the xbox one the xbox one has ESRAM to compensate,this is what i have been telling people all alone,the xbox one ESRAM is a patch for bandwidth what else is new.?

No it didn't you butthurt blind fanboy,read your own damn article..

Pros

  • 20% performance increase over last generation
  • Excellent energy efficiency
  • Overclocked out of the box
  • Extremely quiet
  • No power consumption increase despite overclock
  • Excellent GPU overclocking potential

Cons

  • Price too high to be competitive
  • Lower memory OC potential than other cards
  • Memory overclock is small

Basically most of the boost came from the actual GPU over clock and not from the memory one.

Memory on the memory wasn't over clock on the xbox one since it would gain little from over clocking DDR3,ESRAM got a clock boost because the GPU was also raise,and that is accounted on the specs already,and it didn't change the bus so yeah have a nice time proving the 7770 improved with a 256 bit bus,which doesn't have.

Is a tad stronger because it has 2 more CU,the 7770 has 10,but it has the same TF count which mean more or less the same performance,specially when the 7770 has no cumbersome memory setup.

No MS put that bus there probably to play it safe,need it no probably is not.

Wolfenstein on xbox one drops to 960x1080 on the fly...

960x1080 = 1,036,800 pixels It almost drop to 720p on the fly..hahahahahaha

1280x720 = 921,600 almost 720p the xbox one version of Wolfenstein drop and you think they are on par..

hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaa

The PS4 rarely drops from 1080p and when it does it say abode 900p..hahahaha

http://www.eurogamer.net/articles/digitalfoundry-2014-wolfenstein-the-new-order-face-off

All xbox one games uses the ESRAM without ESRAM games would be less than 720p all of them,68GB/s is not enough to feed the xbox one CPU and CPU at the same time and push graphics.

Class dismiss..

I think you're the idiot here, you can't even do basic mathematics when it comes to bits and bytes

but we're getting somewhere, the realworld performance in tlfops is of course less on the ps3, but when they gave out that number sony counted every single flop on every single chip. Since the rsx was basically fueled by spe's in the cpu, they counted that number as well. If you could actually interpret data instead of copy and pasting it here, i wouldn't have this discussion with you (which is becoming sillier with the minute lol)

That's why tflops are not a good measurement. It is for a single chip but not for a system where different chips work together. Like the with xbox one for instance, if you would look a tflops only , the ps4 would be a lot stronger than it shows now in current games. Off course that number is correct,

in games where the esram isn't used, because otherwise they're very similar systems (apart from the difference in memory speeds and shaders count)

The 7950 was like an underclocked model , if you look what speeds the chip can handle. It is also the very first comment on review sites of the first cards released on amd's pcb.

apparently you can't read as well since a 10 percent increase in performance because of a 31 percent memory overclock vs 6 percent core overclock is suddenly due to gpu core clock lol. but it doesn't matter , your stupidity takes hilarious heights. They were talking of course in comparison with other cards when they're talking , memory overclock is small. Well I just started taking sceenshots then to prove how stupid you really are

So yes the 128 bus on a hd 7770 is a bottleneck and will certainly bottleneck on the x1 gpu since it's a bit stronger than the hd 7770 and it has slower memory speeds.

So A 256 bit bus will increase it severly, mr 'it's only a highway', you must be the biggest moron on this site lol.

The fact that this memory is shared on the x1 will show more real world performance in comparison with desktop cards. So the ps4 bandwith is actually massive overkill and a pr stunt, and the x1 memory bandwith will be enough for what it has especially combined with the esram.

Wolfenstein may have switches to lower resolution, but you cannot notice it. I'm not going to bother to post the reviews because that's exactly what they said, wolfenstein looks the same on both consoles, it doesn't matter that certain tricks are there you can't see them, because the esram is so fast. And not all games use the esram, this was clearly mentioned in launch games (where even 10 percent was reserved for the kinect as well)

So yeah , your ps4 is stronger but it's not a deal breaker if you prefer the controller on the xone, the community or the exclusives. Who cares when the difference in quality is hardly noticable.

Now I'm awaiting your next post filled with bs.

#314 Edited by FastRobby (1246 posts) -


@FastRobby said:

I've never heard an actual developer saying that, only you. Are you a developer? What do you even do, that you think you are aware of how everything works? If you're just a teenager you can tell us

Yeah i am teenager lol my join date here was 11 years ago,so maybe i was 3 when i open my account..lol

Haha, you're an adult and you act like this, hahaha, my god your life must suck

#315 Edited by SoftwareGeek (206 posts) -

@tormentos

"Dude the cloud can't freaking make games look better,they can offload the complete xbox one CPU it will mean sh**."

Go watch the demo you ignorant fool. Clearly it can. It can help the framerate. The demo proves that. By helping the frame rate I'd say you are helping the game look better. If you offload computations to the cloud, it frees up system resources which can help aid visuals. So you can keep saying that it doesn't but clearly the demo proves that it does. It can't make the graphics card perform beyond it's capabilities. But it can free it from some of the heavy lifting. you keep saying that it can't but clearly the demo shows otherwise. How long are you going to live in a state of denial? And folks who are you going to believe? Tormentos that's not a software developer or SoftwareGeek that is a software developer? Hmmmm...who has credibility? What are you going to say next? That offloading work to the cloud is cheating? What other ridiculous and ignorant statements are you going to continue to make? By offloading computations to the cloud, you are letting another CPU do the work. So then you have mulitple CPU's doing the work. MULTIPLE CPU'S DOING THE WORK. What part of this can't you get into that thick skull of yours? The hardware in the PS4 is a little better, but the hardware in the cloud is a ton better than either console. It's like the difference between your gaming pc and a banking server. There's a world of difference in power. tormentos you dumb a**, repeat after me. "A banking server is far more powerful than a PC". Go ahead. Say it several times over and over until it sinks into that hard head of yours. You have no credibility what so ever because you are just a fan boy that foolishly runs his mouth without ANY knowledge to back it up. You have no knowledge tormentos. No knowledge. You are untrained and uneducated. End of story. I'm sure you will keep concocting uninformed criticisms.

#316 Posted by evildead6789 (7435 posts) -


Dude the cloud can't freaking make games look better,they can offload the complete xbox one CPU it will mean sh**.

Internet connection lack the speed for anything significant over online from the cloud,Titafall already uses the cloud is 720p and look average,that is because what you can offload are a few process that are done by the CPU,you can't offload AA or complex process period,ask MS if they can offload MSAA to the cloud i would lot to read their reply...lol

You don't know this because you are a sad lemming that believe on the sh** cloud,you people are so fix with power that you really are suffering for the xbox one been under power,and have to eat what ever crap MS trow at you.

The xbox one doesn't need 1.84TF to make great games.

The xbox one doesn't need to be a top of the line PC to have great games.

Once you admit this and move on i am sure you will not care about graphics any more,and will easily accept that the PS4 is more powerful.

Standalone maybe yeah,but not what you make it to be lol.

And the cloud does work, sony is actually proving it with playstation now.

#317 Edited by FastRobby (1246 posts) -

@evildead6789 said:

And the cloud does work, sony is actually proving it with playstation now.

That's streaming not computing. Big difference. PS Now only has to return audio/video like Netflix would, while accepting user input through controller. Cloud computing does calculations in the cloud and gives back these results, but this isn't video/audio, these could be results from AI calculations. A lot of people that don't know anything about tech or AI, say that: AI of Titanfall is shit, ergo cloud of Microsoft is shit. But that isn't the correct way of looking at it, if the AI of Titanfall is shit, then this just means that Respawn sucks at writing proper AI, this has nothing to do with Microsoft. Their cloud can easily do much more advanced calculations, in basically the same time. Although you don't calculate these things in "time", more in computational steps, but that's just algorithms 101.

#318 Edited by tormentos (17069 posts) -

I think you're the idiot here, you can't even do basic mathematics when it comes to bits and bytes

but we're getting somewhere, the realworld performance in tlfops is of course less on the ps3, but when they gave out that number sony counted every single flop on every single chip. Since the rsx was basically fueled by spe's in the cpu, they counted that number as well. If you could actually interpret data instead of copy and pasting it here, i wouldn't have this discussion with you (which is becoming sillier with the minute lol)

That's why tflops are not a good measurement. It is for a single chip but not for a system where different chips work together. Like the with xbox one for instance, if you would look a tflops only , the ps4 would be a lot stronger than it shows now in current games. Off course that number is correct,

in games where the esram isn't used, because otherwise they're very similar systems (apart from the difference in memory speeds and shaders count)

The 7950 was like an underclocked model , if you look what speeds the chip can handle. It is also the very first comment on review sites of the first cards released on amd's pcb.

apparently you can't read as well since a 10 percent increase in performance because of a 31 percent memory overclock vs 6 percent core overclock is suddenly due to gpu core clock lol. but it doesn't matter , your stupidity takes hilarious heights. They were talking of course in comparison with other cards when they're talking , memory overclock is small. Well I just started taking sceenshots then to prove how stupid you really are

So yes the 128 bus on a hd 7770 is a bottleneck and will certainly bottleneck on the x1 gpu since it's a bit stronger than the hd 7770 and it has slower memory speeds.

So A 256 bit bus will increase it severly, mr 'it's only a highway', you must be the biggest moron on this site lol.

The fact that this memory is shared on the x1 will show more real world performance in comparison with desktop cards. So the ps4 bandwith is actually massive overkill and a pr stunt, and the x1 memory bandwith will be enough for what it has especially combined with the esram.

Wolfenstein may have switches to lower resolution, but you cannot notice it. I'm not going to bother to post the reviews because that's exactly what they said, wolfenstein looks the same on both consoles, it doesn't matter that certain tricks are there you can't see them, because the esram is so fast.

Now I'm awaiting your next post filled with bs.

192Gflops from RSX 210Gflops from Cell. Period that is the PS3 and a far cry from 2TF.

The PS4 has more flops same architecture is stronger period and has show in each and every game that is on both platforms,hell Tomb Raider is a DX game and runs better on PS4 that sad is the xbox one.

There isn't a single game that doesn't use ESRAM period,you may argue that they were not using it properly or some sh** like that but all games used,without it the xbox one games would not even be 720p.

All those GPU can handle speeds over what they come,the 7850 is 860mhz but there is an OC model as well,and that can be even more over clock,the point is the 7950 was stock just like the 660ti and the results are history i proved my point bigger bus no power = screw.

That is funny because the chart maker complain about the low room for memory over clock,and worse the GPU also get an upclock,something the xbox one didn't get in the first place the 7790 is 1027 mhz not 853.

Post the link to that page,the 5770 also has more bandwidth than the 7770,yet the 7770 beat it moron.

http://www.anandtech.com/bench/product/538?vs=536

Nice way to make an ass of your self the 5770 has 77GB/s bandwidth the 7770 has 73GB/s,yet the 7770 beats it with less bandwidth again argument defeated..lol

http://www.anandtech.com/bench/product/776?vs=780

But wait what's that.? The 7790 with 102Gb/s bandwidth 128 bit bus beating the 6870 that has 134Gb/s and a 256 bit bus.?

Oh brother that theory is really crumbling...

It would increase sh** basically a frame or 2 the most and the 7790 beating the 6870 is the biggest example of that 134GB/s 256 bit bus vs 102GB/s 128 bit bus no matter what the 7790 is able to use more power over its bandwidth than the 6870 does.

And before you say any sh** the 6870 has 1120 Stream processors vs the 7790 896 stream processors,that card wasn't a wimp in fact it was more mid range when it landed than the 7790 ever was when it landed.

Idiot the memory on PS4 is also shared see you know sh**..hahahaha Is far from PR stunt is GDDR5 which is the standard for freaking graphics,and is 176GB/s faster than the 7850 has for a reason,it is shared with the CPU.

You can't notice it.? That is because you are a blind moron..

Metrics in the area of 1760x1080 are found on PS4, while on the Xbox One this can drop to an extreme of 960x1080 in some scenes. This is usually identifiable by an increase in the amount of jaggies on screen, along with a slightly fuzzier appearance to the already gritty aesthetic that Machine Games employs throughout the game.

Unless you can't see jaggies any more or fuzzier image it shows,Digital Foundry stated so..hahaha

That last bold sentence there yeah that last one is total and utter gibberish,you can see the difference ESRAM could be as fast as you want to pretend it is,ESRAM doesn't produce flops is memory,the performance comes from Stream processors inside CU,sadly the xbox one has less which is why it drops almost to 720p to keep up frame wise with the PS4..hahahaha

#319 Posted by slimdogmilionar (472 posts) -

@slimdogmilionar:

"I was referring to the quotes you posted earlier where a software engineer had said he expects the PS4 to plateau before the Xbox, I was asking for a link to the source of those comments."

I am the software engineer and my opinions are just that. Opinions. But they are opinions that are formulated on my experience in the software industry (over 15 years). I write cloud based web services in Java. I know other languages including but not limited to C, C#, PHP, Cobol, SQL, Groovy on Grails, JavaScript, JQuery, HTML and XML. Also surrounding technologies for Java such as Spring, Struts, Maven, Ant, JaxB, XmlBeans, JSP and Servlets, Hibernate, IBatis. As far as database experience I am most familiar with Oracle databases.

Pretty cool I'm going for web development now and will be graduating in a few months. Once I'm done I plan to go back for my bachelor's in software engineering, I don't really like web development as much but I also don't want to just quit this far in.

Dude the cloud can't freaking make games look better,they can offload the complete xbox one CPU it will mean sh**.

Internet connection lack the speed for anything significant over online from the cloud,Titafall already uses the cloud is 720p and look average,that is because what you can offload are a few process that are done by the CPU,you can't offload AA or complex process period,ask MS if they can offload MSAA to the cloud i would lot to read their reply...lol

You don't know this because you are a sad lemming that believe on the sh** cloud,you people are so fix with power that you really are suffering for the xbox one been under power,and have to eat what ever crap MS trow at you.

The xbox one doesn't need 1.84TF to make great games.

The xbox one doesn't need to be a top of the line PC to have great games.

Once you admit this and move on i am sure you will not care about graphics any more,and will easily accept that the PS4 is more powerful.

I'm a sad lemming? Or are you just a dumb cow troll. Remember when you said that PS4 gpgpu was for physics and that for the same thing to happen on xbox it would take gpu resources for gpgpu? Mmmmm that's not what these guys are saying. As for internet connection, at the begining of last gen 1 - 3 mbps was considered the norm now 25-60 is the average, Google fiber now offers 1gb for $70 and Verizon Qauntum is like 500 mbps. Yea you keep living in the past dude that's why you don't believe this. Xbox was built with cloud tech in mind so I'm sure they thought of all the problems and limitations when building it. I even admitted that I didn't think the cloud was what most people thought it was but when I see companies like Epic, Havok, and Nvidia on board you have to realize that there must be something there or else you would not have these companies on board in less than a year, we went from the cloud being a myth to 3 big name game developing companies collaborating with a company that was built on the belief of cloud processing. Oh, that's right, because you think you have a PHD in software engineering, work for a prestigious developing company, and know that internet speeds will stay at 20 - 60 mbps for the next 10 years we should believe you when you say that the cloud won't happen. Just like people didn't believe the Wright brothers when they said they were going fly?

Everyone who bought an xbox knows it has weaker hardware, but it's people like you who try and downplay every aspect of the box because you are insecure about the future and know we live in a time when anything is possible. When you make a 180 degree turn and say stuff like "The xbox one doesn't need 1.84TF to make great games." and "The xbox one doesn't need to be a top of the line PC to have great games." it only proves the point that you have no idea what you are talking about and starting to second guess all that "cloud is a gimmick stuff you were saying earlier". How many more companies and devs need to jump on board before you start to see that you don't know everything.

"We're delivering rendering and processing power from the cloud, allowing game creators to define new ground-breaking online gaming mechanics." -cloudgine.com

One last thing for you to think about then you can continue letting @evildead6789 make you look stupid.

"Learning and innovation go hand in hand. The arrogance of success is to think that what you did yesterday will be sufficient for tomorrow."

William Pollard

#320 Edited by SoftwareGeek (206 posts) -

@evildead6789

"So yeah , your ps4 is stronger but it's not a deal breaker if you prefer the controller on the xone, the community or the exclusives. Who cares when the difference in quality is hardly noticable. Now I'm awaiting your next post filled with bs."

I'd plus one that a million times if I could.

"Standalone maybe yeah,but not what you make it to be lol. And the cloud does work, sony is actually proving it with playstation now."

I'd also plus one that a million times. Knowing that Sony will follow MS's lead on utilizing the cloud, I bet tormentos changes his tune and says something like "it only works for the ps4". lol. geeeze....

#321 Posted by SoftwareGeek (206 posts) -

@slimdogmilionar

"Pretty cool I'm going for web development now and will be graduating in a few months. Once I'm done I plan to go back for my bachelor's in software engineering, I don't really like web development as much but I also don't want to just quit this far in."

I hear you. That's why about 5 years ago I got into web-services. Although, I was just loaned out to the Grails team for a few months which included some front end development.

#322 Edited by evildead6789 (7435 posts) -

@tormentos said:

192Gflops from RSX 210Gflops from Cell. Period that is the PS3 and a far cry from 2TF.

The PS4 has more flops same architecture is stronger period and has show in each and every game that is on both platforms,hell Tomb Raider is a DX game and runs better on PS4 that sad is the xbox one.

There isn't a single game that doesn't use ESRAM period,you may argue that they were not using it properly or some sh** like that but all games used,without it the xbox one games would not even be 720p.

All those GPU can handle speeds over what they come,the 7850 is 860mhz but there is an OC model as well,and that can be even more over clock,the point is the 7950 was stock just like the 660ti and the results are history i proved my point bigger bus no power = screw.

That is funny because the chart maker complain about the low room for memory over clock,and worse the GPU also get an upclock,something the xbox one didn't get in the first place the 7790 is 1027 mhz not 853.

Post the link to that page,the 5770 also has more bandwidth than the 7770,yet the 7770 beat it moron.

http://www.anandtech.com/bench/product/538?vs=536

Nice way to make an ass of your self the 5770 has 77GB/s bandwidth the 7770 has 73GB/s,yet the 7770 beats it with less bandwidth again argument defeated..lol

http://www.anandtech.com/bench/product/776?vs=780

But wait what's that.? The 7790 with 102Gb/s bandwidth 128 bit bus beating the 6870 that has 134Gb/s and a 256 bit bus.?

Oh brother that theory is really crumbling...

It would increase sh** basically a frame or 2 the most and the 7790 beating the 6870 is the biggest example of that 134GB/s 256 bit bus vs 102GB/s 128 bit bus no matter what the 7790 is able to use more power over its bandwidth than the 6870 does.

And before you say any sh** the 6870 has 1120 Stream processors vs the 7790 896 stream processors,that card wasn't a wimp in fact it was more mid range when it landed than the 7790 ever was when it landed.

Idiot the memory on PS4 is also shared see you know sh**..hahahaha Is far from PR stunt is GDDR5 which is the standard for freaking graphics,and is 176GB/s faster than the 7850 has for a reason,it is shared with the CPU.

You can't notice it.? That is because you are a blind moron..

Metrics in the area of 1760x1080 are found on PS4, while on the Xbox One this can drop to an extreme of 960x1080 in some scenes. This is usually identifiable by an increase in the amount of jaggies on screen, along with a slightly fuzzier appearance to the already gritty aesthetic that Machine Games employs throughout the game.

Unless you can't see jaggies any more or fuzzier image it shows,Digital Foundry stated so..hahaha

That last bold sentence there yeah that last one is total and utter gibberish,you can see the difference ESRAM could be as fast as you want to pretend it is,ESRAM doesn't produce flops is memory,the performance comes from Stream processors inside CU,sadly the xbox one has less which is why it drops almost to 720p to keep up frame wise with the PS4..hahahaha

Yeah of course it's about using the esram properly, when you don't make use of it's superior speed there's hardly any point in using it.

Yeah so what the 5770 has more bandwith than the hd 7770, that has nothing to do with it lol you still don't understand it. I'm also not going to explain it anymore , I already did that. And when the chart maker was saying a small memory overclock he was comparing it to other hd 7770 cards, not the hd 7770 in general, please, you can't even understand context. A 31 percent overclock is not a small overclock.

You're comparing a 256 bit with a 128 bit card while the one has much higher memory speed, the two are multiplied. It's not my fault you don't know how to multiply. De hd 7790 has also higher clock speed, is on 28nm (in comparison to 40) and it has dx 11.1.

anyway a review on wolfenstein of complex. com

"While the gaming world seemed shocked that Wolfenstein: The New Order turned out to be a great game last week the debate on which console system, the Xbox One or PlayStation 4, has the best graphics keeps waging on. Digital Foundry, the authority in nerd graphic breakdowns, did a side-by-side comparison of the new Nazi shooter running on both systems and found few differences.

To boil down the all the tech talk into the simplest terms. Digital Foundry says that the PlayStation 4 has the upper hand in dealing the best graphic experience while the Xbox One can lose resolution sometimes. Strangely, this seems to happen when nothing is really going on screen and probably has more to do with the console up-scaling to the desired screen size.

However a better test might be just to watch the memorizing, slow-motion video above on the highest quality and see if you can pick out the differences. While smoke and some effects look slightly different, the consoles are very close, so maybe we can put this discussion to bed."

oh dear lord what a difference , let's get a ps4 and play with morons like tormentos.

#323 Edited by tormentos (17069 posts) -
@b4x said:

@tormentos:

I have never seen a guy white knight a console that will be beaten by tablets in 3-4 years. :P

Go post your insight on Beyond 3d. Please do Use this GT so I can follow your success?

Go tell them how the PS4 is 50% more powerful.

Spread your wealth of knowledge. Why keep it bottled up in here?

Spread your wings. Please keep us updated with your findings.

Thats funny that is what people say here about cell phones and last gen consoles,yet i haven't see a single game on Ios or Android that stand even close to Uncharted 3 or TLOU graphically not 1.

But even if it happen worse for the console you so hardly defend because if the PS4 will be beat by tablets in 3 to 4 years the xbox one will be beat in 2 or 3 and you pay $100 more for that crap than i did for my PS4..lol

Why don't you go there and talk to them about the magic cloud that never stop giving and the magic drivers.

Haha, you're an adult and you act like this, hahaha, my god your life must suck

Not as much as your which have completely steer away from the argument,and have been reduce to trying to insult my intelligence.hahahahaa


Go watch the demo you ignorant fool. Clearly it can. It can help the framerate. The demo proves that. By helping the frame rate I'd say you are helping the game look better. If you offload computations to the cloud, it frees up system resources which can help aid visuals. So you can keep saying that it doesn't but clearly the demo proves that it does. It can't make the graphics card perform beyond it's capabilities. But it can free it from some of the heavy lifting. you keep saying that it can't but clearly the demo shows otherwise. How long are you going to live in a state of denial? And folks who are you going to believe? Tormentos that's not a software developer or SoftwareGeek that is a software developer? Hmmmm...who has credibility? What are you going to say next? That offloading work to the cloud is cheating? What other ridiculous and ignorant statements are you going to continue to make? By offloading computations to the cloud, you are letting another CPU do the work. So then you have mulitple CPU's doing the work. MULTIPLE CPU'S DOING THE WORK. What part of this can't you get into that thick skull of yours?

The hardware in the PS4 is a little better, but the hardware in the cloud is a ton better than either console. It's like the difference between your gaming pc and a banking server. There's a world of difference in power. tormentos you dumb a**, repeat after me. "A banking server is far more powerful than a PC". Go ahead. Say it several times over and over until it sinks into that hard head of yours. You have no credibility what so ever because you are just a fan boy that foolishly runs his mouth without ANY knowledge to back it up. You have no knowledge tormentos. No knowledge. You are untrained and uneducated. End of story. I'm sure you will keep concocting uninformed criticisms.

I watch the demo and you don't know under what parameters the demo was made,hell that wasn't even done on xbox one man WTF..lol

Yeah i guess that is why Titanfall is 792p with drops into the mid 30's after 4 updates..hahahaa

When you prove to me that the video wasn't as fake as this so call live presentation you will have a freaking point other wise until MS deliver they are ling,no matter what offloading some process from the CPU will not do the GPU double its frame what the fu** man.

Yeah the problem is that each PS4 owner get a better machine than the xbox one,not every xbox one owner get access to a freaking cloud which they need a fast online connection to use.

GPU work in GB/s bandwidth no MB,which is why improving graphics over a damn network doesn't work.

Man you have like 4 account here right.? because this wall of text posting style is use by like for 4 or 5 lemmings,including slimdogmilionar, and several others,man just post from 1 account using 4 different ones and the same sh** argument doesn't make you right.

I know what the cloud can do and what process can be offload,all are CPU based,faked lighting which isn't demanding in the first place,AI which isn't that demanding either,Physics (not on the same frame) and things like that.

You can't do rendering,offload process like MSAA,or other complex effects done by GPU because those process requires extremely fast bandwidth and even the fastest internet connection doesn't have that.

@Tighaman said:

@tormentos: ps4 doesnt have prt it has a GPU that can use PRT its a difference.

Where the fu** do you think Tile resources come from on the xbox one,.? Tile resources is PRT with another name man,it is GPU based solution not a software solution my god how much miss informed you people can be.?

http://www.gamespot.com/forums/system-wars-314159282/prt-already-on-ps4-tool-set-like-meow-31226490/?page=3

How fast you forget how i simply destroyed all your arguments,please don't forget to visit the first as well..

PRT = Tile Resources both are hardware supported in GCN so yeah exactly the same on xbox one and PS4.

PRT = name on Opengl

Tile Resources = Name on DX

Megatextures = name for software based solution.

The ownage i deliver on that thread was epic,for months i was telling you buffoons that PRT = Tile Resources.... And i was right.

Standalone maybe yeah,but not what you make it to be lol.

And the cloud does work, sony is actually proving it with playstation now.

The fu** hahahahaaaaaaaaaaaaaaaaaaaaaaaaa....

Ultra re confirmed you know sh** about this,leave please..

The cloud sony uses work like Onlive use to work,they stream a video of the game you want to play and you control it,which is totally different to the cloud MS talked about.

I'm a sad lemming? Or are you just a dumb cow troll. Remember when you said that PS4 gpgpu was for physics and that for the same thing to happen on xbox it would take gpu resources for gpgpu? Mmmmm that's not what these guys are saying.

As for internet connection, at the begining of last gen 1 - 3 mbps was considered the norm now 25-60 is the average, Google fiber now offers 1gb for $70 and Verizon Qauntum is like 500 mbps.

Yea you keep living in the past dude that's why you don't believe this. Xbox was built with cloud tech in mind so I'm sure they thought of all the problems and limitations when building it. I even admitted that I didn't think the cloud was what most people thought it was but when I see companies like Epic, Havok, and Nvidia on board you have to realize that there must be something there or else you would not have these companies on board in less than a year, we went from the cloud being a myth to 3 big name game developing companies collaborating with a company that was built on the belief of cloud processing.

Oh, that's right, because you think you have a PHD in software engineering, work for a prestigious developing company, and know that internet speeds will stay at 20 - 60 mbps for the next 10 years we should believe you when you say that the cloud won't happen. Just like people didn't believe the Wright brothers when they said they were going fly?

Everyone who bought an xbox knows it has weaker hardware, but it's people like you who try and downplay every aspect of the box because you are insecure about the future and know we live in a time when anything is possible. When you make a 180 degree turn and say stuff like "The xbox one doesn't need 1.84TF to make great games." and "The xbox one doesn't need to be a top of the line PC to have great games." it only proves the point that you have no idea what you are talking about and starting to second guess all that "cloud is a gimmick stuff you were saying earlier". How many more companies and devs need to jump on board before you start to see that you don't know everything.

"We're delivering rendering and processing power from the cloud, allowing game creators to define new ground-breaking online gaming mechanics." -cloudgine.com

One last thing for you to think about then you can continue letting @evildead6789 make you look stupid.

"Learning and innovation go hand in hand. The arrogance of success is to think that what you did yesterday will be sufficient for tomorrow."

William Pollard

The PS4 can run Physics on its CPU or GPU sad troll.

CPU Load

  • 60 AI characters
  • 940 entities, 300 active
  • 8200 physics objects (1500 key-framed, 6700 static)
  • 500 particle systems
  • 120 sound voices
  • 110 ray casts
  • 1000 jobs per frame

http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall

Killzone SF post morten the physics been done on CPU..lol

The average U.S. broadband connection speed was nearly 10 Mbps (9.8 Mbps) in the third quarter of 2013, according to the latest Akamai State of the Internet Report. That’s an increase of 13 percent over the previous quarter when the average connection speed was 8.6 Mbps and a 31 percent increase over the same period a year earlier.

http://www.telecompetitor.com/akamai-average-u-s-broadband-speed-climbs-9-8-mbps/

This news in early January say the average was 9.8 MB far from the 26 even more 60MB you claim.

Nice now tell me when they hit 150GB/s which is what the xbox one need to move complex process on its GPU,online connection are not average 60 MB even if it were it means nothing GPU work on GB per second not MB per second even a 1GB line would still be no enough for complex process,even DDR3 uses 68GB/s 68 times faster than the fastest online connection out there,and that is DDR3 which is way slower than GDDR5.

Hell the PS4 bandwidth get saturated by MSAAx4 and is 176GB per freaking second,and this is the problem with your argument dude,you don't understand this period GPU require fast bandwidth which internet connection lack.

Of course you will see those companies involve they have winning on those,hell who in hell do you think make the GPU for compute use on those servers MS has.?

hahahaha

The cloud serves for allot of things,rendering graphics over the internet is not one of them sadly,it can be use for AI,some physics as long as the results are not need it in the same frame,baked lighting and stuff like that which don't need constant refreshing.

The cloud isn't a myth is how MS sold it what is was a myth cloud is Siri,cloud is Office 365,cloud is storage,cloud is PSN Now,cloud is many things online rendering is not.

I don't have of those titles i just know for FACT that GPU require ultra fast bandwidth to process graphics,and i just also know that MS and even Sony will lie to hell and beyond just to make you buy their product was the xbox 360 1 TF like MS claimed in 2005.? Was it.? Answer me that because they claimed the xbox 360 was 1 teraflop on 2005,then came sony and claimed the PS3 was 2 teraflops.

When it sound to good to be true it probably is,have you see MS talking about improving graphics lately over the cloud.?

Microsoft's confusing Xbox One cloud message shifts to dedicated servers

"Xbox Live is the service. Dedicated servers is the benefit."

Microsoft's E3 2014 press conference was notable for its focus on games, but one Xbox One feature, touted when the console was revealed last year as one of its unique selling points, was conspicuous by its absence.

There was not one mention of "the cloud" during Microsoft's 90 minute presentation, either from Xbox boss Phil Spencer or the many developers that took to the stage to talk about their games.

Instead, we heard the term "dedicated servers" over and over again. This after Microsoft spent a great deal of time and effort insisting the power of the cloud would revolutionise gaming as we know it.

http://www.eurogamer.net/articles/2014-06-16-microsofts-confusing-xbox-one-cloud-message-shifts-to-dedicated-servers

Just freaking read... You are stock with inferior hardware and you will be unless you go PS4,PC or wait until the next xbox if there is one.

#324 Posted by B4X (3564 posts) -

@tormentos

@b4x said:

@tormentos:

I have never seen a guy white knight a console that will be beaten by tablets in 3-4 years. :P

Go post your insight on Beyond 3d. Please do Use this GT so I can follow your success?

Go tell them how the PS4 is 50% more powerful.

Spread your wealth of knowledge. Why keep it bottled up in here?

Spread your wings. Please keep us updated with your findings.

Thats funny that is what people say here about cell phones and last gen consoles,yet i haven't see a single game on Ios or Android that stand even close to Uncharted 3 or TLOU graphically not 1.

But even if it happen worse for the console you so hardly defend because if the PS4 will be beat by tablets in 3 to 4 years the xbox one will be beat in 2 or 3 and you pay $100 more for that crap than i did for my PS4..lol

Why don't you go there and talk to them about the magic cloud that never stop giving and the magic drivers.

I don't make these claims. You do. Happy posting.

Things I don't talk about. The Cloud, magic drivers, stacked dGpu, stacked ram or the untapped powar!

Tell us all about how the cloud is a mythical creature from the Mayan civilization. How you can factually dismiss it as a Myth?

Enlighten us with your copy and paste. :p

#325 Edited by evildead6789 (7435 posts) -


@evildead6789 said:

Standalone maybe yeah,but not what you make it to be lol.

And the cloud does work, sony is actually proving it with playstation now.

The fu** hahahahaaaaaaaaaaaaaaaaaaaaaaaaa....

Ultra re confirmed you know sh** about this,leave please..

The cloud sony uses work like Onlive use to work,they stream a video of the game you want to play and you control it,which is totally different to the cloud MS talked about.

So what, like they can' t import video into a game

or whatever, the cloud will do something , enough people say

like your word is good for anything, you don't how to count, multiply

you don't even understand what context means

#326 Posted by Shewgenja (8453 posts) -

This thread is The Giving Tree of butthurt.

#327 Posted by evildead6789 (7435 posts) -

This thread is The Giving Tree of butthurt.

yeah those sony fanboys really can't get over it their console is (again) only better on paper lol

#328 Edited by Shewgenja (8453 posts) -

@Shewgenja said:

This thread is The Giving Tree of butthurt.

yeah those sony fanboys really can't get over it their console is (again) only better on paper lol

Whatever helps you sleep at night.

#329 Posted by slimdogmilionar (472 posts) -

The PS4 can run Physics on its CPU or GPU sad troll.

CPU Load

  • 60 AI characters
  • 940 entities, 300 active
  • 8200 physics objects (1500 key-framed, 6700 static)
  • 500 particle systems
  • 120 sound voices
  • 110 ray casts
  • 1000 jobs per frame

http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall

Killzone SF post morten the physics been done on CPU..lol

The average U.S. broadband connection speed was nearly 10 Mbps (9.8 Mbps) in the third quarter of 2013, according to the latest Akamai State of the Internet Report. That’s an increase of 13 percent over the previous quarter when the average connection speed was 8.6 Mbps and a 31 percent increase over the same period a year earlier.

http://www.telecompetitor.com/akamai-average-u-s-broadband-speed-climbs-9-8-mbps/

This news in early January say the average was 9.8 MB far from the 26 even more 60MB you claim.

Nice now tell me when they hit 150GB/s which is what the xbox one need to move complex process on its GPU,online connection are not average 60 MB even if it were it means nothing GPU work on GB per second not MB per second even a 1GB line would still be no enough for complex process,even DDR3 uses 68GB/s 68 times faster than the fastest online connection out there,and that is DDR3 which is way slower than GDDR5.

Hell the PS4 bandwidth get saturated by MSAAx4 and is 176GB per freaking second,and this is the problem with your argument dude,you don't understand this period GPU require fast bandwidth which internet connection lack.

Of course you will see those companies involve they have winning on those,hell who in hell do you think make the GPU for compute use on those servers MS has.?

hahahaha

The cloud serves for allot of things,rendering graphics over the internet is not one of them sadly,it can be use for AI,some physics as long as the results are not need it in the same frame,baked lighting and stuff like that which don't need constant refreshing.

The cloud isn't a myth is how MS sold it what is was a myth cloud is Siri,cloud is Office 365,cloud is storage,cloud is PSN Now,cloud is many things online rendering is not.

I don't have of those titles i just know for FACT that GPU require ultra fast bandwidth to process graphics,and i just also know that MS and even Sony will lie to hell and beyond just to make you buy their product was the xbox 360 1 TF like MS claimed in 2005.? Was it.? Answer me that because they claimed the xbox 360 was 1 teraflop on 2005,then came sony and claimed the PS3 was 2 teraflops.

When it sound to good to be true it probably is,have you see MS talking about improving graphics lately over the cloud.?

Microsoft's confusing Xbox One cloud message shifts to dedicated servers

"Xbox Live is the service. Dedicated servers is the benefit."

Microsoft's E3 2014 press conference was notable for its focus on games, but one Xbox One feature, touted when the console was revealed last year as one of its unique selling points, was conspicuous by its absence.

There was not one mention of "the cloud" during Microsoft's 90 minute presentation, either from Xbox boss Phil Spencer or the many developers that took to the stage to talk about their games.

Instead, we heard the term "dedicated servers" over and over again. This after Microsoft spent a great deal of time and effort insisting the power of the cloud would revolutionise gaming as we know it.

http://www.eurogamer.net/articles/2014-06-16-microsofts-confusing-xbox-one-cloud-message-shifts-to-dedicated-servers

Just freaking read... You are stock with inferior hardware and you will be unless you go PS4,PC or wait until the next xbox if there is one.

Ok and xbox can't run physics from CPU, the only difference being that the xbox is better equipped to communicate with the CPU and GPU at the same time. Remember Sony's own first party said that the PS4's Cpu can't keep up with it's GPU, which is my guess as to why they are pushing gpgpu so much. Nobody said heavy lifting on the cloud except Cloudgine, the demo M$ showed that you accuse softwaregeek of being fake is on their site. M$ was using their technology to demonstrate what Azure could do with Cloudgine's middleware(have you even browsed the site?). I'm still anxious to see how it works because I was convinced it would do nothing but help with processes that were not latency bound. But that doesn't mean someone smarter than me can't figure out how to make it work.

Hell the PS4 bandwidth get saturated by MSAAx4 and is 176GB per freaking second,and this is the problem with your argument dude,you don't understand this period GPU require fast bandwidth which internet connection lack.

I understand that GPU requires fast bandwith but I also understand I'm not the smartest person in the world. Also saying a console is 1tb and 2tb is not the same because hardware is fixed, developers live to make the impossible possible so what's not possible today could be tomorrow if a dev can write the code. You did see where Cloudgine(not M$) claims to be able to make rendering from the cloud possible right? You a re right when it sounds to good to be true it isn't, but nothing about the xbox sounds to good to be true, remember they've been having a tough time ever since last years E3. PS4 is the good guy who can do no wrong.

"Of course you will see those companies involve they have winning on those,hell who in hell do you think make the GPU for compute use on those servers MS has.?" WTF does this even mean. First of all you implying that Havok, Epic, and Nvidia built the Azure servers? Secondly they would only collaborate with Cloudgine if they thought it was going to be relevant middleware in the future.

I read the article about M$ message shifting to dedicated servers, and that's the truth. But like I said it's not M$ themselves claiming to make rendering from the cloud possible they only provide the servers, it is up to the devs at Cloudgine and their middleware, that will make it possible. Having access to unlimited cloud servers is what they need to make it happen and it just so happens that M$ has those. We will just have to wait and see.

#330 Posted by evildead6789 (7435 posts) -

@tormentos said:

The PS4 can run Physics on its CPU or GPU sad troll.

CPU Load

  • 60 AI characters
  • 940 entities, 300 active
  • 8200 physics objects (1500 key-framed, 6700 static)
  • 500 particle systems
  • 120 sound voices
  • 110 ray casts
  • 1000 jobs per frame

http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall

Killzone SF post morten the physics been done on CPU..lol

The average U.S. broadband connection speed was nearly 10 Mbps (9.8 Mbps) in the third quarter of 2013, according to the latest Akamai State of the Internet Report. That’s an increase of 13 percent over the previous quarter when the average connection speed was 8.6 Mbps and a 31 percent increase over the same period a year earlier.

http://www.telecompetitor.com/akamai-average-u-s-broadband-speed-climbs-9-8-mbps/

This news in early January say the average was 9.8 MB far from the 26 even more 60MB you claim.

Nice now tell me when they hit 150GB/s which is what the xbox one need to move complex process on its GPU,online connection are not average 60 MB even if it were it means nothing GPU work on GB per second not MB per second even a 1GB line would still be no enough for complex process,even DDR3 uses 68GB/s 68 times faster than the fastest online connection out there,and that is DDR3 which is way slower than GDDR5.

Hell the PS4 bandwidth get saturated by MSAAx4 and is 176GB per freaking second,and this is the problem with your argument dude,you don't understand this period GPU require fast bandwidth which internet connection lack.

Of course you will see those companies involve they have winning on those,hell who in hell do you think make the GPU for compute use on those servers MS has.?

hahahaha

The cloud serves for allot of things,rendering graphics over the internet is not one of them sadly,it can be use for AI,some physics as long as the results are not need it in the same frame,baked lighting and stuff like that which don't need constant refreshing.

The cloud isn't a myth is how MS sold it what is was a myth cloud is Siri,cloud is Office 365,cloud is storage,cloud is PSN Now,cloud is many things online rendering is not.

I don't have of those titles i just know for FACT that GPU require ultra fast bandwidth to process graphics,and i just also know that MS and even Sony will lie to hell and beyond just to make you buy their product was the xbox 360 1 TF like MS claimed in 2005.? Was it.? Answer me that because they claimed the xbox 360 was 1 teraflop on 2005,then came sony and claimed the PS3 was 2 teraflops.

When it sound to good to be true it probably is,have you see MS talking about improving graphics lately over the cloud.?

Microsoft's confusing Xbox One cloud message shifts to dedicated servers

"Xbox Live is the service. Dedicated servers is the benefit."

Microsoft's E3 2014 press conference was notable for its focus on games, but one Xbox One feature, touted when the console was revealed last year as one of its unique selling points, was conspicuous by its absence.

There was not one mention of "the cloud" during Microsoft's 90 minute presentation, either from Xbox boss Phil Spencer or the many developers that took to the stage to talk about their games.

Instead, we heard the term "dedicated servers" over and over again. This after Microsoft spent a great deal of time and effort insisting the power of the cloud would revolutionise gaming as we know it.

http://www.eurogamer.net/articles/2014-06-16-microsofts-confusing-xbox-one-cloud-message-shifts-to-dedicated-servers

Just freaking read... You are stock with inferior hardware and you will be unless you go PS4,PC or wait until the next xbox if there is one.

Ok and xbox can't run physics from CPU, the only difference being that the xbox is better equipped to communicate with the CPU and GPU at the same time. Remember Sony's own first party said that the PS4's Cpu can't keep up with it's GPU, which is my guess as to why they are pushing gpgpu so much. Nobody said heavy lifting on the cloud except Cloudgine, the demo M$ showed that you accuse softwaregeek of being fake is on their site. M$ was using their technology to demonstrate what Azure could do with Cloudgine's middleware(have you even browsed the site?). I'm still anxious to see how it works because I was convinced it would do nothing but help with processes that were not latency bound. But that doesn't mean someone smarter than me can't figure out how to make it work.

Hell the PS4 bandwidth get saturated by MSAAx4 and is 176GB per freaking second,and this is the problem with your argument dude,you don't understand this period GPU require fast bandwidth which internet connection lack.

I understand that GPU requires fast bandwith but I also understand I'm not the smartest person in the world. Also saying a console is 1tb and 2tb is not the same because hardware is fixed, developers live to make the impossible possible so what's not possible today could be tomorrow if a dev can write the code. You did see where Cloudgine(not M$) claims to be able to make rendering from the cloud possible right? You a re right when it sounds to good to be true it isn't, but nothing about the xbox sounds to good to be true, remember they've been having a tough time ever since last years E3. PS4 is the good guy who can do no wrong.

"Of course you will see those companies involve they have winning on those,hell who in hell do you think make the GPU for compute use on those servers MS has.?" WTF does this even mean. First of all you implying that Havok, Epic, and Nvidia built the Azure servers? Secondly they would only collaborate with Cloudgine if they thought it was going to be relevant middleware in the future.

I read the article about M$ message shifting to dedicated servers, and that's the truth. But like I said it's not M$ themselves claiming to make rendering from the cloud possible they only provide the servers, it is up to the devs at Cloudgine and their middleware, that will make it possible. Having access to unlimited cloud servers is what they need to make it happen and it just so happens that M$ has those. We will just have to wait and see.

sony just sucks at making consoles, last gen the cpu was complete overkill. This gen the memory bandwith is overkill.

They just got lucky microsoft reserved 10 percent for the kinect of the power available and because the esram tools weren't used properly

A game like wolfenstein shows how it is,

virtually no difference, and when the cloud starts , who knows what the x1 will be capable of

#331 Edited by FastRobby (1246 posts) -
@tormentos said:

When it sound to good to be true it probably is,have you see MS talking about improving graphics lately over the cloud.?

Microsoft's confusing Xbox One cloud message shifts to dedicated servers

"Xbox Live is the service. Dedicated servers is the benefit."

Microsoft's E3 2014 press conference was notable for its focus on games, but one Xbox One feature, touted when the console was revealed last year as one of its unique selling points, was conspicuous by its absence.

There was not one mention of "the cloud" during Microsoft's 90 minute presentation, either from Xbox boss Phil Spencer or the many developers that took to the stage to talk about their games.

Instead, we heard the term "dedicated servers" over and over again. This after Microsoft spent a great deal of time and effort insisting the power of the cloud would revolutionise gaming as we know it.

http://www.eurogamer.net/articles/2014-06-16-microsofts-confusing-xbox-one-cloud-message-shifts-to-dedicated-servers

Just freaking read... You are stock with inferior hardware and you will be unless you go PS4,PC or wait until the next xbox if there is one.

Why do you leave out all the rest of the article? You just cut out what you want to hear, which is childish, and a way of lying. In that same article it clearly states:

Crackdown indicates that while Microsoft's messaging behind the cloud has shifted in the short term, it's sticking to its long term vision for how it could benefit games.

Digital Foundry's Richard Leadbetter welcomed Microsoft's more approachable messaging, but said important questions remain.

"The name might have changed, but this is still very much the Azure 'Thunderhead' service, just with a name-change to make it more appealing to the core gamer," Leadbetter said. "While the cloud obviously offers dedicated server functionality, Azure offers the potential for much more and it doesn't look as though Microsoft has changed its plans there.

"We've only seen a hint of what's possible so far beyond multiplayer gaming with Drivatars and Titanfall's grunt AI. The Crackdown prototype is a great example of what the cloud should excel at - offloading complex calculations away from the host console, where the additional 100ms or so latency to and from the datacentre won't unduly impact gameplay.

"The cloud doesn't address graphics bottlenecks, but here it demonstrates how much of a strain simulating destruction of a complex scene can have on the CPU - an area where both PS4 and Xbox One lag behind even mid-range PC processors.

"The question is really how much CPU time Microsoft is willing to dedicate to each game instance. I suspect that the Crackdown prototype uses an order of magnitude more CPU power than, say, the grunt AI in Titanfall. It'll be an interesting stress test of the Azure infrastructure to see if it can hold its own in a game likely to break the million-sales barrier in short order."

Just freaking read...

Also don't talk about what the cloud is able to do:

The cloud serves for allot of things,rendering graphics over the internet is not one of them sadly,it can be use for AI,some physics as long as the results are not need it in the same frame,baked lighting and stuff like that which don't need constant refreshing.

Completely not true. Not even in the slightest. Do you even know basic AI? You never have given any credentials, a lot of other people have. You've got software engineers that are telling you, you are wrong.

#332 Posted by GrenadeLauncher (4060 posts) -

You are missing the bigger picture - having to use DF in order to spot the difference means that the differences are minor. If you tun your car's engine, and then have to hook it up to a dynamometer in order to see if anything changed; you didn't do much.

Another poor analogy from Joey.

@ccagracing:

"software geek took my comments out of context. I don't think that if a game uses the cloud it will suddenly gain 10fps or anything like it"

Did you watch the video? It clearly shows a 28+ fps boost. I'm just merely pointing out that the cloud is real, it's there, it's already working. Those that think it's a marketing gimmick are free to believe that but their beliefs are misplaced. Crackdown will make extensive use of it and if you've watched the trailer you know that is one sweet looking game.

Capcom is remaking the original Resident Evil 1. That's one of my all time fave's.

Can't wait to laugh at the lag outside of that controlled environment and the fact any console or modestly-priced PC will be able to render the same stuff.

Capcom isn't remaking RE1. It's a port.

#333 Posted by StormyJoe (4877 posts) -

@StormyJoe said:

You are missing the bigger picture - having to use DF in order to spot the difference means that the differences are minor. If you tun your car's engine, and then have to hook it up to a dynamometer in order to see if anything changed; you didn't do much.

Another poor analogy from Joey.

@softwaregeek said:

@ccagracing:

"software geek took my comments out of context. I don't think that if a game uses the cloud it will suddenly gain 10fps or anything like it"

Did you watch the video? It clearly shows a 28+ fps boost. I'm just merely pointing out that the cloud is real, it's there, it's already working. Those that think it's a marketing gimmick are free to believe that but their beliefs are misplaced. Crackdown will make extensive use of it and if you've watched the trailer you know that is one sweet looking game.

Capcom is remaking the original Resident Evil 1. That's one of my all time fave's.

Can't wait to laugh at the lag outside of that controlled environment and the fact any console or modestly-priced PC will be able to render the same stuff.

Capcom isn't remaking RE1. It's a port.

Nope. A perfect one. You are saying that because you have no response - we both know I am right.

#334 Posted by Heil68 (43436 posts) -

SONY crushes Xbone and declared the world's most powerful video game console. A truly modern engineering marvel.

#335 Posted by GrenadeLauncher (4060 posts) -

Nope. A perfect one. You are saying that because you have no response - we both know I am right.

You remind me of those louts who insist you can't tell the difference between 1080p and 720p.

Crawl back to your gutter, Joey. Best next gen you'll have a better go of it if MS don't make another shit console.

#336 Posted by SecretPolice (21467 posts) -

DX12 + Supah SecretSqirrel eSram GPU pipelines / move engines + Cloud = Most sophisticated powahfull gaming tech that'll just keep growing in server cloud tech powah over the lifetime of the system which should be like 10 - 20 years. :o Just sayin. :P

#337 Posted by clyde46 (44851 posts) -

@StormyJoe said:

Nope. A perfect one. You are saying that because you have no response - we both know I am right.

You remind me of those louts who insist you can't tell the difference between 1080p and 720p.

Crawl back to your gutter, Joey. Best next gen you'll have a better go of it if MS don't make another shit console.

Just like the consolites last gen who insisted there was no difference between 720p and 1080p?

#338 Posted by GrenadeLauncher (4060 posts) -

@clyde46 said:

Just like the consolites last gen who insisted there was no difference between 720p and 1080p?

Considering most games last gen were barely 720p I don't know how you came to that conclusion.

#339 Posted by clyde46 (44851 posts) -

@clyde46 said:

Just like the consolites last gen who insisted there was no difference between 720p and 1080p?

Considering most games last gen were barely 720p I don't know how you came to that conclusion.

Surely you remember ShadowMoses right?

#340 Posted by StormyJoe (4877 posts) -

@clyde46 said:

@GrenadeLauncher said:

@StormyJoe said:

Nope. A perfect one. You are saying that because you have no response - we both know I am right.

You remind me of those louts who insist you can't tell the difference between 1080p and 720p.

Crawl back to your gutter, Joey. Best next gen you'll have a better go of it if MS don't make another shit console.

Just like the consolites last gen who insisted there was no difference between 720p and 1080p?

I really couldn't tell last gen, either. I guess if I paused the game and stared at it for awhile while having a 2nd TV hooked up that was showing a 1080p version, after a little while I could tell. But really, is it worth it?

CoD:Ghosts plays just as well on XB1 as PS4. I don't see this being a big issue.

@GrenadeLauncher and the rest of the cows want it to be a big issue, because they are moo-cow-fanboys. But wanting something to be, and that something being what you want, are often not the same - like in this case.

#341 Posted by GrenadeLauncher (4060 posts) -

@clyde46 said:

Surely you remember ShadowMoses right?

MGS4? That was 768p. And it still looks better than most Xbone games. Powwa of da cell etc etc.

I really couldn't tell last gen, either. I guess if I paused the game and stared at it for awhile while having a 2nd TV hooked up that was showing a 1080p version, after a little while I could tell. But really, is it worth it?

CoD:Ghosts plays just as well on XB1 as PS4. I don't see this being a big issue.

@GrenadeLauncher and the rest of the cows want it to be a big issue, because they are moo-cow-fanboys. But wanting something to be, and that something being what you want, are often not the same - like in this case.

I should hope you couldn't, the only 1080p games last gen were about 12 digital games.

Joey living in denial about the problems the Xbone's gimped hardware is giving it. sogood.gif

#342 Posted by clone01 (24500 posts) -

@clyde46 said:

@GrenadeLauncher said:

@clyde46 said:

Just like the consolites last gen who insisted there was no difference between 720p and 1080p?

Considering most games last gen were barely 720p I don't know how you came to that conclusion.

Surely you remember ShadowMoses right?

Oh, man, that guy was a headcase.

#343 Posted by clyde46 (44851 posts) -

@GrenadeLauncher: Not the level.... There was a user on here last year or so that created some ruckus... Surely you remember him?

#344 Posted by GrenadeLauncher (4060 posts) -

@clyde46 said:

@GrenadeLauncher: Not the level.... There was a user on here last year or so that created some ruckus... Surely you remember him?

Good luck with that, I can barely remember what I did last week. I'll look him up.

#345 Posted by clyde46 (44851 posts) -

@clyde46 said:

@GrenadeLauncher: Not the level.... There was a user on here last year or so that created some ruckus... Surely you remember him?

Good luck with that, I can barely remember what I did last week. I'll look him up.

Look for his FC3 thread.

#346 Edited by clone01 (24500 posts) -

@clyde46 said:

@GrenadeLauncher: Not the level.... There was a user on here last year or so that created some ruckus... Surely you remember him?

I'll leave this for the lulz

#347 Posted by clyde46 (44851 posts) -

@clone01 said:

@clyde46 said:

@GrenadeLauncher: Not the level.... There was a user on here last year or so that created some ruckus... Surely you remember him?

I'll leave this for the lulz

He was so mad. Do you remember when he got found out using the same FC3 screenshot for all 3 platforms?

#348 Posted by clone01 (24500 posts) -

@clyde46 said:

@clone01 said:

@clyde46 said:

@GrenadeLauncher: Not the level.... There was a user on here last year or so that created some ruckus... Surely you remember him?

I'll leave this for the lulz

He was so mad. Do you remember when he got found out using the same FC3 screenshot for all 3 platforms?

Lol! No, but I imagine the meltdown was fantastic. Honestly, I know most of the people here are just trolling, but he's one of the few that I think really believed the stupid sh*t he said.

#349 Posted by clyde46 (44851 posts) -

@clone01 said:

@clyde46 said:

@clone01 said:

@clyde46 said:

@GrenadeLauncher: Not the level.... There was a user on here last year or so that created some ruckus... Surely you remember him?

I'll leave this for the lulz

He was so mad. Do you remember when he got found out using the same FC3 screenshot for all 3 platforms?

Lol! No, but I imagine the meltdown was fantastic. Honestly, I know most of the people here are just trolling, but he's one of the few that I think really believed the stupid sh*t he said.

Many a time we and others have argued with him. I think some user clocked that the URL's for the images were the same. His meltdown's makes El Tomato's look tame.

#350 Posted by Shewgenja (8453 posts) -

@clone01 said:

@clyde46 said:

@GrenadeLauncher: Not the level.... There was a user on here last year or so that created some ruckus... Surely you remember him?

I'll leave this for the lulz

Wow.. That's seriously unglued.