Watch_Dogs | 900p on PS4, 792p on One; 30fps on Both

#701 Edited by -Unreal- (24544 posts) -

Last year, just before the new consoles came out I had a choice for my "next gen" gaming.

Buy an Xbox One or a PlayStation 4.

Or buy a graphics card for my work PC.

I bought the graphics card. It cost me less money than either console, is more powerful and came with games. Now if I had went with the PS4 for example, I'd have to suffer Watch Dogs at a laughable 900p and 30 FPS at low-medium settings. And it would have cost me more to do it. On top of all that, I'd get no mods, no keyboard and mouse option, none of the advanced graphics, no community made shader/postprocess/texture mods and online play would cost me even more money, plus the game is more expensive to buy.

#702 Posted by MK-Professor (3827 posts) -

@tormentos said:

Dude is not $250 and most sites are not selling it for that price.

I just illustrate what the difference amount to in PC.

So the difference is 10 FPS between the 7850 vs the 7870 and the GPU that you claim to have 2X performance does 17 FPS more than the 7850...The PS4 is a little stronger than the 7850..

So much money trow away just to get 17 Frames more...lol

Hell in 2560x1600 the difference is even less 10 FPS...lol

It is $250 and some timies much less. Again why you are comparing the HD7850 vs HD7870, I thought we are taking about the R9 280(7950)?

Also I said the HD7950(OC to 1150-1200) it perform 2X timies faster that the HD7850, this benchmark you posted show exactly that in both rez. Note the HD7950(OC to 1150-1200) performance is slightly above the HD7970 GHz.

so it is like that

1920x1200:

HD7950(1200) shoud get around 80fps (slightly above the HD7970 GHz)

HD7850(stock) 42fps

2560x1600:

  • HD7950(1200) shoud get around 50fps
  • HD7850(stock) 26fps

2X performance

#703 Edited by MK-Professor (3827 posts) -

@scottpsfan14 said:

What ever man. Listen, I am not a console fanboy, I just have heard it from developers in real life. I have heard it from programming lecturers at college, and by developers on the internet. My dad also used to program and is very tech savy, and confirms this. I also have picked up a few tricks. I just want to know where you get your statement that Direct X has the same GPU performance as consoles no matter how low level the code is.

You say that custom access to the GPU does not matter. Then why does AMD state that mantle (a low level API) "Enables higher graphics performance with direct access to all GPU features"? Are they lying? Are countless developers including John Carmack and Oles Shishkovstov (developer of Metro 2033/LL) wrong when they say low level API's create massive performance increases? Also you should note that most multiplat developers don't spend their coding budget on one platform which is why a lot of them perform similar on PS4 to an equivelent PC gpu.

http://www.eurogamer.net/articles/digitalfoundry-inside-metro-last-light

Digital Foundry: "Do you think that the relatively low-power CPUs in the next-gen consoles (compared to PC, at least) will see a more concerted push to getting more out of GPU Compute?"

Oles Shishkovstov: "You just cannot compare consoles to PC directly. Consoles could do at least 2x what a comparable PC can due to the fixed platform and low-level access to hardware."

Also, here is another example of the benefits of lower level API's. In this case, the PS VITA compared to smartphones that use OpenGL.

http://techcrunch.com/2011/01/28/john-carmack-thanks-to-low-level-apis-sony-ngp-should-be-a-%E2%80%98generation%E2%80%99-ahead-of-smartphones-at-launch/

“Low level APIs will allow the Sony NGP to perform about a generation beyond smart phones with comparable specs.” John Carmack.

If you look at most smartphone games even now, they don't have one game as good looking as Uncharted Golden Abyss. Regardless of them being 2 - 3 times the gpu power now. This is because developers were able focus on the Vita alone and use low level instructions to get the most out of the hardware it had. Smartphone games run on openGL ES that is basically a compact Open GL for portable devices. This has about as much hardware utilization as DirectX.

You say low level coding is a waste of time. Is it really? Care to explain why? Properly this time?

Mantle is not perfome better than DirectX, bechmarks show exacly that, what Mantle dose better than DirectX is to reduce the time draw calls take by a significant amount compared to DX11. As a result in cpu bottleneck scenario the Mantle perform better, when cpu bottleneck is not present (the majority of the time, assuming you have a decent cpu and not a crappy low end) Mantle does not even perform 1% faster.

#704 Posted by GTSaiyanjin2 (6011 posts) -

I very much doubt this will look as good as the E3 2012 demo, even in its stock version on the PC. Looking forward to seeing how it looks on all systems.

#705 Edited by The_Stand_In (395 posts) -

@kipsta77 said:

Those are clearly different iterations of the game. The environment is different (look at the buildings, signs, overhangs, tables, umbrellas, etc.) and the UI is very different. So I don't really see the point in comparing the two either way. Besides, both look like they are shots from a YouTube video, just with months, if not a year or more, difference in development time.

#706 Edited by The_Stand_In (395 posts) -

Double post. Sorry.

#707 Edited by Sollet (7394 posts) -

@silversix_ said:

Imagine what will happen to The Division compared to its 2013 gameplay video lol my god the difference there will be o.O

lol any console fanboy who thinks The Division will look like that is in for a rude awakening.

#708 Posted by tdkmillsy (1399 posts) -

@tormentos:

Silly fanboy that's funny coming from you.

Nobody needs to justify there purchase everyone has the right to buy what they want. But lets just look at the scenario for a moment. What could you get on the PS4 at launch day that 4-10 year olds could play. Now look at what you could get on Xbox One at launch. Look at what my kids play now Just dance, Zoo Tycoon, Golf, Lego Games, Forza, Kinect Sports and that other one I cant remember. My nephew (4 year old) can turn on the Xbox One and it shows his stuff on screen and more importantly doesn't see zombies and come running to me scared, he can say Xbox play something or clicks on an Icon and bang he's in the game. The kids are up on their feet and not sat down with a controller. I bought it because it can do more for the family than PS4 or PC, but that will go straight over your head.

I have a laptop that can play most games good enough. I will have a better laptop or PC later this year to cover the new stuff.

Your losing the whole argument of PC vs PS4 so have to bring this into it. You talk about MS and empty promises yet Sony does it just as bad. If have always said if I could get Kinect, Kinect games and the overall experience on PC I'd have got a PC instead of Xbox One.

I wonder if you ever get a PS4 will you actually play it or build a shrine so you can pray to your god.

#709 Posted by tormentos (18325 posts) -

@MK-Professor said:

It is $250 and some timies much less. Again why you are comparing the HD7850 vs HD7870, I thought we are taking about the R9 280(7950)?

Also I said the HD7950(OC to 1150-1200) it perform 2X timies faster that the HD7850, this benchmark you posted show exactly that in both rez. Note the HD7950(OC to 1150-1200) performance is slightly above the HD7970 GHz.

so it is like that

1920x1200:

HD7950(1200) shoud get around 80fps (slightly above the HD7970 GHz)

HD7850(stock) 42fps

2560x1600:

  • HD7950(1200) shoud get around 50fps
  • HD7850(stock) 26fps

2X performance

Not even quoting the screen you admit it,look at the 7970 it get 70FPS the 7850 get 42,even compare to the 7970 it only does 28FPS more not 2X dude...2X of 42 FPS is 84FPS...

@tdkmillsy said:

@tormentos:

Silly fanboy that's funny coming from you.

Nobody needs to justify there purchase everyone has the right to buy what they want. But lets just look at the scenario for a moment. What could you get on the PS4 at launch day that 4-10 year olds could play. Now look at what you could get on Xbox One at launch. Look at what my kids play now Just dance, Zoo Tycoon, Golf, Lego Games, Forza, Kinect Sports and that other one I cant remember. My nephew (4 year old) can turn on the Xbox One and it shows his stuff on screen and more importantly doesn't see zombies and come running to me scared, he can say Xbox play something or clicks on an Icon and bang he's in the game. The kids are up on their feet and not sat down with a controller. I bought it because it can do more for the family than PS4 or PC, but that will go straight over your head.

I have a laptop that can play most games good enough. I will have a better laptop or PC later this year to cover the new stuff.

Your losing the whole argument of PC vs PS4 so have to bring this into it. You talk about MS and empty promises yet Sony does it just as bad. If have always said if I could get Kinect, Kinect games and the overall experience on PC I'd have got a PC instead of Xbox One.

I wonder if you ever get a PS4 will you actually play it or build a shrine so you can pray to your god.

Exactly so you trying to use the PC against the PS4 is a stupid argument even more when for decades it has been proven that both are different markets that co exist together without 1 killing the other.

Knack and believe be most 4 to 10 year old would not care for sh** about score.

Please dude stop you buy and xbox one because that is what you like,and is ok but flaming the PS4 for not having games when the xbox one has less is a joke.

So no spec.? Hahaha..............

How am i losing my argument.? I already show that you can't build a cheaper PC or the same price as the PS4 that perform equal or better,i already show how on consoles you can save to,and i also show your double standard when you talk about the PS4 but ignore that your xbox cost more,is weaker and has less games and has lower scored ones,hell if it wasn't for Titanfall what would you be playing now.?

#710 Posted by scottpsfan14 (5569 posts) -
@MK-Professor said:

Mantle is not perfome better than DirectX, bechmarks show exacly that, what Mantle dose better than DirectX is to reduce the time draw calls take by a significant amount compared to DX11. As a result in cpu bottleneck scenario the Mantle perform better, when cpu bottleneck is not present (the majority of the time, assuming you have a decent cpu and not a crappy low end) Mantle does not even perform 1% faster.

So are there no benefits to custom low level access to hardware? Then why does these features even exist? CPU overhead is just one problem of Direct X. It also gives you no where near the access to hardware as low level API's including Mantle. So you are not even excited about the possibilities of Direct X 12 and Mantle? You are expecting the same performance as DX11 given that people use high end CPU's?

#711 Edited by MK-Professor (3827 posts) -

@tormentos said:

@MK-Professor said:

It is $250 and some timies much less. Again why you are comparing the HD7850 vs HD7870, I thought we are taking about the R9 280(7950)?

Also I said the HD7950(OC to 1150-1200) it perform 2X timies faster that the HD7850, this benchmark you posted show exactly that in both rez. Note the HD7950(OC to 1150-1200) performance is slightly above the HD7970 GHz.

so it is like that

1920x1200:

  • HD7950(1200) shoud get around 80fps (slightly above the HD7970 GHz)
  • HD7850(stock) 42fps

2560x1600:

  • HD7950(1200) shoud get around 50fps
  • HD7850(stock) 26fps

2X performance

Not even quoting the screen you admit it,look at the 7970 it get 70FPS the 7850 get 42,even compare to the 7970 it only does 28FPS more not 2X dude...2X of 42 FPS is 84FPS...

WTF man did i say 7970 or 7970GHz?

from the benchmark you post earlier:

  • 1920x1200 - 42fps vs 78fps, HD7850 vs HD7970GHz
  • 2560x1600 - 26fps vs 49fps, HD7850 vs HD7970GHz

since the HD7950(at 1200) is slightly faster than the HD7970GHz , it is like this:

  • 1920x1200 - 42fps vs 80fps, HD7850 vs HD7950(at 1200)
  • 2560x1600 - 26fps vs 50fps, HD7850 vs HD7950(at 1200)

Double the performance

#712 Posted by MK-Professor (3827 posts) -

@scottpsfan14 said:
@MK-Professor said:

Mantle is not perfome better than DirectX, bechmarks show exacly that, what Mantle dose better than DirectX is to reduce the time draw calls take by a significant amount compared to DX11. As a result in cpu bottleneck scenario the Mantle perform better, when cpu bottleneck is not present (the majority of the time, assuming you have a decent cpu and not a crappy low end) Mantle does not even perform 1% faster.

So are there no benefits to custom low level access to hardware? Then why does these features even exist? CPU overhead is just one problem of Direct X. It also gives you no where near the access to hardware as low level API's including Mantle. So you are not even excited about the possibilities of Direct X 12 and Mantle? You are expecting the same performance as DX11 given that people use high end CPU's?

Aside CPU performance that is going to be an improvement, the Direct X 12 and Mantle will not bring any other performance advantage over DX11. sure DX12 will bring a few features here and there but nothing important.

#713 Edited by MiiiiV (501 posts) -

@tormentos said:
@MK-Professor said:

@tormentos said:

@MK-Professor said:

http://www.anandtech.com/show/8019/amd-cuts-radeon-r9-280-to-249

For example a week ago Newegg was selling R9 280's for $199.

Also do you realize that this GPU is more than 2 times (4.3TFlops if you OC it properly) more powerfull than ps4, of course a complete pc with that gpu will be more expesive than ps4 but only slightly, not to mention that in the long run the ps4 will be way more expensive.

http://www.anandtech.com/show/7981/best-video-cards-april-2014

The SRP was $299 on April 30 by Anandtech as well...

Once again a freaking 1 time deal doesn't equal normal price...

I can buy a PS4 at any time is $399,that card with a deal you may find it for $250 or even lower,and in fact those deals are always base around re found,which have a time limit.

Once again i know what the whole TF sh** amount to the 7870 is 1TF over the 7850 yet is only does 10 to 13 FPS more under the same quality some times even less.

All I said is that the official price for the R9 280 is $250 and not $300 or $400 as you said, and if you want you can find it at a lower price than $250. What's Up with the 7870 vs 7850? I thought we are taking about the R9 280(7950).

Also take this as an advise when you are comparing performance you never say it perform X fps more, becasue it doesn't make sense (10fps more from 5 fps or 10fps more from 100fps, see?), you always need to say "it perform X % more or something like that".

Dude is not $250 and most sites are not selling it for that price.

I just illustrate what the difference amount to in PC.

So the difference is 10 FPS between the 7850 vs the 7870 and the GPU that you claim to have 2X performance does 17 FPS more than the 7850...The PS4 is a little stronger than the 7850..

So much money trow away just to get 17 Frames more...lol

Hell in 2560x1600 the difference is even less 10 FPS...lol

In gpu demanding situations, the hd 7970 ghz ed. often preforms 80-100% better than the hd 7850, like when you look at minimum frame rates at high resolutions with high settings in demanding games. And I think MK-Professor is talkig about an oc'ed R9 280 (hd 7950) which can easily perform as good as a hd 7970 ghz ed. so I definitely wouldn't call it "money thrown away"

#714 Posted by donalbane (16362 posts) -

@TLHBO said:

rofl. Can't believe people actually paid £429 for a 720-792P console. It's ridiculous.

When you want to play everything, you get all the consoles. If graphics fidelity were the only barometer for game quality, there would be no reason to play anything other than PC games. Never underestimate the power of exclusives.

#715 Posted by blackace (20874 posts) -

@tormentos said:

@GoldenElementXL said:

@tormentos: when did I lose an argument with you? When has anyone?

Yeah you never do....lol

What will happen when i get a PS4 will my arguments be valid.?

I think you actually need a brain to win an argument.

http://74.220.219.105/~johnrei1/wp-content/uploads/2014/01/scarecrow.jpg?w=600

#716 Posted by fierro316 (1670 posts) -

Let's hope there's a resolution patch in the works.

#717 Posted by zeeshanhaider (2603 posts) -

@scottpsfan14 said:
@MK-Professor said:

Developer always lie like that to cheer up console fanboys, "GNM will transform ps4 in to supercomputer and things like that". The draw call performance as I said, is the only overhead, and it only affect CPU, and DX11 was a big improvement in that area over the DX9, so now it is not an issue as it was before. What do you mean show some examples of multiplats? All multiplats this gen and previous gen perform the same on pc and console with similat GPU. PS4 GPU cant perform any better than an equivalent pc gpu that is a fact. It is simple games (this gen and previous gen) speak for themselves no need to say anything else.

What ever man. Listen, I am not a console fanboy, I just have heard it from developers in real life. I have heard it from programming lecturers at college, and by developers on the internet. My dad also used to program and is very tech savy, and confirms this. I also have picked up a few tricks. I just want to know where you get your statement that Direct X has the same GPU performance as consoles no matter how low level the code is.

You say that custom access to the GPU does not matter. Then why does AMD state that mantle (a low level API) "Enables higher graphics performance with direct access to all GPU features"? Are they lying? Are countless developers including John Carmack and Oles Shishkovstov (developer of Metro 2033/LL) wrong when they say low level API's create massive performance increases? Also you should note that most multiplat developers don't spend their coding budget on one platform which is why a lot of them perform similar on PS4 to an equivelent PC gpu.

http://www.eurogamer.net/articles/digitalfoundry-inside-metro-last-light

Digital Foundry: "Do you think that the relatively low-power CPUs in the next-gen consoles (compared to PC, at least) will see a more concerted push to getting more out of GPU Compute?"

Oles Shishkovstov: "You just cannot compare consoles to PC directly. Consoles could do at least 2x what a comparable PC can due to the fixed platform and low-level access to hardware."

Also, here is another example of the benefits of lower level API's. In this case, the PS VITA compared to smartphones that use OpenGL.

http://techcrunch.com/2011/01/28/john-carmack-thanks-to-low-level-apis-sony-ngp-should-be-a-%E2%80%98generation%E2%80%99-ahead-of-smartphones-at-launch/

“Low level APIs will allow the Sony NGP to perform about a generation beyond smart phones with comparable specs.” John Carmack.

If you look at most smartphone games even now, they don't have one game as good looking as Uncharted Golden Abyss. Regardless of them being 2 - 3 times the gpu power now. This is because developers were able focus on the Vita alone and use low level instructions to get the most out of the hardware it had. Smartphone games run on openGL ES that is basically a compact Open GL for portable devices. This has about as much hardware utilization as DirectX.

You say low level coding is a waste of time. Is it really? Care to explain why? Properly this time?

Yup, that whole interview again confirmed that the API + driver overhead is always on the CPU side. Any modern CPU will be magnitude of times more powerful than the tablet Jaguar CPU of either console, so there goes your 2X argument out of the window leaving the GPU alone. No matter what developer do they won't make the PS4 perform beyond what a 570 could do on PC. Even Capcom considers 570 to be in the same ballpark of PS4 and that's why they used it as a target specs untill they didn't get a PS4 dev kit to design Deep Down.

Ask people with AMD GPU's. Having mantle improves performance if you are using a low end CPU, it won't make much difference with high end CPU's.

#718 Edited by Evo_nine (1741 posts) -

@tormentos said:

@MK-Professor said:

http://www.anandtech.com/show/8019/amd-cuts-radeon-r9-280-to-249

For example a week ago Newegg was selling R9 280's for $199.

Also do you realize that this GPU is more than 2 times (4.3TFlops if you OC it properly) more powerfull than ps4, of course a complete pc with that gpu will be more expesive than ps4 but only slightly, not to mention that in the long run the ps4 will be way more expensive.

http://www.anandtech.com/show/7981/best-video-cards-april-2014

The SRP was $299 on April 30 by Anandtech as well...

Once again a freaking 1 time deal doesn't equal normal price...

I can buy a PS4 at any time is $399,that card with a deal you may find it for $250 or even lower,and in fact those deals are always base around re found,which have a time limit.

Once again i know what the whole TF sh** amount to the 7870 is 1TF over the 7850 yet is only does 10 to 13 FPS more under the same quality some times even less.

the beauty of building a pc is that you don't have to limit yourself to $400

yeah the ps4 is a good deal for people who are poor, but its still weak as piss and you are stuck at 30fps...and now 900p....for an entire generation.

#719 Posted by lhughey (4240 posts) -

I'm too lazy to read all of this thread, but are we really comparing the strength of gaming PC's to consoles AGAIN? It seem like this argument would be stale by now.

#720 Edited by tormentos (18325 posts) -

@MK-Professor said:

WTF man did i say 7970 or 7970GHz?

from the benchmark you post earlier:

  • 1920x1200 - 42fps vs 78fps, HD7850 vs HD7970GHz
  • 2560x1600 - 26fps vs 49fps, HD7850 vs HD7970GHz

since the HD7950(at 1200) is slightly faster than the HD7970GHz , it is like this:

  • 1920x1200 - 42fps vs 80fps, HD7850 vs HD7950(at 1200)
  • 2560x1600 - 26fps vs 50fps, HD7850 vs HD7950(at 1200)

Double the performance

What more demanding that Crysis 3.?

Hey look 19 FPS vs 31 FPS how is 31FPS 100% or double performance of 19.?

22FPs vs 13 FPS again double of 13 is 26 not 22..

So More than double the price but not more than double performance.?

But hey look the 7950 boost not the first one,just does 6 frames more, little less than 50% more frames.

Which is funny because this is average so even the 7970ghz will fall into unplayable from time to time in the game,and the 7950 and 7850 will be unplayable.

@blackace said:

I think you actually need a brain to win an argument.

Probably the reason why you never win anything here,but butbut oyu don't own any next gen console...hahahaahaa

You don't even know the consoles you supposedly own..hahaha Go ahead an buy FF14 and pay for PSN+ because you need it to play MMO on PS4 right.? Hahahaaaaaaaaaaaaaaaaaa

@zeeshanhaider said:

Yup, that whole interview again confirmed that the API + driver overhead is always on the CPU side. Any modern CPU will be magnitude of times more powerful than the tablet Jaguar CPU of either console, so there goes your 2X argument out of the window leaving the GPU alone. No matter what developer do they won't make the PS4 perform beyond what a 570 could do on PC. Even Capcom considers 570 to be in the same ballpark of PS4 and that's why they used it as a target specs untill they didn't get a PS4 dev kit to design Deep Down.

Ask people with AMD GPU's. Having mantle improves performance if you are using a low end CPU, it won't make much difference with high end CPU's.

lol still holding to your table stupid argument even after been prove wrong 10 times..hahaha...

Look how an core i7 can't even double the PS4 CPU,yet the PS4 CPU has more than 4 times the performance of a Tegra 4 and even more from an iphone 5 which uses the same CPU as the ipad 3...

So basically the gap between the PS4 CPU vs tablets is actually bigger than the one between the PS4 and a core i7. lol

You are from those morons who think because you see a 2ghz quad core on a tablet and the CPU on the PS4 is 1.6 some how that is a win for a tablet,my god...lol

@Evo_nine said:

the beauty of building a pc is that you don't have to limit yourself to $400

yeah the ps4 is a good deal for people who are poor, but its still weak as piss and you are stuck at 30fps...and now 900p....for an entire generation.

The problem with your argument is that most PC gamers don't actually go for the super ultra powerful,in fact many go for the 560ti or 7790 things like that hell by steam the most used GPU is from Intel and is integrated..lol

like 11 million PC gamers have stronger hardware than the PS4 the huge majority is under,so yeah buying a PS4 for $399 is a great deal compare to what that will get you on PC which doesn't even amount to a 7770 performance.

Wow so now the PS4 is for poor people.? hahaha hell you should not be even here you are a die hard lemming who defend a $500 over priced weak sauce consoles,with $500 i actually buy a PC stronger than the xbox one...lol

If 720p doesn't bother you why 900p should bother me..lol

#721 Edited by Kinthalis (5322 posts) -

@donalbane said:

@TLHBO said:

rofl. Can't believe people actually paid £429 for a 720-792P console. It's ridiculous.

When you want to play everything, you get all the consoles. If graphics fidelity were the only barometer for game quality, there would be no reason to play anything other than PC games. Never underestimate the power of exclusives.

You're going to pay for all those consoles + pay for multiple multi-player fees plus pay extra for games for what? The 2 or 3 exclusive titles that come out per year?

LOL!

PC's exclusives dwarf any single console.

#722 Posted by zeeshanhaider (2603 posts) -

@tormentos said:

@zeeshanhaider said:

Yup, that whole interview again confirmed that the API + driver overhead is always on the CPU side. Any modern CPU will be magnitude of times more powerful than the tablet Jaguar CPU of either console, so there goes your 2X argument out of the window leaving the GPU alone. No matter what developer do they won't make the PS4 perform beyond what a 570 could do on PC. Even Capcom considers 570 to be in the same ballpark of PS4 and that's why they used it as a target specs untill they didn't get a PS4 dev kit to design Deep Down.

Ask people with AMD GPU's. Having mantle improves performance if you are using a low end CPU, it won't make much difference with high end CPU's.

lol still holding to your table stupid argument even after been prove wrong 10 times..hahaha...

Look how an core i7 can't even double the PS4 CPU,yet the PS4 CPU has more than 4 times the performance of a Tegra 4 and even more from an iphone 5 which uses the same CPU as the ipad 3...

So basically the gap between the PS4 CPU vs tablets is actually bigger than the one between the PS4 and a core i7. lol

You are from those morons who think because you see a 2ghz quad core on a tablet and the CPU on the PS4 is 1.6 some how that is a win for a tablet,my god...lol

I would have probably used some nice F...word to address you but since you are 40 and much older than me....sigh....Respected sir, As far as I remember I have debunked your chart numerous times, so many times that now you have stop citing its source. Pretty sure it's from a engine nobody heard of and no game has ever used in. Or do you want me to make a chart in excel where I put even Tegra ahead than 900pStation/720pBox. Would you you deem it credible then. I guess I know the answer.........HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

I'm truly sorry sir that Sony and MS both fucked their fan base real hard this time with Sony choosing a cheap ass tablet CPU (which even DF mentioned in the article, do you want me to quote it, good sir?) and a GPU barely matching a GTX 570 from 2010 turning PS4 into a 900pStation for now with a very good chance of degrading into 720pStation into future. I'm afraid we would be lucky to see any game on 900pStation even matching Crysis 2 from 2011 considering its pathetic specs...........HAHAHAHAHAHAHAHAHAHAHAHAHAHA

#723 Edited by donalbane (16362 posts) -

@Kinthalis said:

@donalbane said:

@TLHBO said:

rofl. Can't believe people actually paid £429 for a 720-792P console. It's ridiculous.

When you want to play everything, you get all the consoles. If graphics fidelity were the only barometer for game quality, there would be no reason to play anything other than PC games. Never underestimate the power of exclusives.

You're going to pay for all those consoles + pay for multiple multi-player fees plus pay extra for games for what? The 2 or 3 exclusive titles that come out per year?

LOL!

PC's exclusives dwarf any single console.

Good thing I have a GTX 780. My point stands... having the ability to play anything will always be the best option. The console fees are simply the price of admission for this hobby.

#724 Edited by 04dcarraher (19666 posts) -

@tormentos said:


@zeeshanhaider said:

Yup, that whole interview again confirmed that the API + driver overhead is always on the CPU side. Any modern CPU will be magnitude of times more powerful than the tablet Jaguar CPU of either console, so there goes your 2X argument out of the window leaving the GPU alone. No matter what developer do they won't make the PS4 perform beyond what a 570 could do on PC. Even Capcom considers 570 to be in the same ballpark of PS4 and that's why they used it as a target specs untill they didn't get a PS4 dev kit to design Deep Down.

Ask people with AMD GPU's. Having mantle improves performance if you are using a low end CPU, it won't make much difference with high end CPU's.

lol still holding to your table stupid argument even after been prove wrong 10 times..hahaha...


You are from those morons who think because you see a 2ghz quad core on a tablet and the CPU on the PS4 is 1.6 some how that is a win for a tablet,my god...lol


that is your proof..... your grasping for straws.....

That texture test results is influenced by system memory bandwidth. Question is, is it simply flushing the data of the compressed textures after calculation or storing into RAM? Is it pulling assets from RAM to compress? We simply don't have that information which doesn't make it correct to judge CPU's based on this test alone. And the fact is that Jaguar clock per clock is only on average 15% faster then Bobcat series which are 50% slower then Athlon 2's from 2009 clock per clock.... paints a different story

#725 Edited by tormentos (18325 posts) -

@zeeshanhaider said:

I would have probably used some nice F...word to address you but since you are 40 and much older than me....sigh....Respected sir, As far as I remember I have debunked your chart numerous times, so many times that now you have stop citing its source. Pretty sure it's from a engine nobody heard of and no game has ever used in. Or do you want me to make a chart in excel where I put even Tegra ahead than 900pStation/720pBox. Would you you deem it credible then. I guess I know the answer.........HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

I'm truly sorry sir that Sony and MS both fucked their fan base real hard this time with Sony choosing a cheap ass tablet CPU (which even DF mentioned in the article, do you want me to quote it, good sir?) and a GPU barely matching a GTX 570 from 2010 turning PS4 into a 900pStation for now with a very good chance of degrading into 720pStation into future. I'm afraid we would be lucky to see any game on 900pStation even matching Crysis 2 from 2011 considering its pathetic specs...........HAHAHAHAHAHAHAHAHAHAHAHAHAHA

You haven't debunked sh** you are a butthurt fanboy not a developer that chart there is for a freaking game engine,is irrefutable.

How does been an engine not one use or heard off change the outcome of the result,if you are going to imply that is badly optimized well that apply to the engine across all CPU not just 1.

The Tegra over the PS4 and xbox one CPU..hahahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

Not no wait.....Hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

@04dcarraher said:

that is your proof..... your grasping for straws.....

That texture test results is influenced by system memory bandwidth. Question is, is it simply flushing the data of the compressed textures after calculation or storing into RAM? Is it pulling assets from RAM to compress? We simply don't have that information which doesn't make it correct to judge CPU's based on this test alone. And the fact is that Jaguar clock per clock is only on average 15% faster then Bobcat series which are 50% slower then Athlon 2's from 2009 clock per clock.... paints a different story

That is funny because the PS4 has 20GB,bandwidth for its CPU on PC you have 68GB/s yet you don't see 4X jump in performance that you see vs tablets.

That is funny how much slower are the Tegra and other ARM CPU compare to those same Athlon 2 from 2009.?

Please get to me on that,as soon as you calculate how slow are they per clock vs the Athlon 2.

#726 Posted by MrXboxOne (743 posts) -

Trying to watch this guy stream Watch Dogs on PS4..... That PS4 stream app. So pixelated it hurts your eyes!

#727 Edited by lostrib (37592 posts) -

@mrxboxone said:

Trying to watch this guy stream Watch Dogs on PS4..... That PS4 stream app. So pixelated it hurts your eyes!

cool story bro

#728 Posted by 04dcarraher (19666 posts) -

@tormentos said:


@04dcarraher said:

that is your proof..... your grasping for straws.....

That texture test results is influenced by system memory bandwidth. Question is, is it simply flushing the data of the compressed textures after calculation or storing into RAM? Is it pulling assets from RAM to compress? We simply don't have that information which doesn't make it correct to judge CPU's based on this test alone. And the fact is that Jaguar clock per clock is only on average 15% faster then Bobcat series which are 50% slower then Athlon 2's from 2009 clock per clock.... paints a different story

That is funny because the PS4 has 20GB,bandwidth for its CPU on PC you have 68GB/s yet you don't see 4X jump in performance that you see vs tablets.

That is funny how much slower are the Tegra and other ARM CPU compare to those same Athlon 2 from 2009.?

Please get to me on that,as soon as you calculate how slow are they per clock vs the Athlon 2.

Why do you copy and paste without actually understanding or reading.... lol where in the hell you come with 68GB/s with DDR3 on Pc.... again you proving that you dont know what your posting . DDR3 1600mhz in 128bit dual channel only has a bandwidth of 24gb/s at its best. the 68gb's your thinking of is the X1's 256bit 2133 mhz ram, 2133mhz on pc maxes at 34gb/s. again we have no idea what configuration they used. Using more refined terms would answer that and actually releasing the detailed report.

Tegra 4 uses a Quad Core ARM Cortex-A15, Bobcat is roughly 32% faster then Cortex-A15, and jags are about 50% faster clock per clock. But again we have no idea which Tegra they are using 2, 3 or 4.

#729 Posted by MK-Professor (3827 posts) -

@tormentos said:

@MK-Professor said:

WTF man did i say 7970 or 7970GHz?

from the benchmark you post earlier:

  • 1920x1200 - 42fps vs 78fps, HD7850 vs HD7970GHz
  • 2560x1600 - 26fps vs 49fps, HD7850 vs HD7970GHz

since the HD7950(at 1200) is slightly faster than the HD7970GHz , it is like this:

  • 1920x1200 - 42fps vs 80fps, HD7850 vs HD7950(at 1200)
  • 2560x1600 - 26fps vs 50fps, HD7850 vs HD7950(at 1200)

Double the performance

What more demanding that Crysis 3.?

Hey look 19 FPS vs 31 FPS how is 31FPS 100% or double performance of 19.?

22FPs vs 13 FPS again double of 13 is 26 not 22..

So More than double the price but not more than double performance.?

But hey look the 7950 boost not the first one,just does 6 frames more, little less than 50% more frames.

Which is funny because this is average so even the 7970ghz will fall into unplayable from time to time in the game,and the 7950 and 7850 will be unplayable.

I prove to you that the HD7950(at 1200MHz) have double the performance in games, and now you trying hard to find a benchmark that show something less than double the performance.

http://www.guru3d.com/articles_pages/gtx_780_ti_sli_geforce_review,19.html here is a recent benchmark (with more recent drivers and game updates) from the exact same game and settings you posted earlier, see? exactly double the performance again... (HD7970GHz 30fps VS HD7850 16fps, keep in mind that the HD7950(at 1200MHz) is slightly faster than the HD7970GHz).

Also what is double the price? I am pretty sure the R9280(or HD7950) is not cost double the price from the HD7850.

#730 Edited by zeeshanhaider (2603 posts) -

@tormentos said:

@zeeshanhaider said:

I would have probably used some nice F...word to address you but since you are 40 and much older than me....sigh....Respected sir, As far as I remember I have debunked your chart numerous times, so many times that now you have stop citing its source. Pretty sure it's from a engine nobody heard of and no game has ever used in. Or do you want me to make a chart in excel where I put even Tegra ahead than 900pStation/720pBox. Would you you deem it credible then. I guess I know the answer.........HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

I'm truly sorry sir that Sony and MS both fucked their fan base real hard this time with Sony choosing a cheap ass tablet CPU (which even DF mentioned in the article, do you want me to quote it, good sir?) and a GPU barely matching a GTX 570 from 2010 turning PS4 into a 900pStation for now with a very good chance of degrading into 720pStation into future. I'm afraid we would be lucky to see any game on 900pStation even matching Crysis 2 from 2011 considering its pathetic specs...........HAHAHAHAHAHAHAHAHAHAHAHAHAHA

You haven't debunked sh** you are a butthurt fanboy not a developer that chart there is for a freaking game engine,is irrefutable.

How does been an engine not one use or heard off change the outcome of the result,if you are going to imply that is badly optimized well that apply to the engine across all CPU not just 1.

The Tegra over the PS4 and xbox one CPU..hahahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

Not no wait.....Hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

@04dcarraher said:

that is your proof..... your grasping for straws.....

That texture test results is influenced by system memory bandwidth. Question is, is it simply flushing the data of the compressed textures after calculation or storing into RAM? Is it pulling assets from RAM to compress? We simply don't have that information which doesn't make it correct to judge CPU's based on this test alone. And the fact is that Jaguar clock per clock is only on average 15% faster then Bobcat series which are 50% slower then Athlon 2's from 2009 clock per clock.... paints a different story

That is funny because the PS4 has 20GB,bandwidth for its CPU on PC you have 68GB/s yet you don't see 4X jump in performance that you see vs tablets.

That is funny how much slower are the Tegra and other ARM CPU compare to those same Athlon 2 from 2009.?

Please get to me on that,as soon as you calculate how slow are they per clock vs the Athlon 2.

The irony.....the good sir, calling me the fanboy. :D

Anyways, so let me try your way then.

OWNED!!! :D

And here's something from the DF about the pathetic tablet CPU:

This is actually a real issue for the next-gen consoles, as the six CPU cores available to developers aren't exactly powerhouses. AMD's Jaguar architecture was designed with low-end laptops and tablets in mind - it's just that the ratio between power consumption and silicon die-space vs performance made it a good fit for the next-gen consoles. The typical CPU that finds its way into a gaming PC is much more capable in raw processing terms, so it's understandable why the trade between pixels vs frames works more frequently there - there's a lot of untapped resource CPU-side.

Source
#731 Posted by BlbecekBobecek (2689 posts) -

inFamous Second Son looks way better than any version of this game.

#732 Posted by BlbecekBobecek (2689 posts) -

@-Unreal- said:

Last year, just before the new consoles came out I had a choice for my "next gen" gaming.

Buy an Xbox One or a PlayStation 4.

Or buy a graphics card for my work PC.

I bought the graphics card. It cost me less money than either console, is more powerful and came with games. Now if I had went with the PS4 for example, I'd have to suffer Watch Dogs at a laughable 900p and 30 FPS at low-medium settings. And it would have cost me more to do it. On top of all that, I'd get no mods, no keyboard and mouse option, none of the advanced graphics, no community made shader/postprocess/texture mods and online play would cost me even more money, plus the game is more expensive to buy.

Cool story bro. I bought both a high end gaming PC and a PS4 two months ago and I still do most of my gaming on my good old PS3. Still plenty of amazing games to play through.

#733 Edited by -Rhett81- (3569 posts) -

Misprint?

#734 Edited by uninspiredcup (8922 posts) -

The young people seem to be mistaking "photo-realism" and "art style" with "good graphics.

This character model is not bad graphics, it's stylized, like practically every Ubisoft game.

The environments are much more apparent in Watchdogs. Well beyond 99% of other "open world games" including Second Son.

#735 Edited by tormentos (18325 posts) -

@zeeshanhaider said:

The irony.....the good sir, calling me the fanboy. :D

Anyways, so let me try your way then.

OWNED!!! :D

And here's something from the DF about the pathetic tablet CPU:

This is actually a real issue for the next-gen consoles, as the six CPU cores available to developers aren't exactly powerhouses. AMD's Jaguar architecture was designed with low-end laptops and tablets in mind - it's just that the ratio between power consumption and silicon die-space vs performance made it a good fit for the next-gen consoles. The typical CPU that finds its way into a gaming PC is much more capable in raw processing terms, so it's understandable why the trade between pixels vs frames works more frequently there - there's a lot of untapped resource CPU-side.

Source

Nice made up chart and the PS4 one is the Laptop one the tablet one is another,there are 2 Jaguar temash and kabini.

The one on PS4 is Kabini,whuich is quad core 1.5ghz and up,temash which is the tablet version is up to 1.4ghz,the PS4 one is 1.6ghz.

This should end your crappy argument..lol

butbut tablet CPU..lol

#736 Posted by zeeshanhaider (2603 posts) -

@tormentos said:

@zeeshanhaider said:

The irony.....the good sir, calling me the fanboy. :D

Anyways, so let me try your way then.

OWNED!!! :D

And here's something from the DF about the pathetic tablet CPU:

This is actually a real issue for the next-gen consoles, as the six CPU cores available to developers aren't exactly powerhouses. AMD's Jaguar architecture was designed with low-end laptops and tablets in mind - it's just that the ratio between power consumption and silicon die-space vs performance made it a good fit for the next-gen consoles. The typical CPU that finds its way into a gaming PC is much more capable in raw processing terms, so it's understandable why the trade between pixels vs frames works more frequently there - there's a lot of untapped resource CPU-side.

Source

Nice made up chart and the PS4 one is the Laptop one the tablet one is another,there are 2 Jaguar temash and kabini.

The one on PS4 is Kabini,whuich is quad core 1.5ghz and up,temash which is the tablet version is up to 1.4ghz,the PS4 one is 1.6ghz.

This should end your crappy argument..lol

butbut tablet CPU..lol

My chart is as real as yours. And the CPU of 900pStation is a tablet one. DF wrote it as a tablet CPU in many of its articles not just this. No matter how hard you try you won't change it. Jaguar was designed for tablets and entry level laptops, period. In other words, tablets. Live with that.

#737 Posted by kipsta77 (1014 posts) -

@TLHBO said:

rofl. Can't believe people actually paid £429 for a 720-792P console. It's ridiculous.

And the best bit... Can't even keep a steady 30fps. watta joke!

#738 Edited by rogelio22 (2381 posts) -

Well I got this installed and running yesterday on my pc and I gotta honestly say it runs and looks like sh**!!! Got everything on ultra even set deferred fx to pc in the ini. Also got the latest drvrs. I had to cap the frames to 30 since there was to much fluctuations... but seriously I was more impressed with gta5 when I played it on ps3 last yr! Well gotta get past the disapointing gfx and see if the actual game is any good today...

Oh and no matter what settings you set this at low or ultra. You get framedrops!

#739 Posted by tormentos (18325 posts) -

@zeeshanhaider said:

My chart is as real as yours. And the CPU of 900pStation is a tablet one. DF wrote it as a tablet CPU in many of its articles not just this. No matter how hard you try you won't change it. Jaguar was designed for tablets and entry level laptops, period. In other words, tablets. Live with that.

No your chart is made by you,mine was made by a freaking developer on an engine...lol

Again call me when half of steam has move from the dual core they have which is even worse than the PS4 jaguar..lol

That is because there is a jaguar for tablet and one for laptop.

http://www.extremetech.com/computing/156552-amds-last-and-only-hope-low-power-kabini-temash-are-ready-for-action

Again the tablet one is Temash which is up to 1.4GHz the one on PS4 is 1.6 and can go up to 2.0 which is clearly kabini..lol

Butbutbut tablet CPU...lol

Again when half of steam move away from their dual core CPU you would have a point,right now half of steam can't even play watchdogs the game requires a quad core minimum..lol

#740 Edited by zeeshanhaider (2603 posts) -

@tormentos said:

@zeeshanhaider said:

My chart is as real as yours. And the CPU of 900pStation is a tablet one. DF wrote it as a tablet CPU in many of its articles not just this. No matter how hard you try you won't change it. Jaguar was designed for tablets and entry level laptops, period. In other words, tablets. Live with that.

No your chart is made by you,mine was made by a freaking developer on an engine...lol

Again call me when half of steam has move from the dual core they have which is even worse than the PS4 jaguar..lol

That is because there is a jaguar for tablet and one for laptop.

http://www.extremetech.com/computing/156552-amds-last-and-only-hope-low-power-kabini-temash-are-ready-for-action

Again the tablet one is Temash which is up to 1.4GHz the one on PS4 is 1.6 and can go up to 2.0 which is clearly kabini..lol

Butbutbut tablet CPU...lol

Again when half of steam move away from their dual core CPU you would have a point,right now half of steam can't even play watchdogs the game requires a quad core minimum..lol

The chart is based of the results from the engine which I'm developing right now called the CowSlaughter Engine and yes, I'm a developer my self, being employed as one and pretty sure I have a degree for that too.

You have failed to provide any evidence for your false claims and therefore I win because I have already provided proof in the past how the Jaguaur architecture is shit and made for cheap ass tablet CPUs.

HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

#741 Edited by tormentos (18325 posts) -

@zeeshanhaider said:

The chart is based of the results from the engine which I'm developing right now called the CowSlaughter Engine and yes, I'm a developer my self, being employed as one and pretty sure I have a degree for that too.

You have failed to provide any evidence for your false claims and therefore I win because I have already provided proof in the past how the Jaguaur architecture is shit and made for cheap ass tablet CPUs.

HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

Exactly pull from deep deep down your ass..hahahahaaaaaaaaaaaaaaaaa

Yeah i am a developer to woooo hooooo yeah welcome to the internet were stupid people pretend to be what ever sh** they like..lol

I even posted a link....hahaaaaaaaaaaaaaaaaa

butbutbut tablet CPU.hahahahaa by steam 24% on PC run 768p,oh that is way bigger % than the PS4 runs games on 900p,so i guess from now on PC should be call 768pC.....

Hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

#742 Edited by m3dude1 (1362 posts) -

lol @zees claiming to be a developer. man the pc fanboys rly are the worst

#743 Edited by zeeshanhaider (2603 posts) -

@tormentos said:

@zeeshanhaider said:

The chart is based of the results from the engine which I'm developing right now called the CowSlaughter Engine and yes, I'm a developer my self, being employed as one and pretty sure I have a degree for that too.

You have failed to provide any evidence for your false claims and therefore I win because I have already provided proof in the past how the Jaguaur architecture is shit and made for cheap ass tablet CPUs.

HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

Exactly pull from deep deep down your ass..hahahahaaaaaaaaaaaaaaaaa

Yeah i am a developer to woooo hooooo yeah welcome to the internet were stupid people pretend to be what ever sh** they like..lol

I even posted a link....hahaaaaaaaaaaaaaaaaa

butbutbut tablet CPU.hahahahaa by steam 24% on PC run 768p,oh that is way bigger % than the PS4 runs games on 900p,so i guess from now on PC should be call 768pC.....

Hahahahaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

Well, you provided a link for a person claiming to be a developer with a chart I did the same. Why is yours a credible resource and mine is not? Care to explain?

HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

You got owned. The results are from my engine and these are facts you can't refute by throwing insults, and yes, I'm a graduate of Computer Science and being employed as a Senior Software Engineer/Senior BPM Developer for more than 4 years. 8)

Edit: Oh and by the way, I forgot to list the specialty of my Engine, it makes cows butthurt and slaughter them a million pers second. Specially tuned for that, though it works really well for all sorts of consololites.

#744 Posted by lostrib (37592 posts) -

@m3dude1 said:

lol @zees claiming to be a developer. man the pc fanboys rly are the worst

you're the worst