The Order 1886 runs at 30fps - that next gen power

This topic is locked from further discussion.

#251 Posted by seanmcloughlin (38113 posts) -

@seanmcloughlin said:

Except RE4's godliness has nothing to do with its visuals. It was all about atmosphere and pacing. What the Order is like remains to be seen,...... literally, we haven't even seen it in motion yet

Actually it had a lot to do with it's visuals. These days, it has a lot to do with nostalgia.

At the time, yeah I guess so. But at the time I didn't expect it to be still one of my favourite games ever years after

#252 Posted by ronvalencia (15103 posts) -

@tormentos said:

@zeeshanhaider said:

The Order is 800p and 30fps with even less polygons and will exactly be on rails and linear. How about that?

Oh and DF is credible when they declare PS4 as the winner and talks crap about X1 but not the other way around, gotcha. You can't go around change goal posts when it suits you. You were destroyed in the Crysis 3, Crysis 2 debate and you lost when the industry (note industry not just DF: there are many other sites who said the same thing) declared Ryse as the better looking game. There's no way for you to escape this time. Talk about 1080p...well The Order isn't 1080p either. Talk about other graphic features Ryse has them all and with high quality assets. Wanna discuss FPS The Order too is 30fps. Talking about achieving more with less, well Crytek is simply the winner in this case. There's no two way about it. Company with a better technology - Crytek hands down. Specs,? I don't need to mention Tablet Cpu and 570 GPU from 2010.

So, let's see you flip flop again. No matter what you choose, you will loose because you have exhausted all your options flip flopping. You may need to do a little more work thinking of something new this time.;)

Rise is 1680x900p..

The Order is 1920x800p and is a design choice not a hardware limitation.

Also you don't even know what the fu** you say,please link me to where is say that The Order is on Rails,do you even fu**ing know what on rails mean? It means that you can't move your characters,basically the AI does it for you and you only have a set number of gameplay actions you can execute,think like Time Crysis,you can't move when you want where you want,you have to wait until the game moves you that is on rail were the control you have over the actions the character does are severely limited.

There isn't videos about The Order gameplay out there so how the fu** do you even know how the game is.?

Oh they don't have to declare the PS4 the winner,Tomb raider performance difference was leaked before DF did any articles,because it was notable the difference,the PS4 version hit 60 FPS the xbox one locked at 30FPS,the PS4 version also had better effects and textures,the same happen with BF4 and Ghost before they were confirmed to be a difference by DF already it was leak that the performance was worse on xbox one.

I don't need DF to know what the PS4 can do and what more or less will be the advantage only morons do,the PS4 is basically a 7850 OC the xbox one a 7770 hell the update is not up yet to unlock the 8% so is even lower than 7770 performance.

Fact is you butthurt moron Ryse on PS4 would look even better than on xbox one and would be 1080p,30 FPS or more,this is a fact that games have confirm already pray,that ryse is not port to the PS4 you would look mighty sad.

Time to let that killzone hate go...lol

Tell me something I didn't know. Did you even read my previous post? I know The Order is 1920 x 800 and that makes it having only 6.7% more pixels than Ryse, dumb dumb. You don't know about the gameplay of the Order? Let me tell you it's on rails just like evry PS exclusive in existence except Demon Source. And if there's isn't any gameplay why all the fuss about The Order having teh best graphics? Cows setting themselves for ownage again like Crapzone: Shadow Fail? And yeah design choice, I heard, Cevat Yerli saying the exact same thing.

Never said Ryse won't look better on PS4. But the fact is according to the industry Ryse > Crapzone: Shadow Fail, no matter how you try to spin this. And I give two craps about Ryse going over to PS4. It will be right at home with other movies. Even if Ryse goes on PS4 and look better that will not change the fact that Crysis 2 >>>>>>>>>>>>>> Any version of Ryse > Crapzone:Shadow Fail. Because the most powerful console is nothing but a cheap ass tablet CPU and a GPU barely matching a 570 from 2010.

And yeah, I remember suggesting you about hying UC4, you didn't do that yet, waiting for first bullshots to arrive? Damn, it sucks to have stuck with outdated garbage couple that with all the big 3 going down. Dark times indeed for consololites.

As AMD Mantle shows, "cheap ass tablet CPU" doesn't have to worry about DirectX CPU overheads.

#253 Edited by MlauTheDaft (2963 posts) -

@seanmcloughlin said:

@MlauTheDaft said:

@seanmcloughlin said:

Except RE4's godliness has nothing to do with its visuals. It was all about atmosphere and pacing. What the Order is like remains to be seen,...... literally, we haven't even seen it in motion yet

Actually it had a lot to do with it's visuals. These days, it has a lot to do with nostalgia.

At the time, yeah I guess so. But at the time I didn't expect it to be still one of my favourite games ever years after

I did'nt mean to insinuate that the game relied solely on it's visuals.

It's a classic of course, but it's still dated, especially mechanically.

Edit:

That's coming from someone who still likes Shin Megami Tensei: Nocturne, so I should shut my mouth about datedness ;)

#254 Posted by IMAHAPYHIPPO (2497 posts) -

@CrownKingArthur: Not knowing everything about the video game industry is hardly a means for pity, my friend. Where is it stated Microsoft interferes with game development of other studios?

#255 Posted by CrownKingArthur (3494 posts) -

@CrownKingArthur: Not knowing everything about the video game industry is hardly a means for pity, my friend. Where is it stated Microsoft interferes with game development of other studios?

i can't believe you need me to be so blunt about this.

ghostwarrior's original post that other guy responded to - and you agreed with: ghostwarrior's post is satire.

bobbety bob quoted you guys and verbatim corrected that it was a 'joke'

i joined in on ghostwarrior's satire, for comedic lols. i actually thought my post would help everyone understand that of course its a joke. its nonsense isn't it? its absurd! sabotage? come on!

i guess sometimes over the internet these things aren't so obvious. so, if you want a serious answer to your question - my serious answer is:

1. the satires were never meant to be taken seriously

2. of course i have no proof that the things described in the satire are happening

look, i'll give you an example.

Q: what is the fastest way to lose weight?

A: amputation

now, is that answer wrong? nah. probably not. but c'mon, surely nobody would seriously considering amputation for weight loss?

yo man, peace out. i hope we can call this a stalemate at the end of the day.

#256 Posted by harry_james_pot (10061 posts) -

lol, that "cinematic experience" nonsense again.

#257 Edited by IMAHAPYHIPPO (2497 posts) -

@CrownKingArthur: Haha, I gotcha. I must have missed a few posts -- reading entire 250 posts threads aren't exactly at the top of my list. You win.

#258 Posted by zeeshanhaider (2191 posts) -

@zeeshanhaider said:

@tormentos said:

@zeeshanhaider said:

The Order is 800p and 30fps with even less polygons and will exactly be on rails and linear. How about that?

Oh and DF is credible when they declare PS4 as the winner and talks crap about X1 but not the other way around, gotcha. You can't go around change goal posts when it suits you. You were destroyed in the Crysis 3, Crysis 2 debate and you lost when the industry (note industry not just DF: there are many other sites who said the same thing) declared Ryse as the better looking game. There's no way for you to escape this time. Talk about 1080p...well The Order isn't 1080p either. Talk about other graphic features Ryse has them all and with high quality assets. Wanna discuss FPS The Order too is 30fps. Talking about achieving more with less, well Crytek is simply the winner in this case. There's no two way about it. Company with a better technology - Crytek hands down. Specs,? I don't need to mention Tablet Cpu and 570 GPU from 2010.

So, let's see you flip flop again. No matter what you choose, you will loose because you have exhausted all your options flip flopping. You may need to do a little more work thinking of something new this time.;)

Rise is 1680x900p..

The Order is 1920x800p and is a design choice not a hardware limitation.

Also you don't even know what the fu** you say,please link me to where is say that The Order is on Rails,do you even fu**ing know what on rails mean? It means that you can't move your characters,basically the AI does it for you and you only have a set number of gameplay actions you can execute,think like Time Crysis,you can't move when you want where you want,you have to wait until the game moves you that is on rail were the control you have over the actions the character does are severely limited.

There isn't videos about The Order gameplay out there so how the fu** do you even know how the game is.?

Oh they don't have to declare the PS4 the winner,Tomb raider performance difference was leaked before DF did any articles,because it was notable the difference,the PS4 version hit 60 FPS the xbox one locked at 30FPS,the PS4 version also had better effects and textures,the same happen with BF4 and Ghost before they were confirmed to be a difference by DF already it was leak that the performance was worse on xbox one.

I don't need DF to know what the PS4 can do and what more or less will be the advantage only morons do,the PS4 is basically a 7850 OC the xbox one a 7770 hell the update is not up yet to unlock the 8% so is even lower than 7770 performance.

Fact is you butthurt moron Ryse on PS4 would look even better than on xbox one and would be 1080p,30 FPS or more,this is a fact that games have confirm already pray,that ryse is not port to the PS4 you would look mighty sad.

Time to let that killzone hate go...lol

Tell me something I didn't know. Did you even read my previous post? I know The Order is 1920 x 800 and that makes it having only 6.7% more pixels than Ryse, dumb dumb. You don't know about the gameplay of the Order? Let me tell you it's on rails just like evry PS exclusive in existence except Demon Source. And if there's isn't any gameplay why all the fuss about The Order having teh best graphics? Cows setting themselves for ownage again like Crapzone: Shadow Fail? And yeah design choice, I heard, Cevat Yerli saying the exact same thing.

Never said Ryse won't look better on PS4. But the fact is according to the industry Ryse > Crapzone: Shadow Fail, no matter how you try to spin this. And I give two craps about Ryse going over to PS4. It will be right at home with other movies. Even if Ryse goes on PS4 and look better that will not change the fact that Crysis 2 >>>>>>>>>>>>>> Any version of Ryse > Crapzone:Shadow Fail. Because the most powerful console is nothing but a cheap ass tablet CPU and a GPU barely matching a 570 from 2010.

And yeah, I remember suggesting you about hying UC4, you didn't do that yet, waiting for first bullshots to arrive? Damn, it sucks to have stuck with outdated garbage couple that with all the big 3 going down. Dark times indeed for consololites.

As AMD Mantle shows, "cheap ass tablet CPU" doesn't have to worry about DirectX CPU overheads.

Is that makes me a better CPU than Desktop offerings? No it is still a cheap ass tablet CPU and CPU does more than just GPU calls.

#259 Posted by tormentos (16275 posts) -

With MSAA X4+1920x800, it's a hardware limitation. Both 7970 and PS4's GCN has 32 ROPS and 7970 has superior memory bandwidth for MSAA.

Of the AA process MSAA has a 24FPS hit on BF3 on the 7850,the 7870 has 19FPS hit,so yeah if they want to use MSAA X 4 the resolution will be 1920x800.

But that was the target in the first place,they have the game running now on 1080p they can let it that way,but without applying MSAA X 4 which many PC owners don't even use for the nasty hit it has on performance.

Does Ryse has MSAA x 4.?

No it uses temporal Antialiasing SMAA 1TX.

So if Ryse uses 4X MSAA it would run at 8 frames per second or less,hell is without MSAA and the game runs most between 26 and 28 FPS and drop as low as 17 or 18 fps is not solid 30,and uses heavy motion blur ala gears of war 3 to deal with imperfections,in this case jodder from the very variable frame rate..

The order was say to be 800p from the start that was the goal and they achieve it with MSAA X 4,now they haven't decide if they want to run the game at 800p or go for higher 1080p without MSAA X 4,all hardware ave limitations including top of the line cards.

There is no emotional reaction here my friend other than humor at the stupidity of others. But if you are going to play the troll game don't complain when the favor is returned. The fact is that cows laughed and said 30 fps was "unplayable" and "last gen" and are now upset that lems are bashing your game for being 30fps. Well wtf did they expect?

lol@me being a lem. Funny I tell the truth about the PS4 I get called a lem. When I tell the truth about the xbone I get called a cow. Seems fanboys just don't like the truth.

Well some cows are more stupid than other,much like for you resolution matter only when is PC vs consoles,but when is PS4 vs xbox one it doesn't..

Have you forgot my quote on that matter.?

30 FPS is acceptable,60 FPS is better in fact many games on PS4 since launch are 30FPS,and it was lemming who ride on Forza 60FPS who for months before launch claimed the PS4 was unable to run games at 60 FPS,all based on Forza,but little did they knew that it would be the xbox one which would struggle even with 30 FPS..

Several games on xbox one can't keep even 30 FPS,like Ryse and DR3 show.

#260 Posted by Vecna (3305 posts) -

@draign said:

Who cares about the FPS on a third person console action game? I want to know whats up with the black borders people have mentioned.

They are there for DLC. You can fill them in when the first expansion hits.

#261 Edited by tormentos (16275 posts) -

Tell me something I didn't know. Did you even read my previous post? I know The Order is 1920 x 800 and that makes it having only 6.7% more pixels than Ryse, dumb dumb. You don't know about the gameplay of the Order? Let me tell you it's on rails just like evry PS exclusive in existence except Demon Source. And if there's isn't any gameplay why all the fuss about The Order having teh best graphics? Cows setting themselves for ownage again like Crapzone: Shadow Fail? And yeah design choice, I heard, Cevat Yerli saying the exact same thing.

Never said Ryse won't look better on PS4. But the fact is according to the industry Ryse > Crapzone: Shadow Fail, no matter how you try to spin this. And I give two craps about Ryse going over to PS4. It will be right at home with other movies. Even if Ryse goes on PS4 and look better that will not change the fact that Crysis 2 >>>>>>>>>>>>>> Any version of Ryse > Crapzone:Shadow Fail. Because the most powerful console is nothing but a cheap ass tablet CPU and a GPU barely matching a 570 from 2010.

And yeah, I remember suggesting you about hying UC4, you didn't do that yet, waiting for first bullshots to arrive? Damn, it sucks to have stuck with outdated garbage couple that with all the big 3 going down. Dark times indeed for consololites.

Not only it has higher pixel than Ryse you baboon,it has 4XMSAA,unlike Ryse Temporal Antialiasing,SMAA 1TX,if Ryse had 4XMSAA it would be 720p and who knows if less,4XMSAA has a nasty h on performance,in fact in the 7850 it has more than 20 FPS hit,so imagine on the much weaker xbox one GPU which is basically lower now than a 7770 in performance.

The fact that you call The Order and any PS game on rails,basically confirms that you are a **ing dumb ass moron,i already explain to you what on rails mean dude,maybe the word you are looking for is linear,which is what people use to describe freedom of choice in games or freedom of movement,which still a line but less so.

Did i even say it has the best graphics.?

What the fu** are we seeing here.? What.? And why is this ^^ so different from...

From this notice the horrible abuse of motion blur and depth of field going on in this game,i remember when Gears of war 3 was heavily criticize for using heavy motion blur to cover jaggies,on Ryse is use to cover the jodder from the horrible frames which dip as lows as 17 or 18 and run between 26 and 28 most of the time.

This game is not only very limited and constricted when it comes to movement,but is 900p and can't even hold a freaking stable 30 FPS,this game is a failure performance wise,Killzone is more open 1080p and has higher than 30 FPS and never goes under..

Now that is how a game should perform at least,this was suppose to be next gen not last one,if you are going to be 30FPS which is more than acceptable at least do it steady.

Funny how everything on Ryse revolves around close up screens,of basically nothing great happening on screen,in small spaces...

By the way my blind friend,here is a chart of the substance engine running from the Ipad 2 to the core i7.

Look at how the i7 has 8 point lead on that chart,and look how the PS4 has an 10.5 point lead vs the Tegra 4 which still relatively new,there is a bigger gap between that Jaguar and a tablet CPU than there is between that Jaguar and a i7 CPU..

Oh and is funny to see you downgrade the PS4 the xbox one which you defend here is even weaker..lol

Find me a tablet with an 8 core Jaguar...

#262 Posted by AM-Gamer (3277 posts) -

People bash its performance yet have no clue what the game will look like? From several behind the scenes viewings most previews have said the game looks amazing. Citing things such as outstanding hair and cloth physics along with what appears to be fully destructable enviorments.

#263 Edited by GrenadeLauncher (3080 posts) -

All this lemming rage over The Order is quite enjoyable.

Maybe one day they'll get why the world mocks the Xbone, until then they can keep whining about "double standards" that don't exist.

#264 Posted by clone01 (24085 posts) -

I personally don't care that much. Yes, I know 60 FPS is what is going to look the best. But frankly, I equate gaming to said enjoyment of game. Dark Souls had abhorrent framerates in some areas, and it never really lessened my enjoyment of the title.

#265 Posted by zeeshanhaider (2191 posts) -

@zeeshanhaider said:

Tell me something I didn't know. Did you even read my previous post? I know The Order is 1920 x 800 and that makes it having only 6.7% more pixels than Ryse, dumb dumb. You don't know about the gameplay of the Order? Let me tell you it's on rails just like evry PS exclusive in existence except Demon Source. And if there's isn't any gameplay why all the fuss about The Order having teh best graphics? Cows setting themselves for ownage again like Crapzone: Shadow Fail? And yeah design choice, I heard, Cevat Yerli saying the exact same thing.

Never said Ryse won't look better on PS4. But the fact is according to the industry Ryse > Crapzone: Shadow Fail, no matter how you try to spin this. And I give two craps about Ryse going over to PS4. It will be right at home with other movies. Even if Ryse goes on PS4 and look better that will not change the fact that Crysis 2 >>>>>>>>>>>>>> Any version of Ryse > Crapzone:Shadow Fail. Because the most powerful console is nothing but a cheap ass tablet CPU and a GPU barely matching a 570 from 2010.

And yeah, I remember suggesting you about hying UC4, you didn't do that yet, waiting for first bullshots to arrive? Damn, it sucks to have stuck with outdated garbage couple that with all the big 3 going down. Dark times indeed for consololites.

Not only it has higher pixel than Ryse you baboon,it has 4XMSAA,unlike Ryse Temporal Antialiasing,SMAA 1TX,if Ryse had 4XMSAA it would be 720p and who knows if less,4XMSAA has a nasty h on performance,in fact in the 7850 it has more than 20 FPS hit,so imagine on the much weaker xbox one GPU which is basically lower now than a 7770 in performance.

The fact that you call The Order and any PS game on rails,basically confirms that you are a **ing dumb ass moron,i already explain to you what on rails mean dude,maybe the word you are looking for is linear,which is what people use to describe freedom of choice in games or freedom of movement,which still a line but less so.

Did i even say it has the best graphics.?

What the fu** are we seeing here.? What.? And why is this ^^ so different from...

From this notice the horrible abuse of motion blur and depth of field going on in this game,i remember when Gears of war 3 was heavily criticize for using heavy motion blur to cover jaggies,on Ryse is use to cover the jodder from the horrible frames which dip as lows as 17 or 18 and run between 26 and 28 most of the time.

This game is not only very limited and constricted when it comes to movement,but is 900p and can't even hold a freaking stable 30 FPS,this game is a failure performance wise,Killzone is more open 1080p and has higher than 30 FPS and never goes under..

Now that is how a game should perform at least,this was suppose to be next gen not last one,if you are going to be 30FPS which is more than acceptable at least do it steady.

Funny how everything on Ryse revolves around close up screens,of basically nothing great happening on screen,in small spaces...

By the way my blind friend,here is a chart of the substance engine running from the Ipad 2 to the core i7.

Look at how the i7 has 8 point lead on that chart,and look how the PS4 has an 10.5 point lead vs the Tegra 4 which still relatively new,there is a bigger gap between that Jaguar and a tablet CPU than there is between that Jaguar and a i7 CPU..

Oh and is funny to see you downgrade the PS4 the xbox one which you defend here is even weaker..lol

Find me a tablet with an 8 core Jaguar...

Four things.

1) The Order is not confirmed to have 4xMSAA

2) PS Exclusives are on rails. Yeah I know what on rails means and PS exclusives are exactly Press 'X' to win on rails movies.

3) Jaguar is a tablet CPU

4) Now why in the hell Textures are generated on CPU, I wonder! Should I post other benchmarks where even Core 2 Duo smokes your tablet CPU out of the water?

And the most important things of all Ryse > Crapzone: Shadow Fail => It would be an insult to Crysis 2 from 2011 if we compare it to Crapzone: Shadow Fail

#266 Posted by Puckhog04 (22577 posts) -

As long as it's locked at 30fps, it's fine. If it dips at all then it's pretty sad.

#267 Edited by TruthBToldShow (328 posts) -

#268 Posted by ryangcnx-2 (1170 posts) -

@ryangcnx-2 said:

@darkangel115 said:

@GiantAssPanda said:

If the frame rate is consistent 1080p@30fps is just fine.

its not 1080P though. Also they confirmed no multiplayer. I went from excited to lukewarm on this game today. And not because of the FPS or resolution, but because of the no multiplayer

Yeas and no, it's not scaling in any way or form. While yes, its not 1920 x 1080 and it's 1920 x 800. All they want is a wider aspect ratio. The image is still 1 to 1 pixal ratio on a 1080p TV. By your logic hardly any movies aren't 1080p on bluray because most have that same exact aspect ratio and run in 1920 x 800. While I'll admit I would prefer the the game at 1920 x 1080 so the image uses the whole screen, and I do prefer movies that are presented in 16x9 format so they also fill up the whole screen.

But it won't be blurry or scaling in anyway or form. It does get the benefit of not having to render the extra pixals due to the wider aspect ratio, where they said they can re-allocate some resources and run it with 4xmsaa.

All I say, it sounds like the original way they envision the game is with the wider aspect ratio. This is purely a design choice of the developers.

Yes i understand what the resolution is, i was just making the point that its not 1080p. And personally i can care less (even though I'm not a fan of black bars for games or movies for that matter. The reason I'm losing interest is the lack of a co-op or PvP in a shooter

You early posts keep comparing how it's barely different than ryse's resolution of 1600 x 900. What I was pointed out is that Ryse has to scale it's resolution while 1920 x 800 does not. The Order at 1920 x 800 is going to appear much sharper in image quality over Ryse being scaled.

Also, this game gets a much more interest to me because it's NOT multiplayer. Too many games have stupid useless MP attached to them. Bioshock 2, Dead Space 2, Tomb Raider and such have crappy mp. I'm also not a fan of AC mp either. Secondly, I would have hated co-op in Mass Effect because so much of the game is story and character development. Then you get those people who constantly speed through everything and skip cutscenes and such while others are trying to enjoy it.

MP is not for SP games. Gears and call of duty were mainly MP games with SP added to them. Since the devs are mainly focusing on a Single player experience I want them to dovote 100% of their funding to the actual game and not waste it on tacked on mp.

#269 Edited by ronvalencia (15103 posts) -

@zeeshanhaider said:

@ronvalencia said:

@zeeshanhaider said:

@tormentos said:

@zeeshanhaider said:

The Order is 800p and 30fps with even less polygons and will exactly be on rails and linear. How about that?

Oh and DF is credible when they declare PS4 as the winner and talks crap about X1 but not the other way around, gotcha. You can't go around change goal posts when it suits you. You were destroyed in the Crysis 3, Crysis 2 debate and you lost when the industry (note industry not just DF: there are many other sites who said the same thing) declared Ryse as the better looking game. There's no way for you to escape this time. Talk about 1080p...well The Order isn't 1080p either. Talk about other graphic features Ryse has them all and with high quality assets. Wanna discuss FPS The Order too is 30fps. Talking about achieving more with less, well Crytek is simply the winner in this case. There's no two way about it. Company with a better technology - Crytek hands down. Specs,? I don't need to mention Tablet Cpu and 570 GPU from 2010.

So, let's see you flip flop again. No matter what you choose, you will loose because you have exhausted all your options flip flopping. You may need to do a little more work thinking of something new this time.;)

Rise is 1680x900p..

The Order is 1920x800p and is a design choice not a hardware limitation.

Also you don't even know what the fu** you say,please link me to where is say that The Order is on Rails,do you even fu**ing know what on rails mean? It means that you can't move your characters,basically the AI does it for you and you only have a set number of gameplay actions you can execute,think like Time Crysis,you can't move when you want where you want,you have to wait until the game moves you that is on rail were the control you have over the actions the character does are severely limited.

There isn't videos about The Order gameplay out there so how the fu** do you even know how the game is.?

Oh they don't have to declare the PS4 the winner,Tomb raider performance difference was leaked before DF did any articles,because it was notable the difference,the PS4 version hit 60 FPS the xbox one locked at 30FPS,the PS4 version also had better effects and textures,the same happen with BF4 and Ghost before they were confirmed to be a difference by DF already it was leak that the performance was worse on xbox one.

I don't need DF to know what the PS4 can do and what more or less will be the advantage only morons do,the PS4 is basically a 7850 OC the xbox one a 7770 hell the update is not up yet to unlock the 8% so is even lower than 7770 performance.

Fact is you butthurt moron Ryse on PS4 would look even better than on xbox one and would be 1080p,30 FPS or more,this is a fact that games have confirm already pray,that ryse is not port to the PS4 you would look mighty sad.

Time to let that killzone hate go...lol

Tell me something I didn't know. Did you even read my previous post? I know The Order is 1920 x 800 and that makes it having only 6.7% more pixels than Ryse, dumb dumb. You don't know about the gameplay of the Order? Let me tell you it's on rails just like evry PS exclusive in existence except Demon Source. And if there's isn't any gameplay why all the fuss about The Order having teh best graphics? Cows setting themselves for ownage again like Crapzone: Shadow Fail? And yeah design choice, I heard, Cevat Yerli saying the exact same thing.

Never said Ryse won't look better on PS4. But the fact is according to the industry Ryse > Crapzone: Shadow Fail, no matter how you try to spin this. And I give two craps about Ryse going over to PS4. It will be right at home with other movies. Even if Ryse goes on PS4 and look better that will not change the fact that Crysis 2 >>>>>>>>>>>>>> Any version of Ryse > Crapzone:Shadow Fail. Because the most powerful console is nothing but a cheap ass tablet CPU and a GPU barely matching a 570 from 2010.

And yeah, I remember suggesting you about hying UC4, you didn't do that yet, waiting for first bullshots to arrive? Damn, it sucks to have stuck with outdated garbage couple that with all the big 3 going down. Dark times indeed for consololites.

As AMD Mantle shows, "cheap ass tablet CPU" doesn't have to worry about DirectX CPU overheads.

Is that makes me a better CPU than Desktop offerings? No it is still a cheap ass tablet CPU and CPU does more than just GPU calls.

No, it enables AMD CPU solution to sufficiently drive the corresponding GPU solution. Also, there's very little point on using i7-3770K class CPU with 12 CU or 18 CU level GCNs.

Also, AMD Jaguar in the consoles are significantly faster than ARM Cortex A9 (e.g. Apple iPad 2) and A15 (e.g. NVIDIA Tegra 4).

The alternative, NVIDIA Geforce GTX 750 Ti "Maxwell" (1) would be inferior to PS4's AMD Liverpool APU/SoC solution with AMD Pitcairn class GPU.

1. GeForce GTX 750 Ti "Maxwell" is NVIDIA's first large scale APU with ARM CPUs and GPU fusion. This is best alternative from the green camp.

NVIDIA Geforce GTX 750 Ti "Maxwell" alternative would deliver X1 level GPU with inferior CPU performance.

MS and Sony are not stupid in building their best PC box at targeted price. It looks like AMD's CPU/GPU 2013 solutions in the two consoles are still competitive in 2014.

#270 Edited by ronvalencia (15103 posts) -
@zeeshanhaider said:

Four things.

1) The Order is not confirmed to have 4xMSAA

2) PS Exclusives are on rails. Yeah I know what on rails means and PS exclusives are exactly Press 'X' to win on rails movies.

3) Jaguar is a tablet CPU

4) Now why in the hell Textures are generated on CPU, I wonder! Should I post other benchmarks where even Core 2 Duo smokes your tablet CPU out of the water?

And the most important things of all Ryse > Crapzone: Shadow Fail => It would be an insult to Crysis 2 from 2011 if we compare it to Crapzone: Shadow Fail

3. AMD Jaguar beyond 5 watts are not tablets.

@tormentos

Of the AA process MSAA has a 24FPS hit on BF3 on the 7850,the 7870 has 19FPS hit,so yeah if they want to use MSAA X 4 the resolution will be 1920x800.

But that was the target in the first place,they have the game running now on 1080p they can let it that way,but without applying MSAA X 4 which many PC owners don't even use for the nasty hit it has on performance.

Does Ryse has MSAA x 4.?

No it uses temporal Antialiasing SMAA 1TX.

So if Ryse uses 4X MSAA it would run at 8 frames per second or less,hell is without MSAA and the game runs most between 26 and 28 FPS and drop as low as 17 or 18 fps is not solid 30,and uses heavy motion blur ala gears of war 3 to deal with imperfections,in this case jodder from the very variable frame rate..

The order was say to be 800p from the start that was the goal and they achieve it with MSAA X 4,now they haven't decide if they want to run the game at 800p or go for higher 1080p without MSAA X 4,all hardware ave limitations including top of the line cards.

I have already posted Ryse is using SMAA (Subpixel Morphological Antialiasing) instead of MSAA in another topic.

-----------------

http://vimeo.com/31247769#at=0 for SMAA vs other AA methods

SMAA is a new image-based, post-processing antialiasing technique, that offers practical solutions to all the common problems of existing filter-based antialiasing algorithms. It yields better pattern detection to handle sharp geometric features and diagonal shapes. Our edge detection scheme exploits local contrast features, along with accelerated and more precise distance searches, which allows to better recognize the patterns to antialias. Our method is capable of reconstructing subpixel features, comparable to 4x multisampling, and is fully customizable, so that every feature can be turned on or off, adjusting to particular needs. We propose four different presets, from the basic level to adding spatial multisampling and temporal supersampling. Even this full-fledged version achieves performances that are on-par with the fastest approaches available, while yielding superior quality.

The Order could have 1920x1080p and SMAA TX2 (with comparable 4X MSAA quality). MSAA 4X a brute force method and it's not being smart in solving this particular problem. For PC with lesser GPUs or going ultra wide resolution, PC has SMAA DLL injectors for games that doesn't support SMAA.

You can download SMAA's DLL injector from http://mrhaandi.blogspot.com.au/p/injectsmaa.html

#271 Edited by tormentos (16275 posts) -

Four things.

1) The Order is not confirmed to have 4xMSAA

2) PS Exclusives are on rails. Yeah I know what on rails means and PS exclusives are exactly Press 'X' to win on rails movies.

3) Jaguar is a tablet CPU

4) Now why in the hell Textures are generated on CPU, I wonder! Should I post other benchmarks where even Core 2 Duo smokes your tablet CPU out of the water?

And the most important things of all Ryse > Crapzone: Shadow Fail => It would be an insult to Crysis 2 from 2011 if we compare it to Crapzone: Shadow Fail

1-If the game doesn't have MSAAX 4 it will be 1080p.

2-You don't know what the fu** on rails mean and i already prove that to hell and beyond.

http://en.wikipedia.org/wiki/Category:Rail_shooters

You are an idiot tell me in which game on PS3 you need to just play X to win,you sad butthurt troll,in fact Ryse the game you so hardly defend was accuse of been a just press A game,and when it was demo,it was press A execute kill,press the wrong button kill still executed,don't press any button kill still execute it self..

So tell me which game on PS only require you to play X alone to win,hell most PS3 games make use of all the 8 buttons,and some even use the d pad as 4 more additional input..

I have play On rail shooters since the early 80's dude and you don't know what on rails means.

3-Find me a table with 8 core jaguar,you have been owned hard i even posted and benchmark and you are just arguing with your opinion alone,you know you remind me allot of Kingtito which actually do exactly what you do,ignore everything that is presented to him and argue using only his opinion.

Is that you Kingtito.?

4-The textures are been compress by the CPU you idiot,you don't know what the fu** you are talking..lol

Once again Ryse is a failure game,it very enclosed 900p and can't even achieve 30 FPS,Crytek had to give up resolutions,frames and open areas to achieve that game on xbox one,they went from making Crysis which was an open FPS to Ryse and enclosed very limited 3rd person hack and slash that can't even run at the bare minimum the industry has 30 FPS on 900p..lol

Killzone SF on xbox one would run at 8 FPS if not less,oh wait the xbox one can't do 1080p on FPS just 720p...lol

#272 Edited by ninjapirate2000 (2877 posts) -

30fps/No multiplayer? This game will be a rental.

#273 Edited by zeeshanhaider (2191 posts) -

@zeeshanhaider said:

Four things.

1) The Order is not confirmed to have 4xMSAA

2) PS Exclusives are on rails. Yeah I know what on rails means and PS exclusives are exactly Press 'X' to win on rails movies.

3) Jaguar is a tablet CPU

4) Now why in the hell Textures are generated on CPU, I wonder! Should I post other benchmarks where even Core 2 Duo smokes your tablet CPU out of the water?

And the most important things of all Ryse > Crapzone: Shadow Fail => It would be an insult to Crysis 2 from 2011 if we compare it to Crapzone: Shadow Fail

1-If the game doesn't have MSAAX 4 it will be 1080p.

2-You don't know what the fu** on rails mean and i already prove that to hell and beyond.

http://en.wikipedia.org/wiki/Category:Rail_shooters

You are an idiot tell me in which game on PS3 you need to just play X to win,you sad butthurt troll,in fact Ryse the game you so hardly defend was accuse of been a just press A game,and when it was demo,it was press A execute kill,press the wrong button kill still executed,don't press any button kill still execute it self..

So tell me which game on PS only require you to play X alone to win,hell most PS3 games make use of all the 8 buttons,and some even use the d pad as 4 more additional input..

I have play On rail shooters since the early 80's dude and you don't know what on rails means.

3-Find me a table with 8 core jaguar,you have been owned hard i even posted and benchmark and you are just arguing with your opinion alone,you know you remind me allot of Kingtito which actually do exactly what you do,ignore everything that is presented to him and argue using only his opinion.

Is that you Kingtito.?

4-The textures are been compress by the CPU you idiot,you don't know what the fu** you are talking..lol

Once again Ryse is a failure game,it very enclosed 900p and can't even achieve 30 FPS,Crytek had to give up resolutions,frames and open areas to achieve that game on xbox one,they went from making Crysis which was an open FPS to Ryse and enclosed very limited 3rd person hack and slash that can't even run at the bare minimum the industry has 30 FPS on 900p..lol

Killzone SF on xbox one would run at 8 FPS if not less,oh wait the xbox one can't do 1080p on FPS just 720p...lol

Four things

1) That isn't confirmed either

2) PS exclusives are QTE fest on rails Press 'X' to win movies for brain dead dummies

3) Jaguar is a tablet CPU and that is what it's intend for. Google something before farting anything

4) Textures are very efficiently decompressed on the GPU and are the preferred method. Hell there are compressed GPU texture formats like 3Dc. So, any other tests? I know cows always talk out of their asses so you can thank me later for teaching a thing or two to you.

And the most important thing as DF and the rest of the industry agree upon undisputed: Ryse > Crapzone:Shadow Fail => Crysis 2 from 2011 shits on, fucks and bitchslaps Crapzone: Shadow Fail 24/7. And Crysis 2 from 2011 will forever be a benchmark for the outdated next-gen consoles to be compared against.

#274 Posted by zeeshanhaider (2191 posts) -

@zeeshanhaider said:

@ronvalencia said:

@zeeshanhaider said:

@tormentos said:

@zeeshanhaider said:

The Order is 800p and 30fps with even less polygons and will exactly be on rails and linear. How about that?

Oh and DF is credible when they declare PS4 as the winner and talks crap about X1 but not the other way around, gotcha. You can't go around change goal posts when it suits you. You were destroyed in the Crysis 3, Crysis 2 debate and you lost when the industry (note industry not just DF: there are many other sites who said the same thing) declared Ryse as the better looking game. There's no way for you to escape this time. Talk about 1080p...well The Order isn't 1080p either. Talk about other graphic features Ryse has them all and with high quality assets. Wanna discuss FPS The Order too is 30fps. Talking about achieving more with less, well Crytek is simply the winner in this case. There's no two way about it. Company with a better technology - Crytek hands down. Specs,? I don't need to mention Tablet Cpu and 570 GPU from 2010.

So, let's see you flip flop again. No matter what you choose, you will loose because you have exhausted all your options flip flopping. You may need to do a little more work thinking of something new this time.;)

Rise is 1680x900p..

The Order is 1920x800p and is a design choice not a hardware limitation.

Also you don't even know what the fu** you say,please link me to where is say that The Order is on Rails,do you even fu**ing know what on rails mean? It means that you can't move your characters,basically the AI does it for you and you only have a set number of gameplay actions you can execute,think like Time Crysis,you can't move when you want where you want,you have to wait until the game moves you that is on rail were the control you have over the actions the character does are severely limited.

There isn't videos about The Order gameplay out there so how the fu** do you even know how the game is.?

Oh they don't have to declare the PS4 the winner,Tomb raider performance difference was leaked before DF did any articles,because it was notable the difference,the PS4 version hit 60 FPS the xbox one locked at 30FPS,the PS4 version also had better effects and textures,the same happen with BF4 and Ghost before they were confirmed to be a difference by DF already it was leak that the performance was worse on xbox one.

I don't need DF to know what the PS4 can do and what more or less will be the advantage only morons do,the PS4 is basically a 7850 OC the xbox one a 7770 hell the update is not up yet to unlock the 8% so is even lower than 7770 performance.

Fact is you butthurt moron Ryse on PS4 would look even better than on xbox one and would be 1080p,30 FPS or more,this is a fact that games have confirm already pray,that ryse is not port to the PS4 you would look mighty sad.

Time to let that killzone hate go...lol

Tell me something I didn't know. Did you even read my previous post? I know The Order is 1920 x 800 and that makes it having only 6.7% more pixels than Ryse, dumb dumb. You don't know about the gameplay of the Order? Let me tell you it's on rails just like evry PS exclusive in existence except Demon Source. And if there's isn't any gameplay why all the fuss about The Order having teh best graphics? Cows setting themselves for ownage again like Crapzone: Shadow Fail? And yeah design choice, I heard, Cevat Yerli saying the exact same thing.

Never said Ryse won't look better on PS4. But the fact is according to the industry Ryse > Crapzone: Shadow Fail, no matter how you try to spin this. And I give two craps about Ryse going over to PS4. It will be right at home with other movies. Even if Ryse goes on PS4 and look better that will not change the fact that Crysis 2 >>>>>>>>>>>>>> Any version of Ryse > Crapzone:Shadow Fail. Because the most powerful console is nothing but a cheap ass tablet CPU and a GPU barely matching a 570 from 2010.

And yeah, I remember suggesting you about hying UC4, you didn't do that yet, waiting for first bullshots to arrive? Damn, it sucks to have stuck with outdated garbage couple that with all the big 3 going down. Dark times indeed for consololites.

As AMD Mantle shows, "cheap ass tablet CPU" doesn't have to worry about DirectX CPU overheads.

Is that makes me a better CPU than Desktop offerings? No it is still a cheap ass tablet CPU and CPU does more than just GPU calls.

No, it enables AMD CPU solution to sufficiently drive the corresponding GPU solution. Also, there's very little point on using i7-3770K class CPU with 12 CU or 18 CU level GCNs.

Also, AMD Jaguar in the consoles are significantly faster than ARM Cortex A9 (e.g. Apple iPad 2) and A15 (e.g. NVIDIA Tegra 4).

The alternative, NVIDIA Geforce GTX 750 Ti "Maxwell" (1) would be inferior to PS4's AMD Liverpool APU/SoC solution with AMD Pitcairn class GPU.

1. GeForce GTX 750 Ti "Maxwell" is NVIDIA's first large scale APU with ARM CPUs and GPU fusion. This is best alternative from the green camp.

NVIDIA Geforce GTX 750 Ti "Maxwell" alternative would deliver X1 level GPU with inferior CPU performance.

MS and Sony are not stupid in building their best PC box at targeted price. It looks like AMD's CPU/GPU 2013 solutions in the two consoles are still competitive in 2014.

Does that make Jaguar anything more than a cheap ass tablet CPU? No.

#275 Posted by tormentos (16275 posts) -

Four things

1) That isn't confirmed either

2) PS exclusives are QTE fest on rails Press 'X' to win movies for brain dead dummies

3) Jaguar is a tablet CPU and that is what it's intend for. Google something before farting anything

4) Textures are very efficiently decompressed on the GPU and are the preferred method. Hell there are compressed GPU texture formats like 3Dc. So, any other tests? I know cows always talk out of their asses so you can thank me later for teaching a thing or two to you.

And the most important thing as DF and the rest of the industry agree upon undisputed: Ryse > Crapzone:Shadow Fail => Crysis 2 from 2011 shits on, fucks and bitchslaps Crapzone: Shadow Fail 24/7. And Crysis 2 from 2011 will forever be a benchmark for the outdated next-gen consoles to be compared against.

1-Is either 800p with MSAAX4 or 1080p without MSAA.

2-Having QTE doesn't mean a game is just on rails,the only games which feature some of those are GOW,Uncharted and some other are QTE in some games are not even that abundant,in fact in GOW you can even skip them just by not doing the kill that way,just because some games have some QTE's doesn't mean the games are on rails,please look again at the link i posted because you don't know what the fu** on rails mean.

3-LINK to the tablets having 8 core Jaguar CPU. Link or your full of sh**.

Just because some version of the Jaguar can be on tables doesn't mean all are,oh and Jaguars are mostly use on Servers..

4-

Real-time DXT compression algorithms can be implemented on both the CPU (Waveren, 2006) and the GPU (Castano, 2007). While GPU implementations are usually faster, CPU versions are sufficiently fast for real-time applications and result in less bus traffic. Which implementation should be chosen is workload dependent and should be based on where the application is bottlenecked.

http://software.intel.com/en-us/articles/fast-cpu-dxt-compression

I hope this should end your whole stupid argument about CPU compression and decompression,why waste GPU resources when you can use CPU for it and save bus traffic which is so important on consoles which have limited or smaller bus than PC.

The fact that you use so stupid words to talk about Killzone show how butthurt you are about it Kingtito...lol

#276 Posted by tormentos (16275 posts) -
#277 Posted by clyde46 (43257 posts) -

I see the AMD shills are out in force.

#278 Posted by Shielder7 (4432 posts) -

@zeeshanhaider said:

Does that make Jaguar anything more than a cheap ass tablet CPU? No.

http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively

Maybe if i pasted the image several times you will understand it,the PS4 CPU is faster than the Tegra4 which is a tablet CPU,were ever you want to admit this or not is up to you...

The PS4 doesn't need and i7 is not a PC.

Dude trying to talk sense into hermits or hardcore Lemmings is like trying to teach a duck Calculus. You're wasting your time.

#279 Edited by darkangel115 (1333 posts) -

@ryangcnx-2: that's not how ME3 multiplayer works. Also I should say I feel bad for people that think like you. It's like the guy at a diner by himself eating because going with other people interrupts his chewing lol

#280 Edited by zeeshanhaider (2191 posts) -

@zeeshanhaider said:

Four things

1) That isn't confirmed either

2) PS exclusives are QTE fest on rails Press 'X' to win movies for brain dead dummies

3) Jaguar is a tablet CPU and that is what it's intend for. Google something before farting anything

4) Textures are very efficiently decompressed on the GPU and are the preferred method. Hell there are compressed GPU texture formats like 3Dc. So, any other tests? I know cows always talk out of their asses so you can thank me later for teaching a thing or two to you.

And the most important thing as DF and the rest of the industry agree upon undisputed: Ryse > Crapzone:Shadow Fail => Crysis 2 from 2011 shits on, fucks and bitchslaps Crapzone: Shadow Fail 24/7. And Crysis 2 from 2011 will forever be a benchmark for the outdated next-gen consoles to be compared against.

1-Is either 800p with MSAAX4 or 1080p without MSAA.

2-Having QTE doesn't mean a game is just on rails,the only games which feature some of those are GOW,Uncharted and some other are QTE in some games are not even that abundant,in fact in GOW you can even skip them just by not doing the kill that way,just because some games have some QTE's doesn't mean the games are on rails,please look again at the link i posted because you don't know what the fu** on rails mean.

3-LINK to the tablets having 8 core Jaguar CPU. Link or your full of sh**.

Just because some version of the Jaguar can be on tables doesn't mean all are,oh and Jaguars are mostly use on Servers..

4-

Real-time DXT compression algorithms can be implemented on both the CPU (Waveren, 2006) and the GPU (Castano, 2007). While GPU implementations are usually faster, CPU versions are sufficiently fast for real-time applications and result in less bus traffic. Which implementation should be chosen is workload dependent and should be based on where the application is bottlenecked.

http://software.intel.com/en-us/articles/fast-cpu-dxt-compression

I hope this should end your whole stupid argument about CPU compression and decompression,why waste GPU resources when you can use CPU for it and save bus traffic which is so important on consoles which have limited or smaller bus than PC.

The fact that you use so stupid words to talk about Killzone show how butthurt you are about it Kingtito...lol

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Real-time compression on the GPU may be less useful for transcoding, because of increased bandwidth requirements for uploading uncompressed texture data and because the GPU may already be tasked with expensive rendering work. However, real-time compression on the GPU is very useful for compressed render targets. The compression on the GPU can be used to save memory when rendering to a texture. Furthermore, such compressed render targets can improve the performance if the data from the render target is used for further rendering. The render target is compressed once, while the resulting data may be accessed many times during rendering. The compressed data results in reduced bandwidth requirements during rasterization and can, as such, significantly improve performance. (The things relevant to gaming)

Link

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

#281 Posted by ronvalencia (15103 posts) -

@tormentos:

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

AMD's 3DC+ was included in DX10 as BC4. There are other new native texture formats in DX11.

AMD Jaguar was designed to almost rivals ARM Cortex A15's cost that scales from tablets to micro-server arrays.

AMD Jaguar's physical address improvements (i.e. from Opteron) was designed for servers. AMD Jaguar's memory controllers has support for server's ECC memory.

AMD Jaguar is not specifically designed just for tablets i.e. AMD Mullins SoC's 2.5 watts *is* designed for tablets.

#282 Edited by ronvalencia (15103 posts) -

@clyde46:

@clyde46 said:

I see the AMD shills are out in force.

What's the matter? Nvidia still has $hit CPUs i.e. for both x86 and ARM.

#283 Posted by tormentos (16275 posts) -

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Real-time compression on the GPU may be less useful for transcoding, because of increased bandwidth requirements for uploading uncompressed texture data and because the GPU may already be tasked with expensive rendering work. However, real-time compression on the GPU is very useful for compressed render targets. The compression on the GPU can be used to save memory when rendering to a texture. Furthermore, such compressed render targets can improve the performance if the data from the render target is used for further rendering. The render target is compressed once, while the resulting data may be accessed many times during rendering. The compressed data results in reduced bandwidth requirements during rasterization and can, as such, significantly improve performance. (The things relevant to gaming)

Link

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

1-One way or the other 1 of the 2 will be..

2-Considering that you don't know what the fu** on rails mean your whole argument is stupid,having QTE's which Ryse has as well,doesn't make the game on rails,even that Ryse is the closes thing is not on rails,because you can control your characters movements,once again the word your looking for is Linear Kingtito..

3-No you blind moron is not just for that..

http://en.wikipedia.org/wiki/Jaguar_%28microarchitecture%29#Server

Note books,mini PC,Servers and tablets in fact how many tablets even use a Jaguar.? Most tablets use ARM architecture,oh and they are different models the top of the Jaguar CPU kick the crap out of the Tablet version.

Fast CPU DXT Compression

http://software.intel.com/en-us/articles/fast-cpu-dxt-compression

The article was posted on April 26 of 2012 moron,the 2006 and 2007 dates refer to the implementations of the technology you idiot,not to the time the article was created..lol OWNED..

I don't even know why you pull those tables,first they are from 2006 which is the year the 8800GTX came alone,not only that they are comparing a single core of a dual core CPU on a time when multicore was on its infancy on PC,second they are using the whole GPU reason for the huge gap,but you can't use all your GPU for compression,because with what the fu** will you run the game,effects,textures and everything else.

Also on 2006 the bandwidth require for a GPU was way different than now,in fact the 8800GTX has 88GB/s that is almost half of what the 7850 has,GPU of today demand more bandwidth,the article was written when the 7850 was already out,so yeah you can save bus traffic by just using the CPU instead of the GPU,specially on PS4 and even more in this case since 4XMSAA requires a ton of bandwidth.

5-You are a butthurt moron who love to defend xbox one games,whith crappy limitations it is a fact and you know sh** of what your talking proven once again.

But but the article is from 2007..lol

#284 Posted by zeeshanhaider (2191 posts) -

@tormentos:

@zeeshanhaider said:

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

AMD's 3DC+ was included in DX10 as BC4. There are other new native texture formats in DX11.

AMD Jaguar was designed to almost rivals ARM Cortex A15's cost that scales from tablets to micro-server arrays.

AMD Jaguar's physical address improvements (i.e. from Opteron) was designed for servers. AMD Jaguar's memory controllers has support for server's ECC memory.

AMD Jaguar is not specifically designed just for tablets i.e. AMD Mullins SoC's 2.5 watts *is* designed for tablets.

Jaguar architecture (Kabini and Temash)

Main article: Jaguar (microarchitecture)

In January 2013 the Jaguar-based Kabini and Temash APUs were unveiled as the successors of the Bobcat-based Ontario, Zacate and Hondo APUs.[44][45][46] The Kabini APU is aimed at the low-power, subnotebook, netbook, ultra-thin and small form factor markets, the Temash APU is aimed at the tablet, ultra-low power and small form factor markets.[46] The two to four Jaguar cores of the Kabini and Temash APUs feature numerous architectural improvements regarding power requirement and performance, such as support for newer x86-instructions, a higher IPC count, a CC6 power state mode and clock gating.[47][48][49] Kabini and Temash are AMD's first, and also the first ever quad-core x86 based SoCs.[50] The integrated Fusion Controller Hubs (FCH) for Kabini and Temash are codenamed "Yangtze" and "Salton" respectively.[51] The Yangtze FCH features support for two USB 3.0 ports, two SATA 6 Gbit/s ports, as well as the xHCI 1.0 and SD/SDIO 3.0 protocols for SD-card support.[51] Both chips feature DirectX 11.1-compliant GCN-based graphics as well as numerous heterogeneous system architecture (HSA) improvements.[44][45] They were fabricated at a 28 nm process in an FT2 BGA package by TSMC, and were released on May 23, 2013.[47][52][53]

The PlayStation 4 and Xbox One, were revealed to both be powered by 8-core semi-custom Jaguar-derived APUs.

Right from the Wikipedia.

Yup, X1 and PS4 is made up of an extended version of jaguar i.e. They have cheap ass tablet CPUs.

#285 Edited by ronvalencia (15103 posts) -

@zeeshanhaider said:

@ronvalencia said:

@tormentos:

@zeeshanhaider said:

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

AMD's 3DC+ was included in DX10 as BC4. There are other new native texture formats in DX11.

AMD Jaguar was designed to almost rivals ARM Cortex A15's cost that scales from tablets to micro-server arrays.

AMD Jaguar's physical address improvements (i.e. from Opteron) was designed for servers. AMD Jaguar's memory controllers has support for server's ECC memory.

AMD Jaguar is not specifically designed just for tablets i.e. AMD Mullins SoC's 2.5 watts *is* designed for tablets.

Jaguar architecture (Kabini and Temash)

Main article: Jaguar (microarchitecture)

In January 2013 the Jaguar-based Kabini and Temash APUs were unveiled as the successors of the Bobcat-based Ontario, Zacate and Hondo APUs.[44][45][46] The Kabini APU is aimed at the low-power, subnotebook, netbook, ultra-thin and small form factor markets, the Temash APU is aimed at the tablet, ultra-low power and small form factor markets.[46] The two to four Jaguar cores of the Kabini and Temash APUs feature numerous architectural improvements regarding power requirement and performance, such as support for newer x86-instructions, a higher IPC count, a CC6 power state mode and clock gating.[47][48][49] Kabini and Temash are AMD's first, and also the first ever quad-core x86 based SoCs.[50] The integrated Fusion Controller Hubs (FCH) for Kabini and Temash are codenamed "Yangtze" and "Salton" respectively.[51] The Yangtze FCH features support for two USB 3.0 ports, two SATA 6 Gbit/s ports, as well as the xHCI 1.0 and SD/SDIO 3.0 protocols for SD-card support.[51] Both chips feature DirectX 11.1-compliant GCN-based graphics as well as numerous heterogeneous system architecture (HSA) improvements.[44][45] They were fabricated at a 28 nm process in an FT2 BGA package by TSMC, and were released on May 23, 2013.[47][52][53]

The PlayStation 4 and Xbox One, were revealed to both be powered by 8-core semi-custom Jaguar-derived APUs.

Right from the Wikipedia.

Yup, X1 and PS4 is made up of an extended version of jaguar i.e. They have cheap ass tablet CPUs.

Physical Address improvements to 40 bits.

From http://www.amd.com/us/press-releases/Pages/Press_Release_69663.aspx

For server usage, Opteron has 40bits Physical Address.

AMD Jaguar's memory controllers also includes server's ECC standard.

Notice "Improved virtualization" i.e. tablets rarely uses VM hardware features.

AMD Jaguar's improved virtualization hardware targets X86 Servers (micro-servers array) and Xbox One usage.

http://www.xbitlabs.com/news/cpu/display/20130528200818_AMD_Launches_AMD_Opteron_X_Series_Processors_for_Micro_Servers.html

"The AMD Opteron X-Series processors are now the world’s premier small-core x86 APUs and CPUs, ideal for next-generation scale-out web and cloud applications ranging from big data analytics to image processing, multimedia content delivery, and hosting.

One of the first systems to utilize AMD Opteron “Kyoto” chips will be HP’s Monshot micro-servers. The HP Moonshot system is particularly well suited for “Kyoto” since it is architected for maximum efficiency."

Wiki?? LOL. instant fail if you use Wiki as source in Uni.

-----

http://www.xbitlabs.com/news/cpu/display/20140122231052_AMD_Mullins_and_Beema_Will_Beat_Intel_Atom_Bay_Trail.html

AMD Beema's 2 watts rivals Intel Baytrail-T's ~2.5 watts. For tablets and laptops, AMD Beema/Mullins also supports MS's InstantGo features.

AMD Jaguar revision inside the X1 and PS4 consoles would be too hot for tablets and both consoles are not running with tablet's DDR3L memory types.

#286 Posted by zeeshanhaider (2191 posts) -

@zeeshanhaider said:

@ronvalencia said:

@tormentos:

@zeeshanhaider said:

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

AMD's 3DC+ was included in DX10 as BC4. There are other new native texture formats in DX11.

AMD Jaguar was designed to almost rivals ARM Cortex A15's cost that scales from tablets to micro-server arrays.

AMD Jaguar's physical address improvements (i.e. from Opteron) was designed for servers. AMD Jaguar's memory controllers has support for server's ECC memory.

AMD Jaguar is not specifically designed just for tablets i.e. AMD Mullins SoC's 2.5 watts *is* designed for tablets.

Jaguar architecture (Kabini and Temash)

Main article: Jaguar (microarchitecture)

In January 2013 the Jaguar-based Kabini and Temash APUs were unveiled as the successors of the Bobcat-based Ontario, Zacate and Hondo APUs.[44][45][46] The Kabini APU is aimed at the low-power, subnotebook, netbook, ultra-thin and small form factor markets, the Temash APU is aimed at the tablet, ultra-low power and small form factor markets.[46] The two to four Jaguar cores of the Kabini and Temash APUs feature numerous architectural improvements regarding power requirement and performance, such as support for newer x86-instructions, a higher IPC count, a CC6 power state mode and clock gating.[47][48][49] Kabini and Temash are AMD's first, and also the first ever quad-core x86 based SoCs.[50] The integrated Fusion Controller Hubs (FCH) for Kabini and Temash are codenamed "Yangtze" and "Salton" respectively.[51] The Yangtze FCH features support for two USB 3.0 ports, two SATA 6 Gbit/s ports, as well as the xHCI 1.0 and SD/SDIO 3.0 protocols for SD-card support.[51] Both chips feature DirectX 11.1-compliant GCN-based graphics as well as numerous heterogeneous system architecture (HSA) improvements.[44][45] They were fabricated at a 28 nm process in an FT2 BGA package by TSMC, and were released on May 23, 2013.[47][52][53]

The PlayStation 4 and Xbox One, were revealed to both be powered by 8-core semi-custom Jaguar-derived APUs.

Right from the Wikipedia.

Yup, X1 and PS4 is made up of an extended version of jaguar i.e. They have cheap ass tablet CPUs.

Physical Address improvements to 40 bits.

From http://www.amd.com/us/press-releases/Pages/Press_Release_69663.aspx

For server usage, Opteron has 40bits Physical Address.

AMD Jaguar's memory controllers also includes server's ECC standard.

Notice "Improved virtualization" i.e. tablets rarely uses VM hardware features.

AMD Jaguar's improved virtualization hardware targets X86 Servers (micro-servers array) and Xbox One usage.

http://www.xbitlabs.com/news/cpu/display/20130528200818_AMD_Launches_AMD_Opteron_X_Series_Processors_for_Micro_Servers.html

"The AMD Opteron X-Series processors are now the world’s premier small-core x86 APUs and CPUs, ideal for next-generation scale-out web and cloud applications ranging from big data analytics to image processing, multimedia content delivery, and hosting.

One of the first systems to utilize AMD Opteron “Kyoto” chips will be HP’s Monshot micro-servers. The HP Moonshot system is particularly well suited for “Kyoto” since it is architected for maximum efficiency."

Wiki?? LOL. instant fail if you use Wiki as source in Uni.

I know but since an idiot posted from Wiki, I did the same. Anyways, still doesn't change the fact that X1/PS4 CPU's are cheap ass tablet CPU's with 64 bit address space. Of course it will have more bits for physical address since they are targeting 8GB.

Now can I have some tests of PS4/X1 CPU vs i7? Yeah....I know the tablet CPU just can't keep up.

#287 Edited by ronvalencia (15103 posts) -
@zeeshanhaider said:

@ronvalencia said:

@zeeshanhaider said:

@ronvalencia said:

@tormentos:

@zeeshanhaider said:

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

AMD's 3DC+ was included in DX10 as BC4. There are other new native texture formats in DX11.

AMD Jaguar was designed to almost rivals ARM Cortex A15's cost that scales from tablets to micro-server arrays.

AMD Jaguar's physical address improvements (i.e. from Opteron) was designed for servers. AMD Jaguar's memory controllers has support for server's ECC memory.

AMD Jaguar is not specifically designed just for tablets i.e. AMD Mullins SoC's 2.5 watts *is* designed for tablets.

Jaguar architecture (Kabini and Temash)

Main article: Jaguar (microarchitecture)

In January 2013 the Jaguar-based Kabini and Temash APUs were unveiled as the successors of the Bobcat-based Ontario, Zacate and Hondo APUs.[44][45][46] The Kabini APU is aimed at the low-power, subnotebook, netbook, ultra-thin and small form factor markets, the Temash APU is aimed at the tablet, ultra-low power and small form factor markets.[46] The two to four Jaguar cores of the Kabini and Temash APUs feature numerous architectural improvements regarding power requirement and performance, such as support for newer x86-instructions, a higher IPC count, a CC6 power state mode and clock gating.[47][48][49] Kabini and Temash are AMD's first, and also the first ever quad-core x86 based SoCs.[50] The integrated Fusion Controller Hubs (FCH) for Kabini and Temash are codenamed "Yangtze" and "Salton" respectively.[51] The Yangtze FCH features support for two USB 3.0 ports, two SATA 6 Gbit/s ports, as well as the xHCI 1.0 and SD/SDIO 3.0 protocols for SD-card support.[51] Both chips feature DirectX 11.1-compliant GCN-based graphics as well as numerous heterogeneous system architecture (HSA) improvements.[44][45] They were fabricated at a 28 nm process in an FT2 BGA package by TSMC, and were released on May 23, 2013.[47][52][53]

The PlayStation 4 and Xbox One, were revealed to both be powered by 8-core semi-custom Jaguar-derived APUs.

Right from the Wikipedia.

Yup, X1 and PS4 is made up of an extended version of jaguar i.e. They have cheap ass tablet CPUs.

Physical Address improvements to 40 bits.

From http://www.amd.com/us/press-releases/Pages/Press_Release_69663.aspx

For server usage, Opteron has 40bits Physical Address.

AMD Jaguar's memory controllers also includes server's ECC standard.

Notice "Improved virtualization" i.e. tablets rarely uses VM hardware features.

AMD Jaguar's improved virtualization hardware targets X86 Servers (micro-servers array) and Xbox One usage.

http://www.xbitlabs.com/news/cpu/display/20130528200818_AMD_Launches_AMD_Opteron_X_Series_Processors_for_Micro_Servers.html

"The AMD Opteron X-Series processors are now the world’s premier small-core x86 APUs and CPUs, ideal for next-generation scale-out web and cloud applications ranging from big data analytics to image processing, multimedia content delivery, and hosting.

One of the first systems to utilize AMD Opteron “Kyoto” chips will be HP’s Monshot micro-servers. The HP Moonshot system is particularly well suited for “Kyoto” since it is architected for maximum efficiency."

Wiki?? LOL. instant fail if you use Wiki as source in Uni.

I know but since an idiot posted from Wiki, I did the same. Anyways, still doesn't change the fact that X1/PS4 CPU's are cheap ass tablet CPU's with 64 bit address space. Of course it will have more bits for physical address since they are targeting 8GB.

Now can I have some tests of PS4/X1 CPU vs i7? Yeah....I know the tablet CPU just can't keep up.

On physical address capability, 2^40bits = ~128 Gigabytes per node LOL. This is a server feature. Another person with math problems.

Physical address capability 2^36bit = 8 GB.

AMD Jaguar based SoCs has 40bits physical address capability and ECC support.

AMD Jaguar is also designed for server array i.e very high density server markets e.g. similar to large scale IBM PowerPC 4xx array in their super computers.

---------

Intel Core i7-3770 is just a faster CPU, which is almost pointless for R9-290X+Mantle setup.

PS; I use i7-2600 (4.2Ghz) and i5-2500K (3.6 Ghz).

#288 Edited by ronvalencia (15103 posts) -
@tormentos said:

@zeeshanhaider said:

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Real-time compression on the GPU may be less useful for transcoding, because of increased bandwidth requirements for uploading uncompressed texture data and because the GPU may already be tasked with expensive rendering work. However, real-time compression on the GPU is very useful for compressed render targets. The compression on the GPU can be used to save memory when rendering to a texture. Furthermore, such compressed render targets can improve the performance if the data from the render target is used for further rendering. The render target is compressed once, while the resulting data may be accessed many times during rendering. The compressed data results in reduced bandwidth requirements during rasterization and can, as such, significantly improve performance. (The things relevant to gaming)

Link

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

1-One way or the other 1 of the 2 will be..

2-Considering that you don't know what the fu** on rails mean your whole argument is stupid,having QTE's which Ryse has as well,doesn't make the game on rails,even that Ryse is the closes thing is not on rails,because you can control your characters movements,once again the word your looking for is Linear Kingtito..

3-No you blind moron is not just for that..

http://en.wikipedia.org/wiki/Jaguar_%28microarchitecture%29#Server

Note books,mini PC,Servers and tablets in fact how many tablets even use a Jaguar.? Most tablets use ARM architecture,oh and they are different models the top of the Jaguar CPU kick the crap out of the Tablet version.

Fast CPU DXT Compression

Submitted by on Thu, 04/26/2012 - 09:56

Real-time DXT compression algorithms can be implemented on both the CPU (Waveren, 2006) and the GPU (Castano, 2007). While GPU implementations are usually faster, CPU versions are sufficiently fast for real-time applications and result in less bus traffic. Which implementation should be chosen is workload dependent and should be based on where the application is bottlenecked.

http://software.intel.com/en-us/articles/fast-cpu-dxt-compression

The article was posted on April 26 of 2012 moron,the 2006 and 2007 dates refer to the implementations of the technology you idiot,not to the time the article was created..lol OWNED..

I don't even know why you pull those tables,first they are from 2006 which is the year the 8800GTX came alone,not only that they are comparing a single core of a dual core CPU on a time when multicore was on its infancy on PC,second they are using the whole GPU reason for the huge gap,but you can't use all your GPU for compression,because with what the fu** will you run the game,effects,textures and everything else.

Also on 2006 the bandwidth require for a GPU was way different than now,in fact the 8800GTX has 88GB/s that is almost half of what the 7850 has,GPU of today demand more bandwidth,the article was written when the 7850 was already out,so yeah you can save bus traffic by just using the CPU instead of the GPU,specially on PS4 and even more in this case since 4XMSAA requires a ton of bandwidth.

5-You are a butthurt moron who love to defend xbox one games,whith crappy limitations it is a fact and you know sh** of what your talking proven once again.

But but the article is from 2007..lol

Both consoles uses unified memory architecture and it doesn't matter which processor (CPU or GPU) generates the procedural workloads i.e. why restrict developers?

Crytek's XboxOne/PS4 4th-gen CryEngine demo which includes procedural GpGPU weather effects.

X1 can use both CPUs (->main memory) and GPU (->32MB eSRAM or main memory) for procedural workloads.

PS4 also also use both CPUs (->main memory) and GPU (->main memory) for procedural workloads.

Both can transfer data from the CPU to the GPU.

#289 Posted by zeeshanhaider (2191 posts) -

@zeeshanhaider said:

@ronvalencia said:

@zeeshanhaider said:

@ronvalencia said:

@tormentos:

@zeeshanhaider said:

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

AMD's 3DC+ was included in DX10 as BC4. There are other new native texture formats in DX11.

AMD Jaguar was designed to almost rivals ARM Cortex A15's cost that scales from tablets to micro-server arrays.

AMD Jaguar's physical address improvements (i.e. from Opteron) was designed for servers. AMD Jaguar's memory controllers has support for server's ECC memory.

AMD Jaguar is not specifically designed just for tablets i.e. AMD Mullins SoC's 2.5 watts *is* designed for tablets.

Jaguar architecture (Kabini and Temash)

Main article: Jaguar (microarchitecture)

In January 2013 the Jaguar-based Kabini and Temash APUs were unveiled as the successors of the Bobcat-based Ontario, Zacate and Hondo APUs.[44][45][46] The Kabini APU is aimed at the low-power, subnotebook, netbook, ultra-thin and small form factor markets, the Temash APU is aimed at the tablet, ultra-low power and small form factor markets.[46] The two to four Jaguar cores of the Kabini and Temash APUs feature numerous architectural improvements regarding power requirement and performance, such as support for newer x86-instructions, a higher IPC count, a CC6 power state mode and clock gating.[47][48][49] Kabini and Temash are AMD's first, and also the first ever quad-core x86 based SoCs.[50] The integrated Fusion Controller Hubs (FCH) for Kabini and Temash are codenamed "Yangtze" and "Salton" respectively.[51] The Yangtze FCH features support for two USB 3.0 ports, two SATA 6 Gbit/s ports, as well as the xHCI 1.0 and SD/SDIO 3.0 protocols for SD-card support.[51] Both chips feature DirectX 11.1-compliant GCN-based graphics as well as numerous heterogeneous system architecture (HSA) improvements.[44][45] They were fabricated at a 28 nm process in an FT2 BGA package by TSMC, and were released on May 23, 2013.[47][52][53]

The PlayStation 4 and Xbox One, were revealed to both be powered by 8-core semi-custom Jaguar-derived APUs.

Right from the Wikipedia.

Yup, X1 and PS4 is made up of an extended version of jaguar i.e. They have cheap ass tablet CPUs.

Physical Address improvements to 40 bits.

From http://www.amd.com/us/press-releases/Pages/Press_Release_69663.aspx

For server usage, Opteron has 40bits Physical Address.

AMD Jaguar's memory controllers also includes server's ECC standard.

Notice "Improved virtualization" i.e. tablets rarely uses VM hardware features.

AMD Jaguar's improved virtualization hardware targets X86 Servers (micro-servers array) and Xbox One usage.

http://www.xbitlabs.com/news/cpu/display/20130528200818_AMD_Launches_AMD_Opteron_X_Series_Processors_for_Micro_Servers.html

"The AMD Opteron X-Series processors are now the world’s premier small-core x86 APUs and CPUs, ideal for next-generation scale-out web and cloud applications ranging from big data analytics to image processing, multimedia content delivery, and hosting.

One of the first systems to utilize AMD Opteron “Kyoto” chips will be HP’s Monshot micro-servers. The HP Moonshot system is particularly well suited for “Kyoto” since it is architected for maximum efficiency."

Wiki?? LOL. instant fail if you use Wiki as source in Uni.

I know but since an idiot posted from Wiki, I did the same. Anyways, still doesn't change the fact that X1/PS4 CPU's are cheap ass tablet CPU's with 64 bit address space. Of course it will have more bits for physical address since they are targeting 8GB.

Now can I have some tests of PS4/X1 CPU vs i7? Yeah....I know the tablet CPU just can't keep up.

On physical address capability, 2^40bits = ~128 Gigabytes per node LOL. This is a server feature. Another person with math problems.

Physical address capability 2^36bit = 8 GB.

AMD Jaguar based SoCs has 40bits physical address capability and ECC support.

AMD Jaguar is also designed for server array i.e very high density server markets e.g. similar to large scale IBM PowerPC 4xx array in their super computers.

It's not that I don't understand how adress space works. I was referring to the two standards. Since they are x86 hence they would go for 64 bits to access more than 4GB memory, which is restricted in 32 bits. IF you are talking about others adress space than Intel also had PAE (Physical Address Extension) that was 36 bits. So, no sir, I don't have any Math problems.

I don't care about the server array. The point is X1/PS4 CPUs are tablet CPUs with some tweaks here and there.

And stop being an AMD (Corporation) fanboy. Leave this sickness to the consolites. PC gaming don't need loyalty to corporations.

#290 Edited by ronvalencia (15103 posts) -

@zeeshanhaider said:

@ronvalencia said:
@zeeshanhaider said:

@ronvalencia said:

@zeeshanhaider said:

@ronvalencia said:

@tormentos:

@zeeshanhaider said:

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

AMD's 3DC+ was included in DX10 as BC4. There are other new native texture formats in DX11.

AMD Jaguar was designed to almost rivals ARM Cortex A15's cost that scales from tablets to micro-server arrays.

AMD Jaguar's physical address improvements (i.e. from Opteron) was designed for servers. AMD Jaguar's memory controllers has support for server's ECC memory.

AMD Jaguar is not specifically designed just for tablets i.e. AMD Mullins SoC's 2.5 watts *is* designed for tablets.

Jaguar architecture (Kabini and Temash)

Main article: Jaguar (microarchitecture)

In January 2013 the Jaguar-based Kabini and Temash APUs were unveiled as the successors of the Bobcat-based Ontario, Zacate and Hondo APUs.[44][45][46] The Kabini APU is aimed at the low-power, subnotebook, netbook, ultra-thin and small form factor markets, the Temash APU is aimed at the tablet, ultra-low power and small form factor markets.[46] The two to four Jaguar cores of the Kabini and Temash APUs feature numerous architectural improvements regarding power requirement and performance, such as support for newer x86-instructions, a higher IPC count, a CC6 power state mode and clock gating.[47][48][49] Kabini and Temash are AMD's first, and also the first ever quad-core x86 based SoCs.[50] The integrated Fusion Controller Hubs (FCH) for Kabini and Temash are codenamed "Yangtze" and "Salton" respectively.[51] The Yangtze FCH features support for two USB 3.0 ports, two SATA 6 Gbit/s ports, as well as the xHCI 1.0 and SD/SDIO 3.0 protocols for SD-card support.[51] Both chips feature DirectX 11.1-compliant GCN-based graphics as well as numerous heterogeneous system architecture (HSA) improvements.[44][45] They were fabricated at a 28 nm process in an FT2 BGA package by TSMC, and were released on May 23, 2013.[47][52][53]

The PlayStation 4 and Xbox One, were revealed to both be powered by 8-core semi-custom Jaguar-derived APUs.

Right from the Wikipedia.

Yup, X1 and PS4 is made up of an extended version of jaguar i.e. They have cheap ass tablet CPUs.

Physical Address improvements to 40 bits.

From http://www.amd.com/us/press-releases/Pages/Press_Release_69663.aspx

For server usage, Opteron has 40bits Physical Address.

AMD Jaguar's memory controllers also includes server's ECC standard.

Notice "Improved virtualization" i.e. tablets rarely uses VM hardware features.

AMD Jaguar's improved virtualization hardware targets X86 Servers (micro-servers array) and Xbox One usage.

http://www.xbitlabs.com/news/cpu/display/20130528200818_AMD_Launches_AMD_Opteron_X_Series_Processors_for_Micro_Servers.html

"The AMD Opteron X-Series processors are now the world’s premier small-core x86 APUs and CPUs, ideal for next-generation scale-out web and cloud applications ranging from big data analytics to image processing, multimedia content delivery, and hosting.

One of the first systems to utilize AMD Opteron “Kyoto” chips will be HP’s Monshot micro-servers. The HP Moonshot system is particularly well suited for “Kyoto” since it is architected for maximum efficiency."

Wiki?? LOL. instant fail if you use Wiki as source in Uni.

I know but since an idiot posted from Wiki, I did the same. Anyways, still doesn't change the fact that X1/PS4 CPU's are cheap ass tablet CPU's with 64 bit address space. Of course it will have more bits for physical address since they are targeting 8GB.

Now can I have some tests of PS4/X1 CPU vs i7? Yeah....I know the tablet CPU just can't keep up.

On physical address capability, 2^40bits = ~128 Gigabytes per node LOL. This is a server feature. Another person with math problems.

Physical address capability 2^36bit = 8 GB.

AMD Jaguar based SoCs has 40bits physical address capability and ECC support.

AMD Jaguar is also designed for server array i.e very high density server markets e.g. similar to large scale IBM PowerPC 4xx array in their super computers.

It's not that I don't understand how adress space works. I was referring to the two standards. Since they are x86 hence they would go for 64 bits to access more than 4GB memory, which is restricted in 32 bits. IF you are talking about others adress space than Intel also had PAE (Physical Address Extension) that was 36 bits. So, no sir, I don't have any Math problems.

I don't care about the server array. The point is X1/PS4 CPUs are tablet CPUs with some tweaks here and there.

And stop being an AMD (Corporation) fanboy. Leave this sickness to the consolites. PC gaming don't need loyalty to corporations.

LOL. AMD Temash is a failure in the tablet market and it has a higher success rate in the console market i.e. >3 million for X1 and >4 million for PS4. For real tablets, it needs to be around 2 watts.

36bit PAE is made up of multiple 32bit address segments i.e. X86's memory segmentation at it's best.

AMD Beema SoC's 2 watts is AMD's real effort for real tablets.

#291 Posted by tormentos (16275 posts) -

Jaguar architecture (Kabini and Temash)

Main article: Jaguar (microarchitecture)

In January 2013 the Jaguar-based Kabini and Temash APUs were unveiled as the successors of the Bobcat-based Ontario, Zacate and Hondo APUs.[44][45][46]The Kabini APU is aimed at the low-power, subnotebook, netbook, ultra-thin and small form factor markets, the Temash APU is aimed at the tablet, ultra-low power and small form factor markets.[46] The two to four Jaguar cores of the Kabini and Temash APUs feature numerous architectural improvements regarding power requirement and performance, such as support for newer x86-instructions, a higher IPC count, a CC6 power state mode and clock gating.[47][48][49] Kabini and Temash are AMD's first, and also the first ever quad-core x86 based SoCs.[50] The integrated Fusion Controller Hubs (FCH) for Kabini and Temash are codenamed "Yangtze" and "Salton" respectively.[51] The Yangtze FCH features support for two USB 3.0 ports, two SATA 6 Gbit/s ports, as well as the xHCI 1.0 and SD/SDIO 3.0 protocols for SD-card support.[51] Both chips feature DirectX 11.1-compliant GCN-based graphics as well as numerous heterogeneous system architecture (HSA) improvements.[44][45] They were fabricated at a 28 nm process in an FT2 BGA package by TSMC, and were released on May 23, 2013.[47][52][53]

The PlayStation 4 and Xbox One, were revealed to both be powered by 8-core semi-custom Jaguar-derived APUs.

Right from the Wikipedia.

Yup, X1 and PS4 is made up of an extended version of jaguar i.e. They have cheap ass tablet CPUs.

I call that been own by your own link.

Temash is the tablet version and the PS4 version is Kabini you idiot..

http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs

So yeah you still owned..

It's not that I don't understand how adress space works. I was referring to the two standards. Since they are x86 hence they would go for 64 bits to access more than 4GB memory, which is restricted in 32 bits. IF you are talking about others adress space than Intel also had PAE (Physical Address Extension) that was 36 bits. So, no sir, I don't have any Math problems.

I don't care about the server array. The point is X1/PS4 CPUs are tablet CPUs with some tweaks here and there.

And stop being an AMD (Corporation) fanboy. Leave this sickness to the consolites. PC gaming don't need loyalty to corporations.

The PS4 doesn't need an i7 is not running windows.

You are a sad troll that defend the xbox one games that perform like sh**,basically you are lemming pretending to be a hermit..lol

#292 Posted by zeeshanhaider (2191 posts) -

@zeeshanhaider said:

@ronvalencia said:
@zeeshanhaider said:

@ronvalencia said:

@zeeshanhaider said:

@ronvalencia said:

@tormentos:

@zeeshanhaider said:

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

AMD's 3DC+ was included in DX10 as BC4. There are other new native texture formats in DX11.

AMD Jaguar was designed to almost rivals ARM Cortex A15's cost that scales from tablets to micro-server arrays.

AMD Jaguar's physical address improvements (i.e. from Opteron) was designed for servers. AMD Jaguar's memory controllers has support for server's ECC memory.

AMD Jaguar is not specifically designed just for tablets i.e. AMD Mullins SoC's 2.5 watts *is* designed for tablets.

Jaguar architecture (Kabini and Temash)

Main article: Jaguar (microarchitecture)

In January 2013 the Jaguar-based Kabini and Temash APUs were unveiled as the successors of the Bobcat-based Ontario, Zacate and Hondo APUs.[44][45][46] The Kabini APU is aimed at the low-power, subnotebook, netbook, ultra-thin and small form factor markets, the Temash APU is aimed at the tablet, ultra-low power and small form factor markets.[46] The two to four Jaguar cores of the Kabini and Temash APUs feature numerous architectural improvements regarding power requirement and performance, such as support for newer x86-instructions, a higher IPC count, a CC6 power state mode and clock gating.[47][48][49] Kabini and Temash are AMD's first, and also the first ever quad-core x86 based SoCs.[50] The integrated Fusion Controller Hubs (FCH) for Kabini and Temash are codenamed "Yangtze" and "Salton" respectively.[51] The Yangtze FCH features support for two USB 3.0 ports, two SATA 6 Gbit/s ports, as well as the xHCI 1.0 and SD/SDIO 3.0 protocols for SD-card support.[51] Both chips feature DirectX 11.1-compliant GCN-based graphics as well as numerous heterogeneous system architecture (HSA) improvements.[44][45] They were fabricated at a 28 nm process in an FT2 BGA package by TSMC, and were released on May 23, 2013.[47][52][53]

The PlayStation 4 and Xbox One, were revealed to both be powered by 8-core semi-custom Jaguar-derived APUs.

Right from the Wikipedia.

Yup, X1 and PS4 is made up of an extended version of jaguar i.e. They have cheap ass tablet CPUs.

Physical Address improvements to 40 bits.

From http://www.amd.com/us/press-releases/Pages/Press_Release_69663.aspx

For server usage, Opteron has 40bits Physical Address.

AMD Jaguar's memory controllers also includes server's ECC standard.

Notice "Improved virtualization" i.e. tablets rarely uses VM hardware features.

AMD Jaguar's improved virtualization hardware targets X86 Servers (micro-servers array) and Xbox One usage.

http://www.xbitlabs.com/news/cpu/display/20130528200818_AMD_Launches_AMD_Opteron_X_Series_Processors_for_Micro_Servers.html

"The AMD Opteron X-Series processors are now the world’s premier small-core x86 APUs and CPUs, ideal for next-generation scale-out web and cloud applications ranging from big data analytics to image processing, multimedia content delivery, and hosting.

One of the first systems to utilize AMD Opteron “Kyoto” chips will be HP’s Monshot micro-servers. The HP Moonshot system is particularly well suited for “Kyoto” since it is architected for maximum efficiency."

Wiki?? LOL. instant fail if you use Wiki as source in Uni.

I know but since an idiot posted from Wiki, I did the same. Anyways, still doesn't change the fact that X1/PS4 CPU's are cheap ass tablet CPU's with 64 bit address space. Of course it will have more bits for physical address since they are targeting 8GB.

Now can I have some tests of PS4/X1 CPU vs i7? Yeah....I know the tablet CPU just can't keep up.

On physical address capability, 2^40bits = ~128 Gigabytes per node LOL. This is a server feature. Another person with math problems.

Physical address capability 2^36bit = 8 GB.

AMD Jaguar based SoCs has 40bits physical address capability and ECC support.

AMD Jaguar is also designed for server array i.e very high density server markets e.g. similar to large scale IBM PowerPC 4xx array in their super computers.

It's not that I don't understand how adress space works. I was referring to the two standards. Since they are x86 hence they would go for 64 bits to access more than 4GB memory, which is restricted in 32 bits. IF you are talking about others adress space than Intel also had PAE (Physical Address Extension) that was 36 bits. So, no sir, I don't have any Math problems.

I don't care about the server array. The point is X1/PS4 CPUs are tablet CPUs with some tweaks here and there.

And stop being an AMD (Corporation) fanboy. Leave this sickness to the consolites. PC gaming don't need loyalty to corporations.

LOL. AMD Temash is a failure in the tablet market and it has a higher success rate in the console market i.e. >3 million for X1 and >4 million for PS4. For real tablets, it needs to be around 2 watts.

36bit PAE is made up of multiple 32bit address segments i.e. X86's memory segmentation at it's best.

AMD Beema SoC's 2 watts is AMD's real effort for real tablets.

That's what I was saying all along. Next-Gen CPU's are cheap ass tablet versions.

#293 Edited by clyde46 (43257 posts) -

@clyde46:

@clyde46 said:

I see the AMD shills are out in force.

What's the matter? Nvidia still has $hit CPUs i.e. for both x86 and ARM.

Lol wut? Nvidia has the best single GPU card on the planet.

#294 Posted by ronvalencia (15103 posts) -

@clyde46 said:

@ronvalencia said:

@clyde46:

@clyde46 said:

I see the AMD shills are out in force.

What's the matter? Nvidia still has $hit CPUs i.e. for both x86 and ARM.

Lol wut? Nvidia has the best single GPU card on the planet.

One should expect more from 551 mm^2 die size GPU.

#295 Posted by ronvalencia (15103 posts) -

@ronvalencia said:

@zeeshanhaider said:

@ronvalencia said:
@zeeshanhaider said:

@ronvalencia said:

@zeeshanhaider said:

@ronvalencia said:

@tormentos:

@zeeshanhaider said:

1) None is set in the stone.

2) PS exclusives are nothing but onrails, Linear QTE fest, press 'X' to win movies for brain dead dummies.

3) Jaguar is a tablet CPU. Designed for it. Just because tablets don't have doesn't make it worthwhile. It only shows that it has no worth. X1 and PS4 has a cheap ass tablet CPU.

4) Thank you for posting an excerpt from an article written in 2007 while we are in 2014. Suits you since you are using an ancient hardware. The graphics scenario for games is radically different after the introduction of DirectX 10.

Couple that with we have now native GPU texture formats like the 3Dc from AMD. Oh and what the fuck is surface Engine, never heard about it. Any other tests for your beloved cheap ass tablet CPU? I guess, no.

5) You are dumb and butthurt. Start calling me an alt won't change that.

And the most important thing, since Ryse > Crapzone: Shadow Fail => Crysis 2 from 2011 runs circle around Crapzone: Shadow Fail and will forever be a benchmark for these outdated next gen consoles(PS4/X1) to reach. Remain butthurt. And this is my last post to you. Talking to you make me lose brain cells.

AMD's 3DC+ was included in DX10 as BC4. There are other new native texture formats in DX11.

AMD Jaguar was designed to almost rivals ARM Cortex A15's cost that scales from tablets to micro-server arrays.

AMD Jaguar's physical address improvements (i.e. from Opteron) was designed for servers. AMD Jaguar's memory controllers has support for server's ECC memory.

AMD Jaguar is not specifically designed just for tablets i.e. AMD Mullins SoC's 2.5 watts *is* designed for tablets.

Jaguar architecture (Kabini and Temash)

Main article: Jaguar (microarchitecture)

In January 2013 the Jaguar-based Kabini and Temash APUs were unveiled as the successors of the Bobcat-based Ontario, Zacate and Hondo APUs.[44][45][46] The Kabini APU is aimed at the low-power, subnotebook, netbook, ultra-thin and small form factor markets, the Temash APU is aimed at the tablet, ultra-low power and small form factor markets.[46] The two to four Jaguar cores of the Kabini and Temash APUs feature numerous architectural improvements regarding power requirement and performance, such as support for newer x86-instructions, a higher IPC count, a CC6 power state mode and clock gating.[47][48][49] Kabini and Temash are AMD's first, and also the first ever quad-core x86 based SoCs.[50] The integrated Fusion Controller Hubs (FCH) for Kabini and Temash are codenamed "Yangtze" and "Salton" respectively.[51] The Yangtze FCH features support for two USB 3.0 ports, two SATA 6 Gbit/s ports, as well as the xHCI 1.0 and SD/SDIO 3.0 protocols for SD-card support.[51] Both chips feature DirectX 11.1-compliant GCN-based graphics as well as numerous heterogeneous system architecture (HSA) improvements.[44][45] They were fabricated at a 28 nm process in an FT2 BGA package by TSMC, and were released on May 23, 2013.[47][52][53]

The PlayStation 4 and Xbox One, were revealed to both be powered by 8-core semi-custom Jaguar-derived APUs.

Right from the Wikipedia.

Yup, X1 and PS4 is made up of an extended version of jaguar i.e. They have cheap ass tablet CPUs.

Physical Address improvements to 40 bits.

From http://www.amd.com/us/press-releases/Pages/Press_Release_69663.aspx

For server usage, Opteron has 40bits Physical Address.

AMD Jaguar's memory controllers also includes server's ECC standard.

Notice "Improved virtualization" i.e. tablets rarely uses VM hardware features.

AMD Jaguar's improved virtualization hardware targets X86 Servers (micro-servers array) and Xbox One usage.

http://www.xbitlabs.com/news/cpu/display/20130528200818_AMD_Launches_AMD_Opteron_X_Series_Processors_for_Micro_Servers.html

"The AMD Opteron X-Series processors are now the world’s premier small-core x86 APUs and CPUs, ideal for next-generation scale-out web and cloud applications ranging from big data analytics to image processing, multimedia content delivery, and hosting.

One of the first systems to utilize AMD Opteron “Kyoto” chips will be HP’s Monshot micro-servers. The HP Moonshot system is particularly well suited for “Kyoto” since it is architected for maximum efficiency."

Wiki?? LOL. instant fail if you use Wiki as source in Uni.

I know but since an idiot posted from Wiki, I did the same. Anyways, still doesn't change the fact that X1/PS4 CPU's are cheap ass tablet CPU's with 64 bit address space. Of course it will have more bits for physical address since they are targeting 8GB.

Now can I have some tests of PS4/X1 CPU vs i7? Yeah....I know the tablet CPU just can't keep up.

On physical address capability, 2^40bits = ~128 Gigabytes per node LOL. This is a server feature. Another person with math problems.

Physical address capability 2^36bit = 8 GB.

AMD Jaguar based SoCs has 40bits physical address capability and ECC support.

AMD Jaguar is also designed for server array i.e very high density server markets e.g. similar to large scale IBM PowerPC 4xx array in their super computers.

It's not that I don't understand how adress space works. I was referring to the two standards. Since they are x86 hence they would go for 64 bits to access more than 4GB memory, which is restricted in 32 bits. IF you are talking about others adress space than Intel also had PAE (Physical Address Extension) that was 36 bits. So, no sir, I don't have any Math problems.

I don't care about the server array. The point is X1/PS4 CPUs are tablet CPUs with some tweaks here and there.

And stop being an AMD (Corporation) fanboy. Leave this sickness to the consolites. PC gaming don't need loyalty to corporations.

LOL. AMD Temash is a failure in the tablet market and it has a higher success rate in the console market i.e. >3 million for X1 and >4 million for PS4. For real tablets, it needs to be around 2 watts.

36bit PAE is made up of multiple 32bit address segments i.e. X86's memory segmentation at it's best.

AMD Beema SoC's 2 watts is AMD's real effort for real tablets.

That's what I was saying all along. Next-Gen CPU's are cheap ass tablet versions.

Funny that Intel Core i3/i5 ULVs has more tablet design wins than AMD Temash. LOL.

#296 Posted by clyde46 (43257 posts) -

@clyde46 said:

@ronvalencia said:

@clyde46:

@clyde46 said:

I see the AMD shills are out in force.

What's the matter? Nvidia still has $hit CPUs i.e. for both x86 and ARM.

Lol wut? Nvidia has the best single GPU card on the planet.

One should expect more from 551 mm^2 die size GPU.

Best single GPU on the planet.....

#297 Edited by ronvalencia (15103 posts) -

@clyde46 said:

@ronvalencia said:

@clyde46 said:

@ronvalencia said:

@clyde46:

@clyde46 said:

I see the AMD shills are out in force.

What's the matter? Nvidia still has $hit CPUs i.e. for both x86 and ARM.

Lol wut? Nvidia has the best single GPU card on the planet.

One should expect more from 551 mm^2 die size GPU.

Best single GPU on the planet.....

Depends on the workload type e.g. 780 Ti wouldn't win lite-coin.

From http://www.hardocp.com/article/2013/11/11/geforce_gtx_780_ti_vs_radeon_r9_290x_4k_gaming/

"For the most part, performance was similar between the two video cards. There were a couple cases where the 780 Ti was small percentages faster, and a couple where the R9 290X were small percentages faster. On the whole, both cards are so similar to each other at Ultra HD 4K gaming that it would be impossible to discern between these while gaming on a 4K display"

#298 Edited by CrownKingArthur (3494 posts) -
@ronvalencia said:

@clyde46 said:

@ronvalencia said:

@clyde46 said:

@ronvalencia said:

@clyde46:

@clyde46 said:

I see the AMD shills are out in force.

What's the matter? Nvidia still has $hit CPUs i.e. for both x86 and ARM.

Lol wut? Nvidia has the best single GPU card on the planet.

One should expect more from 551 mm^2 die size GPU.

Best single GPU on the planet.....

Depends on the workload type e.g. 780 Ti wouldn't win lite-coin.

From http://www.hardocp.com/article/2013/11/11/geforce_gtx_780_ti_vs_radeon_r9_290x_4k_gaming/

"For the most part, performance was similar between the two video cards. There were a couple cases where the 780 Ti was small percentages faster, and a couple where the R9 290X were small percentages faster. On the whole, both cards are so similar to each other at Ultra HD 4K gaming that it would be impossible to discern between these while gaming on a 4K display"

absolutely the workload type will affect relative performance

http://www.anandtech.com/bench/product/1056?vs=1072

clyde has a point though. faster is faster. regardless of die size, cost - whatever.

http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks,3663-19.html

#299 Edited by ronvalencia (15103 posts) -