Worst gaming engine for XB1 gets boost thanks to DX12

  • 183 results
  • 1
  • 2
  • 3
  • 4
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#101  Edited By tormentos
Member since 2003 • 33784 Posts

@delta3074 said:

'Second is not a fu**ing SDK i don't care what they say is fu**ing PR let see if Starwars battlefront is 1080p because of the latest SDK then i say yes i loss.'

So Konami are Lying?

You know what tormentos, i am done with you, even when people post evidence to debunk your claim you just call the evidence Lies or PR speak.

Getting fed up of you calling Credible game developers, who have more knowledge of coding games in there little finger than you have in your whole body, Liars or wrong when you have probably never even coded a rubbish 8 bit game.

What i Cannot stand is people who don't know how to do the Job claiming they know better than people who DO know how to do the Job.

You are NOT an expert when it comes to hardware, you probably have Zero Coding qualifications, you unload boats for crying out loud so please stop making out that you know better then the people at Konami about the Xbones hardware, you don't, they have access to xbone dev kits, you don't.

I was pretty sure a new SDK wouldn't increase resolutions, i thought the best they would get is an improvement in development, Check my posting history Because you are not the only one that was wrong, i was wrong as well.

Unlike you i just don't have a problem admitting i was wrong and i am no Game developer so i am not going to sit there and say Konami have no clue what they are talking about or they are Lying Because they know how to code games, i don't.

the closest i ever got to coding was a book i had for my Commodore 64, you copied the lines of code in the book and, hey presto, Space invaders.

Is PR just like Rebellion did and if you can't see that if your problem.

Fact is Rebellion say the same on February 2014 that they were getting 1080p because of a faster SDK,after that there has been a barrage of 900p games on xbox one,so the whole new SDK mean little fact is any game can be 1080p on xbox one it is just the quality and frames what determine if it can be reach,and MGS5 and Pes should never have been 720p.

The funny thing is i posted evidence of another developer stating the same more than a year ago and the disparity continue it will not end because the PS4 has more juice and to get resolution parity the xbox one most give up frames and to gain frame parity it most give up resolution and some times even quality.

You believe what ever the hell you want DX12 will be sony's 4D mark those words and 3 years from now when games still come in lower resolution on xbox one,i will be here quoting you.

So little you learn from last gen,720p minimum with 4XAA,4D 120FPS is like you can't learn,next game with sub HD graphics is getting you quoted you,hdcarraher and Stormyjoke.

@StormyJoe said:

Stupid - the SDK update improved the game performance. End of story. You lose... again.

Yes, he did - you own f*cking link to his comments said so. Do you do selective reading?

It's won't last forever - Christ, more an more games are going to 1080p on Xb1. What the f*ck is the matter with you, do you even read game articles???

Yeah on a game that should have been 1080p since day 1,even a freaking 7750 ran it at 1080p so i loss shit and you holding to Diablo 3 and Destiny was a total joke you loss and after those 2 a barrage of sub 1080p games have hit so in other words resolution gate continues contrary to what you claimed.

Dude more and more games continue to be sub HD on xbox idiot where the fu** have you been.?

Recently.

Project Cars.

Batman

BFH

The witcher 3

F1 2015

Mortal Kombat X

Dying Light

This games all released this year so far all inferior resolution wise on xbox one one is even 720p,which should not even exist since the SDK that make Diablo 3 and SE3 1080p should have do the same by your logic....lol

@nyadc said:

Damn tormentos is getting destroyed from every possible DirectXion.

There is not even one here which can do that.

@04dcarraher said:

@tormentos: Your so full of crap

This is all i will say to you you blind biased lemming....

CPUs with four or more cores need not apply -- Alien: Isolation is seeking an affordable dual-core Core i3 processor but will also accept lowly Celeron, FX or Phenom II chips. Of course, that could change when running either SLI or Crossfire, but throwing that kind of GPU power at Alien: Isolation is a waste.

This is what Techspot say about the crappy game you call CPU intensive you idiot,as you can fu**ing see even a damn Celeron g1820 is enough to feed a damn R290X at 89FPS just 17 Frame shit of the top of the line CPU on the test,even a my crappy FX6350 run it 7 frames behind a CPU which cost 3 times what mine cost what the fu**...

And tell me that celeron has a better time feeding that R9 290X than the XBO 6 core 1.75ghz has feeding a damn 7770 like GPU so i can completely and utterly tag you as a total loss case and an utter fanboy who argue with me just to go again the current.

http://www.techspot.com/review/903-alien-isolation-benchmarks/page5.html

AI is not i repeat is not CPU intensive...

Just for fun which has bigger CPU impact i wonder,yet the PS4 with 6 cores beat the xbox one with 7 on a game that is more CPU intensive mind you this test is on a Titan X on 1080p.

@XanderZane said:

Hhhmm.. Konami didn't say anything about TIME fixing anything. They specifically said the NEW SDK is what provided the fix. So either you're a blind fanboy who didn't read the article or your damage controlling. Microsoft had already announced that DX12 & Win10 would launch to devs in July 2015.

http://wccftech.com/microsoft-windows-10-os-expected-launch-july-2015-reveals-amd-earnings-call/

Don't think it's that hard to put two and two together. Konami didn't say anything about they fixed their engine or optimized it better. I don't think DX12 is a simple call to an SDK, epecially when the hardware in the XB1 has never fully utilized it. Tomb Raider isn't using DX12 and neither is Forza 6 (which runs as 1080P 60fps with weather effects). So if this new SDK doesn't have DX12 on it as you trolls keep claiming, I can only imagine what will happen when it arrives. I'm sure we'll know the whole story before the year is over.

Is PR blackace nothing more.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#102  Edited By NyaDC
Member since 2014 • 8006 Posts

@tormentos said:

WALL OF BULLSHIT AND IGNORANCE

Hey check this out, nobody cares, you got served countless times in this thread, give up, you've lost.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 tormentos
Member since 2003 • 33784 Posts

@XanderZane said:

LOL!! Not BX4.

You are all the same,you NyaDC,B4X and Blackace man stop you have like 4 freaking accounts all at once..lol

@Zero_epyon said:

So now I'm a fanboy for being skeptical about an assumption? This would be the very first I've ever heard a simple SDK update make this big of a change to a game. So forgive me if I don't quite believe that they just changed a few calls here and there and magically made their game produce 1,336,320 more pixels at twice the rate. Think about that. That would mean that with the same treatment, a game already doing 1080p/60FPS can do around 1440p/120FPS just by using the SDK. That's MisterX stuff right there.

Now I'll say this again, I believe that Konami has been working on their engine all this time and were very close and the SDK pushed them over enough to achieve it. I'm not saying that the SDK did nothing, so I'm not damage controlling. I'm only putting things in perspective so that fanboys don't go foaming at the mouth expecting every game to have similar bumps.

Game engine improvements make a bigger difference than SDK improvements. This is a fact.

That link is for PC. Win10 features won't launch with PC Windows 10.

Win10 Xbox One

So it's legitimate speculation that DX12 is not present in the latest SDK, hence, if you read the article, the rep thanks the SDK, not DX12. You guys are the one interchanging the two and misleading other people.

The two games prove my point. Other games have no problems running at 1080p on the xbox one and some run at 60fps. It's due to their efficient engines, not a magical SDK.

This is like the 6th SDK the xbox one has receive if not more,and every time there is a new game that hit 1080p but after that the machine goes back to pump 900p games,is like SDK improvement work on 1 game and nothing more..lol

Rebellion say the same on February 2014 and again saying a new faster SDK allowed for 1080p then after their game came out several more 900p game hit and 720p as well,there is no fix for the xbox one problem and lemmings just can't grasp that.

@babyjoker1221 said:

Tormentos literally getting destroyed in this thread. I wouldn't be surprised if he fails to show his face in this thread again.

Lol at his thinking Project Cars is cpu intensive. The idiot has made that his bread and butter claim, but is completely oblivious to the fact that PC is gpu intensive.... not cpu. That's how racing games work. Lmao.

You showing your ass here after all the crap you invented a few days ago.? Hahahaaaaaaaaaaaaaaaaaa

But but but Saints Row run better on xbox one..lol

Do you still have some of that shit you smoked pass it over man....

Avatar image for StrongBlackVine
StrongBlackVine

13262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#104 StrongBlackVine
Member since 2012 • 13262 Posts

@XanderZane: PES has been confirmed 1080p, but not MGSV.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#105  Edited By tormentos
Member since 2003 • 33784 Posts

@nyadc said:

Hey check this out, nobody cares, you got served countless times in this thread, give up, you've lost.

The fact that you try to claim that i was serve make this even better B4X....

It just goes to show how ignorant and blind some of you are,but what can we expect from a hypocrite who downplay SF5 because it is on PC but doesn't downplay Gears collection which is on PC,360 and xbox one...lol

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#106  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@nyadc said:
@tormentos said:

WALL OF BULLSHIT AND IGNORANCE

Hey check this out, nobody cares, you got served countless times in this thread, give up, you've lost.

QFT

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#107 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@tormentos: Why do you keep claiming that Project Cars is an A.I. heavy game when trying to prove that A.I. and CPU power don't correlate? The races are capped to 20 cars on the consoles. That removes the bottleneck for the most part. 19 opponents is hardly an A.I. feat these days. Why do you think the number of NPC's in the PC versions of GTA V and AC Unity are so much greater? Because those things require CPU power.

Capping both consoles at 20 cars minimizes any CPU differences and allows the PS4's GPU to shine. It's as simple as that. But saying "Look the PS4 runs Project Cars better and has A.I. so that proves it's not CPU heavy" is silly.




Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#108  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@GoldenElementXL:

I mean look at cpu usage between what he thinks is more taxing vs another game.Totally ignoring reason why one game dont feed the gpu vs the other game.

We can see with the FX6300 on both games they do not tax the cpu to the brink. On the hexacore we can see a 45% average usage across all cores while project cars uses 57%. A whopping 12% difference in cpu usage. Even with quad core and 8 core is only around 15% difference in usage between the games. Now lets look at the Witcher 3 as an example of high cpu usage... quad core has 92% usage vs project car's 80%. that is near full tilt Then on 6 core its 86% with the Witcher when Car's 57%, now that is nearly 30% difference And with the eight core 69% usage vs the 51% with cars.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#109  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

This is like the 6th SDK the xbox one has receive if not more,and every time there is a new game that hit 1080p but after that the machine goes back to pump 900p games,is like SDK improvement work on 1 game and nothing more..lol

Rebellion say the same on February 2014 and again saying a new faster SDK allowed for 1080p then after their game came out several more 900p game hit and 720p as well,there is no fix for the xbox one problem and lemmings just can't grasp that.

SDK based improvements doesn't equal "plug-in a higher grade GPU" type improvements.

When it comes to extracting performance from hardware, nothing beats a higher grade GPU with fast/large size GDDR5 memory i.e. it's a straight forward programming model.

BradW has stated the complexity with tiling and iterative access with 32MB ESRAM vs GDC 2015's split rendering target that combines ESRAM and DDR3 for frame buffer workload. Xbox 360 is unable to handle this XBO workaround since it's ROPS are located in the EDRAM chip.

With each SDK improvement attempt, there's a method to improve performance and there's a programming learning curve to cross.

You can't grasp the programming model details.

Intel Celeron G1820 *is* a dual core Intel Haswell which has the same CPU core design as my Intel Core i7-4770K and i7-4790K. Intel is simply better CPU than AMD CPUs i.e. that how far AMD has fallen from Intel's standard. For the short term, AMD betting on Windows 10 to change their PC CPU fortunes.

On PC, NVIDIA driver has support for optional DX11.0 command list MT, hence enables some MT scaling.

From the AnandTech article: - On that note, for those of you who have been asking about support for D3D11 Driver Command Lists – an optional D3D11 feature that helps with multithreaded rendering and is NVIDIA’s secret sauce for Civilization V – AMD has still not implemented support for it as of this driver.

From MS website: - Command lists can be created on multiple threads — A command list is a recorded sequence of graphics commands. With Direct3D 11, you can create command lists on multiple CPU threads, which enables parallel traversal of the scene database or physics processing on multiple threads. This frees the main rendering thread to dispatch command buffers to the hardware.

Command Lists are purely a CPU core side thing, the difference is between letting the DX11 runtime cache the commands or letting the driver cache them and optimise them. This is what the AMD Radeon driver fails to support.

Significance of this omission is that the Radeon graphics driver would better be able to optimise this process for PC's Radeon GPUs.

Windows 10's D3D11onD3D12 layer forces command list MT support for AMD PC GPUs.

Xbox One doesn't have AMD Radeon HD R9-290X level GPU nor Intel Haswell class CPU, hence your entire post is flawed.

You are comparing oranges to apples.

Your "sub-HD" for XBO is bullshit

1280x720p = entry level HD

1920x1080p = full HD

Against your points.

1. Does your claim prove DX12 doesn't exist for XBO?

2. Does your claim prove XBO doesn't have any improvements from DX12?

3. What is Wardell's game type?

4. Your equality assignment is flawed.

because you claimed the 7770 didn't represent the xbox one because of bandwidth and wanted to use the firepro 5000,which doesn't represent the xbox one either as it has 32ROP

Hence why I posted ROP workaround via TMU usage from GDC 2014.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#110 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

Let's not get things twisted though folks. The Xbox One is NOT getting stronger. Microsoft is just taking some of the juice from Kinect functionality and giving it back to devs for games. DX12 has little to nothing to do with this.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#112  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@GoldenElementXL said:

Let's not get things twisted though folks. The Xbox One is NOT getting stronger. Microsoft is just taking some of the juice from Kinect functionality and giving it back to devs for games. DX12 has little to nothing to do with this.

Its abit more than that DX12 is fixing DX11.X's flaws that was carried over from DX11, and is making the console more efficient with the resources it has. Which means those savings can be put back into the fold increasing performance over its current API. Between more refined tools for esram and cpu side optimizations from DX12 and it opening up the gpu's previous denied area from devs.

This new API will allow game devs not to be forced to use old and outdated API methods, making games better while allowing PS4 not to be held back from X1 cpu side short comings from current API limits. Everyone will be on the same page finally.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#113  Edited By ronvalencia
Member since 2008 • 29612 Posts

@GoldenElementXL said:

Let's not get things twisted though folks. The Xbox One is NOT getting stronger. Microsoft is just taking some of the juice from Kinect functionality and giving it back to devs for games. DX12 has little to nothing to do with this.

Based on TC's post, a jump from 720p to 1080p via SDK improvements indicates ESRAM handling improvements.

GDC 2015 has shown 1920x1080p frame buffer render split workaround i.e. combines ESRAM and DDR3 as a common frame buffer target.

There's a PC version for the above demo program sample without XBO's split render workaround.

Removing Kinect's resource is minor since it's only 10 percent extra resources.

The ESRAM tilting workaround requires sufficient changes to the program's structure.

Avatar image for nyadc
NyaDC

8006

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 5

#114 NyaDC
Member since 2014 • 8006 Posts

@GoldenElementXL said:

Let's not get things twisted though folks. The Xbox One is NOT getting stronger. Microsoft is just taking some of the juice from Kinect functionality and giving it back to devs for games. DX12 has little to nothing to do with this.

Technically it is, there are base hardware specifications and capabilities and then there is the software attached to it. No the Xbox One GPU will never be as powerful as the GPU in the PlayStation 4, HOWEVER, and this is a very big however. Your hardware can only be utilized to the degree that your software allows, Microsoft can add better tools, and refine their SDK's and software package over time to make it dramatically more efficient than the PlayStation 4 thus utilizing the hardware in the Xbox One to a further degree.

In essence they may never hit true parity however the gap can become dangerously close via software, and Microsoft is the leading software company on Earth so they are most certainly capable. That is where Sony falls behind, they are hardware wizards however their software has generally been quite sloppy and rudimentary in comparison.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#115  Edited By tormentos
Member since 2003 • 33784 Posts

@GoldenElementXL said:

@tormentos: Why do you keep claiming that Project Cars is an A.I. heavy game when trying to prove that A.I. and CPU power don't correlate? The races are capped to 20 cars on the consoles. That removes the bottleneck for the most part. 19 opponents is hardly an A.I. feat these days. Why do you think the number of NPC's in the PC versions of GTA V and AC Unity are so much greater? Because those things require CPU power.

Capping both consoles at 20 cars minimizes any CPU differences and allows the PS4's GPU to shine. It's as simple as that. But saying "Look the PS4 runs Project Cars better and has A.I. so that proves it's not CPU heavy" is silly.

The races are what.?

As established in our earlier analysis, PS4 does also have an advantage in performance. To re-cap quickly, Project Cars' read-out is typically 60fps in its career mode, with tearing and drops below 50fps once rain kicks in. Stress-tests also show a PS4 advantage once 30+ cars are engaged, while Microsoft's platform takes a bigger hit on hectic races with heavy alpha effects. Tearing is constant in these 30-40fps stress-test scenarios, but in the interest of keeping render times as close as possible to the 16.67ms target, dropping v-sync helps to keep the visual update as rapid as possible - if at a cost to image quality.

Bumping the car count up to 45 is an excellent stress test for both PS4 and Xbox One. Using the solo mode to push for maximum cars on the track, Sony's machine is just ahead on the 24 Hours of Le Mans circuit to start - but it's clear both struggle when rain effects are also introduced.

The cap to 20 is early on the game,you can run with more than 20 cars is not capped at 20 you are miss inform body.

Loading Video...

This is running on xbox one and PS4 with 45 cars as you can see the hit to performance is huge,and is not only on consoles in fact console have less impact some how that PC.

As you can see here same number of cars same setting but the i3 is producing 16FPS less in that 750 TI this game is CPU bound on PC as well,and i just proved that,in fact 16FPS is the hit on Alien Isolation going from a i7 4770k to a G1820 celeron,if you go from i7 4770k to i3 the difference is 2 frames.

So yeah this game is CPU intensive.

Looking at the Nvidia side, the full-fat Project Cars experience is very much designed with top-end machines in mind, and even our standard test setup struggles to go all-out. A top-end card like the GTX 780 Ti or GTX 980 (backed by a Core i7-3770K PC and 16GB of RAM) shows a 60fps return at high settings during clear weather - though just like the console editions it drops as soon as rain effects come into play. The frame-rate is cleaved in half to 30fps on a soaked Azure Circuit here, especially once all 44 rival cars are in view. Dropping all settings to high and reining the car count in to a more palatable 20 vehicles brings us back into locked 60fps territory. It's a compromise that sits well with the career mode, especially as it encourages races of this size by default.

This game is CPU bound and it shows from miles away denying it only make those who pretend to know about hardware look stupid no other racer use this many cars the impact is huge with 44 cars even on PC.

However, there's scalability here too if you need it - using approximate console quality settings, a GTX 750 Ti can produce very similar results to the PS4 game. However, a good amount of CPU power is required to run the game - a Core i5 is needed on vehicle-heavy races to hit 60fps [UPDATE 5:58pm:

We've updated this article with revised CPU testing: our test rigs were altered after yesterday's DX12 benching - the reality is that a Core i3 processor clearly struggles on packed races in Project Cars - the game is CPU-bound here, even with a GTX 750 Ti].

http://www.eurogamer.net/articles/digitalfoundry-2015-project-cars-face-off

Don't take my word the game is CPU bound period it is and that screen i showed you now of the i5 vs the i3 show it,16 more frames juts by having an i5,funny enough the 750ti with an i3 fail to beat the PS4 which is odd as we know the i3 is more capable than the Jaguar inside the PS4,maybe this game benefit more from having more cores than higher speed and efficiency of the i3.

Alien Isolation

i3 + R9 290X = 104 FPS

i7 4770k + R9 290X = 106 FPS

Project Cars

i3+ 750Ti = 34FPS

i5+ 750Ti = 50FPS

The end result is simple an i3 is enough to feed a R290X GPU which is endless miles ahead of the 750Ti on Alien Isolation.

But that same i3 isn't enough to feed a 750Ti on Project Cars and suffer frame wise for it.

You and the other fool carraher can deny this but you will look mighty stupid denying it,i just destroy not only your post but his to when you prove to me that there is no drop using a i3 on Project cars you will have a point other wise i just ended this argument with my proof.

@04dcarraher said:

@GoldenElementXL:

I mean look at cpu usage between what he thinks is more taxing vs another game.Totally ignoring reason why one game dont feed the gpu vs the other game.

We can see with the FX6300 on both games they do not tax the cpu to the brink. On the hexacore we can see a 45% average usage across all cores while project cars uses 57%. A whopping 12% difference in cpu usage. Even with quad core and 8 core is only around 15% difference in usage between the games. Now lets look at the Witcher 3 as an example of high cpu usage... quad core has 92% usage vs project car's 80%. that is near full tilt Then on 6 core its 86% with the Witcher when Car's 57%, now that is nearly 30% difference And with the eight core 69% usage vs the 51% with cars.

What do we have here.?

And i3 bottlenecking a 750Ti in project cars now say this isn't true you blind biased MS suck up.? Come on say it is not true prove that the i3 is not bottle necking this ^^ GPU ass come one.?

I want to see how far you will go to try to beat me fool,fact is a simple change of CPU to an i5 boosted frames 47% on the same damn 750ti now this is a low end GPU by today's standards.

So an i3 is enough to feed a R290X and blast 104FPS on Alien Isolation just 2 frames shy of the i7 4770k which does 106 FPS on the same game,yet that same i3 can't get a 750Ti to 50FPS it hovers over the low 30's when a i5 boost the same GPU 16 FPS more.

Now you even bring the Witcher and dropped AI...lol I guess this is the part where you loss and don't have the guts to admit it.

AI is not a CPU bound game and i quote a tech site on it,i even showed how an i3 can be 2 frames apart from a much better and powerful CPU from the same brand,but i am sure you will again try to spin my point and deny what is undeniable that Alien Isolation real problem was been a cash grab on 5 platforms and not the CPU at all.

@ronvalencia said:

@tormentos:

SDK based improvements doesn't equal "plug-in a higher grade GPU" type improvements.

When it comes to extracting performance from hardware, nothing beats a higher grade GPU with fast/large size GDDR5 memory i.e. it's a straight forward programming model.

BradW has stated the complexity with tiling and iterative access with 32MB ESRAM vs GDC 2015's split rendering target that combines ESRAM and DDR3 for frame buffer workload. Xbox 360 is unable to handle this XBO workaround since it's ROPS are located in the EDRAM chip.

Come on be a man and admit that Brad Wardell sold a lie that he was doing PR for MS and DX12,you can't erase what he claimed.

If you can't grasp what PR is and how it is done you are beyond help no matter how many years you claim to study or what you are just been a fool for blinding believing in what this dude once claim and most of your arguments are based on Brad Wardell comments about DX12.

"XBox One is the biggest beneficiary; it effectively gives every Xbox One owner a new GPU that is twice as fast as the old one."

This is what he claimed dude you can't erase that,that was for DX12 unveiling,he clearly stated that the xbox one was the biggest beneficiary and that xbox one owners would get a GPU twice as fast,i don't know about you but for me twice as fast mean double speed wise.

But that was on April 2014.

“So it’s pretty easy for me to say yes you’ll get a huge impact on PC, but on the console it’s all a theory. They have nothing, they don’t even know. I mean I’ve talked to the development team there on this subject for a while and it basically boils down to, we don’t know how much of an effect it will have because so much of it is in the hands of the developer.”

http://gamingbolt.com/wardell-explains-microsofts-silence-regarding-xbox-one-dx12-benchmarks#jLLL49DKeMc3BQht.99

But on an article 1 year latter where sites like gaming bolt were asking the same question i have been asking about why there isn't a demo of DX12 on xbox one,he state that on the console DX12 are in theory,and that not even MS know what the improvement will be,he say they have nothing which i am sure is a total lie as well they know which is the reason no benchmark exist on XBO because most of DX12 features are already on xbox one because it is a freaking consoles,and DX12 what bring to PC is console like optimization,so DX12 on xbox one is basically redundant is almost all there with a few things missing that will change nothing.

That is a total bullshit response "'We don't know how much of an effect it will have because so much of it is in the hand of the developer"" Well shoot me that is the case with all fu**ing games on PC or consoles saying DX12 will not shine as good on XBO as on console because it is up to the developers is a joke the same is true for PC where developers have always trail consoles in use of efficient API and multicore coding,taking advantage of DX12 on PC or console is up to the developers and is a sad excuse use on xbox one to justify the lack of any benchmark or real number gain,do you know how stupid you sound every time you say it is a case by case thing,because that is what happen to all game on all platforms if not all would be 4K with 240FPS or better but not all game perform the same because it is a case by case in all games this apply to PC to.

Brad Warder was doing PR for MS and he backtrack from his double performance crap to no one knows it is a theory..lol

He sure looked pretty secure when he claimed the xbox one would be the biggest beneficiary and when he claimed that xbox fans would get a GPU twice as fast....lol

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#117 StormyJoe
Member since 2011 • 7806 Posts

@tormentos: Do you think tuning a game to be 1080p is a flick of the switch thing? There's some magic class out there with a method like "MyRender.Render1080p"?

The last SDK came out a few months ago. Dx 12 isn't even on XB1 yet.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#118  Edited By Zero_epyon
Member since 2004 • 20103 Posts

@StormyJoe said:

@tormentos: Do you think tuning a game to be 1080p is a flick of the switch thing? There's some magic class out there with a method like "MyRender.Render1080p"?

The last SDK came out a few months ago. Dx 12 isn't even on XB1 yet.

I'm glad to read this! People think that DX12 was in the last SDK and it's solely responsible for making these engines run well, when the developers didn't even say this. The engine has to be optimized as well and that's what they've been doing. The SDK updates just made it easier for them to reach their target. Says a lot about how bad their engine was, as opposed to how good the Xbox One is.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#119 StormyJoe
Member since 2011 • 7806 Posts

@tormentos said:
@nyadc said:

Hey check this out, nobody cares, you got served countless times in this thread, give up, you've lost.

The fact that you try to claim that i was serve make this even better B4X....

It just goes to show how ignorant and blind some of you are,but what can we expect from a hypocrite who downplay SF5 because it is on PC but doesn't downplay Gears collection which is on PC,360 and xbox one...lol

Proving once again that you cannot just admit when you are incorrect.

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#120  Edited By Zero_epyon
Member since 2004 • 20103 Posts

@nyadc said:
@GoldenElementXL said:

Let's not get things twisted though folks. The Xbox One is NOT getting stronger. Microsoft is just taking some of the juice from Kinect functionality and giving it back to devs for games. DX12 has little to nothing to do with this.

Technically it is, there are base hardware specifications and capabilities and then there is the software attached to it. No the Xbox One GPU will never be as powerful as the GPU in the PlayStation 4, HOWEVER, and this is a very big however. Your hardware can only be utilized to the degree that your software allows, Microsoft can add better tools, and refine their SDK's and software package over time to make it dramatically more efficient than the PlayStation 4 thus utilizing the hardware in the Xbox One to a further degree.

In essence they may never hit true parity however the gap can become dangerously close via software, and Microsoft is the leading software company on Earth so they are most certainly capable. That is where Sony falls behind, they are hardware wizards however their software has generally been quite sloppy and rudimentary in comparison.

Sony isn't sitting around doing nothing with their SDK. On top of that, Sony's SDK has been praised by developers as a very efficient, most close to metal SDK there is for consoles. They are certainly not behind in this regard.

Here's an example:

http://wccftech.com/ps4-api-graphics-programmers-love-specific-gpu-optimizations-improve-performance/

Of course SDK's are going to improve, but it's not exclusive to one company.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#121  Edited By tormentos
Member since 2003 • 33784 Posts

@nyadc said:

Technically it is, there are base hardware specifications and capabilities and then there is the software attached to it. No the Xbox One GPU will never be as powerful as the GPU in the PlayStation 4, HOWEVER, and this is a very big however. Your hardware can only be utilized to the degree that your software allows, Microsoft can add better tools, and refine their SDK's and software package over time to make it dramatically more efficient than the PlayStation 4 thus utilizing the hardware in the Xbox One to a further degree.

In essence they may never hit true parity however the gap can become dangerously close via software, and Microsoft is the leading software company on Earth so they are most certainly capable. That is where Sony falls behind, they are hardware wizards however their software has generally been quite sloppy and rudimentary in comparison.

Technically you are a fool B4X.

Sony is giving MS a run for its money API wise it did with the PS3 to,sony tools are not tied to DX over head and legacy crap,which is why the PS4 from day 1 was doing async compute while the xbox one wasn't,software will not win this battle for MS as sony is even more efficient that MS getting every ounce of power out of the unit,do you think the PS4 will remain static and not progress just because you don't see sony making a parade about PS tools doesn't mean that SDK don't get updated.

There is no way you can use the xbox one hardware to a further degree than the PS4 because to the degree you can use the PS4 exist the xbox one limits and you can't pass those period the xbox one has a theoretical performance that it can't cross the PS4 one is higher off the shot so they just can't.

Oh please spare me the whole MS is the leading software on earth Mantle has been here for close to 2 years now and DX12 is not,sony has been using async shaders and MS hasn't even started but but but the leading company in the world no they are the biggest making OS which is different.

The funny thing is this gen Sony has give MS a run for its money software wise,to the point where MS has release like 7 SDK and still trail sony badly...lol

@StormyJoe said:

@tormentos: Do you think tuning a game to be 1080p is a flick of the switch thing? There's some magic class out there with a method like "MyRender.Render1080p"?

The last SDK came out a few months ago. Dx 12 isn't even on XB1 yet.

“We haven’t had the time to get deep into it yet, but there are some upcoming events we’re going to participate that will show some of the potential of it,” game’s director Andrea Basilio said to GamingBolt. “It should be noted that the custom DX11 API of Xbox One partly incorporate some future DX12 features already, and they’re mostly performance related.”

“So what we expect is a slight benefit for Xbox One, and some major performance boosts on PC hardware, making the overall development pipeline more similar and tighter.”

Read more at http://gamingbolt.com/ride-dev-dx12-will-bring-in-slight-benefit-for-xbox-one-current-api-already-has-some-dx12-features#iEkqAJM4LdmRoIAG.99

For god sake stop been a fool,DX12 features related to performance are already inside the xbox one,things like Bundles were even use on Forza 5 which was a launch game.

If you have performance related features already in when you bring those on the full release they will do shit because they are already there.

I proved my point is not the first time a developer claim 1080p is achieve because of an SDK only to continue seeing 900p games and even lower,it is time for you to grasp that this 2 GPU on PC have like 20+ frame per second difference and that running optimal performance both at it very best will still have a 20+ FPS gap that can only be close by giving up details or resolution period there is no other way and no SDK will close resolution gate.

Now that say all XBO games can match the PS4 resolution wise,but the question is will developer be willing to sacrifice frames like Tomb Raider,Sniper Elite 3 and several other have done.?

Because some times not even a resolution drop is enough,BF4,Project Cars, and several others have prove that,the only way for parity is pay parity $$$$$$$$$$$$$$$$ which is what brought parity to AC unity.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#122 tormentos
Member since 2003 • 33784 Posts

@StormyJoe said:

Proving once again that you cannot just admit when you are incorrect.

So can you point me to were i was incorrect.?

@Zero_epyon said:

Sony isn't sitting around doing nothing with their SDK. On top of that, Sony's SDK has been praised by developers as a very efficient, most close to metal SDK there is for consoles. They are certainly not behind in this regard.

Here's an example:

http://wccftech.com/ps4-api-graphics-programmers-love-specific-gpu-optimizations-improve-performance/

Of course SDK's are going to improve, but it's not exclusive to one company.

Oh they lack the intelligence to see that,they believe that only MS can improve its SDK is like last gen didn't even existed when sony lower its os memory footprint from 125MB on launch to about 50MB,giving more memory to games and improving performance making spurs which watch how loaded SPE were and tracking performance always,sony has the ICE team working on SDK,but is like they don't know it or ignore it on purpose.

If there is a company of the 2 which i would pick to better exploit the same hardware under the same specs it would be sony,the PS3 officially had a weaker GPU and they manage to outshine the xbox 360 graphics wise,because its CPU was very capable and that was when the PS3 was hell the worse by far console sony had for coding was the PS3.

The PS4 should give even bigger results.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#123  Edited By deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@tormentos said:

What do we have here.?

And i3 bottlenecking a 750Ti in project cars now say this isn't true you blind biased MS suck up.? Come on say it is not true prove that the i3 is not bottle necking this ^^ GPU ass come one.?

I want to see how far you will go to try to beat me fool,fact is a simple change of CPU to an i5 boosted frames 47% on the same damn 750ti now this is a low end GPU by today's standards.

So an i3 is enough to feed a R290X and blast 104FPS on Alien Isolation just 2 frames shy of the i7 4770k which does 106 FPS on the same game,yet that same i3 can't get a 750Ti to 50FPS it hovers over the low 30's when a i5 boost the same GPU 16 FPS more.

Now you even bring the Witcher and dropped AI...lol I guess this is the part where you loss and don't have the guts to admit it.

AI is not a CPU bound game and i quote a tech site on it,i even showed how an i3 can be 2 frames apart from a much better and powerful CPU from the same brand,but i am sure you will again try to spin my point and deny what is undeniable that Alien Isolation real problem was been a cash grab on 5 platforms and not the CPU at all.

Where do I begin.......

1 - If you are arguing that Project Cars is a CPU intensive game, then why does your first line say "i3 bottlenecking a 750Ti?" That would make Project Cars a GPU intensive game but the CPU can't get out of it's way. You are confusing yourself

2 - The i3 4130 is a low end, budget priced dual core processor. Project Cars requires a quad core CPU. Could this be the reason????????????????????????????

3 - You are completely failing to prove that Project Cars is a CPU heavy game. What you are showing with all of these screenshots and benchmarks is that the i3 4130 can bottleneck any GPU, not just high end ones. This isn't the point you are trying to make and yet you are acting like you're right.

Avatar image for StormyJoe
StormyJoe

7806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#124 StormyJoe
Member since 2011 • 7806 Posts

@tormentos: Some features were, but certainly not all. If it were all, then it'd be DX 12.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#125 04dcarraher
Member since 2004 • 23829 Posts

@GoldenElementXL said:
@tormentos said:

What do we have here.?

And i3 bottlenecking a 750Ti in project cars now say this isn't true you blind biased MS suck up.? Come on say it is not true prove that the i3 is not bottle necking this ^^ GPU ass come one.?

I want to see how far you will go to try to beat me fool,fact is a simple change of CPU to an i5 boosted frames 47% on the same damn 750ti now this is a low end GPU by today's standards.

So an i3 is enough to feed a R290X and blast 104FPS on Alien Isolation just 2 frames shy of the i7 4770k which does 106 FPS on the same game,yet that same i3 can't get a 750Ti to 50FPS it hovers over the low 30's when a i5 boost the same GPU 16 FPS more.

Now you even bring the Witcher and dropped AI...lol I guess this is the part where you loss and don't have the guts to admit it.

AI is not a CPU bound game and i quote a tech site on it,i even showed how an i3 can be 2 frames apart from a much better and powerful CPU from the same brand,but i am sure you will again try to spin my point and deny what is undeniable that Alien Isolation real problem was been a cash grab on 5 platforms and not the CPU at all.

Where do I begin.......

1 - If you are arguing that Project Cars is a CPU intensive game, then why does your first line say "i3 bottlenecking a 750Ti?" That would make Project Cars a GPU intensive game but the CPU can't get out of it's way. You are confusing yourself

2 - The i3 4130 is a low end, budget priced dual core processor. Project Cars requires a quad core CPU. Could this be the reason????????????????????????????

3 - You are completely failing to prove that Project Cars is a CPU heavy game. What you are showing with all of these screenshots and benchmarks is that the i3 4130 can bottleneck any GPU, not just high end ones. This isn't the point you are trying to make and yet you are acting like you're right.

lol he's an idiot..... i3 is a freaking dual core and project cars uses deferred rendering aka using more than sole cpu core to handle gpu tasks. And of course he totally ignore the cpu usage charts I posted how nice....... since it shows quad cores using 75-80% of a quad core cpu. So what happens when your throwing a dual core into it... hmm magic?

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#126 tormentos
Member since 2003 • 33784 Posts

@GoldenElementXL said:

Where do I begin.......

1 - If you are arguing that Project Cars is a CPU intensive game, then why does your first line say "i3 bottlenecking a 750Ti?" That would make Project Cars a GPU intensive game but the CPU can't get out of it's way. You are confusing yourself

2 - The i3 4130 is a low end, budget priced dual core processor. Project Cars requires a quad core CPU. Could this be the reason????????????????????????????

3 - You are completely failing to prove that Project Cars is a CPU heavy game. What you are showing with all of these screenshots and benchmarks is that the i3 4130 can bottleneck any GPU, not just high end ones. This isn't the point you are trying to make and yet you are acting like you're right.

1-WTF,,,hahahaaaaaaaaaaaaaaaaaaaaaaaaa

First the 750Ti been bottleneck by a i3 doesn't mean the game is GPU intensive,in mean that the CPU isn't allowing the GPU to reach its peak that is what a CPU bottleneck is,as you can see on the screen which you totally ignored the same GPU is running with a i5 and runs at 50FPS what does that say that the game no matter what GPU you have will have problems if you don't have certain CPU that = CPU bottleneck since the number of cars of AI cars is big it bottleneck the GPU.

2-Well that is the point in proving it is a CPU intensive game, you are going in circles here,since the game requires a stronger CPU to work well,which isn't the case with Alien Isolation which even a Celeron will blast it at 90FPS while that same i3 will blats it at 104FPS just 2 frames from an i7 4770k AI is so easy to run that an i3 which is low end will run it 2 frames from a high end CPU.

3-I just prove it and you are an idiot,be on denial fool fact is to run PC well you need at least an i5 and that is on a 750ti i don't want to imaging feeding a R290X or 780GTX.

Apparently you don't know how to read or simple you are so invested in proving me wrong that you will ignore the evidence presented to you,apparently you don't know what CPU bottleneck is.

AI runs at 106FPS on a R290X with a i7 4770k,on a i3 and the same R290X it runs at 104FPS that mean the i3 is not a bottleneck on AI for a GPU as strong as the R290X,but on Project Cars that same i3 can't even keep up with a 750ti,that is call CPU bottleneck.

@StormyJoe said:

@tormentos: Some features were, but certainly not all. If it were all, then it'd be DX 12.

That is the problem the ones in are mostly related to performance which mean when full DX12 those will have no effect on the xbox one because they are already there....

@04dcarraher said:

lol he's an idiot..... i3 is a freaking dual core and project cars uses deferred rendering aka using more than sole cpu core to handle gpu tasks. And of course he totally ignore the cpu usage charts I posted how nice....... since it shows quad cores using 75-80% of a quad core cpu. So what happens when your throwing a dual core into it... hmm magic?

That is a dual core CPU with hyper threading,it freaking beat 4 core AMD CPU's and even beat my FX6350 which is 6 cores so spare me the ridicule excuse you butthurt fanboy,AI is not CPU intensive Project cars is that is the point you your self just say it..

The fact than an i5 run 16 FPS faster say it all i quote Techspot on it AI is not CPU intensive and your grasping,you are so emotionally invested in beating me that you will ignore what ever is presented to you fact is even a Celeron is enough to power AI yeah do that on Project Cars with a celeron..lol

That same dual core run AI at 104 FPS just 2 frames from an i7 4770k....lol ahahahaahahaaaaaaaaaaaa

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#127  Edited By 04dcarraher
Member since 2004 • 23829 Posts

lol you are comparing an i5 vs an i3 and are stating that Cars is demanding when its only moderately taxing on intel quad cores. Project cars with intel cpu's that have 4+ threads get under utilized... You now saying that i3 is performing better than FX6350 how nice how about some real more demanding cpu demanding multithreaded games

I wonder why an i5 gets 15 fps more ......

4 real cores vs two cores split into 4 threads ..... that is not proof that Project car's AI is demanding..... Its a freaking racing game that has AI driven cars which every single racing has had since forever.....

As you can see its not that taxing on intel cpus and you can see that the game with intel cpu's beyond 4 threads sees no major usage after 4 threads.

Now even with i7 4770k at 2.5 ghz all the way to 4.5ghz sees minor differences in performance showing that the game isnt all that taxing on intel cpus because of AI lol...... average usage on that i7 2600k is a whopping 44%, while the witcher 3 with same 2600k averaged 65%, Now even with the i3 PC averaged 72% when Witcher averaged 92% while even with BF4 averaged 60% with i7 2600k and 94% on the i3.....

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#128 tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

lol you are comparing an i5 vs an i3 and are stating that Cars is demanding when its only moderately taxing on intel quad cores. Project cars with intel cpu's that have 4+ threads get under utilized... You now saying that i3 is performing better than FX6350 how nice how about some real more demanding cpu demanding multithreaded games

I wonder why an i5 gets 15 fps more ......

4 real cores vs two cores split into 4 threads ..... that is not proof that Project car's AI is demanding..... Its a freaking racing game that has AI driven cars on a track which has been used since forever.

As you can see its not that taxing on intel cpus and you can see that the game with intel cpu's beyond 4 threads sees no major usage after 4 threads.

Now even with i7 4770k at 2.5 ghz all the way to 4.5ghz sees minor differences in performance showing that the game isnt all that taxing on intel cpus because of AI lol

AI does 2 frames less on a i3 than on a i7,PC does 16 frames lass on a i3 than on a i5.

That should end the argument there period you are parading like a huge moron that AI is CPU intensive and that the PS4 is 30FPS because of that when in reality PC demands more CPU and runs faster than AI does on PS4.

That is because an i7 is more than enough for PC but an i3 is not idiot,unlike Alien Isolation were a budget i3 basically match frame per frame the performance of a much more expensive i7.

Is a freaking racing game that render 45 cars,how many games do that other than Project Cars you are rendering double or more the amount of cars other racers have.

Just admit you were wrong about AI and move on have some dignity.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#129 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

Yeah @tormentos, it's pretty much over. If the game was some CPU brute we would see that evidence here. And 8 frame difference with a 2 GHz clock difference........... You can go ahead and keep using your i3 (which is below min required specs) and an i5 all you want. This is apples to apples evidence. And Batman AK is far more CPU dependent.

Avatar image for clone01
clone01

29824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#130 clone01
Member since 2003 • 29824 Posts

okay.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#131  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@tormentos said:

AI does 2 frames less on a i3 than on a i7,PC does 16 frames lass on a i3 than on a i5.

That should end the argument there period you are parading like a huge moron that AI is CPU intensive and that the PS4 is 30FPS because of that when in reality PC demands more CPU and runs faster than AI does on PS4.

That is because an i7 is more than enough for PC but an i3 is not idiot,unlike Alien Isolation were a budget i3 basically match frame per frame the performance of a much more expensive i7.

Is a freaking racing game that render 45 cars,how many games do that other than Project Cars you are rendering double or more the amount of cars other racers have.

Just admit you were wrong about AI and move on have some dignity.

Quite being an idiot, suggesting that Cars is more demanding since an i5 performs better than an i3... That is just plain stupid when the performance is the same principle as taking an i5 downclocking to half the clockrate of the i3 of the same architecture with a game that makes use of 4 threads or more.

The only reason why AI might be abit more intensive on the consoles is because their using crappy cpus clocked around 1.6ghz. Suggesting that Cars is cpu intensive on pc using an i3 vs an i5 is just plain dumb. Since your dealing with a cpu with 4 real cores vs a cpu with only two cores that has two virtual cores splitting up the workloads. Which means that the i3 has 25% less processing power per thread vs an i5.

What is funny is that your overlooking and ignoring the facts at hand. Project cars only uses 4 threads on intel cpus to any moderate degree which means that an i3 will perform within 25% of an i5 or i7. For example i5 4670k gets 60 fps while i3 4330gets 48 fps with Cars with Titan X.

Alien Isolation is a game that is not as cpu intensive but looking the fact that the engine only used one core handling all gpu work which is why we seen i3 perform nearly as well as an i7 because real core vs real core they have similar processing power.

Again i3 is able to get 134 fps vs i5's 166 still within that 25% difference. With the Witcher 3 we see nearly 40% difference between i3 and i5 while we see an FX 4300 perform virtually the same as an i3 4330 because that game is much more optimized and demanding on the cpu. Where we see the i3 4330 being 48% faster clock per clock only have two cores vs FX 4300 have four cores gets evened out when your using virtually all cpu resources.

The fact that with Project cars FX 4300 being 45% slower than i3 4330 with Titan X shows use that this game's cpu to gpu communications is unoptimized on Pc and should not be used as proof of AI being demanding.....

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#132  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

BradW's large scale RTS game type is the worst case for CPU bottleneck situation and Brad is talking about his game NOT somebody else's game.

Alien Isolation PC with 7770/7850 vs XBO/PS4 result shows a situation for a near 2X frame rate improvements. XBO/PS4's lower CPU overhead APIs wasn't able to make the result on par with PC version. We know XBO has DX11's deferred context MT model and PS4 has a near straight XBO port, hence the similar result.

Depending on workload, bottlenecks can swing between CPU to GPU. This is why serious PC gamers overkills on both CPU and GPU power i.e. throwing hardware and $$$$ at the problem.

Intel Haswell is Intel's Bulldozer i.e. each thread can have two integer unit/two instruction issue allocated, BUT a single thread can scale up to four integer units/four instruction issue. Intel Ivybridge has 3 integer units for two threads i.e. weaker hyper-threading capabilities.

AMD's Bulldozer design is missing the ability create "uber" single thread from it's four integer units with four instruction issue i.e. forever gimped to near Jaguar IPC which is worst than the old AMD Phenom II. AMD Excavator mitigates some of the Bulldozer's IPC issues, but it still behind Intel's solution.

For 15 watts budget(disregard IPC equality), AMD FX-8800p (two module Excavator) is competitive against Intel's 15 watts solution.

Intel CPU has instruction fusion feature for certain X86 instruction sequence into one instruction issue (extreme CISC or VLIW approach), hence increasing it's effective IPC.

Intel "Celeron" or "Core I" series are just marketing names and what's important is it's code name and it's actual design i.e. how many processing units and instruction issue rate per cycle.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#133  Edited By tormentos
Member since 2003 • 33784 Posts

@GoldenElementXL said:

Yeah @tormentos, it's pretty much over. If the game was some CPU brute we would see that evidence here. And 8 frame difference with a 2 GHz clock difference........... You can go ahead and keep using your i3 (which is below min required specs) and an i5 all you want. This is apples to apples evidence. And Batman AK is far more CPU dependent.

The game is CPU intensive fool.

You loss the argument already i proved how Project Cars bottleneck a i3 vs how Alien Isolation doesn't that was my point that is all about my argument period proved other wise prove to me that the 750TI wasn't been hold back by the i3 performance on Project cars.

This ^^^ is a CPU bottleneck period,same resolution quality and GPU but 16FPS difference just by using a stronger CPU yeah that 750ti is not getting enough juice from that i3 is a FACT not my opinion.

@04dcarraher said:

Quite being an idiot, suggesting that Cars is more demanding since an i5 performs better than an i3... That is just plain stupid when the performance is the same principle as taking an i5 downclocking to half the clockrate of the i3 of the same architecture with a game that makes use of 4 threads or more.

The only reason why AI might be abit more intensive on the consoles is because their using crappy cpus clocked around 1.6ghz. Suggesting that Cars is cpu intensive on pc using an i3 vs an i5 is just plain dumb. Since your dealing with a cpu with 4 real cores vs a cpu with only two cores that has two virtual cores splitting up the workloads. Which means that the i3 has 25% less processing power per thread vs an i5.

What is funny is that your overlooking and ignoring the facts at hand. Project cars only uses 4 threads on intel cpus to any moderate degree which means that an i3 will perform within 25% of an i5 or i7. For example i5 4670k gets 60 fps while i3 4330gets 48 fps with Cars with Titan X.

Alien Isolation is a game that is not as cpu intensive but looking the fact that the engine only used one core handling all gpu work which is why we seen i3 perform nearly as well as an i7 because real core vs real core they have similar processing power.

Again i3 is able to get 134 fps vs i5's 166 still within that 25% difference. With the Witcher 3 we see nearly 40% difference between i3 and i5 while we see an FX 4300 perform virtually the same as an i3 4330 because that game is much more optimized and demanding on the cpu. Where we see the i3 4330 being 48% faster clock per clock only have two cores vs FX 4300 have four cores gets evened out when your using virtually all cpu resources.

The fact that with Project cars FX 4300 being 45% slower than i3 4330 with Titan X shows use that this game's cpu to gpu communications is unoptimized on Pc and should not be used as proof of AI being demanding.....

I could care less what the load per core you claim it is you fu**ing moron,AIien Isolation on a fu**ing i3 does 104FPS on a freaking R290X it fu**ing does 2 less frames than a i7 4770k that mean there is not fu**ing bottleneck you fool on the CPU side this game is not intensive at fu**ing all man...

I even quoted the fu**ing site saying it the game is not CPU demanding,and you like a huge moron to fallow Ronvalencia say the same shit when in reality even a Celeron would not bottlenck by this game,this chart ^^ here fu**ing own your ass over period this game is not CPU intensive at all.

Now Project Cars is more CPU intensive and is intensive enough to bottleneck a 750ti if you use an i3,that mean the celeron running AI at 89FPS will work like total shit on Project Cars,the game is more CPU demanding than AI period is time you admit it and stop putting excuses like blind moron just because you are upset because i beat your ass over in this argument.

FACT.

Alien Isolation.

i3 + R290X = 104 FPS

i7 + R290X = 106 FPS

PS4 30FPS with drops when nothing is happening.

Xbox one sub 30 when it has a FASTER CPU.

Project cars.

i3 + 750Ti = 34 FPS

i5 + 750TI = 50 FPS

PS4 from 40's to 50's FPS 6 cores 1.6ghz

XBO 30's to 40's FPS 7 cores 1.75ghz

This prove without doubt that the only problem with AI is been a cash grab on 5 platforms,the game should have been 60FPS lock on PS4 with those 6 cores since a damn celeron is enough to move it at 90FPS on a GPU which require more CPU time to be feed.

Again you are grasping...

@ronvalencia said:

@tormentos:

BradW's large scale RTS game type is the worst case for CPU bottleneck situation and Brad is talking about his game NOT somebody else's game.

Alien Isolation PC with 7770/7850 vs XBO/PS4 result shows a situation for a near 2X frame rate improvements. XBO/PS4's lower CPU overhead APIs wasn't able to make the result on par with PC version. We know XBO has DX11's deferred context MT model and PS4 has a near straight XBO port, hence the similar result.

Depending on workload, bottlenecks can swing between CPU to GPU. This is why serious PC gamers overkills on both CPU and GPU power i.e. throwing hardware and $$$$ at the problem.

Intel Haswell is Intel's Bulldozer i.e. each thread can have two integer unit/two instruction issue allocated, BUT a single thread can scale up to four integer units/four instruction issue. Intel Ivybridge has 3 integer units for two threads i.e. weaker hyper-threading capabilities.

AMD's Bulldozer design is missing the ability create "uber" single thread from it's four integer units with four instruction issue i.e. forever gimped to near Jaguar IPC which is worst than the old AMD Phenom II. AMD Excavator mitigates some of the Bulldozer's IPC issues, but it still behind Intel's solution.

For 15 watts budget(disregard IPC equality), AMD FX-8800p (two module Excavator) is competitive against Intel's 15 watts solution.

Intel CPU has instruction fusion feature for certain X86 instruction sequence into one instruction issue (extreme CISC or VLIW approach), hence increasing it's effective IPC.

Intel "Celeron" or "Core I" series are just marketing names and what's important is it's code name and it's actual design i.e. how many processing units and instruction issue rate per cycle.

No Brad wardell wasn't talking about his game you fool when he claim xbox owners would get a GPU twice as fast neither when he claim the xbox one was the biggest beneficiary.

Then he backtrack because like me some site started asking what i have been asking for a long time,why are not any test on xbox one of DX12 vs DX11.X,the xbox one is MS main platform for gaming over PC and freaking surface.

He was asked and he backtracked from twice as fast GPU to no one knows what it will do on xbox one..lol

You are an idiot there are tons of games fu**ing broken on PC,look at Batman does that mean PC hardware is bad or bottleneck by Batman.? No it mean the developer did a crappy job on the game,AI CPU performance say it all an i3 will do 2 less frames than an i7 the game is not CPU intensive at all,while Project Cars the i3 bottle neck a 750ti yet Project Cars has faster frames on PS4 than AI does on PS4.

AI is not CPU intensive you are a thick headed moron,who will argue any crap just to prove some one wrong,it is clear the game is not CPU intensive and is clear it is a screw up game on both xbox one and PS4,if CPU was the problem the xbox one should have perform better it has a faster CPU period.

Not only that have you see DF test on AI frame drops when nothing at all is happening on both xbox one and PS4 this game is screw up not CPU limited and you are to arrogant to admit you chose the wrong game to show some how that the consoles were been CPU bottleneck,now you want to defend something you just can't because data show it is not.

That celeron is total shit is garbage and the i3 should spank it to hell and beyond,and it does even on AI it does 15FPS more and that game is not CPU intensive on Project Cars that G1820 would run like total garbage.

I leave you with this here look at the difference from an i3 to a i7 4770k same 2 CPU for Alien Isolation the gap in Project Cars is 23 Frames per second,yet on Alien Isolation the gap is 2 frames per second between both.

This should end the fight here if you continue to argue is because well you are a thick headed moron that can't admit be wrong.

Project Cars is more CPU intensive than Alien Isolation and yet it runs faster on PS4,AI should have been 60 FPS lock on PS4 but is not because it was a cash grab...

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#134 Zero_epyon
Member since 2004 • 20103 Posts

All this DX12 discussion is fine and all, but we still need to remember that DX12 did NOT make these gains and it was the devs doing changes to the engine that was aided by an SDK update from a few months ago. Crediting DX12 is misleading.

Avatar image for deactivated-5cf3bfcedc29b
deactivated-5cf3bfcedc29b

776

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#135 deactivated-5cf3bfcedc29b
Member since 2014 • 776 Posts

Someday Lemmings will come to terms with the fact the Xbox One will always be weaker than the PS4. If not, please stay on XBL :p

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#136  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

Using NVIDIA GPU with driver side command list (deferred) MT support example

I

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#137 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@tormentos So you are going to continue to use a CPU that doesn't meet the minimum specifications for Project Cars as an example of the game being CPU heavy........ Is it possible that the game was coded in a way that requires 4 or more cores to perform as intended? Who am I kidding? There is no way you are going to admit that.

After reading yet another idiotic post from you, I sat here and tried to figure out why you are trying to prove Project Cars is CPU intensive and why the PS4 version performing better than the Xbox One version is such a big deal. Then it hit me. Are you are still upset over AC Unity?


Now excuse me while I take my Titan Blacks out and put a 3DFX Voodoo 2 from 1998 in so I can see what games are GPU heavy.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#138  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@tormentos:

You keep proving the point you have no idea how things work... good job

Love how you keep on ignoring improper cpu to gpu communications on both alien iso and project cars on pc and still ignore alien iso's single threaded gpu usage on all platforms. and then ignoring the fact that i3 has half the processing of an i5.....

Then your mixing up project cars being coded correctly on each consoles API's abilities when in fact your seeing the gpu power difference between them....

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#139 tormentos
Member since 2003 • 33784 Posts

@GoldenElementXL said:

@tormentosSo you are going to continue to use a CPU that doesn't meet the minimum specifications for Project Cars as an example of the game being CPU heavy........ Is it possible that the game was coded in a way that requires 4 or more cores to perform as intended? Who am I kidding? There is no way you are going to admit that.

After reading yet another idiotic post from you, I sat here and tried to figure out why you are trying to prove Project Cars is CPU intensive and why the PS4 version performing better than the Xbox One version is such a big deal. Then it hit me. Are you are still upset over Now excuse me while I take my Titan Blacks out and put a 3DFX Voodoo 2 from 1998 in so I can see what games are GPU heavy.

AC Unity?

Say what.?

Minimum requirements :

CPU: 2.66 GHz Intel Core 2 Quad Q8400, 3.0 GHz AMD Phenom II X4 940

GRAPHICS: nVidia GTX 260, ATI Radeon HD 5770

MEMORY: 4Gb RAM, 1Gb VRAM

Recommended specification :

CPU: 3.5 GHz Intel Core i7 3700, 4.0 GHz AMD FX-8350

GRAPHICS: nVidia GT600 series, AMD Radeon HD7000 series

MEMORY: 8Gb RAM, 2Gb VRAM

Apparently you know shit about CPU i advice you to stay away of the topic so that you don't look even worse,the i3 4130 on that Project Cars benchmark beat both of the CPU on the minimum requirement up there the core 2 duo quad and the Phenom ll X4 as well.

That is a 4h gen Haswel CPU it launched on the end of 2013 and is a more advance i3 than previous version and more advance and efficient than the core duo quad so yeah that i3 certainly meet the minimum requirements.

That dual core haswell is hyperthreaded which mean it work more or less like it has 4 cores,when it has 2 it has 2 cores and can execute 2 threads per core,so is 2 core 4 threads.

In fact did you even look at the chart.? That i3 with 2 cores 3.4ghz beat my FX6350 which has 6 cores at 3.9ghz,and is 3 frames away from the FX8350 which is a 8 cores CPU..

So your excuse is invalid if the game was code to perform with more than 4 cores wouldn't the FX6350 beat it.? It has 6 cores.

You are an idiot who know shit about CPU, you just claim the i3 4130 doesn't meet the minimum spec of Project cars when in reality it exceed them,unless you wan to argue now that a core 2 duo quad 8400 is more efficient or better than a i2 4130 lol...

So just move alone and go pretend elsewhere fool.

@ronvalencia said:

@tormentos:

Using NVIDIA GPU with driver side command list (deferred) MT support example

I

What is your point and why you link to a SLI benchmark.?

@04dcarraher said:

@tormentos:

You keep proving the point you have no idea how things work... good job

Love how you keep on ignoring improper cpu to gpu communications on both alien iso and project cars on pc and still ignore alien iso's single threaded gpu usage on all platforms. and then ignoring the fact that i3 has half the processing of an i5.....

Then your mixing up project cars being coded correctly on each consoles API's abilities when in fact your seeing the gpu power difference between them....

You claimed Alien Isolation is CPU intensive and put it as the reason for the game been 30FPS on PS4.

Is not an i3 perform 2 frames from an i7 there is no fu**ing way in hell that the i7 is just 2 frames away in CPU intensive games who in his right mind would pay for an i7 if the i3 could actually almost match the performance an i7 has.

http://www.techspot.com/review/1006-the-witcher-3-benchmarks/page5.html

Oh look the i3 13 FPS away from the i7 3770.

Oh i think it is you who is ignoring that the game is not CPU intensive,the benchmarks have been posted an i3 is 2 frames shy of a i7 there is nothing intensive CPU wise about that game.

No the PS4 GPU isn't as powerful as to have 40% advantage in resolution and up to 40% faster frames to,the PS4 version of PC not only runs faster by up to 14 FPS is also 1080p vs the xbox one 900p version.

Avatar image for FireEmblem_Man
FireEmblem_Man

20248

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#140 FireEmblem_Man
Member since 2004 • 20248 Posts

@tormentos said:
@GoldenElementXL said:

@tormentosSo you are going to continue to use a CPU that doesn't meet the minimum specifications for Project Cars as an example of the game being CPU heavy........ Is it possible that the game was coded in a way that requires 4 or more cores to perform as intended? Who am I kidding? There is no way you are going to admit that.

After reading yet another idiotic post from you, I sat here and tried to figure out why you are trying to prove Project Cars is CPU intensive and why the PS4 version performing better than the Xbox One version is such a big deal. Then it hit me. Are you are still upset over Now excuse me while I take my Titan Blacks out and put a 3DFX Voodoo 2 from 1998 in so I can see what games are GPU heavy.

AC Unity?

Say what.?

Minimum requirements :

CPU: 2.66 GHz Intel Core 2 Quad Q8400, 3.0 GHz AMD Phenom II X4 940

GRAPHICS: nVidia GTX 260, ATI Radeon HD 5770

MEMORY: 4Gb RAM, 1Gb VRAM

Recommended specification :

CPU: 3.5 GHz Intel Core i7 3700, 4.0 GHz AMD FX-8350

GRAPHICS: nVidia GT600 series, AMD Radeon HD7000 series

MEMORY: 8Gb RAM, 2Gb VRAM

Apparently you know shit about CPU i advice you to stay away of the topic so that you don't look even worse,the i3 4130 on that Project Cars benchmark beat both of the CPU on the minimum requirement up there the core 2 duo quad and the Phenom ll X4 as well.

That is a 4h gen Haswel CPU it launched on the end of 2013 and is a more advance i3 than previous version and more advance and efficient than the core duo quad so yeah that i3 certainly meet the minimum requirements.

That dual core haswell is hyperthreaded which mean it work more or less like it has 4 cores,when it has 2 it has 2 cores and can execute 2 threads per core,so is 2 core 4 threads.

In fact did you even look at the chart.? That i3 with 2 cores 3.4ghz beat my FX6350 which has 6 cores at 3.9ghz,and is 3 frames away from the FX8350 which is a 8 cores CPU..

So your excuse is invalid if the game was code to perform with more than 4 cores wouldn't the FX6350 beat it.? It has 6 cores.

You are an idiot who know shit about CPU, you just claim the i3 4130 doesn't meet the minimum spec of Project cars when in reality it exceed them,unless you wan to argue now that a core 2 duo quad 8400 is more efficient or better than a i2 4130 lol...

So just move alone and go pretend elsewhere fool.

@ronvalencia said:

@tormentos:

Using NVIDIA GPU with driver side command list (deferred) MT support example

I

What is your point and why you link to a SLI benchmark.?

@04dcarraher said:

@tormentos:

You keep proving the point you have no idea how things work... good job

Love how you keep on ignoring improper cpu to gpu communications on both alien iso and project cars on pc and still ignore alien iso's single threaded gpu usage on all platforms. and then ignoring the fact that i3 has half the processing of an i5.....

Then your mixing up project cars being coded correctly on each consoles API's abilities when in fact your seeing the gpu power difference between them....

You claimed Alien Isolation is CPU intensive and put it as the reason for the game been 30FPS on PS4.

Is not an i3 perform 2 frames from an i7 there is no fu**ing way in hell that the i7 is just 2 frames away in CPU intensive games who in his right mind would pay for an i7 if the i3 could actually almost match the performance an i7 has.

http://www.techspot.com/review/1006-the-witcher-3-benchmarks/page5.html

Oh look the i3 13 FPS away from the i7 3770.

Oh i think it is you who is ignoring that the game is not CPU intensive,the benchmarks have been posted an i3 is 2 frames shy of a i7 there is nothing intensive CPU wise about that game.

No the PS4 GPU isn't as powerful as to have 40% advantage in resolution and up to 40% faster frames to,the PS4 version of PC not only runs faster by up to 14 FPS is also 1080p vs the xbox one 900p version.

LOL Tomatoes is having a meltdown! Where's your blog posts now?

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#141  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@tormentos said:

You claimed Alien Isolation is CPU intensive and put it as the reason for the game been 30FPS on PS4.

Is not an i3 perform 2 frames from an i7 there is no fu**ing way in hell that the i7 is just 2 frames away in CPU intensive games who in his right mind would pay for an i7 if the i3 could actually almost match the performance an i7 has.

http://www.techspot.com/review/1006-the-witcher-3-benchmarks/page5.html

Oh look the i3 13 FPS away from the i7 3770.

Oh i think it is you who is ignoring that the game is not CPU intensive,the benchmarks have been posted an i3 is 2 frames shy of a i7 there is nothing intensive CPU wise about that game.

No the PS4 GPU isn't as powerful as to have 40% advantage in resolution and up to 40% faster frames to,the PS4 version of PC not only runs faster by up to 14 FPS is also 1080p vs the xbox one 900p version.

lol you making crap up again

I pointed out that Project cars isnt as demanding as you think it is because your idiotic claims about an i3 vs i5, as proof showed cpu usage between the two games with an i3

why are you posting a benchmark using medium settings with a test run that is not demanding, and is linear type cut-scene.....

right from techspot bench run used:

"Using FRAPS we recorded 120 seconds of gameplay starting from the first time Geralt mounts Roach (his trusty steed) and rides toward a Griffin attacking a villager. The test ends when the Griffin flies away with the villager's horse."

Same test was done by wccftech as above and all cpu's tested performed the same since that scene hardly did anything besides render the three characters and the griffin

Later on...... when benchmarking

"

But once we do a tour of Novigrad City on horseback… well, then we see some big changes. This area can hit 80% utilization across all eight threads on a Core i7 4790K! The less powerful the CPU here, the more CPU stutter you encounter. The G3258 – overclocked to 4.5GHz – doesn’t work out too well.

Also, check out the performance of the FX-6300 and FX-8350 starting from 01:26. In the initial cut-scenes they fall a little short of the Core i3 and i5, but once we hit the heavy Novigrad area, they compete very nicely – the FX 6300 moves ahead of the i3, while the FX 8350 is very close to the 4690K.

So, is this CPU stutter found in the Novigrad stress test a cause for concern? Well, not really. Most in-game stutter is caused when the CPU – not the GPU – is the bottleneck (and that’s what we are testing here). As long as you pair your more budget-orientated CPU with an appropriate GPU, you’ll hit the GPU limit first – and typically that doesn’t cause stutter.

Across all three scenes, the benchmarks work out as follows (low/avg) but do check out the CPU stress test scene to get a better idea of latency and performance under load. A benchmark sums up performance with a single number, but real-life can be a bit more complicated than that:

Core i7 4790K – 65.0/84.4

Core i5 4690K – 52.0/79.2

Core i3 4130 – 38.0/68.9

G3258 at 4.5GHz OC – 27.0/63.8

FX 8350 – 56.0/75.2

FX 6300 – 45.0/69.6

"

Yeah so only 2 fps um yea no try again......

Your the one who using the flawed argument based on flawed and incorrect ideas that because an i3 performs worse than an i5 with an game that is known to have mediocre cpu coding.....

" Project CARS suffers from major CPU issues that will affect all PC gamers. In order to find out whether the game scales well on multiple CPU cores, we simulated a dual-core and a quad-core CPU. At first glance – and by simply looking at the Performance tab of the Task Manager – it appears that even our hexa-core system is being pushed to its limits. However, in this particular case, this tab cannot be trusted. As we can see in the next image, Project CARS used only 50% of our CPU."

Fact is that your ideas are flat out wrong and based on nothing but your twisted logic.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#142  Edited By deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@tormentos said:

So just move alone and go pretend elsewhere fool.



























Who's pretending, the one on top or the one on bottom?

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#143  Edited By ronvalencia
Member since 2008 • 29612 Posts

@tormentos:

The reason I used NVIDIA benchmark is for consistency with a particlar driver side feature support.

When talking about CPU issues, you can't compare with different GPU families.

Minimising the GPU bottlenecks via NVIDIA SLI, desktop PC CPU can scale with MT with Alien Isolation PC build. With the right DX11 driver support, desktop Intel CPU's single thread performance can still handle serailization from deferred context threads.

Most of your PC arguments has nothing to do with Alien Isolation XBO/PS4 build issues i.e. problems with the CPU side which is shown via PC's superior 7770 and 7850 results. If MS and Sony didn't disable AMD Jaguar's turbo overclock feature for narrow thread count workloads, it could have avoided the Alien Isolation XBO/PS4 results.

2.4Ghz Jaguar CPU thread yields about 150 percent improvements from 1.6Ghz.

12-to-34 watts AMD FX-8800p APU has 3.4Ghz turbo overclock from 2.4Ghz base clock which is better designed for DX11's multithreading model.

For PS4 and XBO GPU level, a single thread at 2.6Ghz and 1.6Ghz for other threads would be fine for DX11's deferred context threads being serialized into immediate thread model.

Both Sony and MS went towards X86 and they did NOT fully include desktop PC X86's feature set. My old Intel Core i7-740QM has 3.2 Ghz turbo overclock feature with 1.7 Ghz base clock i.e. Intel knows about DX11 multithreading model issue.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#144  Edited By tormentos
Member since 2003 • 33784 Posts

@ronvalencia said:

@tormentos:

The reason I used NVIDIA benchmark is for consistency with a particlar driver side feature support.

When talking about CPU issues, you can't compare with different GPU families.

Minimising the GPU bottlenecks via NVIDIA SLI, desktop PC CPU can scale with MT with Alien Isolation PC build. With the right DX11 driver support, desktop Intel CPU's single thread performance can still handle serailization from deferred context threads.

Most of your PC arguments has nothing to do with Alien Isolation XBO/PS4 build issues i.e. problems with the CPU side which is shown via PC's superior 7770 and 7850 results. If MS and Sony didn't disable AMD Jaguar's turbo overclock feature for narrow thread count workloads, it could have avoided the Alien Isolation XBO/PS4 results.

2.4Ghz Jaguar CPU thread yields about 150 percent improvements from 1.6Ghz.

12-to-34 watts AMD FX-8800p APU has 3.4Ghz turbo overclock from 2.4Ghz base clock which is better designed for DX11's multithreading model.

For PS4 and XBO GPU level, a single thread at 2.6Ghz and 1.6Ghz for other threads would be fine for DX11's deferred context threads being serialized into immediate thread model.

Both Sony and MS went towards X86 and they did NOT fully include desktop PC X86's feature set. My old Intel Core i7-740QM has 3.2 Ghz turbo overclock feature with 1.7 Ghz base clock i.e. Intel knows about DX11 multithreading model issue.

Dude when you prove to me the PS4 has an SLI set up that will be the moment when your post make sense.

AI is not CPU intensive all benchmarks show it,even a celeron pass 80 frames which is more than great.

PC is more CPU intensive and run faster on PS4 and xbox one than AI.

@GoldenElementXL said:
@tormentos said:

So just move alone and go pretend elsewhere fool.

Who's pretending, the one on top or the one on bottom?

Wait so because you can trow money away for a great set up that make you a CPU expert.? You don't need to be a computer engineer to buy a great set up buffoon all you need is money,now getting back to my point i completely tear you a new one you claimed the i3 4130 didn't meet the minimum requirements for Project Cars when in fact it exceed it is a more powerful and efficient CPU than those on the minimum requirements.

I owned your ass over and the best thing you can do about it is brag about your expensive set up...hahahahaa

You claimed that PC may be code to use 4 cores when in reality that i3 does 4 threads and is enough to out do CPU with 6 cores.

Yeah go flash your PC elsewhere you are a fool and proven wrong already.

@04dcarraher said:

lol you making crap up again

I pointed out that Project cars isnt as demanding as you think it is because your idiotic claims about an i3 vs i5, as proof showed cpu usage between the two games with an i3

why are you posting a benchmark using medium settings with a test run that is not demanding, and is linear type cut-scene.....

right from techspot bench run used:

"Using FRAPS we recorded 120 seconds of gameplay starting from the first time Geralt mounts Roach (his trusty steed) and rides toward a Griffin attacking a villager. The test ends when the Griffin flies away with the villager's horse."

Same test was done by wccftech as above and all cpu's tested performed the same since that scene hardly did anything besides render the three characters and the griffin

Later on...... when benchmarking

"

But once we do a tour of Novigrad City on horseback… well, then we see some big changes. This area can hit 80% utilization across all eight threads on a Core i7 4790K! The less powerful the CPU here, the more CPU stutter you encounter. The G3258 – overclocked to 4.5GHz – doesn’t work out too well.

Also, check out the performance of the FX-6300 and FX-8350 starting from 01:26. In the initial cut-scenes they fall a little short of the Core i3 and i5, but once we hit the heavy Novigrad area, they compete very nicely – the FX 6300 moves ahead of the i3, while the FX 8350 is very close to the 4690K.

So, is this CPU stutter found in the Novigrad stress test a cause for concern? Well, not really. Most in-game stutter is caused when the CPU – not the GPU – is the bottleneck (and that’s what we are testing here). As long as you pair your more budget-orientated CPU with an appropriate GPU, you’ll hit the GPU limit first – and typically that doesn’t cause stutter.

Across all three scenes, the benchmarks work out as follows (low/avg) but do check out the CPU stress test scene to get a better idea of latency and performance under load. A benchmark sums up performance with a single number, but real-life can be a bit more complicated than that:

Core i7 4790K – 65.0/84.4

Core i5 4690K – 52.0/79.2

Core i3 4130 – 38.0/68.9

G3258 at 4.5GHz OC – 27.0/63.8

FX 8350 – 56.0/75.2

FX 6300 – 45.0/69.6

"

Yeah so only 2 fps um yea no try again......

Your the one who using the flawed argument based on flawed and incorrect ideas that because an i3 performs worse than an i5 with an game that is known to have mediocre cpu coding.....

" Project CARS suffers from major CPU issues that will affect all PC gamers. In order to find out whether the game scales well on multiple CPU cores, we simulated a dual-core and a quad-core CPU. At first glance – and by simply looking at the Performance tab of the Task Manager – it appears that even our hexa-core system is being pushed to its limits. However, in this particular case, this tab cannot be trusted. As we can see in the next image, Project CARS used only 50% of our CPU."

Fact is that your ideas are flat out wrong and based on nothing but your twisted logic.

Thanks again for proving my point so in CPU intensive games the difference will never be 2 fu**ing frames per second from an i3 vs an i7 this has been my point so yeah fu**ing Alien Isolation is not CPU intensive period it is not.

Second post the link to the complete Project Cars article you just quoted and stop hiding.

Yeah my twisted logic...

Looking at the Nvidia side, the full-fat Project Cars experience is very much designed with top-end machines in mind, and even our standard test setup struggles to go all-out. A top-end card like the GTX 780 Ti or GTX 980 (backed by a Core i7-3770K PC and 16GB of RAM) shows a 60fps return at high settings during clear weather - though just like the console editions it drops as soon as rain effects come into play.The frame-rate is cleaved in half to 30fps on a soaked Azure Circuit here, especially once all 44 rival cars are in view. Dropping all settings to high and reining the car count in to a more palatable 20 vehicles brings us back into locked 60fps territory. It's a compromise that sits well with the career mode, especially as it encourages races of this size by default

Even on PC with a i7 3770k the game drops from 60 to 30 all 44 AI cars are in view rendering the rain and weather effect kick the CPU hard remember each drop is a draw call to the GPU.

There's plenty of scope for scalability in Project Cars on PC, but while Nvidia's stalwart GTX 750 Ti makes a good fist of matching PS4 visuals, a quad-core processor is required to maintain frame-rate in vehicle-heavy races. It's rare that we become CPU-bound with an entry-level enthusiast GPU like the 750 Ti, but the results here speak for themselves.

GTFO is not my twisted logic you blind biased moron i am quoting a site here..

http://www.eurogamer.net/articles/digitalfoundry-2015-project-cars-face-off

They proved pretty well the 750TI was getting bottleneck by an i3 i could care less about your lame ass excuse the end result is there 16 FPS less by using an i3 that is a CPU bottleneck,where ever you want to admit it or not.

Worse You have the face to claim AI is CPU intensive when in reality it is not and that same i3 will perform frame per frame almost as good as a more expensive i7 4770k.

CPUs with four or more cores need not apply -- Alien: Isolation is seeking an affordable dual-core Core i3 processor but will also accept lowly Celeron, FX or Phenom II chips. Of course, that could change when running either SLI or Crossfire, but throwing that kind of GPU power at Alien: Isolation is a waste.

Can you fu**ing read.? This isn't my twisted logic my ultra butthurt friend this is Techspot which is a damn PC site which do tons of benchmarks,and they are fu**ing telling you that you don't even need a damn quad core CPU for Alien Isolation that an i3 will do and that the game will even accept lowly celeron CPU.

Given how well the Alien: Isolation benchmark ran on a Celeron G1820, we aren't at all surprised to find that there is no difference in performance between the Core i7-4770K clocked at 2.50GHz and 4.50GHz.

They drop 2ghz in speed and not a single frame was loss.

http://www.techspot.com/review/903-alien-isolation-benchmarks/page5.html

Not only are you trying to prove that PC is not CPU intensive,but you are so stupid that you hold tied to Alien Isolation been CPU intensive which is a joke,even a celeron will move it good,there is nothing more to say here than get a damn backbone and admit you loss the argument you look really stupid arguing that AI was CPU intensive while trying to say PC was not..lol

Avatar image for PAL360
PAL360

30570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 31

User Lists: 0

#145 PAL360
Member since 2007 • 30570 Posts

Nice! Devs are realising that native 1080p is the only acceptable resolution for current generation.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#146  Edited By 04dcarraher
Member since 2004 • 23829 Posts

lol el tormentos

Your just foaming at your mouth with nonsense

Project cars Pc is coded not to the same quality as on consoles..... Love how you keep ignoring and misreading the facts.

Ignoring cpu usage between games, ignoring improper cpu to gpu threading..... and lack of processing power does not mean "more demanding"

For example GRID also can 20 AI racers and can be ran on old Dual cores........ compared to test done by techspot with 20....

The fact that with project cars performance tanks with cpus that are actually stronger than i3's ie AMD FX 8/9's shows us the lack of good multithreading.

Fact is if Project cars was AI intensive and required alot of cpu resources to run you would see sizable gains going from stock clocks to 1ghz+ overclocks on the same cpus all we see are minor gains which means this game is not as intensive on cpu even with a larger gpu load.....

Improper cpu usage requiring brute force advantage of a cpu architecture per core is not proof of how demanding a game is nor proves using a weaker cpu comparing it to a stronger cpu means that a game is actually demanding.....

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#147  Edited By tormentos
Member since 2003 • 33784 Posts

@04dcarraher said:

lol el tormentos

Your just foaming at your mouth with nonsense

Project cars Pc is coded not to the same quality as on consoles..... Love how you keep ignoring the facts.

Ignoring cpu usage between games, ignoring improper cpu to gpu threading.....

For example GRID also can 20 AI racers and can be ran on old Dual cores........ compared to test done by techspot

The fact that with project cars performance tanks with cpus that are actually stronger than i3's ie AMD FX 8's shows us the lack of good multithreading.

Fact is if Project cars was AI intensive and required alot of cpu resources to run you would see major gains going from stock clocks to 1ghz+ overclocks on the same cpus which we do not see.......

Where is your proof that it wasn't coded at the same quality on PC.? You know Project Cars is a PC game first right it was announce first for PC,before there was a single line of code done on PS4.

http://www.ign.com/articles/2013/11/08/project-cars-now-headed-to-xbox-one-ps4-and-valves-steam-os

Want to talk about forgetting things fool have you forget the PS4 and xbox one has mayor weak sauce CPU which don't even compete with an i3.?

Yeah that make sense comparing 2 different game to claim one is badly coded to justify been more CPU intensive.

Dude that test you are showing is with 30 cars not 20 and with rain every drop of rain is a draw call on those CPU and even more strain,but you didn't know that did you.? Yeah you know you just ignore it.

The game is CPU intensive more so than the one you claim that is CPU intensive AI...

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#148 Zero_epyon
Member since 2004 • 20103 Posts

@tormentos said:
@04dcarraher said:

lol el tormentos

Your just foaming at your mouth with nonsense

Project cars Pc is coded not to the same quality as on consoles..... Love how you keep ignoring the facts.

Ignoring cpu usage between games, ignoring improper cpu to gpu threading.....

For example GRID also can 20 AI racers and can be ran on old Dual cores........ compared to test done by techspot

The fact that with project cars performance tanks with cpus that are actually stronger than i3's ie AMD FX 8's shows us the lack of good multithreading.

Fact is if Project cars was AI intensive and required alot of cpu resources to run you would see major gains going from stock clocks to 1ghz+ overclocks on the same cpus which we do not see.......

Where is your proof that it wasn't coded at the same quality on PC.? You know Project Cars is a PC game first right it was announce first for PC,before there was a single line of code done on PS4.

http://www.ign.com/articles/2013/11/08/project-cars-now-headed-to-xbox-one-ps4-and-valves-steam-os

Want to talk about forgetting things fool have you forget the PS4 and xbox one has mayor weak sauce CPU which don't even compete with an i3.?

Yeah that make sense comparing 2 different game to claim one is badly coded to justify been more CPU intensive.

Dude that test you are showing is with 30 cars not 20 and with rain every drop of rain is a draw call on those CPU and even more strain,but you didn't know that did you.? Yeah you know you just ignore it.

The game is CPU intensive more so than the one you claim that is CPU intensive AI...

http://gamingbolt.com/project-cars-ps4-using-99-of-cpu-cores-and-95-gpu-new-tracks-wii-u-version-details-and-more

PC is a CPU intensive game, from the developer's mouth. Next?

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#149  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Zero_epyon said:

http://gamingbolt.com/project-cars-ps4-using-99-of-cpu-cores-and-95-gpu-new-tracks-wii-u-version-details-and-more

PC is a CPU intensive game, from the developer's mouth. Next?

That is on PS4 and X1 not on Computer.....

On Computer project cars relies on brute force from architecture's speed per core not maximizing cpu usage along more than 4 threads even with top tier gpus. i7 2600k usage is only 44% average. While other games like Witcher 3 is 25-30% more cpu intensive than PC on same cpu and allows the AMD cpu's to actually perform better than i3's which means that cars throws does not multithread too well

Avatar image for Zero_epyon
Zero_epyon

20103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#150  Edited By Zero_epyon
Member since 2004 • 20103 Posts

@04dcarraher said:
@Zero_epyon said:

http://gamingbolt.com/project-cars-ps4-using-99-of-cpu-cores-and-95-gpu-new-tracks-wii-u-version-details-and-more

PC is a CPU intensive game, from the developer's mouth. Next?

That is on PS4 and X1 not on Computer.....

On Computer project cars relies on brute force from architecture not maximizing cpu usage even with top tier gpus

Then what does that have to do with DX12 on Xbox One? Isn't the point of all of this is to show if DX12 will do anything for it? Also, why would it be CPU intensive on a console and not on a PC?