Batman: Arkham Knight system requirements - PC Gamer

  • 55 results
  • 1
  • 2
Avatar image for biggest_loser
biggest_loser

24508

Forum Posts

0

Wiki Points

0

Followers

Reviews: 60

User Lists: 0

#1 biggest_loser
Member since 2007 • 24508 Posts

Minimum System Requirements

  • OS: Win 7 SP1, Win 8.1 (64-bit Operating System Required)
  • Processor: Intel Core i5-750, 2.67 GHz | AMD Phenom II X4 965, 3.4 GHz
  • Memory: 6 GB RAM
  • Graphics Card: NVIDIA GeForce GTX 660
  • Graphics Memory: 2 GB
  • DirectX®: 11
  • Network: Broadband Internet Connection Required
  • Hard Drive Space: 45 GB

Recommended System Requirements

  • OS: Win 7 SP1, Win 8.1 (64-bit Operating System Required)
  • Processor: Intel Core i7-3770, 3.4 GHz | AMD FX-8350, 4.0 GHz
  • Memory: 8 GB RAM
  • Graphics Card: NVIDIA GeForce GTX 760
  • Graphics Memory: 3 GB
  • DirectX®: 11
  • Network: Broadband Internet Connection Required
  • Hard Drive Space: 55 GB

ULTRA System Requirements

  • OS: Win 7 SP1, Win 8.1 (64-bit Operating System Required)
  • Processor: Intel Core i7-3770, 3.4 GHz | AMD FX-8350, 4.0 GHz
  • Memory: 8 GB RAM
  • Graphics Card: NVIDIA GeForce GTX 980
  • Graphics Memory: 3 GB
  • DirectX®: 11
  • Network: Broadband Internet Connection Required
  • Hard Drive Space: 55 GB

http://www.pcgamer.com/arkham-knight-system-requirements-revealed/

Avatar image for BassMan
BassMan

17799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#2 BassMan
Member since 2002 • 17799 Posts

I'm good for Ultra. I just hope it launches with SLI.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#3  Edited By Coseniath
Member since 2004 • 3183 Posts

I just read it too.

So in the Ultra requirements it needs VRAM of 3GB and GPU power of a GTX980.

I can't wait to see the ingame graphics. :D

Also from Gamepur:

Batman: Arkham Knight and The Witcher 3 Will Get DX12 Support via Patch, No Details On Xbox One Version

The Witcher 3: Wild Hunt and Batman: Arkham Knight on PC will support Microsoft's upcoming API tech, DirectX 12, according to new details shared by MSI, the well known MotherBoards and Graphics Cards firm. The confirmation came from new report posted today on MSI official website. It reads:

"The most exciting part of Windows 10 for gamers is the introduction of DirectX 12. You are likely to start seeing the benefits of the new graphics technology already in The Witcher 3, Batman: Arkham Knight and more games released later this year. Some CPU-bound games like MMO's are able get a performance bump of up to 50% according to Microsoft."

Microsoft has not issued any confirmation yet on this, so take it with a grain of salt and consider it as rumor only for time being. We have contacted our sources at Microsoft for some official words and will update this post as soon as it get to hear something from them.

According to rumors on internet, Windows 10 is scheduled to arrive in July 2015 and by that time both these would have already been released. Hence the above statement on official website of MSI hints DX12 support would be added via Post Patch.

So DX12 for Batman and Witcher!!!!

Avatar image for MirkoS77
MirkoS77

17657

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#4  Edited By MirkoS77
Member since 2011 • 17657 Posts

@Coseniath: Yea just a bit too bad it releases only a month after Batman. Shame. And I can't wait to play this, it's going to be day one.

Avatar image for Addict187
Addict187

1128

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 Addict187
Member since 2008 • 1128 Posts

I am sad now all I got is a GTX 670, i5 2500k. My pc feels old now a days

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6  Edited By KHAndAnime
Member since 2009 • 17565 Posts

Nice. Would like to SLI my GTX 980 by the release of the game but I won't get my hopes up.

Avatar image for mjorh
mjorh

6749

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#7 mjorh
Member since 2011 • 6749 Posts

I don't believe this BS for Recommended....:

  • Processor: Intel Core i7-3770, 3.4 GHz | AMD FX-8350, 4.0 GH

Doesn't add up ...

Avatar image for mjorh
mjorh

6749

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#8 mjorh
Member since 2011 • 6749 Posts

@Addict187 said:

I am sad now all I got is a GTX 670, i5 2500k. My pc feels old now a days

i have I5 2500k but seriously i don't get it why they've recommended I7 3770 for both Ultra and Recommended ....

Avatar image for horgen
horgen

127502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9 horgen  Moderator
Member since 2006 • 127502 Posts

GTX 760 for recommended... I don't have 3GB VRAM on my GTX 680, but I got more horse power in it than a 760. Oh well I'll finish a few other games and upgrade my GPU before I play this game.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#10 deactivated-59d151f079814
Member since 2003 • 47239 Posts

@mjorh said:

@Addict187 said:

I am sad now all I got is a GTX 670, i5 2500k. My pc feels old now a days

i have I5 2500k but seriously i don't get it why they've recommended I7 3770 for both Ultra and Recommended ....

.. It's pretty funny they are using the FX 8350 as the amd equivlent.. A I5 2500k, especially when you overclock it, beats that chip into the ground... Hell a I5 2500k surpasses a I5 4690k stock when you overclock it to 4.4/4.5ghz..

Avatar image for the-obiwan
the-obiwan

3747

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#11  Edited By the-obiwan
Member since 2003 • 3747 Posts

hey guys im new in the *gaming pc* department , would my pc be able to run this? at what res, and settings?

i5 4590 3.3 ghz, asus gtx 750 ti oc 2gb, 8gb ram.

i just build this computer its budget like but its all i could afford xD.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#12 04dcarraher
Member since 2004 • 23829 Posts

@the-obiwan said:

hey guys im new in the *gaming pc* department , would my pc be able to run this? at what res, and settings?

i5 4590 3.3 ghz, asus gtx 750 ti oc 2gb, 8gb ram.

i just build this computer its budget like but its all i could afford xD.

You might be able to play it on low at lower resolutions than 1080p. Just wait and see, you cant take system requirements at bold face value.

Avatar image for the-obiwan
the-obiwan

3747

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#13 the-obiwan
Member since 2003 • 3747 Posts

i was aiming for medium :x but i guess its gonna have to be a wait and see thing , thanks though :)

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#14  Edited By 04dcarraher
Member since 2004 • 23829 Posts
@horgen said:

GTX 760 for recommended... I don't have 3GB VRAM on my GTX 680, but I got more horse power in it than a 760. Oh well I'll finish a few other games and upgrade my GPU before I play this game.

GTX 760 series only has two flavors 2gb and 4gb, as with the 670 and 680's. 2gb are the most common version out of all those gpus. Just wait and see, you cant take system requirements at bold face value take it with a grain of salt until its been tested and played.

@sSubZerOo said:

@mjorh said:

@Addict187 said:

I am sad now all I got is a GTX 670, i5 2500k. My pc feels old now a days

i have I5 2500k but seriously i don't get it why they've recommended I7 3770 for both Ultra and Recommended ....

.. It's pretty funny they are using the FX 8350 as the amd equivlent.. A I5 2500k, especially when you overclock it, beats that chip into the ground... Hell a I5 2500k surpasses a I5 4690k stock when you overclock it to 4.4/4.5ghz..

Chances are that the game makes use of eight threads hence the requirement for the i7 and FX 8, now at the same time the FX 8350 is their most common high end cpu. You cant recommend anything better from AMD unless you start asking for FX 9000's which are rare.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#15 deactivated-59d151f079814
Member since 2003 • 47239 Posts

@04dcarraher said:
@horgen said:

GTX 760 for recommended... I don't have 3GB VRAM on my GTX 680, but I got more horse power in it than a 760. Oh well I'll finish a few other games and upgrade my GPU before I play this game.

GTX 760 series only has two flavors 2gb and 4gb, as with the 670 and 680's. 2gb are the most common version out of all those gpus. Just wait and see, you cant take system requirements at bold face value take it with a grain of salt until its been tested and played.

@sSubZerOo said:

@mjorh said:

@Addict187 said:

I am sad now all I got is a GTX 670, i5 2500k. My pc feels old now a days

i have I5 2500k but seriously i don't get it why they've recommended I7 3770 for both Ultra and Recommended ....

.. It's pretty funny they are using the FX 8350 as the amd equivlent.. A I5 2500k, especially when you overclock it, beats that chip into the ground... Hell a I5 2500k surpasses a I5 4690k stock when you overclock it to 4.4/4.5ghz..

Chances are that the game makes use of eight threads hence the requirement for the i7 and FX 8, now at the same time the FX 8350 is their most common high end cpu. You cant recommend anything better from AMD unless you start asking for FX 9000's which are rare.

Not sure I am going to buy this that this one game out of the countless have magically seen a performance increase to where 8 threads is needed all of a sudden.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#16  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@sSubZerOo said:

Not sure I am going to buy this that this one game out of the countless have magically seen a performance increase to where 8 threads is needed all of a sudden.

Well dont just ignore it, there are quite a few games that actually use more then 4 threads. While yea their have been a handful of games that state the need for i7's or FX8's and they see no real difference over i5's and AMD quads. However the fact that this game is said to be getting DX12 support via patch after it releases is a good sign that the game is already multithreaded.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#17 deactivated-59d151f079814
Member since 2003 • 47239 Posts

@04dcarraher said:

@sSubZerOo said:

Not sure I am going to buy this that this one game out of the countless have magically seen a performance increase to where 8 threads is needed all of a sudden.

Well dont just ignore it, there are quite a few games that actually use more then 4 threads. While yea their have been a handful of games that state the need for i7's or FX8's and they see no real difference over i5's and AMD quads. However the fact that this game is said to be getting DX12 support via patch after it releases is a good sign that the game is already multithreaded.

.... Yet again basically the I5 2500k/3550k/4590k beat FX8350 in any game out there.. Significantly at times... While matching the I7 equivalent.. DX12 isn't going to change that, if anything DX12 has shown that cpu power is going to be even less needed with seeing huge increases in performance..

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#18  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@sSubZerOo said:

@04dcarraher said:

@sSubZerOo said:

Not sure I am going to buy this that this one game out of the countless have magically seen a performance increase to where 8 threads is needed all of a sudden.

Well dont just ignore it, there are quite a few games that actually use more then 4 threads. While yea their have been a handful of games that state the need for i7's or FX8's and they see no real difference over i5's and AMD quads. However the fact that this game is said to be getting DX12 support via patch after it releases is a good sign that the game is already multithreaded.

.... Yet again basically the I5 2500k/3550k/4590k beat FX8350 in any game out there.. Significantly at times... While matching the I7 equivalent.. DX12 isn't going to change that, if anything DX12 has shown that cpu power is going to be even less needed with seeing huge increases in performance..

Nope.It totally depends on the coding of the games and how they allocate cpu resources.

Take BF4 MP as an example of a game that does makes use of 8 threads and is much more cpu bound. the FX 8350 performed virtually the same as an i7 2600k and i5 4670k performed under both of those cpus. While it took a FX 9590 to perform close to a i7 4770k. Your idea that games that make use of 8 threads wont allow the FX 8's to perform on par or better then any i5 is wrong(no overclocking).

Actually no, with DX12 is basically doing the same things as Mantle plus doing other things. Those few games that used Mantle actually did very little for intel quad core cpu's feeding gpu data while it gave AMD cpu's a massive gain allowing those FX 8's to perform on par with intel cpus. Because like DX12 Mantle lowered overheads on the cpu by freeing cpu cycles allowing those cycles to be used elsewhere or put back into the task its doing. Also allows all the cores/threads to feed data to the gpu not relying on a single core to all the gpu communication or have one core doing most of it and deferring some non priority workloads to other cores.

Intel current cpu's are more then 2x faster clock per clock and are able to to feed frame data to the gpu much better then AMD's offerings. They are not bottlenecking the gpu's this is why there is are big leads over AMD cpus in most games. DX12 changes the playing field for FX 6's 8's i3's and AMD quad cores. Unless a game is very cpu bound with DX12 as its base and or having to power multiple gpus or stronger gpus then we have now. is the only real advantage intel i5's or i7's will have over FX 8's. Because overall processing power of FX 8 isnt far off from 3rd gen i5's, and that they handle 8 threads gives more flexibility as well for games assigning separate tasks to specific threads.

Avatar image for harry_james_pot
harry_james_pot

11414

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#19 harry_james_pot  Moderator
Member since 2006 • 11414 Posts

You know.. I bet that "980 for ultra" is for running the game at 4k or something stupid like that.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 KHAndAnime
Member since 2009 • 17565 Posts

@harry_james_pot said:

You know.. I bet that "980 for ultra" is for running the game at 4k or something stupid like that.

No way. I'll be surprised if it gets a solid FPS at 1440P. This game makes The Witcher 3 look a generation behind.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21  Edited By Coseniath
Member since 2004 • 3183 Posts
@KHAndAnime said:

@harry_james_pot said:

You know.. I bet that "980 for ultra" is for running the game at 4k or something stupid like that.

No way. I'll be surprised if it gets a solid FPS at 1440P. This game makes The Witcher 3 look a generation behind.

Well the new batman games were always looking beautiful while they were playing very well on each of their release release current systems.

Even a mid range GTX260 could play the game smoothly (50fps avg) with 4xAA high settings at higher than 1440p (in a year (2009) where 1080p was premium lol, more like what 4K is now).

Judging on these facts, I wouldn't be surprised if the Ultra settings are for 4K.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#22 KHAndAnime
Member since 2009 • 17565 Posts

@Coseniath said:

Well the new batman games were always looking beautiful while they were playing very well on each of their release release current systems.

Even a mid range GTX260 could play the game smoothly (50fps avg) with 4xAA high settings at higher than 1440p (in a year (2009) where 1080p was premium lol, more like what 4K is now).

Judging on these facts, I wouldn't be surprised if the Ultra settings are for 4K.

I'd say for the time each game was released, they required decently beefy rigs to max and get a solid 60 FPS (high settings with PhysX off doesn't count). Especially Arkham City. You're going to want SLI GTX 980 for this game at 4k (at minimum). I'll be really surprised if a single GTX 980 gets more than 30 FPS avg at 4k. I don't even think I'd get more than 30 FPS on Arkham City at 4k with all the settings turned up...

Avatar image for Lach0121
Lach0121

11782

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#23  Edited By Lach0121
Member since 2007 • 11782 Posts

I don't think the Ultra settings are for 4k, I think it would need just a little more power than that. However, I am entertaining the idea that "Ultra" settings have 1440p in mind. That would make more sense, but I would rather to know for sure, or not, maybe more info will be released soon on it.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24  Edited By Coseniath
Member since 2004 • 3183 Posts
@KHAndAnime said:

I'd say for the time each game was released, they required decently beefy rigs to max and get a solid 60 FPS (high settings with PhysX off doesn't count). Especially Arkham City. You're going to want SLI GTX 980 for this game at 4k (at minimum). I'll be really surprised if a single GTX 980 gets more than 30 FPS avg at 4k. I don't even think I'd get more than 30 FPS on Arkham City at 4k with all the settings turned up...

I didn't mention physx cause it wouldn't be fair for AMD GPUs. And as I said before a sub $200 GPU (I bought a new GTX275 for $200 back then) could play the game at high settings with 4xAA at more than 1440p with 1GB!!! VRAM (while the mainstream back then was 1680x1050). I think this is something that we barely see in great eye candy games.

PhysX destroys performance so I wouldn't be surprise if need a lot of horsepower.

But in the end you might be right and the Ultra settings would be for the today's mainstream 1080p at 60FPS...

Avatar image for horgen
horgen

127502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#25 horgen  Moderator
Member since 2006 • 127502 Posts

@04dcarraher said:
@horgen said:

GTX 760 for recommended... I don't have 3GB VRAM on my GTX 680, but I got more horse power in it than a 760. Oh well I'll finish a few other games and upgrade my GPU before I play this game.

GTX 760 series only has two flavors 2gb and 4gb, as with the 670 and 680's. 2gb are the most common version out of all those gpus. Just wait and see, you cant take system requirements at bold face value take it with a grain of salt until its been tested and played.

Of course, but I wouldn't think they say 3GB without a reason. Doesn't matter anyway. I'm so far behind on games that my GTX 680 runs well, even at 3440*1440, so by the time I get up to newer games, I will switch to newer GPU.. Maybe even two.

Avatar image for blangenakker
blangenakker

3240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 blangenakker
Member since 2006 • 3240 Posts

Would it be safe to say that they may have bloated the requirements?

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#28  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@blangenakker said:

Would it be safe to say that they may have bloated the requirements?

maybe, the game is using a heavily modified UE3 engine, the difference between the gtx 660 and 760 is like 20%. Im thinking that requirements are based on a set resolution on each tier of requirements along with increased settings like physx.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#29  Edited By deactivated-59d151f079814
Member since 2003 • 47239 Posts

@04dcarraher said:

@sSubZerOo said:

@04dcarraher said:

@sSubZerOo said:

Not sure I am going to buy this that this one game out of the countless have magically seen a performance increase to where 8 threads is needed all of a sudden.

Well dont just ignore it, there are quite a few games that actually use more then 4 threads. While yea their have been a handful of games that state the need for i7's or FX8's and they see no real difference over i5's and AMD quads. However the fact that this game is said to be getting DX12 support via patch after it releases is a good sign that the game is already multithreaded.

.... Yet again basically the I5 2500k/3550k/4590k beat FX8350 in any game out there.. Significantly at times... While matching the I7 equivalent.. DX12 isn't going to change that, if anything DX12 has shown that cpu power is going to be even less needed with seeing huge increases in performance..

Nope.It totally depends on the coding of the games and how they allocate cpu resources.

Take BF4 MP as an example of a game that does makes use of 8 threads and is much more cpu bound. the FX 8350 performed virtually the same as an i7 2600k and i5 4670k performed under both of those cpus. While it took a FX 9590 to perform close to a i7 4770k. Your idea that games that make use of 8 threads wont allow the FX 8's to perform on par or better then any i5 is wrong(no overclocking).

Actually no, with DX12 is basically doing the same things as Mantle plus doing other things. Those few games that used Mantle actually did very little for intel quad core cpu's feeding gpu data while it gave AMD cpu's a massive gain allowing those FX 8's to perform on par with intel cpus. Because like DX12 Mantle lowered overheads on the cpu by freeing cpu cycles allowing those cycles to be used elsewhere or put back into the task its doing. Also allows all the cores/threads to feed data to the gpu not relying on a single core to all the gpu communication or have one core doing most of it and deferring some non priority workloads to other cores.

Intel current cpu's are more then 2x faster clock per clock and are able to to feed frame data to the gpu much better then AMD's offerings. They are not bottlenecking the gpu's this is why there is are big leads over AMD cpus in most games. DX12 changes the playing field for FX 6's 8's i3's and AMD quad cores. Unless a game is very cpu bound with DX12 as its base and or having to power multiple gpus or stronger gpus then we have now. is the only real advantage intel i5's or i7's will have over FX 8's. Because overall processing power of FX 8 isnt far off from 3rd gen i5's, and that they handle 8 threads gives more flexibility as well for games assigning separate tasks to specific threads.

http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/4 Not according to early benchmarks of DirectX 12 benches.. In which 4 core and 6 core are neck and neck.. As well as only extremely powerful cards like the 980 in which you see a bottle neck for a TWO core chip. Based on data like that, the recopmmends should be even lower if this were a DX12 game due to the insanely high boosts in performance..

Right here we have shadows of Mordor with similar nonsensical requirements.. Oh look a I3 is beating a FX 9590.. And a I5 4690k when overclocked beats everything.. http://www.tomshardware.com/reviews/shadow-of-mordor-performance,3996-4.html .. So a I3 is beating a even faster amd chip of the amd recommended chip.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#30  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@sSubZerOo said:

http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/4 Not according to early benchmarks of DirectX 12 benches.. In which 4 core and 6 core are neck and neck.. As well as only extremely powerful cards like the 980 in which you see a bottle neck for a TWO core chip. Based on data like that, the recopmmends should be even lower if this were a DX12 game due to the insanely high boosts in performance..

Right here we have shadows of Mordor with similar nonsensical requirements.. Oh look a I3 is beating a FX 9590.. And a I5 4690k when overclocked beats everything.. http://www.tomshardware.com/reviews/shadow-of-mordor-performance,3996-4.html .. So a I3 is beating a even faster amd chip of the amd recommended chip.

You are not understanding, intel cpu's see less improvements then AMD cpu's because of the fact their fast enough to feed the gpu frame data as needed. Even with two cores its only a 17% difference between 2 vs 4 vs 6 cores. AMD cpu allowing upto 8 cores will allow the AMD cpu's to perform as well has intel quad cores feeding data to the gpu. Like I said unless the game is cpu bound intel quad cores is not going to outclass AMD FX 8's with a single gpu in DX12.

A prime example with Thief coming from DX11 to mantle allowed an i3 4130 to perform on par with i7 4770k in DX11 and allowed a FX 8350 to perform on par with i7 4770k in DX11 as well. While that i7 going from DX11 to Mantle only seen a an average of 5% gain.

Again your example with Shadows of Morder is still based on a flawed idea. Did you even look at the benchs? GTX 980 on FX 4170 vs i5 4690k OC was a whopping 8 fps average difference and 6 fps minimum difference. All the cpu's used performed within a 10% range of each other meaning that the game didn't use 8 threads. And the difference between quad core vs 6 vs 8 are slight. and the fact that the game relies on dx11 method of using a main core with deferred workloads feeding gpu data. Any 2nd gen FX 4 6 or 8 above 3.5ghz provided more then average 90% gpu usage with 780ti. meaning intel's single core performance being 2x faster clock allows 100% usage.

The only reason why an i3 beats an AMD FX 8 core is because of lack of proper cpu usage ie not using more then 4 threads correctly and still using the old method of feeding gpu data using a single thread/core.

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#31  Edited By deactivated-59d151f079814
Member since 2003 • 47239 Posts

@04dcarraher said:

@sSubZerOo said:

http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/4 Not according to early benchmarks of DirectX 12 benches.. In which 4 core and 6 core are neck and neck.. As well as only extremely powerful cards like the 980 in which you see a bottle neck for a TWO core chip. Based on data like that, the recopmmends should be even lower if this were a DX12 game due to the insanely high boosts in performance..

Right here we have shadows of Mordor with similar nonsensical requirements.. Oh look a I3 is beating a FX 9590.. And a I5 4690k when overclocked beats everything.. http://www.tomshardware.com/reviews/shadow-of-mordor-performance,3996-4.html .. So a I3 is beating a even faster amd chip of the amd recommended chip.

You are not understanding, intel cpu's see less improvements then AMD cpu's because of the fact their fast enough to feed the gpu frame data as needed. Even with two cores its only a 17% difference between 2 vs 4 vs 6 cores. AMD cpu allowing upto 8 cores will allow the AMD cpu's to perform as well has intel quad cores feeding data to the gpu. Like I said unless the game is cpu bound intel quad cores is not going to outclass AMD FX 8's with a single gpu.

Again your example with Shadows of Morder is still based on a flawed idea. Did you even look at the benchs? GTX 980 on FX 4170 vs i5 4690k OC was a whopping 8 fps average difference and 6 fps minimum difference. All the cpu's used performed within a 10% range of each other meaning that the game didn't use 8 threads. And the difference between quad core vs 6 vs 8 are slight. and the fact that the game relies on dx11 method of using a main core with deferred workloads feeding gpu data. Any FX 4 6 or 8 above 3.5ghz provided more then 90% gpu usage with 780ti. meaning intel's single core performance being 2x faster clock allows 100% usage.

.. Less improvement.. Yet is still surpassing the amd chips in both current gen games and early direct x 12 benchmarks.. The fact of the matter is we aren't going to see some magical change where suddenly AMD chips are more competitive.. These chips are not new they are years old.. And basically all your trying to point out is baseless assumptions when basically everything we seen is on the contrary against that point.. The only thing I stated is the fact that these requriements and recommends are not accurate they make little to no sense.. Mordor has the I3 surpassing the 8350 in performance, a recommended chip.. And the I3 gets crushed by I5's especially the K editions that are overclocked..

This isn't the only game that does this.. Evil Within Another game that has a "minimum requirement of I7" or equivlent..

And wouldn't you know

.. Another low Sandy bridge I3 beating the 8350.. There we can see a stock 4670k matching the I7 5960x.. And a 2500k overclocked to 4.5ghz WILL outperform a stock 4670k.. My point still stands.. That developers have no idea what their talking about, we have been seeing this for half a year now down a dozen AAA game dev studios releasing hilariously inflated requirements and recommends.. By their minimum requirements alone a I3 2100 shouldn't even be able to play the game, let alone surpass the 8350.. The 8350 isn't exactly new tech, developers have had years to work with it and they haven't.. And thus far early DX12 benchmarking from what we have seen has shown 4 and 6 cores to operate very similarly.. Hell even the dual core I3's are working amazingly well in DX12..

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#32  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@sSubZerOo:

Quit ignoring the facts..... You keep on using examples of games that only make good use of 4 threads or less and or still using DX9/11 level of one core to gpu communication.

BF4 MP with proper DX 11 deferred workloads

The fact is that one core from 2nd gen icore or newer is around 2x or abit more faster clock per clock that is why an i3 is able to outclass AMD cpu core. This is why you see a FX 9590 performing under a 3.5 ghz i3. The CPU can not feed the gpu its data as fast as it needs it that is why you see lower performance. You can ignore DX12 or mantle all you want but fact is that any current gpu will be properly fed by FX eight core cpu and the lead intel has with games will more on a level playing ground. And before you label me as an AMD fanboy I currently running a i5 4690k with GTX 970.....

Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#33  Edited By deactivated-59d151f079814
Member since 2003 • 47239 Posts

@04dcarraher said:

@sSubZerOo:

Quit ignoring the facts..... You keep on using examples of games that only make good use of 4 threads or less and or still using DX9/11 level of one core to gpu communication.

The fact is that one core from 2nd gen icore or newer is around 2x or abit more faster clock per clock that is why an i3 is able to outclass AMD cpu core. This is why you see a FX 9590 performing under a 3.5 ghz i3. It would take a 6.5-7 ghz AMD cpu to match a 3rd/4th gen 3.5 ghz intel core per core. The CPU can not feed the gpu its data as fast as it needs it that is why you see lower performance. You can ignore DX12 or mantle all you want but fact is that any current gpu will be properly fed by FX eight core cpu and the lead intel has with games will more on a level playing ground.

I just posted benchmarks of a DX12 test that shows the performance difference between a 4 core and 6 core processor with a 980 as neck and neck.. And a I3 still performing quite well, though not as good as that.. I then posted numerous recent games in showing how the cpu requirements make absolutely no sense.. This game falls directly into line as that.. The new Batman or Witcher 3 are not going to be some how magically different from this line of progression.. Yet again, these requirements and recommends for cpu are silly and make little to no sense based on actual performance.. Especially when you take into the fact that these are MULTIPLAT GAMES, and are developed on the much weaker ps4 and XboxOne cpus. IN other similar games released around this current time with equally as silly requirements/recommends for cpu (that do not make sense) and with things like DX12 which we saw a real world FPS benchmark by tomshardware shed which illustrated to have the 4 core and 6 core to be neck and neck..

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34  Edited By Gatygun
Member since 2010 • 2709 Posts

Nice dx12 patch, this means amd gpu's will get a noticable performance boost. Good to see.

@horgen said:
@04dcarraher said:
@horgen said:

GTX 760 for recommended... I don't have 3GB VRAM on my GTX 680, but I got more horse power in it than a 760. Oh well I'll finish a few other games and upgrade my GPU before I play this game.

GTX 760 series only has two flavors 2gb and 4gb, as with the 670 and 680's. 2gb are the most common version out of all those gpus. Just wait and see, you cant take system requirements at bold face value take it with a grain of salt until its been tested and played.

Of course, but I wouldn't think they say 3GB without a reason. Doesn't matter anyway. I'm so far behind on games that my GTX 680 runs well, even at 3440*1440, so by the time I get up to newer games, I will switch to newer GPU.. Maybe even two.

3gb is what is needed these days for taxing new games, it's there for a reason a gpu that is powerful enough to tax that 3gb will also have to have horse power behind it to push itself, specially when the game features open world games with heavy gpu taxation i7 more cores then i5 will showcase it's use for sure.

I think the requirements are spot on, I just don't see a single 980 push 1080p on ultra with this game.

I think you are better off selling your 680 2gb for about 100-120 euro's add another 100 and buy a 290 for 200 euro's second handed with warranty. I did the same with my 580, best upgrade so far.

With dx12 that 290 will push 970/980 numbers for sure, and the gpu department well, if you got any i7 you are completely set with dx12.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#35  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@sSubZerOo said:

@04dcarraher said:

@sSubZerOo:

Quit ignoring the facts..... You keep on using examples of games that only make good use of 4 threads or less and or still using DX9/11 level of one core to gpu communication.

The fact is that one core from 2nd gen icore or newer is around 2x or abit more faster clock per clock that is why an i3 is able to outclass AMD cpu core. This is why you see a FX 9590 performing under a 3.5 ghz i3. It would take a 6.5-7 ghz AMD cpu to match a 3rd/4th gen 3.5 ghz intel core per core. The CPU can not feed the gpu its data as fast as it needs it that is why you see lower performance. You can ignore DX12 or mantle all you want but fact is that any current gpu will be properly fed by FX eight core cpu and the lead intel has with games will more on a level playing ground.

I just posted benchmarks of a DX12 test that shows the performance difference between a 4 core and 6 core processor with a 980 as neck and neck.. And a I3 still performing quite well, though not as good as that.. I then posted numerous recent games in showing how the cpu requirements make absolutely no sense.. This game falls directly into line as that.. The new Batman or Witcher 3 are not going to be some how magically different from this line of progression.. Yet again, these requirements and recommends for cpu are silly and make little to no sense based on actual performance.. Especially when you take into the fact that these are MULTIPLAT GAMES, and are developed on the much weaker ps4 and XboxOne cpus. IN other similar games released around this current time with equally as silly requirements/recommends for cpu (that do not make sense) and with things like DX12 which we saw a real world FPS benchmark by tomshardware shed which illustrated to have the 4 core and 6 core to be neck and neck..

Again your missing the point about DX12, the reason why there is no difference between 4 and 6 cores on an intel cpu is because the 4 cores is more then enough feed all the gpu its data it needs when it needs it. Hell even two cores was only 17% slower then 4 cores which means three cores could do the job. Which means that using 6-8 cores on a FX cpu will allow it to perform as well as a intel quad core feeding data. Right now with DX11 intel cpu's have an advantage because or their performance per clock gains and the majority of games do not make full use of 8 threads.

The problem is it totally depends on the coding of the game and how they "the dev's" allocate cpu resources based on the needs. You are mixing up games with low cpu usage and ones with overestimated requirements. Also do not ignore that both games ie Batman and Witcher series have always been good and extra features on pc. So stating that their multiplat games and because of the X1 and PS4 have weak cpu's means that cpu requirements are over board is just plain guessing.Fact is the consoles need to use all cpu cores to get by....

Fact that cpu requirements on high & ultra are the same even with intel means again one thing 8 threads are going to used to what degree we do not know. And with the Witcher 3 same thing applies. But the news that both games will be getting DX12 patch gives us an idea that these games are multithreaded, and porting to high level coding of DX12 aka DX11.3 will be simple. And cpu to gpu gains will be seen even with AMD cpus.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#36  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Gatygun said:

Nice dx12 patch, this means amd gpu's will get a noticable performance boost. Good to see.

@horgen said:
@04dcarraher said:
@horgen said:

GTX 760 for recommended... I don't have 3GB VRAM on my GTX 680, but I got more horse power in it than a 760. Oh well I'll finish a few other games and upgrade my GPU before I play this game.

GTX 760 series only has two flavors 2gb and 4gb, as with the 670 and 680's. 2gb are the most common version out of all those gpus. Just wait and see, you cant take system requirements at bold face value take it with a grain of salt until its been tested and played.

Of course, but I wouldn't think they say 3GB without a reason. Doesn't matter anyway. I'm so far behind on games that my GTX 680 runs well, even at 3440*1440, so by the time I get up to newer games, I will switch to newer GPU.. Maybe even two.

3gb is what is needed these days for taxing new games, it's there for a reason a gpu that is powerful enough to tax that 3gb will also have to have horse power behind it to push itself, specially when the game features open world games with heavy gpu taxation i7 more cores then i5 will showcase it's use for sure.

I think the requirements are spot on, I just don't see a single 980 push 1080p on ultra with this game.

I think you are better off selling your 680 2gb for about 100-120 euro's add another 100 and buy a 290 for 200 euro's second handed with warranty. I did the same with my 580, best upgrade so far.

With dx12 that 290 will push 970/980 numbers for sure, and the gpu department well, if you got any i7 you are completely set with dx12.

Dont forget the gains 970/980 will get above what 290 series will see with DX12. Because right now with DX9/11 970=290x at 1080p and at 1440p 970 sits between the 290 and 290x overall.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37  Edited By Gatygun
Member since 2010 • 2709 Posts

@04dcarraher:

@04dcarraher said:

@Gatygun said:

Nice dx12 patch, this means amd gpu's will get a noticable performance boost. Good to see.

@horgen said:
@04dcarraher said:
@horgen said:

GTX 760 for recommended... I don't have 3GB VRAM on my GTX 680, but I got more horse power in it than a 760. Oh well I'll finish a few other games and upgrade my GPU before I play this game.

GTX 760 series only has two flavors 2gb and 4gb, as with the 670 and 680's. 2gb are the most common version out of all those gpus. Just wait and see, you cant take system requirements at bold face value take it with a grain of salt until its been tested and played.

Of course, but I wouldn't think they say 3GB without a reason. Doesn't matter anyway. I'm so far behind on games that my GTX 680 runs well, even at 3440*1440, so by the time I get up to newer games, I will switch to newer GPU.. Maybe even two.

3gb is what is needed these days for taxing new games, it's there for a reason a gpu that is powerful enough to tax that 3gb will also have to have horse power behind it to push itself, specially when the game features open world games with heavy gpu taxation i7 more cores then i5 will showcase it's use for sure.

I think the requirements are spot on, I just don't see a single 980 push 1080p on ultra with this game.

I think you are better off selling your 680 2gb for about 100-120 euro's add another 100 and buy a 290 for 200 euro's second handed with warranty. I did the same with my 580, best upgrade so far.

With dx12 that 290 will push 970/980 numbers for sure, and the gpu department well, if you got any i7 you are completely set with dx12.

Dont forget the gains 970/980 will get above what 290 series will see with DX12. Because right now with DX9/11 970=290x at 1080p and at 1440p 970 sits between the 290 and 290x overall.

AMD dx11 api is shit, it eats a ton of cpu performance. If you run a 290 on a weak cpu, and have a nvidia card that pushes the same performance as 290 on that same weak cpu, your 290 performance will get a massive hit. Specially when that cpu also runs a game that demands a lot of cpu performance. This is a massively overlooked thing with AMD gpu's in dx11 api land.

Nvidia has there api far more better optimized. The reason for this is that AMD went the mantle way ( and now it's spiritual successor which they already work a year with DX12 ) Nvidia lacks behind on there dx12 development while AMD is basically infront of it.

THis means a massive turn around, cpu demand will be far lesser for amd cards and games that require a lot more cpu performance won't tank the fps that hard when the gpu needs to flush data through it.

Nvidia on the other hand has a good optimzied dx11 api, there somewhat worse then amd driver solution for dx12 won't bring them much of a leap at all. As there hardware is already functioning as it should be. You get on there 960 already on a i3 95% of it's performce, while with amd and it's 280 you won't even get 60% of it's performance on that i3. This scales all the way further.

Higher resolution amd and nvidia matches closer together, the reason for this is because the cpu overhead becomes less in difference.

Basically you really need to have a shit tier cpu for your gpu to be limited with nvidia's dx11 api. While with amd you need to have pretty much the strongest cpu available to not have a bottleneck through there dx11 api's. Its a massive difference.

About your 970, the 290x has a ton of streaming processors and other units something the 970 misses in comparison. It brute forces performance atm because there api's are that shit. The 290's and 290x's are basically just flush a ton of hardware at our software deflict. while the 970 and 980 are more midrange card sold as high range solutions as they are lesser hardware with a more efficient architecture and better dx11 drivers. The reason for this is because amd doesn't give a shit anymore about dx11 and just moves on towards there dx12 solution which also there 300x series cards are based on. Look already at the huge amount of streaming processors it's rumored to have. It will push past titan x levels because that's basically nvidia's 290x pbc layout.

the 290x will be a lot more closer towards 970 and 980 in dx12 then you would think.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 RyviusARC
Member since 2011 • 5708 Posts

@Coseniath said:

I just read it too.

So in the Ultra requirements it needs VRAM of 3GB and GPU power of a GTX980.

I can't wait to see the ingame graphics. :D

Also from Gamepur:

Batman: Arkham Knight and The Witcher 3 Will Get DX12 Support via Patch, No Details On Xbox One Version

The Witcher 3: Wild Hunt and Batman: Arkham Knight on PC will support Microsoft's upcoming API tech, DirectX 12, according to new details shared by MSI, the well known MotherBoards and Graphics Cards firm. The confirmation came from new report posted today on MSI official website. It reads:

"The most exciting part of Windows 10 for gamers is the introduction of DirectX 12. You are likely to start seeing the benefits of the new graphics technology already in The Witcher 3, Batman: Arkham Knight and more games released later this year. Some CPU-bound games like MMO's are able get a performance bump of up to 50% according to Microsoft."

Microsoft has not issued any confirmation yet on this, so take it with a grain of salt and consider it as rumor only for time being. We have contacted our sources at Microsoft for some official words and will update this post as soon as it get to hear something from them.

According to rumors on internet, Windows 10 is scheduled to arrive in July 2015 and by that time both these would have already been released. Hence the above statement on official website of MSI hints DX12 support would be added via Post Patch.

So DX12 for Batman and Witcher!!!!

So if they bring patches with DX12 will that mean they will support ram stacking so in SLI a 2x4GB is actually 8GB instead of just mirror it and being 4?

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#39  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Gatygun:

Your whole idea there is abit off ..

AMD's DX11 optimization isnt the best, but performance is affected as with nvidia with weaker cpu's. If your using any semi modern intel cpu the differences are small.

Even with Mantle 290x still cant outright cream a 970.

With Thief (only known Mantle game that used Async shaders) with i7 4770k at 4.4 ghz mantle vs DX11 benchmark

290x with DX11: Min-55 Average -99, Mantle: 67/122, With 970 DX11 54/104, GTX 980 67/116. Mantle with proper multithreaded cpu support only gave 290x a 20-25% lead over 970 using DX11.

DX12 and Mantle have many of the same features which is why DX12 and Mantle perform very close to each other. Even still with Mantle it 290x in Starswarm was still nearly 50% behind the 980.The difference between Nvidia DX11 980 to 290x Mantle was around 30% while Dx11 to Dx12 on a Dx12 card was 250% gain.You can not over look the ability to get into a natively designed gpu with the API it was designed around you just get more out of it. While we know 970 is only around 15% slower then 980. We can conclude that 970 will have a lead over the 290x in 1080/1440 res gaming.

Major problem with you claiming that the 290x has to do brute force with its poor AMD DX11 drivers.... Is so wrong, claiming the 290x as a high end gpu while 970/980 are midranged screams AMD bias there big time. Now AMD's 300 series have been reworked tweaked at the hardware level for DX12 and we will see 380x and 290x's dx12 performance differ even though they are pretty much the same gpu. The 970 will perform better then 290x in DX12. All Dx11 based cards do not get the all the low level coding and features from DX12.

Avatar image for digitm64
digitm64

470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 5

#40 digitm64
Member since 2013 • 470 Posts

I'm pretty sure my 780ti will run this on Ultra. Who's to say Rocksteady aren't in bed with nVidia to try to sell more GTX980's.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#41 KHAndAnime
Member since 2009 • 17565 Posts

@digitm64 said:

I'm pretty sure my 780ti will run this on Ultra. Who's to say Rocksteady aren't in bed with nVidia to try to sell more GTX980's.

Looks like you'll have to drop some setting down a tick or two ;)

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#42  Edited By Coseniath
Member since 2004 • 3183 Posts
@RyviusARC said:

So if they bring patches with DX12 will that mean they will support ram stacking so in SLI a 2x4GB is actually 8GB instead of just mirror it and being 4?

Wow. I totally forgot that. I guess that it will work that way.

That will remove any VRAM bottlenecks for people with SLI and CF.

People that will benefit most from this will be people that used GTX680/770 and GTX780/ti for SLI and their combined GPU power might struggle from VRAM limit in a lot of new games.

But you have to use Windows 10 too.

I don't know how many people will use Windows 10 in the first days of release :P.

edit: @04dcarraher: I was reading that GameGPU.ru site for some time and their results always differ from other sites.

Most of the times in the favor of AMD. They tested BF4 on campaign map, yet they didn't tested on a heavy multiplayer map which is the reason most people buy BF4 (multiplayer).

People with AMD CPUs were complaining about lower performance at multiplayer.

An other site did Multiplayer tests in a map with 64 players when CPU matters more:

As you can see results differ...

ps: I posted BF4 Hardline results cause BF4 is old and it might not have so much multithreaded performance (I can still post BF4 results too but they will be near the same...).

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 RyviusARC
Member since 2011 • 5708 Posts

@Coseniath said:
@RyviusARC said:

So if they bring patches with DX12 will that mean they will support ram stacking so in SLI a 2x4GB is actually 8GB instead of just mirror it and being 4?

Wow. I totally forgot that. I guess that it will work that way.

That will remove any VRAM bottlenecks for people with SLI and CF.

People that will benefit most from this will be people that used GTX680/770 and GTX780/ti for SLI and their combined GPU power might struggle from VRAM limit in a lot of new games.

But you have to use Windows 10 too.

I might make a dual boot of Windows 7 and Windows 10.

And only use Windows 10 for the DX12 games because it will most likely be unstable for older games at release.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44  Edited By Gatygun
Member since 2010 • 2709 Posts

@RyviusARC said:

@Coseniath said:

I just read it too.

So in the Ultra requirements it needs VRAM of 3GB and GPU power of a GTX980.

I can't wait to see the ingame graphics. :D

Also from Gamepur:

Batman: Arkham Knight and The Witcher 3 Will Get DX12 Support via Patch, No Details On Xbox One Version

The Witcher 3: Wild Hunt and Batman: Arkham Knight on PC will support Microsoft's upcoming API tech, DirectX 12, according to new details shared by MSI, the well known MotherBoards and Graphics Cards firm. The confirmation came from new report posted today on MSI official website. It reads:

"The most exciting part of Windows 10 for gamers is the introduction of DirectX 12. You are likely to start seeing the benefits of the new graphics technology already in The Witcher 3, Batman: Arkham Knight and more games released later this year. Some CPU-bound games like MMO's are able get a performance bump of up to 50% according to Microsoft."

Microsoft has not issued any confirmation yet on this, so take it with a grain of salt and consider it as rumor only for time being. We have contacted our sources at Microsoft for some official words and will update this post as soon as it get to hear something from them.

According to rumors on internet, Windows 10 is scheduled to arrive in July 2015 and by that time both these would have already been released. Hence the above statement on official website of MSI hints DX12 support would be added via Post Patch.

So DX12 for Batman and Witcher!!!!

So if they bring patches with DX12 will that mean they will support ram stacking so in SLI a 2x4GB is actually 8GB instead of just mirror it and being 4?

Mantle is already capable of this, as DX12 is a evolved Mantle and probably is going to be the main api already after a year, I could see this happening. Still it will have somewhat a overhead but to what extend is unknown atm. It will probably happen but when / for what games / and what trade offs is unknown as of yet.

@04dcarraher said:

@Gatygun:

Your whole idea there is abit off ..

AMD's DX11 optimization isnt the best, but performance is affected as with nvidia with weaker cpu's. If your using any semi modern intel cpu the differences are small.

Even with Mantle 290x still cant outright cream a 970.

With Thief (only known Mantle game that used Async shaders) with i7 4770k at 4.4 ghz mantle vs DX11 benchmark

290x with DX11: Min-55 Average -99, Mantle: 67/122, With 970 DX11 54/104, GTX 980 67/116. Mantle with proper multithreaded cpu support only gave 290x a 20-25% lead over 970 using DX11.

DX12 and Mantle have many of the same features which is why DX12 and Mantle perform very close to each other. Even still with Mantle it 290x in Starswarm was still nearly 50% behind the 980.The difference between Nvidia DX11 980 to 290x Mantle was around 30% while Dx11 to Dx12 on a Dx12 card was 250% gain.You can not over look the ability to get into a natively designed gpu with the API it was designed around you just get more out of it. While we know 970 is only around 15% slower then 980. We can conclude that 970 will have a lead over the 290x in 1080/1440 res gaming.

Major problem with you claiming that the 290x has to do brute force with its poor AMD DX11 drivers.... Is so wrong, claiming the 290x as a high end gpu while 970/980 are midranged screams AMD bias there big time. Now AMD's 300 series have been reworked tweaked at the hardware level for DX12 and we will see 380x and 290x's dx12 performance differ even though they are pretty much the same gpu. The 970 will perform better then 290x in DX12. All Dx11 based cards do not get the all the low level coding and features from DX12.

I never said that the 290/ 290x would be equal towards 970's or 980's. The cards have a better architecture then 290/290x while the 290/290x brute forces through hardware on there cards. This is why 290's on a 1100 clock ( easily hitable with pretty much all 290's ) will push equal numbers as a stock 970. The reason a 290 can even come close towards let alone equal a new gen cards is because the 900 series released by nvidia top models are nothing but midrange cards you can see this at the size of the cards and what they push on it on cuda cores.

DX12 will bring stock 970's on equal food towards 290x/290's oc'ed editions. 980 will still walk away from 290/290x, titan x is a whole different ball game.

I'm not really interested in thiefs performance with mantle it isn't really a good implementend api for that game specifically, it's a butchered one. You are better off using battlefield 4 numbers as it's the only game that really pushes mantle as it should, even while it's a outdated driver version of amd's mantle.

I don't see 290x/290's push past 970 performance with dx12, it will equal it at tops. ( unless nvidia really messes up on there dx12 drivers ).

I do see 290x/290's getting a noticable performance boost over what we have now in comparison towards nvidia in current and upcoming dx11 games.

About the "AMD bias":

290x and 290 won't overtake the 970 because it's a newer gen card. It's that simple. Both 970/980's are mid range new gen cards and this explains why 290/290x still compete after all this time in a artificially made "high end class" because of no competition, the titan x is the new 580 ( when witcher 2 came out as comparison ) There is no denying this if you look at it's pbc and compare it towards 900 series / 290 cards.

Every card 290/290x/970/980 are considered by consumers high end as only a small % of people have pc's with that horse power in it.

It's not so hard to realize this, if you need to play the amd bias card because you do not like the results, well dunno what to say to you.

@RyviusARC said:

@Coseniath said:
@RyviusARC said:

So if they bring patches with DX12 will that mean they will support ram stacking so in SLI a 2x4GB is actually 8GB instead of just mirror it and being 4?

Wow. I totally forgot that. I guess that it will work that way.

That will remove any VRAM bottlenecks for people with SLI and CF.

People that will benefit most from this will be people that used GTX680/770 and GTX780/ti for SLI and their combined GPU power might struggle from VRAM limit in a lot of new games.

But you have to use Windows 10 too.

I might make a dual boot of Windows 7 and Windows 10.

And only use Windows 10 for the DX12 games because it will most likely be unstable for older games at release.

I would haevily advice people to not update towards a new windows release before service pack 1 is out. But if they can't resist your solution is the best one for sure.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#45  Edited By 04dcarraher
Member since 2004 • 23829 Posts

@Gatygun

Here is problem with that, GTX 970 even at stock performs better them 290x in DX11 at 1080p and equal at 1440p. So implying that with DX12 "stock" 970 would bring only allow the 970 to be on par with 290x at those resolutions is again wrong. Non-vanilla 970's are nearly equal to 980's and ones that are clocked beyond 1.2 ghz are on par with 980's. Stock 970's are only on average 15% slower and and non vanilla 970's are under 10%.

Actually no BF4 uses the older version of mantle, Thief uses a more updated version that uses Async shaders that BF4 does not. While thief may not be as demanding on the cpu as BF4 the game still makes use of 8 threads. This also shows that there is no cpu bound results with the cpu not able to feed the gpu's data. Showing that even with Mantle 290x is not high end as you think.

Again suggesting that 290 series are high end and 970/980 are midranged is full of bologna, I'm thinking your idea what is mid ranged and what is high end is abit off. Because titan x is only 20- 30% faster then 980 depending on resolution. But costs around $500 more then 980 and around $700 more then 970 for a measly 33% performance gain from 970 at 1440p. That's not a high end gpu that is a high end rip off.

Also the titan series are not solely meant for gaming purposes. Even AMD's dual gpu card the 295x2 tends to be less then 30% faster then GTX 980 at 1440p.

Avatar image for horgen
horgen

127502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#46 horgen  Moderator
Member since 2006 • 127502 Posts

@Gatygun said:

Nice dx12 patch, this means amd gpu's will get a noticable performance boost. Good to see.

@horgen said:
@04dcarraher said:
@horgen said:

GTX 760 for recommended... I don't have 3GB VRAM on my GTX 680, but I got more horse power in it than a 760. Oh well I'll finish a few other games and upgrade my GPU before I play this game.

GTX 760 series only has two flavors 2gb and 4gb, as with the 670 and 680's. 2gb are the most common version out of all those gpus. Just wait and see, you cant take system requirements at bold face value take it with a grain of salt until its been tested and played.

Of course, but I wouldn't think they say 3GB without a reason. Doesn't matter anyway. I'm so far behind on games that my GTX 680 runs well, even at 3440*1440, so by the time I get up to newer games, I will switch to newer GPU.. Maybe even two.

3gb is what is needed these days for taxing new games, it's there for a reason a gpu that is powerful enough to tax that 3gb will also have to have horse power behind it to push itself, specially when the game features open world games with heavy gpu taxation i7 more cores then i5 will showcase it's use for sure.

I think the requirements are spot on, I just don't see a single 980 push 1080p on ultra with this game.

I think you are better off selling your 680 2gb for about 100-120 euro's add another 100 and buy a 290 for 200 euro's second handed with warranty. I did the same with my 580, best upgrade so far.

With dx12 that 290 will push 970/980 numbers for sure, and the gpu department well, if you got any i7 you are completely set with dx12.

I'm not buying another 28nm process GPU. Besides I have a huge backlog of games where my GTX 680 play high to ultra... Even at 3440*1440 resolution. Batman AA runs fine on it, I expect Batman AC to run well on it as well. Dirt Showdown runs fine, I expect Dirt 3 to run fine as well.

When I upgrade, I'll get two for sli or crossfire. Perhaps not right away, as with waterblocks and extra radiator space will also add to the cost.

As for CPU.. My i5 3570K runs at 4.5GHz... I'm not upgrading before I see that I'll gain noticeable performance increase. I don't expect to wait until I see another i5 which has twice the amount of power that mine does. Atm the only options that gives any significant amount increased performance is the X99 line. But they cost to much. How much boost a simple i7 gives over i5 with DX12... Well I'll wait and see before I decide what to upgrade to in terms of CPU.

Avatar image for attirex
attirex

2448

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47  Edited By attirex
Member since 2007 • 2448 Posts

My body, mind, and dual 970s are ready.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 Gatygun
Member since 2010 • 2709 Posts

@04dcarraher said:

@Gatygun

Here is problem with that, GTX 970 even at stock performs better them 290x in DX11 at 1080p and equal at 1440p. So implying that with DX12 "stock" 970 would bring only allow the 970 to be on par with 290x at those resolutions is again wrong. Non-vanilla 970's are nearly equal to 980's and ones that are clocked beyond 1.2 ghz are on par with 980's. Stock 970's are only on average 15% slower and and non vanilla 970's are under 10%.

Actually no BF4 uses the older version of mantle, Thief uses a more updated version that uses Async shaders that BF4 does not. While thief may not be as demanding on the cpu as BF4 the game still makes use of 8 threads. This also shows that there is no cpu bound results with the cpu not able to feed the gpu's data. Showing that even with Mantle 290x is not high end as you think.

Again suggesting that 290 series are high end and 970/980 are midranged is full of bologna, I'm thinking your idea what is mid ranged and what is high end is abit off. Because titan x is only 20- 30% faster then 980 depending on resolution. But costs around $500 more then 980 and around $700 more then 970 for a measly 33% performance gain from 970 at 1440p. That's not a high end gpu that is a high end rip off.

Also the titan series are not solely meant for gaming purposes. Even AMD's dual gpu card the 295x2 tends to be less then 30% faster then GTX 980 at 1440p.

I can't recall the numbers anymore exactly but, the 970 on stock seems to push about 10-15% away from a 290 with a 950 core i believe. a 1100 core is pretty much 10-15% faster. This is the reason why i said a oc'ed 290 vs a stock 970 and not a oc'ed 970 vs a oc'ed 290. A oc'ed 970 will walk away from a 290 obviously. ( 290 and 290x are practically equal besides 3-5% difference so no biggy here).

About the 1080p and 1440p discussion. The reason why 1440p is lesser of a bottleneck is because cpu overhead is starting to become lesser drastical. To give you a example of how shit dx11 api is from amd here you got a simple test:

As you can see a 280 will get bottlenecked because the cpu overhead is massive. while nvidia simple pushes far better frames already on far lower cpu's. Here's another example that is a bit more complex:

Now don't bother with results of fps or whatever else which are not interesting, just look at the minimum.

2x 780 sli pushes a 233 minimum as max performance, this minimum is gained on a 4770k,

2x 290x pushes a 182 minimum as max performance. this minimum is only gainable on a 5960x ( fastest cpu around "from what i remember")

Now look at the i3 4330 on the 780 sli, still pushes a 203 minimum of the 233 maximum

Now look at the i3 4330, it pushes 111 from the 182 minimum.

Nvidia gpu's will work better on weaker cpu's. Aka garbage dx11 api where it talked about.

1080p like i already discussed earlier is for reasons more taxing for amd because of the dx11 api, this will get solved with mantle / dx12 but for now we are still stuck with 9 out of the 10 games being dx 11. ( which is soon going to change ).

Higher resolution's will relay more on the hardware available on the gpu to crush numbers. This is why on higher resolutions the 900 series will fall short in comparison towards a titan / 290's, because it simple doesn't have the hardware on it to push it . ( if you remove the dx11 layer away )

A good example here with mantle: ( 8gb versions are actually a tiny bit slower then 4gb versions and bf4 will hit about 3200 max on v-ram usage with aa )

the 980's with dx12 will probably have the edge on the 290x tho.

BF4 like i said is the better game to compare things with, thief is not because it hardly pushes cpu's the way BF4 does. I basically agree with you there dunno why to argue on that point.

About the mid / high range, i explained it now twice and i dunno what to say to you anymore on this department. It is what it is.

30% performance increase = a new generation these days. Most mid range cards are rebuilded older cards. 770 = 680, 660 = 580, 7950 = 280, 7970 = 280x etc etc.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#49 Gatygun
Member since 2010 • 2709 Posts

@horgen said:

@Gatygun said:

Nice dx12 patch, this means amd gpu's will get a noticable performance boost. Good to see.

@horgen said:
@04dcarraher said:
@horgen said:

GTX 760 for recommended... I don't have 3GB VRAM on my GTX 680, but I got more horse power in it than a 760. Oh well I'll finish a few other games and upgrade my GPU before I play this game.

GTX 760 series only has two flavors 2gb and 4gb, as with the 670 and 680's. 2gb are the most common version out of all those gpus. Just wait and see, you cant take system requirements at bold face value take it with a grain of salt until its been tested and played.

Of course, but I wouldn't think they say 3GB without a reason. Doesn't matter anyway. I'm so far behind on games that my GTX 680 runs well, even at 3440*1440, so by the time I get up to newer games, I will switch to newer GPU.. Maybe even two.

3gb is what is needed these days for taxing new games, it's there for a reason a gpu that is powerful enough to tax that 3gb will also have to have horse power behind it to push itself, specially when the game features open world games with heavy gpu taxation i7 more cores then i5 will showcase it's use for sure.

I think the requirements are spot on, I just don't see a single 980 push 1080p on ultra with this game.

I think you are better off selling your 680 2gb for about 100-120 euro's add another 100 and buy a 290 for 200 euro's second handed with warranty. I did the same with my 580, best upgrade so far.

With dx12 that 290 will push 970/980 numbers for sure, and the gpu department well, if you got any i7 you are completely set with dx12.

I'm not buying another 28nm process GPU. Besides I have a huge backlog of games where my GTX 680 play high to ultra... Even at 3440*1440 resolution. Batman AA runs fine on it, I expect Batman AC to run well on it as well. Dirt Showdown runs fine, I expect Dirt 3 to run fine as well.

When I upgrade, I'll get two for sli or crossfire. Perhaps not right away, as with waterblocks and extra radiator space will also add to the cost.

As for CPU.. My i5 3570K runs at 4.5GHz... I'm not upgrading before I see that I'll gain noticeable performance increase. I don't expect to wait until I see another i5 which has twice the amount of power that mine does. Atm the only options that gives any significant amount increased performance is the X99 line. But they cost to much. How much boost a simple i7 gives over i5 with DX12... Well I'll wait and see before I decide what to upgrade to in terms of CPU.

I always liked my 580 gtx, it's horrible that the card only had 1,5gb of v-ram which bothered me to no end already from the day i bought it. But the card gpu performance was always fantastic, pretty much the best gpu i bought in all my gpu years for sure. Specially when i oc'ed my i7 870 from 2,9ghz > 4ghz it gave the card a second life.

The 1,5gb killed it in unity so i had to upgrade ( basically was forced too, as even on low it wouldn't run stable as it always would need more then 1,5gb of v-ram. ) I can see with that cpu setup you are having a 680 still gives you perfectly fine performance. Specially as the nvidia drivers are a god send on optimisation even for older models in dx11.

Avatar image for horgen
horgen

127502

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#50 horgen  Moderator
Member since 2006 • 127502 Posts

@Gatygun: 2GB as it has is enough for the games I play for now. When I upgrade I want 6GB or more. If I go for two cards, I don't want to run out of VRAM before the cards are to weak.