Witcher 3 benchmarks and performance thread

  • 106 results
  • 1
  • 2
  • 3
Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#1  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

Lots of threads popping up now that this beast has released! Thought it might be useful if we keep performance stats and impressions in a nice bundle!

postingImpressions: no rules, just share at will

postingBenchmarks: I'd ask that if you do provide benchmarks, please provide details on the game settings you used and the hardware in your rig! Posting professional benchmark links is welcome as well

I'll start with first impressions of my own, and add some benchmarks later:

This game is gorgeous (playing on Ultra @ 1080p)! Big fan of their lighting and color choices - they know how to create a truly vibrant world.

Combat is much improved over TW2. There are still some awkward things, but it no longer feels like trying to steer a boat around. He is quick and responsive; rolls are useful but not near-mandatory. It's just tighter and more fun than the last game.

I gotta say - after an hour and a half with it I'm even more stoked than I was before

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#2  Edited By ShadowDeathX
Member since 2006 • 11698 Posts

Hardware: Three HD7970s in Trifire / Intel Core i7 3930k at 4.5Ghz

Preference: Ultra Settings with Shadows on Medium. Most of the post processing settings are off. Hate the vast majority of them. SSAO instead HBAO. (Gamebroke settings OFF of course)

2560 x 1440p: 32 to 44fps. (Single GPU. View Problem Below)

1920 x 1080p: 48 to 68fps. Capped at 60fps. (Single GPU. View Problem Below)

Problem: The game doesn't not utilize multiGPUs in Crossfire correctly. GPU1 clocks up to max setting but GPU2 and GPU3 remain on idle clocks. Used almost every profile that makes sense but none work correctly. Some don't scale correctly. Some cause crashes. The logical choice would be to use Witcher 2's crossfire profile but those won't work since Witcher 2 is a DX9 title and this is DX11.

On the other hand, on Nvidia the game picks up multiGPUs and clocks them correctly but it looks like users are also reporting issues with graphic flickering and variable GPU scaling. Also looks like poopy experience is coming from non-9 series Geforce cards. The excuse is that Kepler sucks in Compute compared to Maxwell (and AMD GCN).

Issues on both sides even though this is a Nvidia title.

Edit: Got Crossfire to work and now I'm getting 90 - 105 fps with the same settings from above at 1440p. Capping it at 90fps.

Edit 2: Nevermind. Too many graphical glitches. Can't use Crossfire. Must wait for patch/profile add on.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#3  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

So I'm running

1080p, Ultra preset but with hair effects turned off, postprocessing all on High (max) preset. I'm getting about 40-45fps in forest, 45-50fps in the open, upwards approaching 60 in more enclosed spaces (like the opening areas of Kaer Morhen)

*Edit: gave approximately your settings a try, Shadow (Ultra but shadows on medium, no hair effects), all postprocessing off except kept AA on and used SSAO instead of HBAO+). Here I get about 53-60fps (capped at 60) in both open and forest.

Hardware:

  • Zotac GTX970 (factory settings)
  • AMD Phenom II X4 975BE @3.4GHz
  • 8GB system RAM

I'd say this is decent performance given my hardware - the CPU utilization gets brought to 100%, but so does the GTX970, so they're both being drawn and quartered by the game! GPU RAM usage hasn't broken 2GB, consistent with most games on the market staying under that level at 1080p.

At some point I'll play with some OC'ing of the CPU and GPU and see where that brings me (but my GPU doesn't have a great cooling solution and is hitting the TDP as it is)

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

The game is running as expected on my system.

Spec:

  • i5 4670K 4.2Ghz
  • 8GB DDR3 RAM
  • GTX 960 2GB Strix 1500Mhz

Settings:

  • Everything on in post processing except Blur, AA and HBAO+
  • High options setting but with Nvidia Hair off

I get 45-55FPS.

It looks good but the game has yet to take my breath away from a visual stand point.

Avatar image for deactivated-579f651eab962
deactivated-579f651eab962

5404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 deactivated-579f651eab962
Member since 2003 • 5404 Posts

Seems to be running ok for me, only had a quick go upto the first ghoul fight. Have a proper go after work.

Avatar image for blangenakker
blangenakker

3240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 blangenakker
Member since 2006 • 3240 Posts

My 650 Ti Boost struggles for this game.

Avatar image for alucrd2009
Alucrd2009

787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#7 Alucrd2009
Member since 2007 • 787 Posts

Everything @ultra , but HBO+ off , i choosed SSO+ , and Hair crap is off , 60 fps Steady at 1080 with any issue .

Still waiting for crossfire profile so i can turn on the hair crap and hbo+ ,

Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#8 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts

Everything Ultra, Hair on, 1200p. Some postprocessing crap like Chromatic aberration off. Using some config tweaks from the nvidia guide to make shadows look better.

Runs 40-55fps most of the time on an overlocked 980. If it zooms into Geralt's head, the fps drop like a bomb, i think the worst ive seen was 17fps when a cutscene was about 6cm away from the back of Geralt's head. But most of the time in handles it fine, and the furry monsters don't seem to drop the fps much.

I don't mind the fps drop for hairworks most of the time. I love seeing Geralt's luscious locks and majestic beard flowing in the wind, and all the cool monster fur. Though using a crossbow is slightly annoying as it zooms in to geralt's head when you aim and drops 15fps.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9  Edited By RyviusARC
Member since 2011 • 5708 Posts

I have everything maxed plus tweaked settings to go beyond ultra sometimes by double.

With 2 970s in SLI and an i7 4770k at 2560x1440 downsampled from 3132x1762 I get around 35-45 fps.

Downsampling helps get rid of some of the blurry background and gets rid of most of the left over jagged edges that post AA misses.

Avatar image for digitm64
digitm64

470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 5

#10  Edited By digitm64
Member since 2013 • 470 Posts

i7 4770, 16GB 1866MHz DDR3, MSI GTX780ti OC.

Average 30-45fps with everything on Ultra 1080p including Hairwanks.

Getting the odd stutter now and again, trying to decide if it would be better to set the max frames to 30fps to get smooth gameplay. Use a controller so not certain if there would be any latency doing this.

Also not certain if I should apply any tweaks such as grass since fps is already pitiful.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 RyviusARC
Member since 2011 • 5708 Posts

Found the sweetfx settings I will most likely stick with.

Not sure if there is any more settings I can force past Ultra that actually make a difference so I am content with how it looks and performs.

Avatar image for Articuno76
Articuno76

19799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#12  Edited By Articuno76
Member since 2004 • 19799 Posts

FX8350/970 user here. I set the game to the High Preset (albeit with Hairworks completely off) and I'm getting a locked 60fps 95% of the time with the odd drop to 55-ish fps. This is at 1920x1080 with Post Processing set to High, using SSAO as opposed to HBAO+.

I tried engaging Hairworks (even on Medium and turning Shadow Quality to low) and found the framerate would bounce around more often, and between 40-60fps. Apparently Hairworks is demanding enough that even turning down the settings a fair bit like that isn't enough to compensate. But on reflection I'd rather have the game on High anyway as that's effectively about as good as it looks (keeping in mind that Ultra looks nigh on identical to High).

I've not properly sat down and played the game but I've skipped some way in and ran around open, densely vegetation areas on horseback. And the 60fps target again, remains firm 95% of the time.

Avatar image for digitm64
digitm64

470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 5

#13 digitm64
Member since 2013 • 470 Posts

Adjusted the Hairworks to instead use 2xMSAA instead of 8xMSAA (which is hurting the frames), this seems to have improved it a bit without much difference in quality.

In your game install go to Bin > Config > Base > Rendering.ini

There is a line HairWorksAALevel=8
Change that to either 4, 2, or 0

I changed it to 2 which makes it 2xMSAA.

Would love to hear any feedback as to if this helped or not

Avatar image for deactivated-5f3ec00254b0d
deactivated-5f3ec00254b0d

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#14  Edited By deactivated-5f3ec00254b0d
Member since 2009 • 6278 Posts

I'm bit confused. Wasn't that confident about how the game would preform on my PC. Was expecting it to run all in low just to struggle to be around 30fps @1080p, since I have a 5yo CUP and a low end GPU.

To my surprise the game locks at 30fps pretty stable (if I unlock it in goes into the 40s but very unstable). Everything on high except "background characters" - I think that's how they call it- and with the hair thing off. I've tried going to ultra and the game runs OK but with some drops to around 20fps. I've turned off some of the post processing, but I'm still trying with that.

I'm still in the first village, there might exist more demanding zones in the map. But until now I'm satisfied on how the game is preforming, especially considering this is day one.

Phenom II X6 1090TBlack Edition

EVGA GTX 750 Ti FTW

8GB DDR3 RAM

Win 7 64

Strange how changing most of the settings from low to ultra appear to have relative little impact on the performance. :/

Not trying to start a system war - I'm a console player myself - but I find it weird how the new consoles struggle to run this at 30fps.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#15 xantufrog  Moderator
Member since 2013 • 17875 Posts

@digitm64 said:

Adjusted the Hairworks to instead use 2xMSAA instead of 8xMSAA (which is hurting the frames), this seems to have improved it a bit without much difference in quality.

In your game install go to Bin > Config > Base > Rendering.ini

There is a line HairWorksAALevel=8

Change that to either 4, 2, or 0

I changed it to 2 which makes it 2xMSAA.

Would love to hear any feedback as to if this helped or not

cool tip! I'll give this a try and post comparisons. The effect I've seen with Hairworks on my system (sans your tweak) is my 45FPS becomes about 35FPS. It definitely hits my card hard. Overall I'm surprised that my performance meets or exceeds that reported on PCGamesHardware.de - their 970 is a nicer model than mine and their CPU is as well.

Avatar image for ribstaylor1
Ribstaylor1

2186

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#16  Edited By Ribstaylor1
Member since 2014 • 2186 Posts

Running i7 3770k 3.5ghz, 16gb ram 1600mhz, gtx 770 4gb 1.19ghz and an SSD for the storage. I'll post back with when I finally get it running, but from what I'm reading my computer which is leagues ahead of consoles won't be able to run this game at high settings at anything above 30fps 1080p.... So consoles settings essentually on a $2000 dollar pc from two years ago. So far not I'm extremely not impressed, especially with the console parity in place outside of using Nvidia gameworks. But ya I'll post back tonight with how it runs on such a system. If what sites are saying is true though, CD project red completely downgraded the game in every way to meet a parity between all platforms, and barely spent any time optimizing this game for pc outside of the 900 series of cards.

Hell they are saying a titan at ultra settings can't get past 30fps on 1080p. That's just downright unacceptable, and I'm more and more inclined to just stop playing games, and just move on to books or something else like burying myself in school work. Because gaming in it's current state is far too expensive for far too little content and support from developers and publishers to even make their games work the way it should and could. I think I've literally lost all want or need to keep this as a hobby if every major release I want is online only DRM or straight up doesn't take advantage of the gaming hardware I've been sold at ridiculous high prices. So fucking sick of this shit, and to see it come from a company who's main income is the pc gaming market and their GOG store front, is downright a slap in the face to the people that made them a successful company in the first place.

Avatar image for AlexKidd5000
AlexKidd5000

3103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 AlexKidd5000
Member since 2005 • 3103 Posts

Is Witcher 3 really good otherwise? I am gonna buy it when I get payed.

Avatar image for deactivated-5f3ec00254b0d
deactivated-5f3ec00254b0d

6278

Forum Posts

0

Wiki Points

0

Followers

Reviews: 54

User Lists: 0

#18 deactivated-5f3ec00254b0d
Member since 2009 • 6278 Posts

@ribstaylor1: I'm betting you will be able to run this game with much of the settings in ultra at 1080p.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#19  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

@phbz said:

@ribstaylor1: I'm betting you will be able to run this game with much of the settings in ultra at 1080p.

I dunno about that. Benches with the 770 on Ultra are showing framerates in the 20s. I think ribstaylor would be better off aiming for mid-high.

for posterity, here are some of the "published" benchmarks from around the web I've referenced (besides our own):

http://www.gamersnexus.net/game-bench/1947-witcher-3-pc-graphics-card-fps-benchmark

http://wccftech.com/witcher-3-initial-benchmarks/

Avatar image for Articuno76
Articuno76

19799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#20 Articuno76
Member since 2004 • 19799 Posts

I wonder, isn't the whole push with Maxwell cards that they do tessellation really well? If so, might people with cards like the 770/780 not see massive gains by turning water detail (which I asssume is tessellation level) down?

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#21  Edited By Coseniath
Member since 2004 • 3183 Posts

New benchmarks surface:

GameGPU

PCLab

----

ps: VRAM doesn't seem an issue for this game...

Avatar image for ribstaylor1
Ribstaylor1

2186

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#22  Edited By Ribstaylor1
Member since 2014 • 2186 Posts

A rather in depth look into the witcher's performance at varying levels and a wack load of hardware. And well it's worse then watch dogs, in how badly optimized this game is on pc hardware across the board. 1440p isn't even properly supported...

http://www.gamersnexus.net/game-bench/1947-witcher-3-pc-graphics-card-fps-benchmark

Avatar image for digitm64
digitm64

470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 5

#23  Edited By digitm64
Member since 2013 • 470 Posts

Time to bite the bullet and upgrade from the "unsupported" Kepler GTX780ti. Wondering GTX980, TitanX, or hold off just a tad bit for 980ti? Really want to play this game to it's full potential, and think this will be the same situation with Batman Arkham Knight since that will have Gameworks as well. Also only play at 1080p.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#24 04dcarraher
Member since 2004 • 23829 Posts

@digitm64 said:

Time to bite the bullet and upgrade from the "unsupported" Kepler GTX780ti. Wondering GTX980, TitanX, or hold off just a tad bit for 980ti? Really want to play this game to it's full potential, and think this will be the same situation with Batman Arkham Knight since that will have Gameworks as well. Also only play at 1080p.

Wait it out, drivers and patches will be incoming, plus wait for 2016.

Avatar image for digitm64
digitm64

470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 5

#25 digitm64
Member since 2013 • 470 Posts

@04dcarraher said:
@digitm64 said:

Time to bite the bullet and upgrade from the "unsupported" Kepler GTX780ti. Wondering GTX980, TitanX, or hold off just a tad bit for 980ti? Really want to play this game to it's full potential, and think this will be the same situation with Batman Arkham Knight since that will have Gameworks as well. Also only play at 1080p.

Wait it out, drivers and patches will be incoming, plus wait for 2016.

2016? That's so far away. What's happening in 2016?

Avatar image for EducatingU_PCMR
EducatingU_PCMR

1581

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#26 EducatingU_PCMR
Member since 2013 • 1581 Posts

Trash console port as expected, I've read some mad people saying how well optimized it is.

I feel sorry for Kepler owners, is what you get for going green.

At this performance might as well lock the game to 30fps and set a couple of options to ultra.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#27  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts
@ribstaylor1 said:

well it's worse then watch dogs, in how badly optimized this game is on pc hardware across the board.

I dunno, seems to run fine on 900 series Nvidia, and a broad gammut of Radeon. I feel like the really off-mark performance is older gen Nvidia hardware. I mean, an R9 285 works surprisingly great for what it is

Avatar image for NeoGen85
NeoGen85

4270

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#28 NeoGen85
Member since 2003 • 4270 Posts

@digitm64: If you're a working man or woman, I suggest you go ahead and make a purchase. I debated for the past 5 months[while away from my gaming rig on business in Seattle] to make a big upgrade knowing something else could come later in the year. Not to mention the Black Friday prices. If, and only if you have money to spare go ahead and upgrade. I jumped in deep with a Titan X, a 4k monitor that supports G-sync, and a new i7 processor that required me to do a mobo swap and get DDR4 RAM. If there's a video card that will release in the next few months that's out performing the Titan X you can always sell the Titan X to someone else and then use that money to purchase what you really want. If you can't roll like that I suggest just holding off.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#29 04dcarraher
Member since 2004 • 23829 Posts

@digitm64 said:
@04dcarraher said:
@digitm64 said:

Time to bite the bullet and upgrade from the "unsupported" Kepler GTX780ti. Wondering GTX980, TitanX, or hold off just a tad bit for 980ti? Really want to play this game to it's full potential, and think this will be the same situation with Batman Arkham Knight since that will have Gameworks as well. Also only play at 1080p.

Wait it out, drivers and patches will be incoming, plus wait for 2016.

2016? That's so far away. What's happening in 2016?

Volta from Nvidia,

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#30  Edited By Coseniath
Member since 2004 • 3183 Posts
@04dcarraher said:
@digitm64 said:

2016? That's so far away. What's happening in 2016?

Volta from Nvidia,

Pascal :P

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#31 04dcarraher
Member since 2004 • 23829 Posts

@Coseniath said:
@04dcarraher said:
@digitm64 said:

2016? That's so far away. What's happening in 2016?

Volta from Nvidia,

Pascal :P

oops lol

Avatar image for ribstaylor1
Ribstaylor1

2186

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#32 Ribstaylor1
Member since 2014 • 2186 Posts

@xantufrog: Ya it would seem AMD isn't doing to bad with this title but sli is barely supported and for most is reported as 100% broken or not working. Whole resolutions for cards meant for 4k aren't even popping up. $1000 dollar GPU's aren't even able to do this game at anything above 30fps 1080p at ubur or ultra what ever the hell the call it. The fact my i7 3770k, 16b ram, ssd and gtx 770 4gb can't play this game better then a console is PATHETIC coming from such a company who says pc for them is their main platform. Ultra minus Nividia gameworks looks virtually no different then what I see coming from the ps4, so it is 100% unoptimized if my computer which can run computational circles around both the ps4 and xbox one combined, can't even play this game at 60fps ultra or even high for that matter.

4k screens are a thing and apparently this game unless housing several r9 290's or titan x's. if you can even get the sli or crossfire too work won't even be able to do 60fps at that resolution. How that isn't worse then watchdogs I don't know. As I could run that game full out at ultra 1440p 40-60fps no issues. Same goes with the entirety of GTA 5 which is a gorgeous game on pc.

Star citizen is my last grasp of hope in this industry for something that actually gives me what I'm wanting out of games in 2015/2016. As apparently even pc centric companies no longer care to put care into their pc games. Wish I had chose a different hobby as a kid, as I'm more often left pissed at DRM, since I don't have internet where I use my rig, or I'm left with a piece of shit game that doesn't perform anywhere near what my hardware is capable of doing. This game may not have the DRM, but I couldn't care less how good the game and story is if the damn thing doesn't actually run the way it should on my hardware. Performance over graphics any day, and this game has neither of them going for it.

Avatar image for alucrd2009
Alucrd2009

787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#34  Edited By Alucrd2009
Member since 2007 • 787 Posts

@Chatch09: thats weird , i am running one 290 atm cause no crossfire support , with nvida crap off , and its 60 fps average !

@ShadowDeathX how did you get crossfire to work ? i tried everything in ARF there lots of flickering the game unplayable ? can you share the trick ?

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#35 BassMan
Member since 2002 • 17811 Posts

I am running max settings at 1440p, but have AA, blur, motion blur, depth of field and chromatic aberration disabled (I like clear visuals). Getting 60+ fps steady with rare dips into the 50s. Seem to notice the fps drops most in cut-scenes. I am happy with the performance and the game looks great.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#36 Coseniath
Member since 2004 • 3183 Posts

TechpowerUp usually don't do just one game performance analysis.

It seems Witcher 3 is an exception :P.

The Witcher 3: Performance Analysis

Everything on Ultra, SSAO and Hairworks off:

You can find more tests in the article.

Avatar image for Articuno76
Articuno76

19799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#37  Edited By Articuno76
Member since 2004 • 19799 Posts

I'd suggest most people here just play the game on High as for all intents and purposes the game does look the same. My understanding is that Ultra just cache more assets in so there's a bit less pop-in.

Avatar image for vfibsux
vfibsux

4497

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#38 vfibsux
Member since 2003 • 4497 Posts

@ribstaylor1 said:

@xantufrog: Ya it would seem AMD isn't doing to bad with this title but sli is barely supported and for most is reported as 100% broken or not working. Whole resolutions for cards meant for 4k aren't even popping up. $1000 dollar GPU's aren't even able to do this game at anything above 30fps 1080p at ubur or ultra what ever the hell the call it. The fact my i7 3770k, 16b ram, ssd and gtx 770 4gb can't play this game better then a console is PATHETIC coming from such a company who says pc for them is their main platform. Ultra minus Nividia gameworks looks virtually no different then what I see coming from the ps4, so it is 100% unoptimized if my computer which can run computational circles around both the ps4 and xbox one combined, can't even play this game at 60fps ultra or even high for that matter.

4k screens are a thing and apparently this game unless housing several r9 290's or titan x's. if you can even get the sli or crossfire too work won't even be able to do 60fps at that resolution. How that isn't worse then watchdogs I don't know. As I could run that game full out at ultra 1440p 40-60fps no issues. Same goes with the entirety of GTA 5 which is a gorgeous game on pc.

Star citizen is my last grasp of hope in this industry for something that actually gives me what I'm wanting out of games in 2015/2016. As apparently even pc centric companies no longer care to put care into their pc games. Wish I had chose a different hobby as a kid, as I'm more often left pissed at DRM, since I don't have internet where I use my rig, or I'm left with a piece of shit game that doesn't perform anywhere near what my hardware is capable of doing. This game may not have the DRM, but I couldn't care less how good the game and story is if the damn thing doesn't actually run the way it should on my hardware. Performance over graphics any day, and this game has neither of them going for it.

Have to agree dude. Consoles going the way of a wanna-be pc have done nothing but hold this platform back, and it sucks. We do get some games that go the extra mile, but typically with a multi-platformer we get screwed. Thankfully it does still look good and is a fun game but this part of it disappoints.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#39  Edited By ShadowDeathX
Member since 2006 • 11698 Posts

@Jawad2007 said:

@ShadowDeathX how did you get crossfire to work ? i tried everything in ARF there lots of flickering the game unplayable ? can you share the trick ?

Crossfire works, but not really. If you force the Ryse: Son of Rome profile into the game, it will clock up all the GPUs. The problem is that it won't use them to more than 50%. >.>

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#40 xantufrog  Moderator
Member since 2013 • 17875 Posts

moved to Hardware on further consideration of the topic content

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#41 Coseniath
Member since 2004 • 3183 Posts

Moar benchies inc!

From Guru3D:

The Witcher 3 Graphics Performance Review - Article - Guide - Review

Evereything is set to Ultra, AA is enabled, Nvidia Harworks is disabled and we use SSAO.

Some different tests than the others:

You can read more in the article :).

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#42  Edited By insane_metalist
Member since 2006 • 7797 Posts

i7 4770K / Crossfire R9 290's / 16GB DDR3 @ 1866mhz

Running i7 4770K @ 4.4GHz / single R9 290 (CFX not supported) *yet*.

2560 X 1440: SSAO instead of HBAO, (AA, Hairworks, Depth Of Field) turned off, everything else on Ultra.

Getting 33 fps - 40 fps.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#43 xantufrog  Moderator
Member since 2013 • 17875 Posts

Alrighty, so I systematically walked through the settings in the in-game menu and tested their outcome on my performance:

My baseline: 1080p, everything on Ultra except hair off, all post-proc on high + HBAO+

Was getting ~47fps in open field areas with grass and water and such, ~40fps in the densest forest with deer and things running around

The first thing I homed in on was AO type - interestingly, although some sites recommend going with SSAO, I noticed very little effect of turning HBAO+ off... my benches in the exact same location were ~49fps in open and maybe 41fps in densest forest. So... HBAO+ not that hard a hitter on my system relative to regular SSAO.

Also interesting - turning AA off had basically no effect on my framerate... very modest, 1fps at most.

Ok, so I like HBAO+ and AA, so I'm leaving them on.

Now, the big hitter - foliage render distance. Changing this from Ultra to high bought a lot of frames for very subtle perceptual hit (you don't really notice except in side-by-side comparisons): my open field went to 60fps (capped at 60) with occasional blips to 57 or so, and my dense forest went to ~49-50fps.

So - GTX970, Phenom II X4 965BE, 8GB RAM - 50-60fps across environments with everything on Ultra except hair turned off and foliage distance turned down to high.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#44 Coseniath
Member since 2004 • 3183 Posts

I think we found the problem between the Performance of GPUs in Witcher 3.

It is very Pixel Fillrate depended.

There is an image that shows how well Pixel Fillrate scales with performance.

From hardware.fr forums:

Avatar image for deactivated-579f651eab962
deactivated-579f651eab962

5404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#45 deactivated-579f651eab962
Member since 2003 • 5404 Posts

@Coseniath said:

I think we found the problem between the Performance of GPUs in Witcher 3.

It is very Pixel Fillrate depended.

There is an image that shows how well Pixel Fillrate scales with performance.

From hardware.fr forums:

With my current bios look at my Pixel Fillrate. Lol

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#46 Coseniath
Member since 2004 • 3183 Posts
@klunt_bumskrint said:

With my current bios look at my Pixel Fillrate. Lol

Oo.

1300Mhz TitanX?

It wouldn't be explained otherwise... :P

Avatar image for deactivated-579f651eab962
deactivated-579f651eab962

5404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#47 deactivated-579f651eab962
Member since 2003 • 5404 Posts

@Coseniath: Yeah, roughly.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#48  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

Klunt doesn't mess around! I was so proud of myself for investing in my wee 970 :-P

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#49 Coseniath
Member since 2004 • 3183 Posts
@xantufrog said:

I was so proud of myself for investing in my wee 970 :-P

Same here, I made sooo much research (I could create a doctoral thesis about it, lol) for months (not so much to do with HD4600 after GTX570 died) before I bought it.

:P

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#50 xantufrog  Moderator
Member since 2013 • 17875 Posts

Well, hey, I DO think it's an amazing card, especially for the price. It's made a huge difference in my system's performance, so I'm happy even if still taunted by the carrots that always dangle a little higher out of reach