The Witcher 3 PC Ultra Runs At 1080p 60FPS On GTX 980

  • 145 results
  • 1
  • 2
  • 3
Avatar image for goodkingmog
GoodKingMog

167

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#101 GoodKingMog
Member since 2015 • 167 Posts

@KHAndAnime said:
@goodkingmog said:

so much troll... lol

Ignorance is bliss I guess. When people are saying Witcher 2 looks better in many aspects, well...that's just kinda lame. I'm not going to harp on it though.

oh gosh... a link to a steam thread.... because we all know steam community posts arent just a bunch of PC elitist bitches stroking each others epeens.... *rolls eyes*

in no way, shape, or form, does the witcher 3 look WORSE than the witcher 2. it is just not true.

Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#102  Edited By deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts
@KHAndAnime said:
@goodkingmog said:

so much troll... lol

Ignorance is bliss I guess. When people are saying Witcher 2 looks better in many aspects, well...that's just kinda lame. I'm not going to harp on it though.

Oh please, those people have a bad case of nostalgia googles. Witcher 2 looks nice overall because it had good aesthetics and some nice stuff, but even when it released it had many dated graphical features. Witcher 2 has some of the worst shadow dithering issues in a game, an abysmal SSAO implementation, slathered on way too much low quality post processing, terrible quality bloom where if you angled the camera just right Geralts face would just turn into a blob of light.

People who think Witcher 2 looks better or comparable to Witcher 3 are probably the same people who still think vanilla Crysis is graphically impressive in 2015. They need to try playing those games again to remember what they actually look like.

Avatar image for goodkingmog
GoodKingMog

167

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103 GoodKingMog
Member since 2015 • 167 Posts

@ferret-gamer said:
@KHAndAnime said:
@goodkingmog said:

so much troll... lol

Ignorance is bliss I guess. When people are saying Witcher 2 looks better in many aspects, well...that's just kinda lame. I'm not going to harp on it though.

Oh please, those people have a bad case of nostalgia googles. Witcher 2 looks nice overall because it had good aesthetics and some nice stuff, but even when it released it had many dated graphical features. Witcher 2 has some of the worst shadow dithering issues in a game, an abysmal SSAO implementation, slathered on way too much low quality post processing, terrible quality bloom where if you angled the camera just right Geralts face would just turn into a blob of light.

People who think Witcher 2 looks better or comparable to Witcher 3 are probably the same people who still think vanilla Crysis is graphically impressive in 2015. They need to try playing those games again to remember what they actually look like.

how dare you speak out against the mighty king of bullshit KHAndAnime! he is ALWAYS right... a man of his stature couldnt possibly be wrong in any debate that doesnt involve facts or common sense... ever!

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#104 Coseniath
Member since 2004 • 3183 Posts

Moar benchies!

PCLab:

GPU sample: w/o Gameworks:

CPU Tests:

FX8350 on par with entry Haswell i5s, enjoying 60FPS!!!

From gamegpu.ru:

max settings:

Also VRAM test with max settings:

:D

Avatar image for ribstaylor1
Ribstaylor1

2186

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#105  Edited By Ribstaylor1
Member since 2014 • 2186 Posts

SO turned out the company and Nvidia did almost no optimizing for cards outside of the 900 series. Not only that on max settings a gtx titan, a fucking titan can't even get the game to run at anything above 30fps. So here's another company on my shit list of companies to no longer support by buying their games. EA, Ubisoft, now CD project red. Seems I'm slowly losing companies I'm willing to buy games from. If things don't drastically change for the better in this hobby, then I'm leaving for good I think. Been getting bored of games of late and if such a release like this one can't even take the time to optimize itself for pc while making it almost a parity to console versions, then I honestly don't think I even give two shits about buying or supporting the industry I've loved for so long.

Like a titan can't play this game at anything above 1080p 30fps...A fucking $1000 GPU... Ya I'm done with all this fucking bullshit, no more consoles, and no more pc upgrades for me, as time and time again I'm shown this industry isn't worth the hassle and the large amounts of cash it requires to enjoy.

Avatar image for ShepardCommandr
ShepardCommandr

4939

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#106 ShepardCommandr
Member since 2013 • 4939 Posts

bad optimization aside the 980 is 50+% faster than a vanilla titan

If you look at those slides the numbers do indeed add up.....

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#107 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

I haven't played the game yet but I did just mess with the graphic settings to compare with what the GeForce Experience says would be best. It wants me to turn down grass quality and foliage visibility range from Ultra to High, Depth of Field and Nvidia HairWorks from on to off. I guess it's time to upgrade again.

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#108  Edited By BassMan
Member since 2002 • 17811 Posts

@ribstaylor1 said:

SO turned out the company and Nvidia did almost no optimizing for cards outside of the 900 series. Not only that on max settings a gtx titan, a fucking titan can't even get the game to run at anything above 30fps. So here's another company on my shit list of companies to no longer support by buying their games. EA, Ubisoft, now CD project red. Seems I'm slowly losing companies I'm willing to buy games from. If things don't drastically change for the better in this hobby, then I'm leaving for good I think. Been getting bored of games of late and if such a release like this one can't even take the time to optimize itself for pc while making it almost a parity to console versions, then I honestly don't think I even give two shits about buying or supporting the industry I've loved for so long.

Like a titan can't play this game at anything above 1080p 30fps...A fucking $1000 GPU... Ya I'm done with all this fucking bullshit, no more consoles, and no more pc upgrades for me, as time and time again I'm shown this industry isn't worth the hassle and the large amounts of cash it requires to enjoy.

Sounds like you are just burned out. Take a break and come back when you are ready. By that time, the drivers will be optimized.

Avatar image for insane_metalist
insane_metalist

7797

Forum Posts

0

Wiki Points

0

Followers

Reviews: 42

User Lists: 0

#109 insane_metalist
Member since 2006 • 7797 Posts
@dangamit said:

Nvidia is screwing the 7 series owners to force them to buy 9 series cards. What a terrible business model. I will never again buy an Nvidia card after this slap to the face. The 280X is owing the 780 when they're not even in the same league.

Nvidia has been very shady lately. People that bought their 780's paid around $500 a piece. Now $1000 in GPU's isn't enough to run Witcher 3..? All because they're greedy and want people to buy their 900 series cards (instead of doing what they're suppose to and release proper drivers for 700 series cards).

Avatar image for ribstaylor1
Ribstaylor1

2186

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#110 Ribstaylor1
Member since 2014 • 2186 Posts

@BassMan I wish I could say burned out was the word, as I've burned out a couple of times. I'd more wager to call it completely and utterly fed up with being treated like an imbecile who'll buy any old junk thrown at him is more like it. Tired of being shown fake footage, tired of being lied to, tired of the cutting of content to charge for later, and the now pervasive microtransactions, and the most tired of the lack of support companies give to the highest profiting section of this industry. The pc.

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#111 BassMan
Member since 2002 • 17811 Posts

@ribstaylor1: The best thing is to stop pre-ordering games and wait to see if a game is worthwhile. Make them earn your money. Don't support broken games and practices.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#112 04dcarraher
Member since 2004 • 23829 Posts

@GoldenElementXL said:

I haven't played the game yet but I did just mess with the graphic settings to compare with what the GeForce Experience says would be best. It wants me to turn down grass quality and foliage visibility range from Ultra to High, Depth of Field and Nvidia HairWorks from on to off. I guess it's time to upgrade again.

There is a trick you can apply to allow hairworks to run much better. In the rendering.ini file found in The Witcher 3 Wild Hunt\bin\config\base. Open it up and find hairworks AA level and set it to 0 from 8. I believe they are applying 8xAA of some sort on all the objects that are using the hairworks physics. Just doing this allow my 970 gain another 15-20 fps giving me a near 60 fps average with most settings on ultra, besides shadows&foliage range on high.

Avatar image for deactivated-583e460ca986b
deactivated-583e460ca986b

7240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#113 deactivated-583e460ca986b
Member since 2004 • 7240 Posts

@04dcarraher: Thanks for the trick. Right now I have everything maxed except AA at 1440p (even hair works is on) and I'm staying in the low 50's. (Except cutscenes) The game sure is a looker.

Avatar image for 04dcarraher
04dcarraher

23829

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#114 04dcarraher
Member since 2004 • 23829 Posts

@GoldenElementXL said:

@04dcarraher: Thanks for the trick. Right now I have everything maxed except AA at 1440p (even hair works is on) and I'm staying in the low 50's. (Except cutscenes) The game sure is a looker.

So how much did it help you with those titans at 1440?

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#115 BassMan
Member since 2002 • 17811 Posts

I am running max settings at 1440p, but have AA, blur, motion blur, depth of field and chromatic aberration disabled (I like clear visuals). Getting 60+ fps steady with rare dips into the 50s. Seem to notice the fps drops most in cut-scenes. I am happy with the performance and the game looks great.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#116  Edited By Coseniath
Member since 2004 • 3183 Posts

Look at this in Reddit:

How to run Hairworks on AMD cards without crippling performance

So thanks to a wonderful user over at Guru3D, it's been discovered that hairworks can in fact run relatively smoothly on AMD cards. I'm talking 60fps smooth guys, and the trade-off is barely noticeable.

All you need to do is create a profile for witcher3.exe in your Catalyst Control Center, then set Tessellation Mode to "Override application settings", and finally set the Maximum Tessellation Level to either 2x, 4x, 8x or 16x. Picture guide

Important: Depending on what tessellation level you set, the hair quality AND performance will vary. For best performance while maintaining realistic hair I recommend 4x or 8x if your card can handle it. I would recommend against using 2x as it severely reduces quality and looks terrible. Here's a comparison photo that lets you see the differences between 2x, 4x, 8x and 16x tessellation levels. (Thanks to nzweers from G3D for the comparison photo!)

For reference I'm using a r9 290 and I have little to no performance impact if I use 8x, if I bump it up to 16x then I regularly drop to 50 in intense areas (e.g. wolf packs). So there you have it, let me know if this also works for you guys - and also if you happen to find any issues that arise from doing this tweak.

Avatar image for eva02langley
eva02langley

375

Forum Posts

0

Wiki Points

0

Followers

Reviews: 12

User Lists: 0

#117  Edited By eva02langley
Member since 2011 • 375 Posts

@thehig1: It took a while for the drivers for the Witcher 2 to be released to support CF. It took a month and I waited until the drivers were released to play the game. It was running quite well after that.

As for The Witcher 3, there is no CF support at the moment which is a shame, but new AMD drivers are coming up next week. It should be the end of the joke. It's a really demanding game. It cannot be played at 4k with a single graphic card. Even there, you need a 295x, CF 290x or SLI GTX 980.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#118 Coseniath
Member since 2004 • 3183 Posts

New patch!

The Witcher 3: Wild Hunt PC Patch 1.03 Released, Optimizes Nvidia Hairworks (from NextPowerUp). Moar performance.

CD Projekt Red has released a new patch for the PC version of The Witcher 3, just as promised. The Witcher 3: Wild Hunt PC patch 1.03 optimizes gameplay, improves graphics, adds more graphics options for PC gamers to fiddle with, and optimizes Nvidia Hairworks. Read below for the full patch notes.

The Witcher 3: Wild Hunt patch 1.03 changelog:

  • Improves stability in gameplay and the UI
  • Improves performance especially in cutscenes and gameplay
  • Fixes grass and foliage popping that could occur after density parameters were changed
  • Improves Nvidia Hairworks performance
  • Boosted texture anisotropy sampling to 16x on Ultra preset
  • Sharpen Post-process settings extended from Off/On to Off/Low/High
  • Blood particles will now properly appear after killing enemies on the water
  • Corrects a bug where player was able to shoot bolts at friendly NPCs
  • Improves menu handling
  • Corrects an issue with Stamina regeneration while sprinting
  • Fixes a cursor lock issue that sometimes occurred when scrolling the map
  • Generally improves world map focus
  • Improves input responsiveness when using keyboard
  • Corrects some missing translations in the UI
  • Corrects an issue in dialogue selections
  • Rostan Muggs is back
  • Minor SFX improvements
Avatar image for deactivated-59d151f079814
deactivated-59d151f079814

47239

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#119 deactivated-59d151f079814
Member since 2003 • 47239 Posts

@Coseniath said:

Moar benchies!

PCLab:

GPU sample: w/o Gameworks:

U Tests:

FX8350 on par with entry Haswell i5s, enjoying 60FPS!!!

From gamegpu.ru:

max settings:

Also VRAM test with max settings:

:D

I5 2500k and i7 2600k have to be some of the best value chips ever made in existence.. They are over 4 years and 3 months old and they are still surpassing the best mainstream chips out there with medium overclocks vs the stocks, and even when the newest are being overclocked they are within 5% of one another.. Absolutely amazing.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#120 Coseniath
Member since 2004 • 3183 Posts
@sSubZerOo said:

I5 2500k and i7 2600k have to be some of the best value chips ever made in existence.. They are over 4 years and 3 months old and they are still surpassing the best mainstream chips out there with medium overclocks vs the stocks, and even when the newest are being overclocked they are within 5% of one another.. Absolutely amazing.

I totally agree.

But the reason is that when it was launched, it provided 20% more performance for 30% less power. It was one of the best architecture leaps ever. And this kneeled AMD and they never reached this performance per core per GHz. And since AMD couldn't reach Intel, Intel decided to slack by giving minor architecture upgrades while supressing the performance their own CPUs by not using soldering but cheap TIM (for Ivybridge, cause for Haswell they decided that the TIM shouldn't even touch the CPU. I was screaming that Haswell due to die shrinking and new architecture should have more MHz. And after 1 year a miracle happened. i7 4790K...)...

Avatar image for gerygo
GeryGo

12805

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#121  Edited By GeryGo  Moderator
Member since 2006 • 12805 Posts

Currently running with GTX660 at 30fps - textures on low, everything else set to high. (compared to my old 280X which had everything set to high at 45-60fps)

There's new beta drivers for AMD to improve 10% performance for Witcher 3 so that's a great news.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#122  Edited By Coseniath
Member since 2004 • 3183 Posts
@PredatorRules said:

There's new beta drivers for AMD to improve 10% performance for Witcher 3 so that's a great news.

Yeap and up to 17% for Project Cars!

Avatar image for elessarGObonzo
elessarGObonzo

2677

Forum Posts

0

Wiki Points

0

Followers

Reviews: 140

User Lists: 0

#123  Edited By elessarGObonzo
Member since 2008 • 2677 Posts
@NeoGen85 said:

Okay. So here's what I got:

i7 5820k at 4.1ghz

32GB of DDR4 RAM

Titan X

Ultra settings with GameWorks features:

  • 1080p - 60fps
  • 1440p - 55 to 60fps
  • 4K - 30fps

there must be something wrong there. you've got to be stuck on a 60Hz display. i really hope 60fps isn't all that setup can achieve.

8GB 290X \ i7-4790K @ 4.6Ghz \ 8GB DDR3 2400

1080p Ultra Settings with Hairworks\All Effects : ~55fps | without Hairworks\blur turned off: steady 60fps

1440p Ultra\Hair\effects: ~40fps | without Hair\blur\sharpening: ~55fps.

there should be a much bigger margin between our systems with the price you've put out.

Avatar image for elessarGObonzo
elessarGObonzo

2677

Forum Posts

0

Wiki Points

0

Followers

Reviews: 140

User Lists: 0

#124 elessarGObonzo
Member since 2008 • 2677 Posts

@KHAndAnime: i just played through the first few hours and then the last few hours of Assassins of Kings the couple days before Wild Hunt came out. can guarantee Wild Hunt looks a bit better @ 1080p with everything turned all the way up. not as much better as people were thinking it would, but a still a bit better. besides that there's more going on in any given scene and a much greater draw distance in Wild Hunt.

Avatar image for NeoGen85
NeoGen85

4270

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#125 NeoGen85
Member since 2003 • 4270 Posts

@elessarGObonzo: It's a 60hz monitor.

Avatar image for elessarGObonzo
elessarGObonzo

2677

Forum Posts

0

Wiki Points

0

Followers

Reviews: 140

User Lists: 0

#126 elessarGObonzo
Member since 2008 • 2677 Posts

@NeoGen85: sucks being limited to 60fps. definitely not worth lowering the resolution to get another 15 frames per second though. my next investment is going to have to be a 120+Hz display

Avatar image for NeoGen85
NeoGen85

4270

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#127 NeoGen85
Member since 2003 • 4270 Posts

@elessarGObonzo: A 4k monitor at 120hz would have been amazing. Two things I learned while building my new rig is that nothing is future proof and because of that always be patient especially when money is involved. Two GTX 980 Ti are looking good now. lol

Avatar image for chriscoolguy
chriscoolguy

729

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#128 chriscoolguy
Member since 2011 • 729 Posts

Must say i'm pretty happy about the game. A few tweaks. I'm running everything on ultra, NO AA, but everything else turned on. Hariworks on with profile made have it set to 8 in AMD CC. Asus AMD R9 280, I7 4790K, getting on avg about 31-32 FPS most of the time, dips down to about 28, sometimes up to about 36. Avg of 31-32FPS is fine for me.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#129 Coseniath
Member since 2004 • 3183 Posts
@chriscoolguy said:

Must say i'm pretty happy about the game. A few tweaks. I'm running everything on ultra, NO AA, but everything else turned on. Hariworks on with profile made have it set to 8 in AMD CC. Asus AMD R9 280, I7 4790K, getting on avg about 31-32 FPS most of the time, dips down to about 28, sometimes up to about 36. Avg of 31-32FPS is fine for me.

Are you using 15.5 beta drivers? If not, use them, they will give you extra 3-4 FPS. :)

Avatar image for chriscoolguy
chriscoolguy

729

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#130 chriscoolguy
Member since 2011 • 729 Posts

I am. Low 30's for FPS is fine with me.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#131  Edited By KHAndAnime
Member since 2009 • 17565 Posts

@elessarGObonzo said:

@KHAndAnime: i just played through the first few hours and then the last few hours of Assassins of Kings the couple days before Wild Hunt came out. can guarantee Wild Hunt looks a bit better @ 1080p with everything turned all the way up. not as much better as people were thinking it would, but a still a bit better. besides that there's more going on in any given scene and a much greater draw distance in Wild Hunt.

I'm not sure about how it compares to The Witcher 2 because I don't have it installed and I haven't played it for a long time. I briefly played The Witcher 3, and this screenshot pretty much sums up my complaints about the game graphically (yes, max settings).

Ten seconds in and the game is throwing early PS3-quality assets at you...and you don't have to look hard for them (a cutscene takes places at this exact spot facing the camera at these exact leaves).

The best part of the graphics must be what...the character models maybe? If that's the case, and the game is supposed to look great, why do random NPC slaves in Mordor look way more detailed?

Here, you can see the texture of their skin revealed in the light, as well as their veins, their bone outlines, freckle spots on their shoulders, etc. In comparison, Geralt from that previous screenshot looks rather flat. His textures are really soft and very blurry on spots (like the skin on his arm), the different assets of clothing aren't casting any shadows upon eachother (like the necklace or rope wrapped around Geralt's chest piece). That sort of stuff sticks out to me like a sore thumb and I feel like you must have quite the biased eye to think The Witcher 3 isn't inconsistent looking graphically. The game's style is so splotchy, blurry, aliased, and distorted looking. Comparing those two screenshots, in the former shot feel like I'm looking at a game that was released very early in a generation while in the latter it feels like I'm looking at a game that was released much later in that generation. The smooth extra-detailed lighting, tessellation, and variety of shaders really brings the Mordor protagonist to life. Unlike Geralt's model, where there's a distinct lack of shadows going on, the Mordor dude has a very polished look. These same techniques are why I think even the NPC's in Mordor have an upper-hand over Geralt graphically.

I'm not saying The Witcher 3 always look as bad as that screenshot and never looks better than that (or Mordor). But I'm saying from the couple hours I spent with the game, much of the time I couldn't help but feel like the game often looks as bad as that screenshot. In many circumstances, the game's assets come very well together. In other circumstances, they don't mix very well and it looks like there's something missing (particularly in the lighting department), like the screenshot I used.

And I haven't even gotten started on the game's animations and physics. They must've hired the folks from Bethesda or something because they perfectly nailed the awkward ragdoll where NPC's randomly fly up into the air after you kill them and then float to the ground with no sense of acceleraton. They also nailed many different awkward animations, some of my favorites being the spastic clunky jump and the climb animation (usually involving Geralt massively clipping through the edge he clibms).

Avatar image for elessarGObonzo
elessarGObonzo

2677

Forum Posts

0

Wiki Points

0

Followers

Reviews: 140

User Lists: 0

#132 elessarGObonzo
Member since 2008 • 2677 Posts

@KHAndAnime said:

??

either got a crappy system that just can't process it or your in-game settings are just messed up. it appears you have no anti-aliasing or HBAO+ and other settings are clearly turned down. but the cloth and the rock textures still look better in the Wild Hunt shot.

the characters and the scenery in Wild Hunt also look a lot better than Shadow of Mordor. even in your screenshots the protagonist's face and hair look much worse in Shadow of Mordor. protagonist's facial hair is just "spray-painted" on his face, has almost no wrinkle lines.

funny you take a shot in bright sunlight VS a shot in dark next to a fire and then compare how much shadows\shading is going on. but can tell you obviously have those also turned down in the Wild Hunt because they are not blurry blobs with decent settings.

also, if there was any decent foliage in Shadow of Mordor, you would notice the same flatness when standing at awkward angles like in almost every game to date.

i think they both are great looking games but Wild Hunt is clearly higher quality. also not near as boring as Shadow of Mordor, at least there are new things going on and not doing the same Ork fight over and over for 30 hours.

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#133 BassMan
Member since 2002 • 17811 Posts

The more I play the game, the more impressed I am with the graphics. The scale of everything and amount of detail. It has some of the most beautiful landscapes I have seen. It all feels like a believable world too. We can nit pick all we want, but I think the game looks pretty fucking good. The game does have some bizarre judder or frame pacing issues in large crowds though. FPS is high and nothing bottle-necking and it just doesn't feel smooth like it should.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#134  Edited By KHAndAnime
Member since 2009 • 17565 Posts

@elessarGObonzo said:
@KHAndAnime said:

??

either got a crappy system that just can't process it or your in-game settings are just messed up. it appears you have no anti-aliasing or HBAO+ and other settings are clearly turned down. but the cloth and the rock textures still look better in the Wild Hunt shot.

the characters and the scenery in Wild Hunt also look a lot better than Shadow of Mordor. even in your screenshots the protagonist's face and hair look much worse in Shadow of Mordor. protagonist's facial hair is just "spray-painted" on his face, has almost no wrinkle lines.

funny you take a shot in bright sunlight VS a shot in dark next to a fire and then compare how much shadows\shading is going on. but can tell you obviously have those also turned down in the Wild Hunt because they are not blurry blobs with decent settings.

also, if there was any decent foliage in Shadow of Mordor, you would notice the same flatness when standing at awkward angles like in almost every game to date.

i think they both are great looking games but Wild Hunt is clearly higher quality. also not near as boring as Shadow of Mordor, at least there are new things going on and not doing the same Ork fight over and over for 30 hours.

Take a screenshot of the same location and prove yours looks any better. You're obviously in denial if you can't accept what the game looks like in that screenshot. The only thing I turned off was the Sharpening filter, and if I left it on, the screenshots would look a *lot* worse, so I'm actually doing the game a favor by leaving it off here.

I agree that Wild Hunt overall has better graphical quality than Mordor. But I also think that in terms of graphical consistency, Mordor was a lot less distracting. My screenshots don't even really do Mordor justice. There is so much more detail and effects going on with the protagonist that it's beyond comparison IMO, but it's unfair to make the comparison because Geralt has lots of different armor to wear, so they couldn't really dedicate enough time to make all these different pieces of armor look as good as the primary costume on a character in a big-budget game.

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#135  Edited By BassMan
Member since 2002 • 17811 Posts

@KHAndAnime: The foliage textures is the only thing that really stands out as bad in that screenshot. It's weird because most of the speed tree foliage in the game is a decent res. Artist must have got lazy when creating that custom foliage. Also, I am actually surprised at how efficient the game is with VRAM usage. Most of the textures look pretty good throughout the game and it uses around 2 GB VRAM at 1440p. That is impressive. SoM and other open world games use a lot more than that.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#136 Coseniath
Member since 2004 • 3183 Posts
@BassMan said:

@KHAndAnime: The foliage textures is the only thing that really stands out as bad in that screenshot. It's weird because most of the speed tree foliage in the game is a decent res. Artist must have got lazy when creating that custom foliage. Also, I am actually surprised at how efficient the game is with VRAM usage. Most of the textures look pretty good throughout the game and it uses around 2 GB VRAM at 1440p. That is impressive. SoM and other open world games use a lot more than that.

This game is a brilliant star when it comes to memory usage. While it provides great visuals it needs around 2GB even at 4K!!!

Avatar image for Truth_Hurts_U
Truth_Hurts_U

9703

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#137  Edited By Truth_Hurts_U
Member since 2006 • 9703 Posts

@Coseniath: Game does look outstanding for the amount of VRAM it uses. I wish the gog version had a screen shot key. :(

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#138 RyviusARC
Member since 2011 • 5708 Posts

@Coseniath said:
@BassMan said:

@KHAndAnime: The foliage textures is the only thing that really stands out as bad in that screenshot. It's weird because most of the speed tree foliage in the game is a decent res. Artist must have got lazy when creating that custom foliage. Also, I am actually surprised at how efficient the game is with VRAM usage. Most of the textures look pretty good throughout the game and it uses around 2 GB VRAM at 1440p. That is impressive. SoM and other open world games use a lot more than that.

This game is a brilliant star when it comes to memory usage. While it provides great visuals it needs around 2GB even at 4K!!!

There are some very huge ugly textures once you reach Novigrad.

Also I can get the game to easily use over 3GB of vRAM without even playing at 4k res.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#139 Coseniath
Member since 2004 • 3183 Posts
@Truth_Hurts_U said:

@Coseniath: Game does look outstanding for the amount of VRAM it uses. I wish the gog version had a screen shot key. :(

FRAPS had a key for screenshots. I am not sure but maybe MSI AB might have this option too...

@RyviusARC said:

There are some very huge ugly textures once you reach Novigrad.

Also I can get the game to easily use over 3GB of vRAM without even playing at 4k res.

Which GPU do you have and did you used hairworks? Cause 3 sites can't be sooo wrong...

Avatar image for deactivated-579f651eab962
deactivated-579f651eab962

5404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#140 deactivated-579f651eab962
Member since 2003 • 5404 Posts

@Coseniath:I'm using 2.3Gb at least @ 1440

@Truth_Hurts_U:Just add it to steam and use F12 or use AB.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#141 Coseniath
Member since 2004 • 3183 Posts
@klunt_bumskrint said:

@Coseniath:I'm using 2.3Gb at least @ 1440

Yeah this number would be more logical since you have hairworks on and you have 12GB VRAM. :P

Avatar image for genius2365
genius2365

495

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#142 genius2365
Member since 2010 • 495 Posts

Man, the games just keep looking better and better...

With games like the Witcher 3 pushing 1080p60FPS with a GTX 980, does this make it reasonable to get a 980 Ti and expect it to max out 1080p for at least the next year? Is it even worth discussing 1080p and the 980 Ti in the same sentence? Not to mention that the 980 Ti comes with 6 GB of VRAM for extra security. Would a 980 Ti be a future-proofers choice for 1080p, or will software advances like DX12 or other stuff render it's extra power irrelevant by the time it's needed?

Avatar image for Truth_Hurts_U
Truth_Hurts_U

9703

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#143  Edited By Truth_Hurts_U
Member since 2006 • 9703 Posts

@klunt_bumskrint: Yeah, I thought about doing that. I just swapped over my mobo again because I can't stand my Asus board. Gave it a head to toe cleaning while I was at it (I have OCD about cleaning my pc every 1-2 months). Plus my Asrock has really nice spacing for when I get my 2nd 970. Thought about it long and hard... Figure might as well and 2 years from now do a brand new build with the latest tech.

If only I wasn't so lazy to add it to steam... I'll have to do it some day. :)

Avatar image for deactivated-579f651eab962
deactivated-579f651eab962

5404

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#144 deactivated-579f651eab962
Member since 2003 • 5404 Posts

@Truth_Hurts_U: It takes 5 seconds man

Avatar image for thor_121
thor_121

26

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#145 thor_121
Member since 2014 • 26 Posts

@thehig1: i had a 7850 and rushed in to buy the strix gtx960 2gb which i regret.. I had the fx 8320 the 7850 and gskill snipers 2x4 gb on a shitty biostar mb and my sytem codnt run it at all.. So i got the gtx960 and it was a poor decition bc the witcher is very demanding and needs the extra vram which crossfite 7850s wont give you, even if i sli my 960 it wont run it as good as a 980. So i decided that since my first pc build sucked i needed a whole new build, so i now have the i74970k msi gamer 6 mb the same ram i had and a corsair ax760i psu, and all i need to do is sell the 960 and get the 980 so i can take advantage of my i7, so my point is dont waste money on a secknd 7850 that wont help you out much

Avatar image for thehig1
thehig1

7537

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#146 thehig1
Member since 2014 • 7537 Posts

@thor_121: I already had two 7850s at the time I posted that.

Sold them both and bought a 970.

Couldn't justify the extra spend for the 980.