The Witcher 3 PC Ultra Runs At 1080p 60FPS On GTX 980

  • 145 results
  • 1
  • 2
  • 3
Avatar image for blangenakker
blangenakker

3240

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 blangenakker
Member since 2006 • 3240 Posts

Everyone's like "I wonder if my 970 will run it on Ultra" and I'm just sitting here hoping my 650 Ti Boost will be able to have some things on High.

Avatar image for chaplainDMK
chaplainDMK

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#52  Edited By chaplainDMK
Member since 2008 • 7004 Posts

@blangenakker said:

Everyone's like "I wonder if my 970 will run it on Ultra" and I'm just sitting here hoping my 650 Ti Boost will be able to have some things on High.

Lol check my Radeon 5770. :D

Kinda sucks that I really can't see myself wasting 800 bucks on a new computer, quite frankly not even 400 bucks on a new console. Just gonna trucker on with my 6 year old rig till she burns out. Luckily I only have a 1280x1024 monitor, so I can still play most of the not top of the line games pretty decently.

Gaming is an expensive hobby lol, and I really aren't into it enough currently to see myself wasting that much money on anything.

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#53  Edited By BassMan
Member since 2002 • 17811 Posts

@chaplainDMK said:
@blangenakker said:

Everyone's like "I wonder if my 970 will run it on Ultra" and I'm just sitting here hoping my 650 Ti Boost will be able to have some things on High.

Lol check my Radeon 5770. :D

Kinda sucks that I really can't see myself wasting 800 bucks on a new computer, quite frankly not even 400 bucks on a new console. Just gonna trucker on with my 6 year old rig till she burns out. Luckily I only have a 1280x1024 monitor, so I can still play most of the not top of the line games pretty decently.

Gaming is an expensive hobby lol, and I really aren't into it enough currently to see myself wasting that much money on anything.

I can think of a lot more expensive hobbies and quicker ways to blow through money. I was in Vegas recently. In one night... nice steak dinner, limo, bottle service at the club... I could have bought a nice video card with the money I spent in one night. I think gaming is very affordable. The entertainment per dollar is unmatched.

Avatar image for chaplainDMK
chaplainDMK

7004

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#54 chaplainDMK
Member since 2008 • 7004 Posts

@BassMan said:
@chaplainDMK said:
@blangenakker said:

Everyone's like "I wonder if my 970 will run it on Ultra" and I'm just sitting here hoping my 650 Ti Boost will be able to have some things on High.

Lol check my Radeon 5770. :D

Kinda sucks that I really can't see myself wasting 800 bucks on a new computer, quite frankly not even 400 bucks on a new console. Just gonna trucker on with my 6 year old rig till she burns out. Luckily I only have a 1280x1024 monitor, so I can still play most of the not top of the line games pretty decently.

Gaming is an expensive hobby lol, and I really aren't into it enough currently to see myself wasting that much money on anything.

I can think of a lot more expensive hobbies and quicker ways to blow through money. I was in Vegas recently. In one night... nice steak dinner, limo, bottle service at the club... I could have bought a nice video card with the money I spent in one night. I think gaming is very affordable. The entertainment per dollar is unmatched.

I went on a 3 week backpacking trip for the money that it would cost me to build a new pc :)

Avatar image for thereal25
thereal25

2074

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#55 thereal25
Member since 2011 • 2074 Posts

@chaplainDMK said:
@blangenakker said:

Everyone's like "I wonder if my 970 will run it on Ultra" and I'm just sitting here hoping my 650 Ti Boost will be able to have some things on High.

Lol check my Radeon 5770. :D

Kinda sucks that I really can't see myself wasting 800 bucks on a new computer, quite frankly not even 400 bucks on a new console. Just gonna trucker on with my 6 year old rig till she burns out. Luckily I only have a 1280x1024 monitor, so I can still play most of the not top of the line games pretty decently.

Gaming is an expensive hobby lol, and I really aren't into it enough currently to see myself wasting that much money on anything.

I can relate but one thing that has really spiked my interest in hardware upgrades is all this new vr stuff which is soon to be released.

Even then it may not be worth it though. Hard to tell.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#56  Edited By RyviusARC
Member since 2011 • 5708 Posts

Day one patch added higher settings for PC.

High settings are now the old ultra and new ultra is a bump up.

Some benchmarks were released for the game and it looks like you need at least one 970 to get over 30fps since even the 290x gets lower than 30fps average with max settings at 1080p.

Glad I have 2 970s so I can max the game at 1440p.

The settings that read hairworks from means disabled and hairworks to means enabled.

Avatar image for Truth_Hurts_U
Truth_Hurts_U

9703

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#57 Truth_Hurts_U
Member since 2006 • 9703 Posts

Glad I got my new G Sync monitor coming today... Just in time.

Avatar image for ShepardCommandr
ShepardCommandr

4939

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#58 ShepardCommandr
Member since 2013 • 4939 Posts

1080p is pleb tier

Avatar image for alucrd2009
Alucrd2009

787

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#59  Edited By Alucrd2009
Member since 2007 • 787 Posts

@RyviusARC: if this is true i m screwed .... fuckkkkkkk nvida and thier stubid shit :((((((((((((((((((((((((

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

Looks like I am playing it on medium-high with no AA.

Expected.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#61  Edited By Coseniath
Member since 2004 • 3183 Posts

Well it seems that this was right...

From Nvidia

1920x1080, Low settingsGTX 960
1920x1080, Medium settingsGTX 960
1920x1080, High settingsGTX 960
1920x1080, Uber settingsGTX 970
1920x1080, Uber settings w/ GameWorksGTX 980
2560x1440, Uber settingsGTX 980
2560x1440, Uber settings w/ GameWorksGTX TITAN X, or 2-Way SLI GTX 970
3840x2160, Uber settingsGTX TITAN X, or 2-Way SLI GTX 980
3840x2160, Uber settings w/ GameWorks2-Way SLI GTX 980 or GTX TITAN X

Intel i7-5960X, 16GB DDR4 RAM, The Witcher 3: Wild Hunt Game Ready GeForce GTX Driver

Also since there are hardly any sites that messed with settings:

The Witcher 3: Wild Hunt Graphics, Performance & Tweaking Guide

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62  Edited By RyviusARC
Member since 2011 • 5708 Posts

Looks like my two overclocked 970s that perform better or similar to stock 980s can handle even 4k max.

Although I find the hard to believe with the benchmark I posted.

I don't mind anyway since I will be playing at 1440p.

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#63 BassMan
Member since 2002 • 17811 Posts

I will most likely be playing 1440p with AA disabled to gets some extra fps.

Avatar image for Gamesterpheonix
Gamesterpheonix

3676

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#64 Gamesterpheonix
Member since 2005 • 3676 Posts

You guys are talking as if a GTX 970 wont be able to run the game lol. I mean, Im sure there are going to be ways to adjusting settings to allow for 60 fps and near to no visual degradation. I mean, tweaking really hasnt even started yet. A lot of PC folks dont even have that kind of power. Ive held off on a new build just because of Witcher 3 but also because of all the games that are becoming VRAM whores. One great thing about all the "downgrades" that are resulting because of consoles (especially in this case) help lower the requirements needed to max the game. Without DX12 or Mantle or OpenGL devs arent gonna give a crap about reducing VRAM loads or optimizing for PC. Douche bags know they will do better on console. So many games look great on PS4 and could perform equally well on PC if the devs made the effort or cared...but they dont. I have my doubts with CDProjekt Red as well.

Avatar image for digitm64
digitm64

470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 5

#65 digitm64
Member since 2013 • 470 Posts

Just tried with my 780ti. Looks like this card is now obsolete, this game is killing it at Ultra with Hairworks. Getting between 20-45fps at the start at 1440p. All the new games this year will make Kepler a dead horse. I was waiting for the 980ti, but might just have to bite the bullet and pickup a Titan X, since I really want to play this game decently. I heard a rumour that the 980ti will be faster than the Titan X but cheaper. What a dilemma.
I'm kind of cheesed that a 780ti has become such crap all of a sudden.

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#66 Coseniath
Member since 2004 • 3183 Posts
@digitm64 said:

Just tried with my 780ti. Looks like this card is now obsolete, this game is killing it at Ultra with Hairworks. Getting between 20-45fps at the start at 1440p. All the new games this year will make Kepler a dead horse. I was waiting for the 980ti, but might just have to bite the bullet and pickup a Titan X, since I really want to play this game decently. I heard a rumour that the 980ti will be faster than the Titan X but cheaper. What a dilemma.

I'm kind of cheesed that a 780ti has become such crap all of a sudden.

Wait for 980ti. Rumors saying that might be soon released in order to counter whatever AMD has to offer.

Also 980ti will have 6GB VRAM but it will have higher clocks and not to mention that Nvidia will let Asus, MSI, Gigabyte etc etc to use custom coolers.

The difference could be great. Imagine 10% higher clocks and an other 10% from factory o/c cards.

Avatar image for dangamit
dangamit

664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#67 dangamit
Member since 2010 • 664 Posts
@digitm64 said:

Just tried with my 780ti. Looks like this card is now obsolete, this game is killing it at Ultra with Hairworks. Getting between 20-45fps at the start at 1440p. All the new games this year will make Kepler a dead horse. I was waiting for the 980ti, but might just have to bite the bullet and pickup a Titan X, since I really want to play this game decently. I heard a rumour that the 980ti will be faster than the Titan X but cheaper. What a dilemma.

I'm kind of cheesed that a 780ti has become such crap all of a sudden.

SLI 780's here. Playing at 2K with everything on high and getting FPS in the low 50s with Hairworks off. Very disappointing. The game looks nice, but it's nothing revolutionary. GTA looks better AND runs much better.

Nvidia is screwing the 7 series owners to force them to buy 9 series cards. What a terrible business model. I will never again buy an Nvidia card after this slap to the face. The 280X is owing the 780 when they're not even in the same league.

Avatar image for NeoGen85
NeoGen85

4270

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#68  Edited By NeoGen85
Member since 2003 • 4270 Posts

Hrm. I see some new benchmarks out today. Ultra settings include GameWorks features(HBAO and Hairworks) seem to have high-end video cards struggling to maintain 60 frames per second at 1080p:

http://wccftech.com/witcher-3-initial-benchmarks/

Avatar image for Truth_Hurts_U
Truth_Hurts_U

9703

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#69  Edited By Truth_Hurts_U
Member since 2006 • 9703 Posts

Played Witcher 3 for about 60/70 mins so far. Grass is defiantly out of place in the game. Maxed out runs pretty smooth... I can tell even with G-sync that it's not high FPS. But at least the stuttering is cut down because of it.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#70 xantufrog  Moderator
Member since 2013 • 17875 Posts

@dangamit said:

Nvidia is screwing the 7 series owners to force them to buy 9 series cards. What a terrible business model. I will never again buy an Nvidia card after this slap to the face. The 280X is owing the 780 when they're not even in the same league.

Nvidia's been kinda shady, I feel like

Avatar image for saintsatan
SaintSatan

1986

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#71  Edited By SaintSatan
Member since 2003 • 1986 Posts

I'd hope it runs decently after not one but two graphical downgrades. There was the 2013 version that looked pretty awesome. Then in 2014 they released a new build showing some serious downgrades. Then again in 2015 they showed yet another version with even more downgrades.

Avatar image for dangamit
dangamit

664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#72 dangamit
Member since 2010 • 664 Posts

For people running the game at 1440p, disabling AA improves performance by 5 to 15 FPS depending on how graphically demanding the area is.

Avatar image for Gaming-Planet
Gaming-Planet

21064

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#73  Edited By Gaming-Planet
Member since 2008 • 21064 Posts

The game is pretty demanding. I ran it with a 960 and it goes below 30 fps at times although that could be because it needs more optimizing since I didn't really see anything demanding other than nvidia hairworks(probably will turn that off my next run to not make the frame rate so choppy)

The game looks and plays great at high and ultra isn't much of a difference. Kinda wish I had gotten a 970 instead but I think I'll live for now.

Avatar image for dangamit
dangamit

664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#74 dangamit
Member since 2010 • 664 Posts

@xantufrog said:
@dangamit said:

Nvidia is screwing the 7 series owners to force them to buy 9 series cards. What a terrible business model. I will never again buy an Nvidia card after this slap to the face. The 280X is owing the 780 when they're not even in the same league.

Nvidia's been kinda shady, I feel like

You know, I used to think this whole conspiracy about nvidia ditching old devices (or even crippling their performance on purpose) was a bunch crap. But a 960 outperforming a 780? Yea I think I should invest in a tinfoil hat.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#75  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

@dangamit said:

You know, I used to think this whole conspiracy about nvidia ditching old devices (or even crippling their performance on purpose) was a bunch crap. But a 960 outperforming a 780? Yea I think I should invest in a tinfoil hat.

Same here.

Avatar image for digitm64
digitm64

470

Forum Posts

0

Wiki Points

0

Followers

Reviews: 25

User Lists: 5

#76 digitm64
Member since 2013 • 470 Posts

I thought it got delayed so they could do optimisations? There's many a rumour flying around that NVidia drivers are crippling the older Kepler series.

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#77  Edited By BassMan
Member since 2002 • 17811 Posts

Nvidia better boost driver performance for Kepler cards. That shit is not right. 280X and 960 should not be beating a 780. The driver performance should reflect that.

Avatar image for navyguy21
navyguy21

17427

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#78 navyguy21  Online
Member since 2003 • 17427 Posts

Guess I'll be using high/ultra combination. Don't think I'll be maxing this one.

Avatar image for NeoGen85
NeoGen85

4270

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#79  Edited By NeoGen85
Member since 2003 • 4270 Posts

I have a Titan X, but after looking at those benchmarks and hearing your comments...we'll see. 6% left on the download. >.<

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#80 KHAndAnime
Member since 2009 • 17565 Posts

1440P Ultra for me. But I think I'll hope out for Arkham Knight to satisfy my need for beautiful graphics.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#81  Edited By RyviusARC
Member since 2011 • 5708 Posts

Game was running so well maxed at 1440p that I forced some settings beyond ultra.

Avatar image for NeoGen85
NeoGen85

4270

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#82 NeoGen85
Member since 2003 • 4270 Posts

Keyboard & mouse or controller?

Avatar image for NeoGen85
NeoGen85

4270

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#83  Edited By NeoGen85
Member since 2003 • 4270 Posts

Okay. So here's what I got:

i7 5820k at 4.1ghz

32GB of DDR4 RAM

Titan X

Acer 28" 4K monitor with G-sync

Ultra settings with GameWorks features:

  • 1080p - 60fps
  • 1440p - 55 to 60fps
  • 4K - 30fps

By the way, playing at 1440p is beautiful but at 4K resolution is stupid ridiculous!

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#84  Edited By BassMan
Member since 2002 • 17811 Posts

@RyviusARC said:

Game was running so well maxed at 1440p that I forced some settings beyond ultra.

1440p, everything maxed (ultra) with no AA, and no vsync, what is the fps range and average?

Avatar image for Coseniath
Coseniath

3183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#85  Edited By Coseniath
Member since 2004 • 3183 Posts
@RyviusARC said:

Game was running so well maxed at 1440p that I forced some settings beyond ultra.

Do not tweak the grass distance!!!! :P

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#86 RyviusARC
Member since 2011 • 5708 Posts

@Coseniath said:
@RyviusARC said:

Game was running so well maxed at 1440p that I forced some settings beyond ultra.

Do not tweak the grass distance!!!! :P

I left it at 3 for that settings and performance was still great along with all the other tweaks added in the nvidia page.

I didn't find 6 worth it because the improvement was not as much as moving from 1.5 to 3.

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#87  Edited By RyviusARC
Member since 2011 • 5708 Posts

@BassMan said:
@RyviusARC said:

Game was running so well maxed at 1440p that I forced some settings beyond ultra.

1440p, everything maxed with no AA, and no vsync, what is the fps range and average?

The game uses post process AA which has no effect on frame rate.

Not sure what I get without Vsync but with it I am getting around 50fps but this is with tweak settings that go beyond ultra.

I believe 60fps average should be possible with regular ultra settings at 1440p.

Avatar image for BassMan
BassMan

17811

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#88 BassMan
Member since 2002 • 17811 Posts
@RyviusARC said:
@BassMan said:
@RyviusARC said:

Game was running so well maxed at 1440p that I forced some settings beyond ultra.

1440p, everything maxed with no AA, and no vsync, what is the fps range and average?

The game uses post process AA which has no effect on frame rate.

Not sure what I get without Vsync but with it I am getting around 50fps but this is with tweak settings that go beyond ultra.

I believe 60fps average should be possible with regular ultra settings at 1440p.

I like to keep 60 fps minimum. I guess I will find out tomorrow.

Avatar image for goodkingmog
GoodKingMog

167

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#89 GoodKingMog
Member since 2015 • 167 Posts

@KHAndAnime said:

Meh. The Witcher games have always been pretty ugly - this one is no exception. I can only imagine the barrage of crap recycled animations we're about to endure

lol KH.... always good for a troll.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#90 ShadowDeathX
Member since 2006 • 11698 Posts

@dangamit said:
@digitm64 said:

Just tried with my 780ti. Looks like this card is now obsolete, this game is killing it at Ultra with Hairworks. Getting between 20-45fps at the start at 1440p. All the new games this year will make Kepler a dead horse. I was waiting for the 980ti, but might just have to bite the bullet and pickup a Titan X, since I really want to play this game decently. I heard a rumour that the 980ti will be faster than the Titan X but cheaper. What a dilemma.

I'm kind of cheesed that a 780ti has become such crap all of a sudden.

SLI 780's here. Playing at 2K with everything on high and getting FPS in the low 50s with Hairworks off. Very disappointing. The game looks nice, but it's nothing revolutionary. GTA looks better AND runs much better.

Nvidia is screwing the 7 series owners to force them to buy 9 series cards. What a terrible business model. I will never again buy an Nvidia card after this slap to the face. The 280X is owing the 780 when they're not even in the same league.

wtf???

0.o

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#91  Edited By KHAndAnime
Member since 2009 • 17565 Posts

@goodkingmog said:
@KHAndAnime said:

Meh. The Witcher games have always been pretty ugly - this one is no exception. I can only imagine the barrage of crap recycled animations we're about to endure

lol KH.... always good for a troll.

And unfortunately I'm right again. The game releases and many people are really disappointed that the graphics didn't live up to their expectations. Happened with Witcher 2 too. The pre-release screenshots were not very impressive and yet they were downgraded from those, even. I mean, we can pretend the game looks great...but c'mon. It really doesn't. Reminds me of an MMORPG graphically. The grass has that gross brightened pasty look past 10m away...reminds me of Oblivion or something.

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#92 FelipeInside
Member since 2003 • 28548 Posts

@KHAndAnime said:
@goodkingmog said:
@KHAndAnime said:

Meh. The Witcher games have always been pretty ugly - this one is no exception. I can only imagine the barrage of crap recycled animations we're about to endure

lol KH.... always good for a troll.

And unfortunately I'm right again. The game releases and many people are really disappointed that the graphics didn't live up to their expectations. Happened with Witcher 2 too. The pre-release screenshots were not very impressive and yet they were downgraded from those, even. I mean, we can pretend the game looks great...but c'mon. It really doesn't.

No you're wrong again:

"The game releases and many people are really disappointed that the graphics didn't live up to their expectations"

- No, people are dissapointed that CDProjekt showed one graphics build, but released the game with another

"Happened with Witcher 2 too."

- Witcher 2 looked better with each footage and even better with the final game

"I mean, we can pretend the game looks great...but c'mon. It really doesn't."

- You need glasses

Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#93 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts

@RyviusARC said:
@BassMan said:
@RyviusARC said:

Game was running so well maxed at 1440p that I forced some settings beyond ultra.

1440p, everything maxed with no AA, and no vsync, what is the fps range and average?

The game uses post process AA which has no effect on frame rate.

Not sure what I get without Vsync but with it I am getting around 50fps but this is with tweak settings that go beyond ultra.

I believe 60fps average should be possible with regular ultra settings at 1440p.

It does use post process AA, but CD Projekt's own solution that does have a noticeable hit. I've actually found it to be a pretty good AA solution, definitely better than crap like FXAA or TXAA.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#94  Edited By KHAndAnime
Member since 2009 • 17565 Posts

@FelipeInside said:
@KHAndAnime said:
@goodkingmog said:
@KHAndAnime said:

Meh. The Witcher games have always been pretty ugly - this one is no exception. I can only imagine the barrage of crap recycled animations we're about to endure

lol KH.... always good for a troll.

And unfortunately I'm right again. The game releases and many people are really disappointed that the graphics didn't live up to their expectations. Happened with Witcher 2 too. The pre-release screenshots were not very impressive and yet they were downgraded from those, even. I mean, we can pretend the game looks great...but c'mon. It really doesn't.

No you're wrong again:

No, people are dissapointed that CDProjekt showed one graphics build, but released the game with another

What difference does it make? None - it's the same difference: people are disappointed with Witcher 3's graphics. People thought they were going to look better than what they were delivered. You're referring to older screenshots that used a better graphics build. The most recent screenshots released reflected the current state of the game's graphics.

- Witcher 2 looked better with each footage and even better with the final game

If you think so. The trailers and pre-release screenshots looked great but when the game released, many people (including myself) were laughing at the animations and NPC detail quality. Very poor.

- You need glasses

If my vision was blurrier, wouldn't that mean lower-detail objects would appear equivalent to higher detailed objects? Maybe I'm not the one who need glasses. If your vision is blurry and there's no low-detail elements that pop out to you, that's fine. The low-quality elements of this game's GFX stick out to me like a sore thumb - and have long before launch.

I'm not sure how I can objectively wrong about something that's subjective. I just think the animations look blatantly clunky and many elements of the graphics look dated. If you disagree, that's fine. I'm sure the game's fine, but if the game wasn't ugly or disappointing - nobody would really be pointing out downgrades from the earlier builds in the first place. There are parts of the game that look great, but as a whole, visually, it's a little weak for a 2015 title. Often good graphics are enough alone to convince me to pick up titles. Especially considering I'd like to "benchmark" the beauty of my new monitor - but Witcher 3 never really convinced me.

Don't want to slight the game though...sure it's great. Just not my taste.

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#95 FelipeInside
Member since 2003 • 28548 Posts

@KHAndAnime said:
@FelipeInside said:
@KHAndAnime said:
@goodkingmog said:
@KHAndAnime said:

Meh. The Witcher games have always been pretty ugly - this one is no exception. I can only imagine the barrage of crap recycled animations we're about to endure

lol KH.... always good for a troll.

And unfortunately I'm right again. The game releases and many people are really disappointed that the graphics didn't live up to their expectations. Happened with Witcher 2 too. The pre-release screenshots were not very impressive and yet they were downgraded from those, even. I mean, we can pretend the game looks great...but c'mon. It really doesn't.

No you're wrong again:

No, people are dissapointed that CDProjekt showed one graphics build, but released the game with another

What difference does it make? None - it's the same difference: people are disappointed with Witcher 3's graphics. You're referring to older screenshots that used a better graphics build. The most recent screenshots reflected the current state of the game's graphics.

- Witcher 2 looked better with each footage and even better with the final game

If you think so. The trailers and pre-release screenshots looked great but when the game released, many people (including myself) were laughing at the animations and NPC detail quality. Very poor.

- You need glasses

If my vision was blurrier, wouldn't that mean lower-detail objects would appear equivalent to higher detailed objects? Maybe I'm not the one who need glasses. If your vision is blurry and there's no low-detail elements that pop out to you, that's fine. The low-quality elements of this game's GFX stick out to me like a sore thumb - and have long before launch.

I'm not sure how I can objectively wrong about something that's subjective. I just think the animations look blatantly clunky and many elements of the graphics look dated. If you disagree, that's fine. I'm sure the game's fine, but if the game wasn't ugly or disappointing - nobody would really be pointing out downgrades from the earlier builds in the first place. There are parts of the game that look great, but as a whole, visually, it's a little weak for a 2015 title.

There's a difference between people not happy with the downgrade issue and not happy with the graphics.

I haven't see one person complain that the game looks ugly (except you of course), the game looks great. What people are disappointed with is that it could have looked even better (and were shown better).

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#96  Edited By KHAndAnime
Member since 2009 • 17565 Posts

@FelipeInside said:
@KHAndAnime said:

There's a difference between people not happy with the downgrade issue and not happy with the graphics.

I haven't see one person complain that the game looks ugly (except you of course), the game looks great. What people are disappointed with is that it could have looked even better (and were shown better).

I'm not necessarily saying the game is outright ugly. Just visually disappointing. It doesn't excel, that's for sure.

You don't really have to look around hard to find people being disappointed with the graphics.. Even in the other review round up thread, the guy in the Youtube video says in some cases it looks worse than The Witcher 2. How would that not be disappointing? Some of the people complaining about the graphics being downgraded are also complaining that the graphics are generally disappointing - not completely mutually exclusive groups.

Avatar image for goodkingmog
GoodKingMog

167

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#97 GoodKingMog
Member since 2015 • 167 Posts

@KHAndAnime said:
@FelipeInside said:
@KHAndAnime said:
@goodkingmog said:
@KHAndAnime said:

Meh. The Witcher games have always been pretty ugly - this one is no exception. I can only imagine the barrage of crap recycled animations we're about to endure

lol KH.... always good for a troll.

And unfortunately I'm right again. The game releases and many people are really disappointed that the graphics didn't live up to their expectations. Happened with Witcher 2 too. The pre-release screenshots were not very impressive and yet they were downgraded from those, even. I mean, we can pretend the game looks great...but c'mon. It really doesn't.

No you're wrong again:

No, people are dissapointed that CDProjekt showed one graphics build, but released the game with another

What difference does it make? None - it's the same difference: people are disappointed with Witcher 3's graphics. People thought they were going to look better than what they were delivered. You're referring to older screenshots that used a better graphics build. The most recent screenshots released reflected the current state of the game's graphics.

- Witcher 2 looked better with each footage and even better with the final game

If you think so. The trailers and pre-release screenshots looked great but when the game released, many people (including myself) were laughing at the animations and NPC detail quality. Very poor.

- You need glasses

If my vision was blurrier, wouldn't that mean lower-detail objects would appear equivalent to higher detailed objects? Maybe I'm not the one who need glasses. If your vision is blurry and there's no low-detail elements that pop out to you, that's fine. The low-quality elements of this game's GFX stick out to me like a sore thumb - and have long before launch.

I'm not sure how I can objectively wrong about something that's subjective. I just think the animations look blatantly clunky and many elements of the graphics look dated. If you disagree, that's fine. I'm sure the game's fine, but if the game wasn't ugly or disappointing - nobody would really be pointing out downgrades from the earlier builds in the first place. There are parts of the game that look great, but as a whole, visually, it's a little weak for a 2015 title. Often good graphics are enough alone to convince me to pick up titles. Especially considering I'd like to "benchmark" the beauty of my new monitor - but Witcher 3 never really convinced me.

Don't want to slight the game though...sure it's great. Just not my taste.

so much troll... lol

Avatar image for FelipeInside
FelipeInside

28548

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#98  Edited By FelipeInside
Member since 2003 • 28548 Posts

@KHAndAnime said:
@FelipeInside said:
@KHAndAnime said:

There's a difference between people not happy with the downgrade issue and not happy with the graphics.

I haven't see one person complain that the game looks ugly (except you of course), the game looks great. What people are disappointed with is that it could have looked even better (and were shown better).

I'm not necessarily saying the game is outright ugly. Just visually disappointing. It doesn't excel, that's for sure.

You don't really have to look around hard to find people being disappointed with the graphics.. Even in the other review round up thread, the guy in the Youtube video says in some cases it looks worse than The Witcher 2. How would that not be disappointing? Some of the people complaining about the graphics being downgraded are also complaining that the graphics are generally disappointing - not completely mutually exclusive groups.

I doubt it looks worse than Witcher 2 in parts, but I'll have to judge for myself when I get to play it this week.

I know one thing for sure; just watching HD gameplay videos (even after the downgrade), it's one of the best looking RPGs on the market (like Witcher 1 and Witcher 2 were at the time). CDProjekt didn't push the tech as far as they could this time, but they still came out with a beautiful product (not to mention that I'm sure the game will be great since 1 and 2 were)

And this "Often good graphics are enough alone to convince me to pick up titles", lol.... said like a true console player.

Avatar image for KHAndAnime
KHAndAnime

17565

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#99  Edited By KHAndAnime
Member since 2009 • 17565 Posts

@goodkingmog said:

so much troll... lol

Ignorance is bliss I guess. When people are saying Witcher 2 looks better in many aspects, well...that's just kinda lame. I'm not going to harp on it though.

Avatar image for Malta_1980
Malta_1980

11890

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#100 Malta_1980
Member since 2008 • 11890 Posts

Hhmm guess I'll end up building a new PC towards the end of the year..