FFS please be excited for Far Cry 6

  • 104 results
  • 1
  • 2
  • 3
Avatar image for strategyfn
strategyfn

1182

Forum Posts

0

Wiki Points

0

Followers

Reviews: 52

User Lists: 0

#101 strategyfn
Member since 2012 • 1182 Posts

Lol. From what I gather, the whole early atmosphere is that you’re a poor South American who has to improvise and scrounge materials.

Playing as a woman is nice. Adaptive triggers are really nice, it really feels like you are pulling the trigger on a rusty gun.

Avatar image for BassMan
BassMan

17886

Forum Posts

0

Wiki Points

0

Followers

Reviews: 227

User Lists: 0

#102  Edited By BassMan
Member since 2002 • 17886 Posts
@Gatygun said:
@BassMan said:
@Gatygun said:
@BassMan said:

If you have a shit CPU... sure. Any decent 8C/16T CPU should be fine for RT. My 9900K is 3 years old and it has not been a bottleneck.

That's based on a 9900k. Every CPU has this problem because that's just how RT functions.

The only people that do not have this issue, have weak gpu's sit at high resolutions where RT simple hits GPU bottleneck before CPU. If you got a 3080 and play at 1080p/1440p cpu bottlenecks will happen in titles big time.

The video from farcry 6 benchmark showcases exactly this low usage on the GPU because CPU gets hit.

RT demands more CPU resources, but I have never run into a CPU bottleneck when using RT. I usually play at 3440x1440 or 4K. I have also played at 1440p with RT. Who uses their 3080 at 1080p? That is just silly. You would only use a 3080 at 1080p for competitive gaming on a 240/360hz monitor. As for Far Cry 6, the game has shit CPU optimization and is hammering a single thread. This is independent of RT and is an issue with the game engine.

I dunno what u are rambling about resolutions but u clearly have no clue how RT works.

Nobody with nvidia gpu's uses RT without DLSS or FSR, if you do the performance will tank massively or the game has some seriously bothered AMD version like deathloop or far cry 6 that basically can't even be called RT to start with as it looks worse then just prebaked shit at that point.

What DLSS does is lower the internal resolution and upscales it with AI, FSR does it without AI. So that means when u sit at 4k performance dlss u are basically rending at 1080p. Quality is 1440p.

This is why i made the statement 1080p/1440p, as without DLSS a actual RT game like ascent / cyberpunk / control will all feature massive performance hit to unplayable framerates.

Control for example enabling RT = ~45 fps at 3440x1440p. with DLSS quality its ~80 fps. Why? far lower resolution.

Go play 4k in cyberpunk with RT on see how that holds up without DLSS, aint happening. So your whole "u only play at high resolutions is not really the reality is it?" And u would have known this if you actually used RT in your life i highly doubt u did.

So back to my original statement that u somehow disagree with that RT doesn't hit CPU performance and a simple good modern half quality CPU will be fine. Well let me showcase you how its not fine and not a single CPU on the market can run RT at high refresh rates for shit because we need far faster CPU's to do this.

Here you go.

Loading Video...

1:25 = no rt + dlss ultra settings = 140 fps 80% gpu usage at 3440x1440 = cpu bound as gpu won't see full taxation

2:20 = RT 3440x1440 without DLSS = 30-45 fps = 100% GPU bound

3:07 = RT + DLSS quality = 53 fps CPU Bound, as GPU is 78% taxated.

So CPU bottleneck with RT = 53 fps can't even hit 60. So hows this possible? RT destroys CPU performance. U see this in every games so no its not shit optimized in this specific title. and with DLSS its easy to see even at high resolutions how good your CPU holds up.

This is exactly what u see with far cry 6 only 70 fps with a 3090 yet it runs 60 fps at 4k. while 4k is far more demanding then 1080p, the fps gain should be far higher right at a lower resolution right? no not really because CPU bound.

This is why we need drastical improvement in CPU performance more then GPU performance at this point in time and anybody saying even a potato cpu can run every game without problems. Yea sure without RT, add RT into the mix and that potato cpu is dead.

LOL, I owned 2080 Ti since launch (actually owned 2) and 3080 since launch.

I know how RT and DLSS works and of course I use DLSS because I want the higher frame rate. I am still not outputting at 1080p in any scenario regardless of DLSS (and its lower internal res) or not. Like I said, you are only going to use a 3080 at 1080p for 240/360hz competitive gaming and RT is not even an option at that point as you want as high a frame rate as possible with the least amount of system lag.

The examples that you show are not showing full GPU usage or CPU usage. This is due to shit optimization and not the games being CPU bound. So, your cry for better CPUs for RT is silly as they are not even being used fully as is. Of course a CPU can bottle-neck a game at a lower resolution, but you are yet to show me an example of a game outputting at 1440p or higher with RT enabled (DLSS or not) that is being bottle-necked by the CPU and not shit optimization. Also, I never said that RT does not hit the CPU as it has to work harder to feed both the CUDA cores and RT cores. Modern CPUs have the headroom to do this.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#103  Edited By Gatygun
Member since 2010 • 2709 Posts
@BassMan said:
@Gatygun said:
@BassMan said:
@Gatygun said:

That's based on a 9900k. Every CPU has this problem because that's just how RT functions.

The only people that do not have this issue, have weak gpu's sit at high resolutions where RT simple hits GPU bottleneck before CPU. If you got a 3080 and play at 1080p/1440p cpu bottlenecks will happen in titles big time.

The video from farcry 6 benchmark showcases exactly this low usage on the GPU because CPU gets hit.

RT demands more CPU resources, but I have never run into a CPU bottleneck when using RT. I usually play at 3440x1440 or 4K. I have also played at 1440p with RT. Who uses their 3080 at 1080p? That is just silly. You would only use a 3080 at 1080p for competitive gaming on a 240/360hz monitor. As for Far Cry 6, the game has shit CPU optimization and is hammering a single thread. This is independent of RT and is an issue with the game engine.

I dunno what u are rambling about resolutions but u clearly have no clue how RT works.

Nobody with nvidia gpu's uses RT without DLSS or FSR, if you do the performance will tank massively or the game has some seriously bothered AMD version like deathloop or far cry 6 that basically can't even be called RT to start with as it looks worse then just prebaked shit at that point.

What DLSS does is lower the internal resolution and upscales it with AI, FSR does it without AI. So that means when u sit at 4k performance dlss u are basically rending at 1080p. Quality is 1440p.

This is why i made the statement 1080p/1440p, as without DLSS a actual RT game like ascent / cyberpunk / control will all feature massive performance hit to unplayable framerates.

Control for example enabling RT = ~45 fps at 3440x1440p. with DLSS quality its ~80 fps. Why? far lower resolution.

Go play 4k in cyberpunk with RT on see how that holds up without DLSS, aint happening. So your whole "u only play at high resolutions is not really the reality is it?" And u would have known this if you actually used RT in your life i highly doubt u did.

So back to my original statement that u somehow disagree with that RT doesn't hit CPU performance and a simple good modern half quality CPU will be fine. Well let me showcase you how its not fine and not a single CPU on the market can run RT at high refresh rates for shit because we need far faster CPU's to do this.

Here you go.

Loading Video...

1:25 = no rt + dlss ultra settings = 140 fps 80% gpu usage at 3440x1440 = cpu bound as gpu won't see full taxation

2:20 = RT 3440x1440 without DLSS = 30-45 fps = 100% GPU bound

3:07 = RT + DLSS quality = 53 fps CPU Bound, as GPU is 78% taxated.

So CPU bottleneck with RT = 53 fps can't even hit 60. So hows this possible? RT destroys CPU performance. U see this in every games so no its not shit optimized in this specific title. and with DLSS its easy to see even at high resolutions how good your CPU holds up.

This is exactly what u see with far cry 6 only 70 fps with a 3090 yet it runs 60 fps at 4k. while 4k is far more demanding then 1080p, the fps gain should be far higher right at a lower resolution right? no not really because CPU bound.

This is why we need drastical improvement in CPU performance more then GPU performance at this point in time and anybody saying even a potato cpu can run every game without problems. Yea sure without RT, add RT into the mix and that potato cpu is dead.

LOL, I owned 2080 Ti since launch (actually owned 2) and 3080 since launch.

I know how RT and DLSS works and of course I use DLSS because I want the higher frame rate. I am still not outputting at 1080p in any scenario regardless of DLSS (and its lower internal res) or not. Like I said, you are only going to use a 3080 at 1080p for 240/360hz competitive gaming and RT is not even an option at that point as you want as high a frame rate as possible with the least amount of system lag.

The examples that you show are not showing full GPU usage or CPU usage. This is due to shit optimization and not the games being CPU bound. So, your cry for better CPUs for RT is silly as they are not even being used fully as is. Of course a CPU can bottle-neck a game at a lower resolution, but you are yet to show me an example of a game outputting at 1440p or higher with RT enabled (DLSS or not) that is being bottle-necked by the CPU and not shit optimization. Also, I never said that RT does not hit the CPU as it has to work harder to feed both the CUDA cores and RT cores. Modern CPUs have the headroom to do this.

1) The fact u don't know about heavy CPU hit with RT means u never played around with settings let alone have experience with RT on PC department. As its straight up instantly noticable by everybody that got into higher performing RT cards and DLSS 2.0 solutions at higher and lower resolutions tha twant to push to higher framerates with a more aggressive DLSS until they realize they hit cpu wall.

2) The fact u don't know DLSS renders at lower resolutions, is again u not having experience with it.

3) The fact u don't understand how RTSS works even while its the main benchmark metric solution used by any PC benchmark is again u not having experience with the matter

4) The fact u don't know CPU's are never taxated towards a 100% and that IPC is far more important and also the main reason why people upgrade there CPU hardware because nobody is going to design there games around high end CPU's on PC or more core then x amount solutions which makes IPC king Is also you not knowing how the PC market works.

U have no argument, showcasing a bunch of hardware is cool and all but u clearly have no clue how things work or u do, but u draw all the casual arguments u can imagine to empower a already lost point as the point is proven by simple reality.

Your statement about nobody uses 1440p resolution with a 3080 even while every person on the planet that has higher end hardware sits probably on that resolution as 4k 60fps is still till this day a crapshoot specially if DLSS isn't in the picture and RT is introduced or u find 60 fps acceptable ( the absolute minimum for high end gaming ) Is just you not knowing how the market functions.

It's obviously visible with your posts.

Anyway

I showcased you a example of a game, where CPU gets absolutely decimated on IPC performance ( aka single core performance for every core "oh well u probably don't understand what that even means i guess even while its the main metric for game performance on PC under CPU's" )

Ascent shows 120 fps minimums ( or whatever it was in the video without RT ) with RT and aggressive DLSS usage which lowers the render resolution way down, u will never get to a 100 fps no matter how fast your GPU is because CPU gets decimated at core for core performance called IPC. Which is the main problem RT adds.

Even if developers would 100% optimize for a 9900k and slam the cpu cores at 100% all of them, that performance will also relate towards non RT scenario's which means RT will again take a hit here big time on the CPU performance as it simple takes up a metric ton of cycles.

No matter how u want to spin it, and u can test this out yourself incredible easy in every single RT game that isn't a potato like deathpool yourself.

This is why i stated, 70% usage on the GPU on a 3090 means nothing when the GPU is getting bottlenecked by the CPU. This is also why even at higher resolutions like 3440x1440 DLSS in many games doesn't add much towards the mix even with RT after Quality preset because CPU will be your main problem. Unless u put RT in a game that cripples a GPU to such extends that u won't hit the CPU stress hold where it bottlenecks.

Cyberpunk at 4k is a good example of this where it absolute destroys performance.

In short.

RT kills CPU performance and that's just the reality. I showcased you proof hell the whole channel is riddled with video's that showcase you this that have RT in it and u can test it out yourself without effort if you have the games.

My arguments are supported by facts and by reality, your arguments support a lack of knowledge on the subject to the point its not even worth my time anymore. Good luck.

Avatar image for BassMan
BassMan

17886

Forum Posts

0

Wiki Points

0

Followers

Reviews: 227

User Lists: 0

#104  Edited By BassMan
Member since 2002 • 17886 Posts

@Gatygun: Your reading comprehension is terrible and you make false claims and assumptions and label them as facts. Your attempts to inform me on how things work are hilarious and make you come off as an arrogant idiot. Listening to you talk in your video only reinforces that assessment.