Grey_Eyed_Elf's forum posts

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

12

Followers

Reviews: 0

User Lists: 0

#1 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@hardwenzen said:
@Grey_Eyed_Elf said:
@hardwenzen said:
@Grey_Eyed_Elf said:
@hardwenzen said:

More n more pathetic by the day.

1200p

30fps

Way lower graphical setting than pc

No physical versions

If you get this on a console instead of pc you're clueless.

Clueless?....

So the game running at 1200p 30fps on a 12TFLOP console isn't giving you any clues to how its going to run on PC?

You do realise that since consoles went x86 is almost a 1:1 indicator of how its going to run on pc, meaning you will need a $2k PC to get this thing running at 4k 60fps on high/ultra.

Worse by the day? oh how is Cyberpunk or Jedi Survivor running for you 144hz monitor being fully utilised?... If its bad on console its really bad on PC.

I'm a PC gamer but you are just stupid.

You're a pc gamer that doesn't know that there's always 1 or possibly 2 graphical settings that might ruin the performance, and if you turn them down, you get a massive fps boost🤦‍♂️You can't do that on consoles, and because devs are always aiming for graphical fidelity to please the masses, if they can use said graphical setting, they will, even if their game drop to sub 25fps.

And you're giving me Jedi Survivor as an example, even tho the game was literally broken, and no matter the setting, fps was always low. Its not by mentioning a broken game that it makes your example half valid.

Buddy this game is running off the Creation engine and is heavily updated... It will be broken.

As for your logic about settings, a magical setting or two that will make the game run at 60FPS being there and the developer wouldn't use it?... In a world where 60fps options are the norm on consoles?

I think you guys laughing at the performance cant seem to connect the dots that just maybe the game will run like dog s*** on everything which is why I used those two games as a example.

This is not the custom silicon with alien level platforms on consoles, if a game does not offer a 60fps on a PS5 or Series X then get your helmet ready your in for a crash course of a PC release.

I am well aware of it running on an engine from seven centuries ago. Bugs are one thing, and running poorly is another. Outside of F76, i have played all Bethesda titles since Oblivion, and all of them ran fine on PC. And maxed out god rays in Fallout 4 is a good example of said magical setting. Back in 2015, you turn it down, and you were gaining a big fps boost.

The game will be buggy, but it won't have any trouble running in 60fps on pc🤷‍♂️

Right well your brain is clearly switched off.

On PC all those games running on that engine ran like Dog sh**. Fallout 4 required a 980 to run at 60FPS at 1440p, anything weaker than that you were on 40-50FPS at 1080p.

You needed to wait for the 10 series like I did to get the game at 4K 60FPS.

Current console hardware has no bottleneck and is x86, if the game runs bad on a console like a series X like elden ring did?... Guess what the PC equivilent a RTX 2060, struggle with the same settings and resolution and got the same framerate.

1200p 30fps on a series X?... unless you have a 2080 ti or better good luck trying to get 60FPS on anything above 1440p without running at console settings which you laughed at.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

12

Followers

Reviews: 0

User Lists: 0

#2 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@hardwenzen said:
@Grey_Eyed_Elf said:
@hardwenzen said:

More n more pathetic by the day.

1200p

30fps

Way lower graphical setting than pc

No physical versions

If you get this on a console instead of pc you're clueless.

Clueless?....

So the game running at 1200p 30fps on a 12TFLOP console isn't giving you any clues to how its going to run on PC?

You do realise that since consoles went x86 is almost a 1:1 indicator of how its going to run on pc, meaning you will need a $2k PC to get this thing running at 4k 60fps on high/ultra.

Worse by the day? oh how is Cyberpunk or Jedi Survivor running for you 144hz monitor being fully utilised?... If its bad on console its really bad on PC.

I'm a PC gamer but you are just stupid.

You're a pc gamer that doesn't know that there's always 1 or possibly 2 graphical settings that might ruin the performance, and if you turn them down, you get a massive fps boost🤦‍♂️You can't do that on consoles, and because devs are always aiming for graphical fidelity to please the masses, if they can use said graphical setting, they will, even if their game drop to sub 25fps.

And you're giving me Jedi Survivor as an example, even tho the game was literally broken, and no matter the setting, fps was always low. Its not by mentioning a broken game that it makes your example half valid.

Buddy this game is running off the Creation engine and is heavily updated... It will be broken.

As for your logic about settings, a magical setting or two that will make the game run at 60FPS being there and the developer wouldn't use it?... In a world where 60fps options are the norm on consoles?

I think you guys laughing at the performance cant seem to connect the dots that just maybe the game will run like dog s*** on everything which is why I used those two games as a example.

This is not the custom silicon with alien level platforms on consoles, if a game does not offer a 60fps on a PS5 or Series X then get your helmet ready your in for a crash course of a PC release.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

12

Followers

Reviews: 0

User Lists: 0

#3 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@hardwenzen said:

More n more pathetic by the day.

1200p

30fps

Way lower graphical setting than pc

No physical versions

If you get this on a console instead of pc you're clueless.

Clueless?....

So the game running at 1200p 30fps on a 12TFLOP console isn't giving you any clues to how its going to run on PC?

You do realise that since consoles went x86 is almost a 1:1 indicator of how its going to run on pc, meaning you will need a $2k PC to get this thing running at 4k 60fps on high/ultra.

Worse by the day? oh how is Cyberpunk or Jedi Survivor running for you 144hz monitor being fully utilised?... If its bad on console its really bad on PC.

I'm a PC gamer but you are just stupid.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

12

Followers

Reviews: 0

User Lists: 0

#4 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@above_average said:
@Grey_Eyed_Elf said:

1440p 30fps £250 console?... I mean no offense but you forum dwelling fanboys are just living in a cave if you think that a Series S is a bad deal.

You will probably need a £1000 PC to play 1440p 60fps.

Spoilt and delusional. Its why you guys killed the industry. Buying early access games, asking for better graphics, playing remakes.

DUMB!

Speaking of dumb and living in caves..

Xbox Series X is 1296p 30fps on this game stated by DF.

The $250 console version of this game won't be anywhere near 1080p.

But don't let that stop you from your heart felt soap box sales pitch for Xbox Series S

Its called dynamic resolution.

You know when dirt ran at 900p on PS5 on a 1440p dynamic resolution.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

12

Followers

Reviews: 0

User Lists: 0

#5 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

1440p 30fps £250 console?... I mean no offense but you forum dwelling fanboys are just living in a cave if you think that a Series S is a bad deal.

You will probably need a £1000 PC to play 1440p 60fps.

Spoilt and delusional. Its why you guys killed the industry. Buying early access games, asking for better graphics, playing remakes.

DUMB!

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

12

Followers

Reviews: 0

User Lists: 0

#6 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

I just finished Sanctuary on Netflix, pretty great... Leaves you wanting more which I hate, each season should end with some sort of conclusion and not a carrot on a stick type of ending.

Either way great.

@davillain said:

As of now, I'm watching Mandalorian S3 (and by the time of writing this, I'm at the final episode for season 3) and let me tell its been a letdown.

So you can say the writers lost their way?...

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

12

Followers

Reviews: 0

User Lists: 0

#7 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@osan0 said:

@Grey_Eyed_Elf: Yeah it's a good point. As I alluded to in another post further up: taking completely different systems and boiling it down to saying one is X times more powerful than another is...well...not an exact science.

Any PS4 vs Deck comparison get's even messier since you are comparing a console to a PC running PC games....And that PC is using Linux and a compatibility layer which can incur a bit of a performance penalty too. So how much performance difference is hardware weakness vs software overhead?

Oh there are many things that determine performance, the quality of the developer doing the port, if the game is developed for that hardware first, on top of the hardware differences, the software involved and OS of that console and so on.

On PC you can only judge TFLOPS on the same architecture anything else is dumb.

Console vs console like PS4 vs X1 and PS5 vs XSX makes sense due to the SOC's being the same architecture.

But Switch vs anything or XSX/PS5 vs PC?... It gets complex and futile to compare.

Its pointless.

Like a Switch 2 having 1.4TFLOP with 2x the memory bandwidth and memory bit bus with more CPU cores and VRAM... Is the difference between what Witcher 3 looks like now 720p on Switch Vs sold 30FPS 1080p Medium settings, it will be a bigger difference than Wii U vs Switch 2x TFLOP jump BOTW 720p vs 900p same settings.

Biggest issue Switch has is CPU and memory bandwidth along with RAM, the GPU is only part of the problem.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

12

Followers

Reviews: 0

User Lists: 0

#8  Edited By Grey_Eyed_Elf
Member since 2011 • 7970 Posts

Also a good way to debunk the Switch power and TFLOPS all at once... Wii U 0.35 TFLOPS and the Switch is 0.768TFLOPS?...

2x the TFLOPS and you get 900p instead of 720p at the same settings and framerate... in BOTW.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

12

Followers

Reviews: 0

User Lists: 0

#9 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@osan0 said:
@silentchief said:
@Mozelleple112 said:

@silentchief: I thought the entire point of measuring flops was for 1:1 comparison across hardware.

PS3 and X360 have around 0.25 Tflops

Wii-U has 0.35 Tflops

the Switch has 1 Tflop (docked)

X1 has 1.3 Tflops

PS4 has 1.84 Tflops

PS4 Pro has 4.2 Tflops

X1X has 6 Tflops

PS5 has 10.52 Tlfops

XSX has 12 Tflops.

If the Switch 2 is 10x faster that would put it at the PS5 level of performance. Not happening imo

even though it should not be unrealistic. Remember the Switch released four years after the PS4 and was still the PS4 was 84% faster.

Yea after reading further I don't think the 1tflop number is accurate. I think it's closer to half that.

It looks like the 1 Tflop comes from the FP16 measurement when the GPU is running at 1GHz (which is what the Tegra X1 does for the Nvidia shield). The switch GPU doesn't run at 1GHz, It's only 768MHz docked and, when talking about flops, it's FP32 that referred to. So the switch is less than half. Around 0.4 GFlops docked.

The tegra X1 specs:

https://en.wikipedia.org/wiki/Tegra

But it also should be noted that using Flops as a measurement of performance only works when the architecture is the same. So comparing the flops of the PS4 and X1 is fair to relate performance. Or comparing a PS5 and XSX using flops is fine.

But using them to measure performance with a switch holds a lot less weight as the GPUs are just completely different. Even if it's the same vendor it doesn't work once the underlying architecture changes. e.g. comparing a PS4 to a PS5 using flops is also on pretty thin ice. RDNA2 (or 1.5...or whatever) is a completely different beast from GCN1.0 as found in the PS4.

Not to mention the CPU performance, the memory bandwidth.

A good example of the TFLOP comparison being poor especially when it comes to handhelds with limited TDP is Steam deck at 1.64 TFLOPS... Putting it at almost PS4 level's which it is not, its borderline slower than a Xbox One.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

12

Followers

Reviews: 0

User Lists: 0

#10 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

Well if they are currently working on the Switch 2, and they plan on keep it at $299-350 price range... Chances are its going to be a high yield mass produced SOC so there's little to no chance of it being based off of Ada Lovelace 4nm is SLIM!

There's a very high probability the SOC will be based off of Ampere 8nm, and based on RTX 3050 mobile chips the slowest ones are 40w, a handheld SOC will need to be 12-15w... Meaning the slowest 3050 mobile chip cut by a third in performance.

You are looking at a handheld that will be able to play current games at low-medium at 30FPS... Like Elden Ring.

Its not going to change the Switch much really all you will get is another round up of $60-70 Switch 2 re-releases of games you are playing now but with no 60FPS option and lower resolution and quality.

It will be a rinse and repeat of what we have now, Only good thing is you will probably be able to play Zelda at 60fps.