SuperMighty XSX MonsterBox will cure all that ails you. lol :P
Clearly it is very capable with the power of the cloud. Certain things can only be cured by the secret sauce SSD that Sony has though. Gotta love the console hype. LOL
SuperMighty XSX MonsterBox will cure all that ails you. lol :P
Clearly it is very capable with the power of the cloud. Certain things can only be cured by the secret sauce SSD that Sony has though. Gotta love the console hype. LOL
SuperMighty XSX MonsterBox will cure all that ails you. lol :P
Clearly it is very capable with the power of the cloud. Certain things can only be cured by the secret sauce SSD that Sony has though. Gotta love the console hype. LOL
Since launch the sauce is now 100% mooaar secret so you know it's some sort of alien tech. lol :P
SuperMighty XSX MonsterBox will cure all that ails you. lol :P
Clearly it is very capable with the power of the cloud. Certain things can only be cured by the secret sauce SSD that Sony has though. Gotta love the console hype. LOL
Lmao, gotta give it to Nvidia: stuff like DLSS and DLSS 3.0 just works.
even if they gave up on physx and stuff before, I don't think they ever overpromised unrealistic stuff :)
@BassMan: your $5k gpu is complete and utter piece of trash compared to what AMD and Nvidia have in their labs, equipment worth millions of dollars, hundreds of millions, even billions. Im talking about full quantum chips, doing things we can't even imagine. Your trash computer isn't worth jack in nvidias, amds, intels eyes. Let that sink in for a minute.
This is why context is important, and why a $500 xbox >>> your trash pc.
@BassMan: you do know that those consumer gpus you have access to aren't even the top tier gpus that AMD and Nvidia having cooking right?? Heck, im 100% sure, AMD and Intel have stronger cpus that they don't release to anyone. Those that hit 10ghz and with 100 cores on a single chip. Cooling done with a LN2 loop, or something that gets even colder. Forget about that, its likely the researchers even have quantum chips running some kind of visuals that is pretty much reality. Maybe theres even a FULL VR system, running at 100fps, that they hook up to some lab rats using some brain electrode with full sensations of touch, taste, smell, etc etc. This kind of thing is probably worth hundreds of millions to the manufacturers, and doesn't get released to the public.
So whats my point then?? Your 3080, 3090, the 6950, the 4090, 4090ti, 7950 is just all left over trash compared to what they really have. You're arguing over some left over garbage scraps of UE5, while the manufacturers already made an alternative reality, fully working in their labs. Also, the price point ALWAYS matters. If you disregard price point, where the pc retail gpus can reach upward of $5,000, then don't disregard those $100million setups the manufacturers keep in their labs.
This is why $500 xbox >>> better than your trash computer. Let that sink in.
@BassMan: you do know that those consumer gpus you have access to aren't even the top tier gpus that AMD and Nvidia having cooking right?? Heck, im 100% sure, AMD and Intel have stronger cpus that they don't release to anyone. Those that hit 10ghz and with 100 cores on a single chip. Cooling done with a LN2 loop, or something that gets even colder. Forget about that, its likely the researchers even have quantum chips running some kind of visuals that is pretty much reality. Maybe theres even a FULL VR system, running at 100fps, that they hook up to some lab rats using some brain electrode with full sensations of touch, taste, smell, etc etc. This kind of thing is probably worth hundreds of millions to the manufacturers, and doesn't get released to the public.
So whats my point then?? Your 3080, 3090, the 6950, the 4090, 4090ti, 7950 is just all left over trash compared to what they really have. You're arguing over some left over garbage scraps of UE5, while the manufacturers already made an alternative reality, fully working in their labs. Also, the price point ALWAYS matters. If you disregard price point, where the pc retail gpus can reach upward of $5,000, then don't disregard those $100million setups the manufacturers keep in their labs.
This is why $500 xbox >>> better than your trash computer. Let that sink in.
LOL, I made the edit. Point is, why compare a $5000 PC with a $500 xbox?? Does that make sense? Or why compare a $50 million hardware setup with a $5000 PC. Its the same
@BassMan: and you seem to have problems with context not just with console vs pc, but even in the handheld space. Theres freaking context everywhere, yet you always seem happy to throw all that out the window and shill for the pc. Its just a joke.
SuperMighty XSX MonsterBox will cure all that ails you. lol :P
Clearly it is very capable with the power of the cloud. Certain things can only be cured by the secret sauce SSD that Sony has though. Gotta love the console hype. LOL
Lmao, gotta give it to Nvidia: stuff like DLSS and DLSS 3.0 just works.
even if they gave up on physx and stuff before, I don't think they ever overpromised unrealistic stuff :)
What unrealistic stuff did they promise? All the tech demo or stuff they reveal, always ran on a real hardware.
I said it before sometime that so far everything that Nvidia came up with became a standard/mainstream. Just look at RT as a recent example, both AMD and Intel start to offer that. The problem is Nvidia like to keep stuff locked to their hardware. Physx failed because it was proprietary not what in essence did. Pretty much all games now use GPU compute for physics, which basically what Physx was.
@blaznwiipspman1:
lmao...i'll give you credit for this. You're probably in the top 10 funniest posters on this board. In a league with hardwenzen, mowgly1, secretpolice, kvallyx, pedro, goldenelementalx, techhog and a few more.
Edit...almost forgot ghosts
@pc_rocks: Most of the tools have not changed much from UE4. The main draw is Nanite and Lumen. Lumen is the real performance killer as is any real time path/ray tracing. However, if we are going to make the switch to more dynamic and simulated worlds, we need major engines like UE5 to take on that burden. Games can still be built with traditional baking and faking with UE5 though. So, it is not like devs have to adopt the entire feature set to make a game in the engine.
SuperMighty XSX MonsterBox will cure all that ails you. lol :P
Clearly it is very capable with the power of the cloud. Certain things can only be cured by the secret sauce SSD that Sony has though. Gotta love the console hype. LOL
Lmao, gotta give it to Nvidia: stuff like DLSS and DLSS 3.0 just works.
even if they gave up on physx and stuff before, I don't think they ever overpromised unrealistic stuff :)
What unrealistic stuff did they promise? All the tech demo or stuff they reveal, always ran on a real hardware.
I said it before sometime that so far everything that Nvidia came up with became a standard/mainstream. Just look at RT as a recent example, both AMD and Intel start to offer that. The problem is Nvidia like to keep stuff locked to their hardware. Physx failed because it was proprietary not what in essence did. Pretty much all games now use GPU compute for physics, which basically what Physx was.
I'm not a fan of proprietary stuff: Gsync, DLSS, Physx...
AMD does a way better job on that, even if FSR isn't as good or futureproof as DLSS.
I guess the only unrealstic stuff they overpromised was saying their product launches in some markets with a not a single card to behold and all of them being put into chinese mining farms straigh off the factory line ;)
What unrealistic stuff did they promise? All the tech demo or stuff they reveal, always ran on a real hardware.
I said it before sometime that so far everything that Nvidia came up with became a standard/mainstream. Just look at RT as a recent example, both AMD and Intel start to offer that. The problem is Nvidia like to keep stuff locked to their hardware. Physx failed because it was proprietary not what in essence did. Pretty much all games now use GPU compute for physics, which basically what Physx was.
I'm not a fan of proprietary stuff: Gsync, DLSS, Physx...
AMD does a way better job on that, even if FSR isn't as good or futureproof as DLSS.
I guess the only unrealstic stuff they overpromised was saying their product launches in some markets with a not a single card to behold and all of them being put into chinese mining farms straigh off the factory line ;)
Oh I agree it's not good. These corporation don't understand that a higher tide lifts all boats. They just want to cling to their power regardless if it makes sense or not. However, I'm 100% sure if the tables were to reverse, AMD will start to have proprietary stuff and Nvidia will do open apis. That's just how it's unfortunately. I mean when AMD sponsors a game or partner with a dev, they only support their tech not Nvidia's or anything else while when Nvidia does they don't restrict it.
@pc_rocks: Most of the tools have not changed much from UE4. The main draw is Nanite and Lumen. Lumen is the real performance killer as is any real time path/ray tracing. However, if we are going to make the switch to more dynamic and simulated worlds, we need major engines like UE5 to take on that burden. Games can still be built with traditional baking and faking with UE5 though. So, it is not like devs have to adopt the entire feature set to make a game in the engine.
Nanite and Lumen isn't something I would call a game changer. That's more of a marketing thing. As I said there are other engines with better renderer and performance. I mean Nanite only works with static geometry and Lumen is nothing but another version of SVOGI which accumulates over several frames.
For those wondering, Linus just ran the demo on the Xbox Series S.
PS5 was a bit better:
For those wondering, Linus just ran the demo on the Xbox Series S.
PS5 was a bit better:
Nice try, but HZD demolishes everything on the little xbox.
@pc_rocks: They are indeed game changers. Not having to build lighting all the time and having it be dynamic in the scene is huge. Also, Nanite helps with asset budgets and LOD optimizations.
Easy there pal. Tech demos don't make the game. It may look like it will ease development especially for artists but we don't know if it will be performant or something that the programmers will like. Blueprint also appears to be great for artists and 'no code' visual scripting but it can easily end up leading you to a wrong path until you realize it. Good for prototyping/small games with single devs but not so much for a big project. I mean UE is notorious for shader compilation stuttering issues, especially after DX12/Vulkan.
@pc_rocks: Blueprints and C++ work together. You don't have to pick one or the other and only stick to that one. Choose the method that is best suited for what you or the team is trying to accomplish. Nanite is indeed very performant and opens up new possibilities. It is not a gimmick.
Here are some news EPIC is working on UE 5.1 that improves Nanite and Lumen performance allowing 60 fps targets on Consoles and PC.
For those wondering, Linus just ran the demo on the Xbox Series S.
PS5 was a bit better:
Nice try, but HZD demolishes everything on the little xbox.
Okay... Sure.
Not going to say Series S is more powerful, but it's not as bad as you're making it out. :-S
For those wondering, Linus just ran the demo on the Xbox Series S.
PS5 was a bit better:
Nice try, but HZD demolishes everything on the little xbox.
Okay... Sure.
Not going to say Series S is more powerful, but it's not as bad as you're making it out. :-S
Nice one on posting a tiny picture where you can't see google maps-like detail.
@IMAHAPYHIPPO: Yeah I reckon everyone wants to be first to "maximize" UE5, then we'll start seeing some moderation and efficiency.
I'm looking forward to both sides of it. My PC is pretty good, can probably run it close to max, but I'd take better performance and good looks over mediocre performance and great looks.
@hardwenzen: Except DF said it’s more or less the same as the Series X version at a lower resolution.
Please Log In to post.
Log in to comment