The Order dominates everyone in graphics at Gamescom 2014

#551 Edited by scatteh316 (5021 posts) -

Being able to handle more requests for compute doesn't increase the chips compute performance. It might slightly increase efficiency but having 8 ACE'S does not increase the amount of work each shader can do, i.e peak and real world floating point numbers do not change.

A 7970 might only have 2 ACE'S compared to PS4'S 8 but the 7970 would smack PS4 all of the place when it comes to GPGPU as it's just as faster GPU with more cores that can handle compute.

And m3 you're a fucking moron, you have no understanding of anything and if I want to have a pointless conversation I'll take a dump and talk to whatever comes out my arse which would probably still be more intelligent then you.

#552 Edited by m3dude1 (1570 posts) -

herm dismisses a fact, standard. absolutely the dumbest group of gamers who happen to think they know the most. theres a reason gpgpu has gone almost completely unused all these years on pc, and its because of latency wrt gpu <---> cpu communication.

#554 Posted by faizan_faizan (7868 posts) -

@m3dude1: uh, what's a fact?

#555 Edited by Heirren (18799 posts) -

Wait, is this Star Citizen the next big thing for pc gamers? No disrespect but it looks generic to no end. It reminds me of an IOS Mass Effect knockoff game.

#556 Edited by m3dude1 (1570 posts) -
@faizan_faizan said:

@m3dude1: uh, what's a fact?

that latency with current interconnect tech in pcs makes gpgpu pretty much dead in the water for gaming

#558 Posted by 7MDMA (315 posts) -

@scottpsfan14 said:
@m3dude1 said:
@faizan_faizan said:

@m3dude1: uh, what's a fact?

that latency with current interconnect tech in pcs makes gpgpu pretty much dead in the water for gaming

PC's brute force things. And they have hardware physx for tasks that GPGPU would normally handle on a console.

Isn't DirectX12 going to help optimize GPU/CPU latency?

#559 Edited by m3dude1 (1570 posts) -

theres no way to brute force this one. theres a reason why hardware phyx has been the same lame shit in every game since nvidia bought the tech. how many years has it been with literally no progress in actual games?

#560 Posted by scatteh316 (5021 posts) -

@scottpsfan14: PCI controllers are built in the CPU die so latency will be fine.

Lack of bandwidth for GPU compute in consoles will also stop it dead in the water.

#561 Edited by m3dude1 (1570 posts) -

@7mdma said:

@scottpsfan14 said:
@m3dude1 said:
@faizan_faizan said:

@m3dude1: uh, what's a fact?

that latency with current interconnect tech in pcs makes gpgpu pretty much dead in the water for gaming

PC's brute force things. And they have hardware physx for tasks that GPGPU would normally handle on a console.

Isn't DirectX12 going to help optimize GPU/CPU latency?

i dunno is dx12 going to magically change the physical distances of trace routes on motherboards and change the way pci express operates? if so then it might help.

#562 Posted by 7MDMA (315 posts) -

@m3dude1 said:

@7mdma said:

@scottpsfan14 said:
@m3dude1 said:
@faizan_faizan said:

@m3dude1: uh, what's a fact?

that latency with current interconnect tech in pcs makes gpgpu pretty much dead in the water for gaming

PC's brute force things. And they have hardware physx for tasks that GPGPU would normally handle on a console.

Isn't DirectX12 going to help optimize GPU/CPU latency?

i dunno is dx12 going to magically change the physical distances of trace routes on motherboards and rewrite the way pci express operates? if so then it might help.

Now I'm convinced it will help. Thanks. :D

#564 Posted by 7MDMA (315 posts) -

@scottpsfan14 said:
@7mdma said:

@scottpsfan14 said:
@m3dude1 said:
@faizan_faizan said:

@m3dude1: uh, what's a fact?

that latency with current interconnect tech in pcs makes gpgpu pretty much dead in the water for gaming

PC's brute force things. And they have hardware physx for tasks that GPGPU would normally handle on a console.

Isn't DirectX12 going to help optimize GPU/CPU latency?

It should help. A great deal of bottlenecks the PC has will no longer be a problem with DX12. With enhancements like 10x the draw calls from the CPU, low level access to the GPU, memory management, should all amount to roughly 50% more performance over DX11 in the same game, on the same system, using the same graphics settings, resolution and AA. Assuming that the game is coded for DX12 in the first place of course.

So basically, the main advantage that consoles had was a low level API, and single spec optimizations. With DX12, PC will now have a low level API, not as low level as a console, because it still sits on top of Windows GPU drivers. A console can still get everything from the hardware since it's only one spec to work with. But now the performance gap between similar specced PC's is closing even more with DX12. DX11 doesn't even see tech in GCN and Keplar GPU's. There is a great deal of performance lost with DX11.

Thanks mate. Always an interesting, objective poster.

#566 Edited by m3dude1 (1570 posts) -

ummmm the 50% performance increase intel demonstrated was in a specific cpu bound scenario. gpu bound scenarios will not be seeing 50% increase. check gpu bound results in mantle for what you can hope to expect in the best scenarios. dx12 wont be as low level as mantle since it cant be designed specifically by amd for amd. and none of this has any effect on the latency issue ive described above.

#568 Edited by m3dude1 (1570 posts) -

the single spec focus is far more of a win than the low level api factor. bf4 on mantle shows the upper range of perf increase you can hope for when gpu limited. again dx12 will have a higher abstraction level than mantle and wont necessarily expose all the hardware functionality of each vendors gpus. its main benefit will be in reducing cpu/driver overhead and allowing for proper multi threading.

i also have concerns about how stable these games will be when the developer is now responsible for proper housekeeping while running code on your pc. ubisoft games....ruh roh

#569 Edited by scatteh316 (5021 posts) -

@scottpsfan14: Consoles are a fixed spec, both an advantage and disadvantage, as developers push the machines with better textures and effects they will also be pushing memory bandwidth utilisation too. The more you spend on the graphics side of things the less you'll have for GPGPU.

That's a benefit of PC, masses of spare bandwidth for doing things like GPGPU without robbing bandwidth from other intensive tasks.

You can't have both on a console as the specs don't change.

And you really have start seeing through Cernys bullshit about PS4's compute performance.

Yes it has 8 ACE'S so it will more efficient then the other HD 7000 GPU's when using the GPU for compute tasks but it won't be any faster then a 7850.

A 1.8tFlop GPU with 8 ACE'S will not have a higher peak compute performance then a 1.8tFlop GPU with 2 ACE'S. Maybe slightly better real world utilisation due to higher efficiency with the 8 ACE'S but you still then have bandwidth to think about.

And graphically Resogun is very basic so the developers could use the spare power and bandwidth for compute. You think they could pull off that level of GPU compute in a game that's say, as graphically complex as The Order?

I know what my answer is.

#572 Posted by shurns (98 posts) -

@Snugenz said:

@shurns said:

@Snugenz said:

Denial about what, you're one of the very few who actually believe the "design choice" bullshit, it is you who's in denial as usual.

Insecurity can play tricks on the mind, especially someone as weak minded as you.

It is a design choice and I don't like their "design choice" either. Whether you like the choice or not is a matter of perspective.

It's a design choice that had to be made because of hardware limitations. To think they'd choose not to have 1080p and every bell and whistle if they had the grunt to do so is naive.

The could have clearly used a different AA method that's less taxing to reach full 1080p on a console. It's clearly a design choice.

#573 Posted by Snugenz (12767 posts) -

@shurns said:

@Snugenz said:

@shurns said:

@Snugenz said:

Denial about what, you're one of the very few who actually believe the "design choice" bullshit, it is you who's in denial as usual.

Insecurity can play tricks on the mind, especially someone as weak minded as you.

It is a design choice and I don't like their "design choice" either. Whether you like the choice or not is a matter of perspective.

It's a design choice that had to be made because of hardware limitations. To think they'd choose not to have 1080p and every bell and whistle if they had the grunt to do so is naive.

The could have clearly used a different AA method that's less taxing to reach full 1080p on a console. It's clearly a design choice.

A necessary choice because of limitation. Contrary to what fanboys believe the PS4 isn't all powerful so in order to get it (close to) looking like they wanted they had to compromise in places, one of those places being resolution.

#574 Edited by donalbane (16376 posts) -

It does look very good. Ryse looks really good too.

#575 Posted by GarGx1 (3233 posts) -

#

@m3dude1 said:

the order screen isnt cgi, its 100% realtime. LOL another hermit owned by the order

Dude that's not the game it's a pre rendered cut scene, in the game you see the back of your avatars head, as in all third person games. You really need to learn the difference between cut scenes and in game.

#576 Posted by SoftwareGeek (539 posts) -

@MK-Professor said:
  • 800p - check
  • 30 fps - check
  • black bars - check
  • film grain effect - check
  • lightmaps - check

Yeah, just not a lot to be impressed about really.

#577 Edited by faizan_faizan (7868 posts) -

@m3dude1 said:

herm dismisses a fact, standard. absolutely the dumbest group of gamers who happen to think they know the most. theres a reason gpgpu has gone almost completely unused all these years on pc, and its because of latency wrt gpu <---> cpu communication.

When we say "compute" tasks, we don't just refer to physics, but all compute tasks in general. That includes lighting (dynamic lighting, ambient occlusion, GI etc), post-processing (DOF and the like) and physics.

@m3dude1 said:

theres no way to brute force this one. theres a reason why hardware phyx has been the same lame shit in every game since nvidia bought the tech. how many years has it been with literally no progress in actual games?

Similarly, we have seen real-time path-tracing using OpenCL on PC: http://www.youtube.com/watch?v=BpT6MkCeP7Y

What have consoles brought to the table? In real-time rendering, PC games have always been more innovative.

#578 Posted by m3dude1 (1570 posts) -

@faizan_faizan said:

@m3dude1 said:

herm dismisses a fact, standard. absolutely the dumbest group of gamers who happen to think they know the most. theres a reason gpgpu has gone almost completely unused all these years on pc, and its because of latency wrt gpu <---> cpu communication.

When we say "compute" tasks, we don't just refer to physics, but all compute tasks in general. That includes lighting (dynamic lighting, ambient occlusion, GI etc), post-processing (DOF and the like)and physics.

@m3dude1 said:

theres no way to brute force this one. theres a reason why hardware phyx has been the same lame shit in every game since nvidia bought the tech. how many years has it been with literally no progress in actual games?

Similarly, we have seen real-time path-tracing using OpenCL on PC: http://www.youtube.com/watch?v=BpT6MkCeP7Y

What have consoles brought to the table? In real-time rendering, PC games have always been more innovative.

thats all typical graphics work, do you know what gpgpu is?

#579 Posted by Heirren (18799 posts) -

consoles need to go back to custom hardware.

#580 Edited by faizan_faizan (7868 posts) -

@m3dude1 said:

@faizan_faizan said:

@m3dude1 said:

herm dismisses a fact, standard. absolutely the dumbest group of gamers who happen to think they know the most. theres a reason gpgpu has gone almost completely unused all these years on pc, and its because of latency wrt gpu <---> cpu communication.

When we say "compute" tasks, we don't just refer to physics, but all compute tasks in general. That includes lighting (dynamic lighting, ambient occlusion, GI etc), post-processing (DOF and the like)and physics.

@m3dude1 said:

theres no way to brute force this one. theres a reason why hardware phyx has been the same lame shit in every game since nvidia bought the tech. how many years has it been with literally no progress in actual games?

Similarly, we have seen real-time path-tracing using OpenCL on PC: http://www.youtube.com/watch?v=BpT6MkCeP7Y

What have consoles brought to the table? In real-time rendering, PC games have always been more innovative.

thats all typical graphics work, do you know what gpgpu is?

Do you know what you're talking about?

http://www.tomshardware.com/reviews/directcompute-opencl-gpu-acceleration,3146-2.html

Also, aren't you the same person who said Crysis 3 doesn't have tessellation? Yeah. Goes to show much you DO know.

#581 Edited by m3dude1 (1570 posts) -

im referring to non graphics work done on the gpu. and what i said wrt crysis 3 tessellation is that it adds basically nothing to the actual end visual result.

#582 Posted by ButDuuude (566 posts) -

I love my PlayStation 4, but I sense that this game will flop harrrrrd.

#583 Edited by faizan_faizan (7868 posts) -

@m3dude1 said:

im referring to non graphics work done on the gpu. and what i said wrt crysis 3 tessellation is that it adds basically nothing to the actual end visual result.

LOL. It's amazing how you thought you could just change the goal posts and no one would notice it.

Also:

You are a horrible, horrible troll.

#584 Posted by m3dude1 (1570 posts) -

thx for proving my crysis 3 point correct with that comparison. i didnt change any goal posts. doing some graphics work thru direct compute is pretty trivial. latency makes any nontrivial use of gpgpu on pcs dead in the water. do you disagree with this?

#585 Edited by RR360DD (13033 posts) -

Ouch :(

LOL

#586 Edited by faizan_faizan (7868 posts) -

@m3dude1 said:

thx for proving my crysis 3 point correct with that comparison. i didnt change any goal posts.

doing some graphics work thru direct compute is pretty trivial. latency makes any nontrivial use of gpgpu on pcs dead in the water. do you disagree with this?

So, you're implying that you didn't bother to open the images in a new tab? Do you play games at 480p (or whatever the thumbnail size is)?

OK, so I guess you're a programmer. How did you get the compute shader (in DirectX 11) to work with ambient occlusion? What were you trying to do? How did you get around the overhead that you speak of?