Microsoft patents client side processing by splitting rendering between cloud and local

Avatar image for daniel_su123
#1 Edited by Daniel_Su123 (995 posts) -

http://www.freepatentsonline.com/10159901.html

It seems that this patent is a combination of both these technologies show below. This is all consistent with rumours of a Streaming device where they place local input like controls and collisions and the rest of the game being used in the cloud to deliver graphics and images to the devices.

Credit: https://www.resetera.com/threads/microsoft-patents-client-side-processing-by-splitting-rendering-between-cloud-and-local.92294/#post-16715007

Avatar image for npiet1
#2 Posted by npiet1 (1450 posts) -

That's pretty cool. It should solve some of the latency issues with input lag. It will be like playing a NES. Takes some getting used to if you haven't played it for a while but not big enough to make it a horrible experience.

Avatar image for R4gn4r0k
#3 Edited by R4gn4r0k (29827 posts) -

Maybe it's the future, but I'm totally not interested in streaming.

Avatar image for daniel_su123
#4 Posted by Daniel_Su123 (995 posts) -
@R4gn4r0k said:

Maybe it's the future, but I'm totally not interested in streaming.

Traditional console gamers are likely not, however the harsh reality is that console gaming will likely increasingly becoming less of a focus for companies.

Avatar image for navyguy21
#5 Posted by navyguy21 (15141 posts) -

Thats cool, much better than traditional streaming.

I tried PSNow with the free trial and the latency was insane, even more so since most games are 30fps.

Given this advancement, i wonder what the limitations on resolution will be.

Seems like 1080p+ should be a given.

Avatar image for SecretPolice
#6 Edited by SecretPolice (34345 posts) -

Nintendont and Phony are doomed. :P

Avatar image for jun_aka_pekto
#7 Posted by jun_aka_pekto (24998 posts) -

This seems fine for those with fast broadband. What about those with shitty ones?

Avatar image for dzimm
#8 Posted by dzimm (5133 posts) -

@jun_aka_pekto said:

This seems fine for those with fast broadband. What about those with shitty ones?

Imagine watching streaming video with a less than perfect connection when the image suddenly becomes all blocky and smeary. That's what it's going to be like playing games on a streaming service.

Avatar image for the-a-baum
#9 Posted by The-A-Baum (1233 posts) -

@dzimm said:
@jun_aka_pekto said:

This seems fine for those with fast broadband. What about those with shitty ones?

Imagine watching streaming video with a less than perfect connection when the image suddenly becomes all blocky and smeary. That's what it's going to be like playing games on a streaming service.

They have it working on as low as 4G. That is most cell phones. If your internet is slower than a cell phone, that sucks man!

Avatar image for michaelmikado
#10 Posted by michaelmikado (39 posts) -

OMG no! These two methods seem like the worse possible way to handle this process.

In the Kahawai method, it essential renders a low rez game and overlays it with a high rez layer rendered on a remote server. This is fine for mobile games, which it seems to be aiming for, but falls flat with the level of interactivity and physics we would expect from next gen games. When we are talking modern games we have high complexity models which dynamic physics. Here's a good article on garbage hitboxes and why they ruin gameplay. A good hitbox will match the character poly model as closely as possible, unless for some specific design purpose. Objects will (should) interact in accordance with what you see on screen.

We already have that same complaint in recent Farcry comparisons where the environments and physics are garbage.

https://www.dsogaming.com/articles/far-cry-2-features-more-advanced-physics-than-far-cry-5-despite-being-released-10-years-ago/

Putting a mobile game with a HD wrapper will only exasperate the problem:

Loading Video...

Now Outatime's solution seems to understand that attempting to split development or reduce the complexity of a game and throw lipstick on a pig isn't the best game development model (although viable in today's market.) Rather, their solution is to keep the tradition means of development, but replicate possible next frame scenarios multiple times in the cloud and present the correct frame based on user input. While recognizing you cannot necessarily split the processing for gameplay, the resources it would require to render games would be whatever the system would present X4. One for each of what it claims would be the for possible predicted frames. It does address incorrect predictions, but it seems like that would be a problem in particular manic games and in effective.

My opinion on split and local processing:

The best uses of split/cloud processing (IMO) would be to utilize a high performance system for rendering high level physics and calculations based on the proximity of objects to the user.

I.e. fully rendered user player and most of the scene surrounding that user. Rasterization could be used on dynamic objects in close proximity and I would say cloud based ray-tracing would be more suited for distant and non-dynamic elements. MS seemed to try to do that with the Xbox but it didn't appear that their tools were fleshed out enough to make it viable.

Here's a good intel article on cloud ray-tracing that they showed off over 10 years ago:

https://software.intel.com/sites/default/files/m/d/4/1/d/8/Cloud-based_Ray_Tracing_0211.pdf

https://software.intel.com/en-us/articles/tracing-rays-through-the-cloud/

Loading Video...
Loading Video...

Avatar image for dzimm
#11 Posted by dzimm (5133 posts) -

@the-a-baum said:
@dzimm said:
@jun_aka_pekto said:

This seems fine for those with fast broadband. What about those with shitty ones?

Imagine watching streaming video with a less than perfect connection when the image suddenly becomes all blocky and smeary. That's what it's going to be like playing games on a streaming service.

They have it working on as low as 4G. That is most cell phones. If your internet is slower than a cell phone, that sucks man!

Oh, I'm sure they have it working. Streaming video services are also designed to work in low bandwidth situations. They just reduce the quality until it looks like shit.

Avatar image for pyro1245
#12 Posted by pyro1245 (4417 posts) -

Damn. Sucks they got a patent.

If it does turn out to be a game changer it could stifle the industry for a while.

Heh. Patents suck unless you own them :)

Avatar image for dzimm
#13 Edited by dzimm (5133 posts) -

@pyro1245 said:

Damn. Sucks they got a patent.

If it does turn out to be a game changer it could stifle the industry for a while.

Heh. Patents suck unless you own them :)

It's not that patents suck. The problem is that most things that receive patents, really shouldn't. In order for an idea to be eligible for a patent, it must be specific, novel, and non-obvious to someone with the relevant technical knowledge. Unfortunately, the people who approve patents don't have the necessary technical knowledge to determine if a particular idea meets that criteria. The modern patent system is a mess because of it.

Avatar image for i_p_daily
#14 Posted by I_P_Daily (9945 posts) -

Streaming you say, well i say **** that.