Google Stadia Reveal. Another 'reality check' for Next Gen specs.

  • 63 results
  • 1
  • 2
Avatar image for loco145
loco145

12226

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 loco145
Member since 2006 • 12226 Posts

So, the first next gen 'console' is here. And even with the unlimited power of the cloud we got the following specs:

  • Custom 2.7GHz hyper-threaded x86 CPU with AVX2 SIMD and 9.5MB L2+L3 cache
  • Custom AMD GPU with HBM2 memory and 56 compute units, capable of 10.7 teraflops
  • 16GB of RAM with up to 484GB/s of performance
  • SSD cloud storage

We have no much info for the CPU, but the GPU info is damming. As Eurogamer said, is practically the power of a Vega 56. All points towards for sub 1080ti/Vega 64 power for next gen consoles.

Source.

Avatar image for Pedro
Pedro

69466

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#2 Pedro  Online
Member since 2002 • 69466 Posts

Nextgen is going to be another "mid" gen upgrade.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#3  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

@loco145: I don't think you understand. That's a single GPU instance. A player could use 5 if needed

They even showed this in action in the conference.

An interesting after effect - I wonder if developer support for this will unintentionally jumpstart sli/crossfire support

Avatar image for loco145
loco145

12226

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 loco145
Member since 2006 • 12226 Posts

@xantufrog said:

@loco145: I don't think you understand. That's a single GPU instance. A player could use 5 if needed

But Google didn't even gave a hint on how much their service will cost. For how much do you think that google will lease you 5 Vega 56 GPUs? And how do the existence of Sli/XF has any bearing on next gen consoles specs? From the conference, 10.7tf is the base spec of Stadia, with an added nod to we can do sli btw! Remember that 4xPS3 tech demo? Is the same thing.

Avatar image for pmanden
pmanden

2927

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 pmanden
Member since 2016 • 2927 Posts

Not too excited about this "next-gen console". I'll stick to my xbox one x for the foreseeable future.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#6  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

@loco145 said:

And how do the existence of Sli/XF has any bearing on next gen consoles specs?

I didn't say it would? I was musing that support which is basically dead on PCs might return if code bases start to adapt to work with multiple GPUs. But I'm not sure if this is true or not - I suspect for this platform to work it actually has multiple instances to masquerade as a single uberGPU, and handles parallelization of that task between actual instances completely on its end.

@loco145 said:

From the conference, 10.7tf is the base spec of Stadia, with an added nod to we can do sli btw! Remember that 4xPS3 tech demo? Is the same thing.

No. It's not the same thing and that's not what they said. They were very clear - it's an enormous server farm. Each GPU instance in the farm is a 10.7 TF custom GPU, but as many can be recruited as needed for the processing task at hand. They then went on to show this in action, comparing the water physics that could be pushed at 60FPS with one instance vs multiple instances.

You're right, they didn't say how much it will cost, but there's no way it's going to be like "renting" 5 vega 56's. It's probably going to be more like the following: X/month for the 1080p 60FPS package. 2X/month for the 4k 60FPS package. beaucoup bux/month for the >4k >60FPS package.

It's a server farm. You aren't renting a discrete computer in some person's shack

Avatar image for loco145
loco145

12226

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7  Edited By loco145
Member since 2006 • 12226 Posts

@xantufrog: My take is that the CPU will work that way, hence no info about number of cores. But they were very explicit on giving the GPU specs. I'm sure that they don't have GPU virtualization as there doesn't exist a GPU that can manage several instances of 10.7Tflops and high throughput memory pooling would be a nightmare.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#8 deactivated-642321fb121ca
Member since 2013 • 7142 Posts

As above, and I'm am sure they can switch out GPU's at will. Need to know recommended MB though, mine is around 60 - 70.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#9  Edited By xantufrog  Moderator
Member since 2013 • 17875 Posts

@loco145 said:

@xantufrog: I'm sure that they don't have GPU virtualization as there doesn't exist a GPU that can manage several instances of 10.7Tflops.

What? What does "a GPU that can manage several instances of 10.7Tflops" even mean?

It's a server. I use them every day. I run one program, and submit its processes to run in parallel over - wait for it - multiple GPUs. It enables enormous volumes of computations to be run in the exact same timeframe that one of those GPUs alone would do.

Google is a leader in machine learning, AI, and server technology. If my research institute can do it, so can they.

Avatar image for true_link
True_Link

243

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#11 True_Link
Member since 2019 • 243 Posts

So, this is not a console but a streaming service? Well, fine by me as long as traditional gaming is still a thing, options are good.

Avatar image for xhawk27
xhawk27

12183

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 xhawk27
Member since 2010 • 12183 Posts

@Pedro said:

Nextgen is going to be another "mid" gen upgrade.

I laugh when fanboys say that next year the PS5 is going to be 14Tflops and only cost $400.

Avatar image for ellos
ellos

2532

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13  Edited By ellos
Member since 2015 • 2532 Posts

Actually that kinda verify that next gen consoles will be more powerful then that. They should atleast be more powerful then a single stack. Sure consoles are usually powered down to save power but a single stack usually needs to be even more efficient at that. So at least this tells as they will pass the 10.7 tf just a bit maybe lol.

Avatar image for loco145
loco145

12226

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14  Edited By loco145
Member since 2006 • 12226 Posts

@xantufrog: Ah, so you say that SLI/XF would be transparent for the game devs? Sorry, but it if were that easy, Nvidia/AMD would have done that ages ago. Unlike other parallel tasks such as AI, graphics rendering is very sensitive to memory latency. There's a reason why they showed their multi-gpu capabilities on an specially coded demo (that already supports SLI/XF) and not on Assasin's Creed Odyssey.

As far as I know, there does not exist the technology to make graphics rendering over several GPUs acting as one. And if Google invented it, they failed to mention it at GDC 2019.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@tom-pepper said:
@Random_Matt said:

As above, and I'm am sure they can switch out GPU's at will. Need to know recommended MB though, mine is around 60 - 70.

They mentioned 25 meg on eurogamer

That's for Google Stream which ran at 1080p/60/... for 4K/60 you will need a lot more not to mention if there is anyone watching Netflix at the same time your f***ed.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16  Edited By Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@loco145 said:

@xantufrog: Ah, so you say that SLI/XF would be transparent for the game devs? Sorry, but it were that easy, Nvidia/AMD would have done that ages ago. Unlike other parallel tasks such as AI, graphics rendering is very sensitive to memory latency. There's a reason why they showed their multi-gpu capabilities on an specially coded demo (that already supports SLI/XF) and not on Assasin's Creed Odyssey.

As far as I know, there does not exist the technology to make graphics rendering over several GPUs acting as one. And if Google invented it, they failed to mention it at GDC 2019.

They spoke about being able to stream 4K/60 while gaming at 4K/60 so I would assume that that is what they meant by using more than one system, but for rendering the game its pretty clear that 10.7 card is what they will be using otherwise they would have made it very clear that it will use how ever many of those system's to get the game running at 4K/60.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#17 deactivated-642321fb121ca
Member since 2013 • 7142 Posts
@Grey_Eyed_Elf said:
@tom-pepper said:
@Random_Matt said:

As above, and I'm am sure they can switch out GPU's at will. Need to know recommended MB though, mine is around 60 - 70.

They mentioned 25 meg on eurogamer

That's for Google Stream which ran at 1080p/60/... for 4K/60 you will need a lot more not to mention if there is anyone watching Netflix at the same time your f***ed.

**** all people will be streaming 4K then.

Avatar image for son-goku7523
Son-Goku7523

955

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#19 Son-Goku7523
Member since 2019 • 955 Posts

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20  Edited By Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@ellos said:

Actually that kinda verify that next gen consoles will be more powerful then that. They should atleast be more powerful then a single stack. Sure consoles are usually powered down to save power but a single stack usually needs to be even more efficient at that. So at least this tells as they will pass the 10.7 tf just a bit maybe lol.

Not really... consoles are APU based system and have to contend with the the heat produced from both the CPU and GPU being on one die and cooled by one solution as apposed to a server farm that has separate cooling for both CPU and GPU with heavy cooling and monitoring of the temperatures in the room and of the hardware with high RPM fans

Its not just power draw that affects a consoles specifications its the heat especially when it comes to frequencies which directly corresponds to the TFLOP count of the GPU.

A good example would be X1X has 6TFLOPs but is a 40CU GPU... The RX 580 is a 6.1TFLOP GPU but at 36CU's, why does the X1X have less TFLOPS but a higher CU count?... Frequenies & heat & TDP!

10-11TFLOPS is what they can do at BEST with consoles for next generation unless Navi drastically improves TDP from current 7nm Radeon VII levels.

Avatar image for tigerbalm
Tigerbalm

1118

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#21  Edited By Tigerbalm
Member since 2017 • 1118 Posts

Google said AC trial was 1080p 60fps, but I think DF said it was 30fps output.

Avatar image for kuu2
kuu2

12062

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#22 kuu2
Member since 2005 • 12062 Posts

So another streaming service.

Sony is going to be left out in the cold.

Watch.............

Avatar image for howmakewood
Howmakewood

7702

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#23 Howmakewood
Member since 2015 • 7702 Posts

Apparently done with Google's connection... from the df video

Avatar image for OniLordAsmodeus
OniLordAsmodeus

381

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24 OniLordAsmodeus
Member since 2010 • 381 Posts

How much will it cost though?

Seems cool enough for single player games (with a bit of lag which will most likely be manageable). But I foresee a bit more lag for shooters and what not, and quite a bit more for fighting games.

I'm a fighting game guy, so this will more than likely be a pass for me.

Avatar image for kuu2
kuu2

12062

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#25 kuu2
Member since 2005 • 12062 Posts

I thought cloud gaming was a figment of MSofts imagination?

Also, be very leery of Google. They abandon projects faster than any big company I have ever seen.

Avatar image for sonic_spark
sonic_spark

6195

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

#26 sonic_spark
Member since 2003 • 6195 Posts

The problem with game streaming hasn't changed since online was a thing. What is the bandwith and overall speed I need to have a smooth experience? At this point in time, it won't be as smooth consistently as actual hardware. Forget that most people likely share an internet connection in their house.

I do like going across various devices, though. Neat idea and implementation.

Avatar image for xantufrog
xantufrog

17875

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#27 xantufrog  Moderator
Member since 2013 • 17875 Posts

@loco145 said:

@xantufrog: Ah, so you say that SLI/XF would be transparent for the game devs? Sorry, but it if were that easy, Nvidia/AMD would have done that ages ago. Unlike other parallel tasks such as AI, graphics rendering is very sensitive to memory latency. There's a reason why they showed their multi-gpu capabilities on an specially coded demo (that already supports SLI/XF) and not on Assasin's Creed Odyssey.

As far as I know, there does not exist the technology to make graphics rendering over several GPUs acting as one. And if Google invented it, they failed to mention it at GDC 2019.

You're so busy being upset by this that you couldn't properly read my post.

First, I said "cool, this has Crossfire-like tech. I wonder if that means games coded to support that will lead to a bleed-over, whereby we get better crossfire support on PC games since devs are doing it anyway for Google's platform"

Then, I mused "or, maybe that won't be the case because I imagine they'll try to handle parallelization on the server side, such that multiple gpus can be used without a game code change"

I don't know if that second option is true or not. I didn't say it was. But I definitely do think it's possible, for the record. You're overselling the mystique behind this. As I already said, it's not very difficult for me to submit processes in parallel to run on multiple gpus. Mind you, I'm doing this at the code level. But, what google could do to implement that without explicit SLI/Crossfire support would be to intervene with the render call and distribute the computations over multiple GPUs. This, to me, would imply a CPU-intensive task which a server is perfect for.

The journalists don't know the answer or the limits, so I won't pretend to. The guy from from PCgamesn said it well:

The fact the Stadia system in the datacentre means it is going to be able to allocate multiple GPUs to a gaming instance is mind-boggling, and assumedly means not just two, but potentially as many GPUs as is needed to offer the level of gaming performance the end user needs for a particular game. Maybe it will just need to use a single one of the 10.7 TFLOP GPUs, or maybe it’s greedy and needs three.

Quite how Google and AMD have been able to create this server-side CrossFire analogue is obviously beyond me, or I’d be working at Google and not about to me made redundant in my position as a PC hardware journalist.

There is simply no way Google is talking about 4k 60 or even 8k if they are limited to a single 10.7TF GPU. It's just not going to happen.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#28  Edited By michaelmikado
Member since 2019 • 406 Posts

@loco145 said:

So, the first next gen 'console' is here. And even with the unlimited power of the cloud we got the following specs:

  • Custom 2.7GHz hyper-threaded x86 CPU with AVX2 SIMD and 9.5MB L2+L3 cache
  • Custom AMD GPU with HBM2 memory and 56 compute units, capable of 10.7 teraflops
  • 16GB of RAM with up to 484GB/s of performance
  • SSD cloud storage

We have no much info for the CPU, but the GPU info is damming. As Eurogamer said, is practically the power of a Vega 56. All points towards for sub 1080ti/Vega 64 power for next gen consoles.

Source.

I've already talked about this multiple times but 90% of future Game streaming services will be running on the newly released V340.

https://www.amd.com/en/products/professional-graphics/radeon-pro-v340

https://www.tomshardware.com/news/amd-radeon-pro-dual-vega-v340,37694.html

The gpus specs down to the 56 cus and even the bandwidth divide completely evenly with these cards.

AMD official launched it in August with release in Q4 of 2018... The same time Google announced its service and released its public demos.

AMD also has this beauty out in the wild since October.

https://community.amd.com/community/radeon-pro-graphics/blog/2018/11/13/amd-server-cpus-gpus-the-ultimate-virtualization-solution

It has the exact same specs down to using the Epyc 7601 dual processor.

https://www.amd.com/en/products/cpu/amd-epyc-7601

The idea would be that they could scale cores per game. But the greater point is that the server in the above has 4 V340s for a total of 8 56 cus GPUs, and 64 cores/128 threads. Anyway, this is also the servers that's been going into AWS datacenters and what PSNow now uses. That being said, it's almost a lock that a PS4 will be 10.2-10.7 GFlops, 16GB VRAM and 8 cores. This way if they virtualized them, they could run PS5 games via streaming and actually share the same servers as Google much like video streaming services share the same servers. I actually anticipate the servers will get an upgrade once Zen 2 Eypc releases which I expect will coincide with the PS5 as well. I anticipate Google upgrading to 3.2Ghz cpus by Q1 2020.

Man this thing is beautiful.

Avatar image for loco145
loco145

12226

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29  Edited By loco145
Member since 2006 • 12226 Posts

@xantufrog: MS talked about 4K60fps for their XBX and delivered for games like GeoW4 and Forza 7. And is not like Google is over hyping their stuff. Also, you are not reading what I'm saying. I know what it can be done with parallelization. Distributing something like stochastic gradient descent batches has way looser memory latency requirements than rendering pipelines. I can do it on my home PC over my 10 cores, my 2 (non-symetric) GPUs and even my raspberry pi cluster right now.

On the other hand, while giving their presentation of raytracing in Metro Ex, the developers said that inter-shader communication was a limit in doing global illumination and that indirect illumination was faster because they could localize computations, therefore optimizing the cache memory of each shader! It doesn't matter how custom are Stadia servers, the GPU to GPU communication is going to of several orders slower than intra GPU data sharing! Real time graphics rendering is a much different beast than most other task of parallel computing.

Avatar image for davillain
DaVillain

56094

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#30 DaVillain  Moderator
Member since 2014 • 56094 Posts

Cloud stuff. It looks cool but with the data caps and throttling situation for unlimited plans in the United States, I can't really get excited about this.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#31 michaelmikado
Member since 2019 • 406 Posts

@xantufrog:

This isn’t splitting the rendering across different GPUs, Google will be using v340s which are pretty much just a bunch of Vega 56s stuck together: anyone claiming they are using multiple discreet GPUs per instance doesn’t know what they are talking about at all. Yes their servers have multiple GPUs, put is not stitching and pulling resources across them its dividing up the resources across a v340.

Avatar image for loco145
loco145

12226

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 loco145
Member since 2006 • 12226 Posts

@michaelmikado: Ah, so they are using their infinity fabric stuff to sew the Vega 56 together? That's a good find!

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#33 michaelmikado
Member since 2019 • 406 Posts

@loco145:

Well yes and no.

The v340 is literally two Vega 56 cards stuck together with a pool of 32GB of HBM2. Googles specs are them literally just dedicating one card to an instance and half the VRAM. No SLI or infinity fabric needed. Any claims of something “new” in that vein is just marketing magic. Honestly it’s both very mundane and insane that such a server exists.

Avatar image for pyro1245
pyro1245

9397

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#34 pyro1245
Member since 2003 • 9397 Posts
@davillain- said:

Cloud stuff. It looks cool but with the data caps and throttling situation for unlimited plans in the United States, I can't really get excited about this.

Google would have to throw fat sacks of money at all the ISPs to keep them from throttling. I think Netflix was/is doing that.

Too bad the old guard ISPs gave Google so much trouble that they weren't able to effectively roll out fiber everywhere they wanted to.

Avatar image for BassMan
BassMan

17806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#35  Edited By BassMan  Online
Member since 2002 • 17806 Posts

My concern is for the quality of the experience... input lag, compression, frame time consistency, higher frame rates, ultrawide support, mods, etc.. However, this talk of being able to scale the processing power to the developer's needs has me intrigued. If the Vulkan API can scale seamlessly to leverage the distributed computing, then we may really be seeing the "power of the cloud" for the first time. We will have to wait and see. Imagine stringing together even 10 of these Stadia modules for over 100 Tflops of processing power.... Physics, ray tracing, etc.. Oh my, I am getting hard now. LOL

Avatar image for speedytimsi
speedytimsi

1415

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 speedytimsi
Member since 2003 • 1415 Posts

When your list of partners includes Unreal lol

Hahahahahaha!

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#37 michaelmikado
Member since 2019 • 406 Posts

@BassMan:

Performance level GPU pooling isn’t possible at this point for real-time graphics rendering and gaming. Virtualization of GPUs to smaller instances is just taking off. Possible in the future we will have transparent pooled virtual GPUs. Right now we can’t even get SLI working properly and transparently with GPUs connected directly next to each other.

Avatar image for IMAHAPYHIPPO
IMAHAPYHIPPO

4196

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#38 IMAHAPYHIPPO
Member since 2004 • 4196 Posts

I only watched the last few minutes of the presentation. Somebody help me out here. So, it's not a physical console, it's a streaming service. So what are these specs coming from? Are they devices running out of Google that powers the games that are being streamed? Does your subscription come with a device sent to, like a cable box? I'm super confused.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39  Edited By Grey_Eyed_Elf
Member since 2011 • 7970 Posts

Yeah... It does seem like 10.7 TFLOP GPU spec is all one user will receive to render games:

"UP TO 4K" gives it away. I am guessing it will depend on the game and how demanding it is and or your own internet connection that will determine the resolution but they seem to be targeting 60FPS.

When companies are this vague on the details its never a good thing.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#40 michaelmikado
Member since 2019 • 406 Posts

@Grey_Eyed_Elf:

That’s correct and 60fps is ideal as it actually helps to mask the lag more believe it or not. I assume it’s because it’s slightly less noticeable if a single frame is skipped due to packet loss.

They will likely scale up more in the future as they get better server hardware and bandwidth costs go down.

Avatar image for BassMan
BassMan

17806

Forum Posts

0

Wiki Points

0

Followers

Reviews: 225

User Lists: 0

#41  Edited By BassMan  Online
Member since 2002 • 17806 Posts

@michaelmikado said:

@BassMan:

Performance level GPU pooling isn’t possible at this point for real-time graphics rendering and gaming. Virtualization of GPUs to smaller instances is just taking off. Possible in the future we will have transparent pooled virtual GPUs. Right now we can’t even get SLI working properly and transparently with GPUs connected directly next to each other.

Loading Video...

Another concern for me though is them partnering with Havok. Correct me if I am wrong, but PhysX is the only physics API that is allowed to use GPU acceleration (I think Nvidia has the patent). However, they recently opened up the PhysX API to be used with AMD hardware as well and it is no longer proprietary to Nvidia GPUs. So, why aren't they using PhysX with Stadia? Imagine distributed computing with GPUs accelerating the physics...

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#42 michaelmikado
Member since 2019 • 406 Posts

@IMAHAPYHIPPO:

It’s the underlying servers that these run on and technically how much of those server resources they will allocate to a single user at any given time. It’s really for reference and marketing for consumers and devs to compare against current offerings.

Avatar image for raining51
Raining51

1162

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43  Edited By Raining51
Member since 2016 • 1162 Posts

That's next gen. maybe not in a shocking way but that counts as next-gen.

I was just thinking about how I think Sony will gradually exist the console market... actually maybe even more quickly than I think... there just such a big vast multimedia powerhouse and I feel like they just are going to

say well that was fun but we have better opportunities elsewhere and so we'll see the last of them with the PS5 or maybe PS6.

Meanwhile Nintendo at this point is obviously struggling to keep up with everyone else... they are trying to stretch their philosohpies and get every inch but it's apparent that they just come across

as medcium powered PCs at this point.

Sega obviously not having made a on console in awhile, who knows how that will go.

And hten finally we have Microsoft which seems utterly content having basically sold well every gen, and I expect them to exit also but more slowly.....

I guess basically what I'm saying is to rejuvenate the system wars/console wars we really need google and amazon and apple to enter with consoles and stuff like this... to make these

overpwoered machines that attract interest because relying on the traidtional groups at this point is a bad idea.

Personally it's been sustained mainly by Sony and like I said they're on the way out so relying on them would be bad.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#44 michaelmikado
Member since 2019 • 406 Posts

@BassMan:

PhysX is just a software API. It can still be used if a developer wants to. I’m sure they have their reasons but I’m assuming because they will be using AMD hardware and running PC games that they would have some tie into Microsoft who own Havok and likely make a lot of Xbox games.

Basically Google needs PC games which may already be cross developed for Xbox which MS owns so getting them running on AMD hardware is likely easier due to their experience with Xbox and MS. Remember they games would need to support using the API in the first place and few devs will have PhysX running on AMD GPUs since it just recently went open source.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@michaelmikado: Looks like that is their plan but I guess that is entirely dependant on the success of the launch and repeat subscribers.

One thing that I am worried about is that a subscription services won't work for such a expensive service, I have a feeling it will be a contractual service similar to a mobile phone contract 12 or 24 months. User's being able to cancel at any moment like its Origin Access or something would be a disaster especially for a gaming service where someone can subscribe then unsubscribe right after doing so just to try it out... Not to mention that unless they give a cut to developers they will have little incentive to do so unless along with the contract their is a pay store for games. Too many questions and the "up to"... Sneakily put in while talking about 4K/60FPS so confidently makes me wonder if I should trust anything I see or read from them.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#46 deactivated-642321fb121ca
Member since 2013 • 7142 Posts
@Grey_Eyed_Elf said:

@michaelmikado: Looks like that is their plan but I guess that is entirely dependant on the success of the launch and repeat subscribers.

One thing that I am worried about is that a subscription services won't work for such a expensive service, I have a feeling it will be a contractual service similar to a mobile phone contract 12 or 24 months. User's being able to cancel at any moment like its Origin Access or something would be a disaster especially for a gaming service where someone can subscribe then unsubscribe right after doing so just to try it out... Not to mention that unless they give a cut to developers they will have little incentive to do so unless along with the contract their is a pay store for games. Too many questions and the "up to"... Sneakily put in while talking about 4K/60FPS so confidently makes me wonder if I should trust anything I see or read from them.

I did think how is Ubisoft making any money from this?

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#47 michaelmikado
Member since 2019 • 406 Posts

@Grey_Eyed_Elf:

The cost of the service is a huge concern.

These streaming services are generally priced per hour of steaming. When Netflix streaming first launched users only got 20 hours of streaming per month.

PSnow when it first launched charged $2.99 for a 4 hour rental. Or $0.75 per hour.

Shadow charges $35 for 20 hours and Nvidia Shield is rumored to be priced at $25 for 20 hours of use.

Neither Shadow nor the consumer version of Shield includes game unless you own them already through a digital store. The betas include some games.

So the break down is PSNow at launch: $0.75

Nvidia Shield rumored: $1.25 per hour no games

Shadow: $1.75 per hour, no games.

PSNow can be gotten for $99 a year now or about $8 per month so like Netflix, after several years in the streaming industry they were able to offer an unlimited model for under $10 a month that includes the content itself.

Neither Google nor MS have said anything about their streaming prices but the going rate appears to be between $1.25-$1.75 and hour. Anything higher that $2 per hour may be a hard sell even though we pay $5 to rent 2 hour movies for 72 hours.... so new games will likely be out of the question unless it’s very high priced.

Avatar image for Grey_Eyed_Elf
Grey_Eyed_Elf

7970

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 Grey_Eyed_Elf
Member since 2011 • 7970 Posts

@Random_Matt said:
@Grey_Eyed_Elf said:

@michaelmikado: Looks like that is their plan but I guess that is entirely dependant on the success of the launch and repeat subscribers.

One thing that I am worried about is that a subscription services won't work for such a expensive service, I have a feeling it will be a contractual service similar to a mobile phone contract 12 or 24 months. User's being able to cancel at any moment like its Origin Access or something would be a disaster especially for a gaming service where someone can subscribe then unsubscribe right after doing so just to try it out... Not to mention that unless they give a cut to developers they will have little incentive to do so unless along with the contract their is a pay store for games. Too many questions and the "up to"... Sneakily put in while talking about 4K/60FPS so confidently makes me wonder if I should trust anything I see or read from them.

I did think how is Ubisoft making any money from this?

Also you would think with all these developers behind it like Jade Raymond that the controller would have a more ergonimic design and or a asymmetrical thumb stick layout?... How is any gamer going to find that appealing, are the manufactures that disconnected or do they just not game and or game so they are unaware?... Does it come with the subscription?...

Too many red flags here, something is off. The "up to 4K"... I have a feeling this thing will be a disaster.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#49  Edited By ronvalencia
Member since 2008 • 29612 Posts

@loco145 said:

So, the first next gen 'console' is here. And even with the unlimited power of the cloud we got the following specs:

  • Custom 2.7GHz hyper-threaded x86 CPU with AVX2 SIMD and 9.5MB L2+L3 cache
  • Custom AMD GPU with HBM2 memory and 56 compute units, capable of 10.7 teraflops
  • 16GB of RAM with up to 484GB/s of performance
  • SSD cloud storage

We have no much info for the CPU, but the GPU info is damming. As Eurogamer said, is practically the power of a Vega 56. All points towards for sub 1080ti/Vega 64 power for next gen consoles.

Source.

CPU: ZEN IP

GPU: Vega v1 or v2 IP? NAVI with HBM v2?

484 GB/s indicates Vega 64's memory bandwdith.

10.7 TFLOPS indicates higher clocked GCN 56 i.e. 1500 Mhz.

Normal Vega 56 has 1138 MHz base clock to 1474 MHz boost mode. Game consoles usually removes boost modes.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#50  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Grey_Eyed_Elf said:
@ellos said:

Actually that kinda verify that next gen consoles will be more powerful then that. They should atleast be more powerful then a single stack. Sure consoles are usually powered down to save power but a single stack usually needs to be even more efficient at that. So at least this tells as they will pass the 10.7 tf just a bit maybe lol.

Not really... consoles are APU based system and have to contend with the the heat produced from both the CPU and GPU being on one die and cooled by one solution as apposed to a server farm that has separate cooling for both CPU and GPU with heavy cooling and monitoring of the temperatures in the room and of the hardware with high RPM fans

Its not just power draw that affects a consoles specifications its the heat especially when it comes to frequencies which directly corresponds to the TFLOP count of the GPU.

A good example would be X1X has 6TFLOPs but is a 40CU GPU... The RX 580 is a 6.1TFLOP GPU but at 36CU's, why does the X1X have less TFLOPS but a higher CU count?... Frequenies & heat & TDP!

10-11TFLOPS is what they can do at BEST with consoles for next generation unless Navi drastically improves TDP from current 7nm Radeon VII levels.

When compared to X1X's GPU 44 CU, RX-580 is smaller chip area size which increases chip numbers per wafer. X1X's GPU 44 CU dev kit vs retail 40 CU has similar TDP and cooling solution.

If a game console is spec'ed to RX-580's 6.1 TFLOPS with 36 CU, this solution has zero margin for defects, hence the yields will fall.

Google's GPU solution could be VII with 56 CU (worst than VII's 60 CU yields) and two stacks HBM v2 at 950Mhz.

MI60 = 64 CU with quad stack HBM v2 for high cost server/workstation SKU

VII = 60 CU with quad stack HBM v2 for medium-high cost gaming SKU

VII = 56 CU with dual stack HBM v2 for medium cost gaming SKU,

------------

Microsoft tolerated 4 CU defects with their 44 CU GPU

Sony tolerated 4 CU defects with their 40 CU GPU