ARM Confirms That The Nintendo Switch's Chipset Is Very Close To Tegra X1 Spec

  • 69 results
  • 1
  • 2
Avatar image for Micropixel
Micropixel

1383

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#1  Edited By Micropixel
Member since 2005 • 1383 Posts

Ever since reports began to circulate that Nintendo and Nvidia were working together on a new console there has been rampant speculation about exactly what kind of chipset would be doing the heavy lifting.

When Eurogamer published its report last year, it was optimistically assumed that Nvidia would be supplying its latest Tegra offering to Nintendo - perhaps based on its new Pascal architecture - and not the existing Tegra X1, which had already been seen in the Nvidia Shield Android TV unit, released in 2015.

Eurogamer has since unearthed more information which suggests that X1 is indeed the chip being used, and now ARM - which supplies the Cortex CPUs used in Nvidia's chipsets - has apparently confirmed these reports. [Source]

So, what's your reaction? Are you Happy with this? Disappointed?

UPDATE: ARM deletes Facebook post, claims it wasn't "an official statement"

Avatar image for deactivated-5d6bb9cb2ee20
deactivated-5d6bb9cb2ee20

82724

Forum Posts

0

Wiki Points

0

Followers

Reviews: 56

User Lists: 0

#2 deactivated-5d6bb9cb2ee20
Member since 2006 • 82724 Posts

Past caring, to be honest.

Avatar image for 22Toothpicks
22Toothpicks

12546

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3  Edited By 22Toothpicks
Member since 2005 • 12546 Posts

@charizard1605:

Yeah not to mention had they gone for top-of-line stuff the sucker would be even more expensive (some would say overpriced) and have even worse battery life. I think the Switch is exactly what it needs to be as there was never any hope of competing specs wise anyhow.

Avatar image for Basinboy
Basinboy

14495

Forum Posts

0

Wiki Points

0

Followers

Reviews: 19

User Lists: 0

#5 Basinboy
Member since 2003 • 14495 Posts

Lul, but at least it's not teh Cell.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6  Edited By ronvalencia
Member since 2008 • 29612 Posts

@xboxiphoneps3 said:

Eh, flagship smartphones are more powerful then it already but I will put that aside here. X1 is still a relatively powerful SoC and I mean definitely is a good improvement over the Wii U...if it can get more games to come to it, that would be great. I wonder if it is running at full X1 specs(full 2.0 GHZ clockspeed on the 4x A57 cores) and full GPU clock speed on par with the X1 Shield TV docked. There has been speculation that it has been underclocked for power efficiency reasons

Would of been pretty sweet though to see a custom "Pascal X2" and not Maxwell X1 in the Switch and maybe some A72 cores over A57 cores though but oh well

https://www.vg247.com/2016/12/19/nintendo-switch-cpu-and-gpu-specs-leak-reveals-console-power-in-portable-and-docked-modes/

The leak GPU specs has 768 MHz in dock mode. Shield TV has 1 Ghz GPU.

Switch's memory bandwidth seems to be similar to Shield TV i.e. 25 GB/s.

Shield TV's GFLOPS seems to be mostly memory bandwidth bound since it still losing to Xbox 360's game ports. If Switch's games are 25 GB/s memory bandwidth bound, then it's result would be close to Shield TV.

If Switch's X1 has customised high speed EDRAM, it may have some performance boost.

Avatar image for superbuuman
superbuuman

6400

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#7 superbuuman
Member since 2010 • 6400 Posts

pass it..just disappointed that the Shield gpu is not even being clock...to its fullest weak sauce potential when docked on Switchy. :P

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#8  Edited By ronvalencia
Member since 2008 • 29612 Posts

@superbuuman said:

pass it..just disappointed that the Shield gpu is not even being clock...to its fullest weak sauce potential when docked on Switchy. :P

Maxwell V2 PC GPUs has recent driver updates that increased their delta memory compression i.e. delta memory compression is higher than 1.56X factor e.g. up to 2.19X factor.

Pascal's delta memory compression improvements in the driver also improved Maxwell v2 GPU' delta memory compression.

There are beyond3d benchmarks that shows improvements.

Avatar image for onesiphorus
onesiphorus

5245

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 6

#9 onesiphorus
Member since 2014 • 5245 Posts

For me, I never seem to understand the fascination over chipsets. I am more concerned whether a home console hardware improves over last generation's hardware.

Avatar image for asylumni
asylumni

3304

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#10 asylumni
Member since 2003 • 3304 Posts

Alright, just this one time, but I'm not going to make it a habit. @iandizion713, I've got your back on this one. There is hope...

Loading Video...

?

Avatar image for osan0
osan0

17809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#11 osan0
Member since 2004 • 17809 Posts

@ronvalencia: any chance nvidia can also do something similar for kepler(or find some way to get more performance from it)? sadly i have to admit that my laptop (780M) has been resolutely beaten by the PS4 in Doom 2016.

anywho on topic this isnt news. we knew this already. its an underclocked TX1 (very undercloked when undocked). the only question i have (and im guessing the answer is no since there isnt even a whiff of news on it) is have nintendo done anything to counteract the low memory bandwidth in the switch. e.g. have they added EDram to the TX1 (maybe replaced the little cores with it).

Avatar image for daniel_su123
Daniel_Su123

1103

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 Daniel_Su123
Member since 2015 • 1103 Posts

Though this off topic. I can't wait and see what ARM can do to improve the PC industry at the moment, which is in the slumps at the moment, aside from gaming and 2 in 1s

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13  Edited By Gatygun
Member since 2010 • 2709 Posts

@onesiphorus said:

For me, I never seem to understand the fascination over chipsets. I am more concerned whether a home console hardware improves over last generation's hardware.

It explains what you can aspect on support and on scope for games.

@osan0 said:

@ronvalencia: any chance nvidia can also do something similar for kepler(or find some way to get more performance from it)? sadly i have to admit that my laptop (780M) has been resolutely beaten by the PS4 in Doom 2016.

anywho on topic this isnt news. we knew this already. its an underclocked TX1 (very undercloked when undocked). the only question i have (and im guessing the answer is no since there isnt even a whiff of news on it) is have nintendo done anything to counteract the low memory bandwidth in the switch. e.g. have they added EDram to the TX1 (maybe replaced the little cores with it).

That's because a 780m is a mobile chip, and mobile chips always yield behind big time. I think that the 780m is a bit weaker then the ps4 chipset at the end. So it makes sense.

Maxwell gives a good performance boost over older architectures, so that's something they can exploit to gain advantages.

EDram is possible tho, but honestly not sure if they would want to go that road and spend more energy on it, rather then just upclocking the chip and deal with that limitation. So in my vision probably not.

@daniel_su123 said:

Though this off topic. I can't wait and see what ARM can do to improve the PC industry at the moment, which is in the slumps at the moment, aside from gaming and 2 in 1s

What is there to improve?

Avatar image for superbuuman
superbuuman

6400

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#14 superbuuman
Member since 2010 • 6400 Posts

@asylumni said:

Alright, just this one time, but I'm not going to make it a habit. @iandizion713, I've got your back on this one. There is hope...

Loading Video...

?

Plausible. :) :P

Avatar image for silversix_
silversix_

26347

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 silversix_
Member since 2010 • 26347 Posts

Graphics are a gimmick so it doesn't matter. Gameplay is where its at.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#16  Edited By ronvalencia
Member since 2008 • 29612 Posts

@osan0 said:

@ronvalencia: any chance nvidia can also do something similar for kepler(or find some way to get more performance from it)? sadly i have to admit that my laptop (780M) has been resolutely beaten by the PS4 in Doom 2016.

anywho on topic this isnt news. we knew this already. its an underclocked TX1 (very undercloked when undocked). the only question i have (and im guessing the answer is no since there isnt even a whiff of news on it) is have nintendo done anything to counteract the low memory bandwidth in the switch. e.g. have they added EDram to the TX1 (maybe replaced the little cores with it).

Sorry, Kepler is in legacy support. Read http://www.nvidia.com/page/legacy.html

My 980 Ti is not yet on the legacy list.

Both Pascal (not GP100) and Maxwell v2 has very similar designs.

TX1 actually has newer SM unit design than desktop PC's Pascal SM units i.e. TX1's SM unit has double rate FP16 feature while desktop PC variant has broken 1/64 speed native FP16 feature. Don't worry FP32 units emulates FP16 at same rate as FP32.

All NVIDIA Maxwells has tile based rasterization and binning which includes small storage very high speed cache.

https://en.wikipedia.org/wiki/Tiled_rendering

  • Nvidia GPUs based on the Maxwell architecture and later architectures

  • Xbox 360 (2005): the GPU contains an embedded 10 MiBeDRAM; this is not sufficient to hold the raster for an entire 1280×720 image with 4× multisample anti-aliasing, so a tiling solution is superimposed when running in HD resolutions and 4× MSAA is enabled.[9]
  • Xbox One (2013): the GPU contains an embedded 32 MBeSRAM, which can be used to hold all or part of an image. It is not a tiled architecture, but is flexible enough that software developers can emulate tiled rendering

Xbox One's GCN GPU is not tiled hardware architecture i.e. software emulated tiled render.

Loading Video...

https://github.com/nlguillemot/trianglebin/releases

Test tool for tiling vs non-tiling. Test tool works on Windows 10.

NVIDIA's tiled rendering is known as immediate tiled render which is backward compatible to traditional non-tiling immediate render.

NVIDIA Maxwell automatically adjust tile size based on data size and complexity (watch the youtube). On Xbox One, this is a manual process.

From http://www.anandtech.com/show/11002/the-amd-vega-gpu-architecture-teaser/3

ROPs & Rasterizers: Binning for the Win(ning)

We’ll suitably round-out our overview of AMD’s Vega teaser with a look at the front and back-ends of the GPU architecture. While AMD has clearly put quite a bit of effort into the shader core, shader engines, and memory, they have not ignored the rasterizers at the front-end or the ROPs at the back-end. In fact this could be one of the most important changes to the architecture from an efficiency standpoint.

Back in August, our pal David Kanter discovered one of the important ingredients of the secret sauce that is NVIDIA’s efficiency optimizations. As it turns out, NVIDIA has been doing tile based rasterization and binning since Maxwell, and that this was likely one of the big reasons Maxwell’s efficiency increased by so much. Though NVIDIA still refuses to comment on the matter, from what we can ascertain, breaking up a scene into tiles has allowed NVIDIA to keep a lot more traffic on-chip, which saves memory bandwidth, but also cuts down on very expensive accesses to VRAM.

For Vega, AMD will be doing something similar. The architecture will add support for what AMD calls the Draw Stream Binning Rasterizer, which true to its name, will give Vega the ability to bin polygons by tile. By doing so, AMD will cut down on the amount of memory accesses by working with smaller tiles that can stay-on chip. This will also allow AMD to do a better job of culling hidden pixels, keeping them from making it to the pixel shaders and consuming resources there.

As we have almost no detail on how AMD or NVIDIA are doing tiling and binning, it’s impossible to say with any degree of certainty just how close their implementations are, so I’ll refrain from any speculation on which might be better. But I’m not going to be too surprised if in the future we find out both implementations are quite similar. The important thing to take away from this right now is that AMD is following a very similar path to where we think NVIDIA captured some of their greatest efficiency gains on Maxwell, and that in turn bodes well for Vega.

Meanwhile, on the ROP side of matters, besides baking in the necessary support for the aforementioned binning technology, AMD is also making one other change to cut down on the amount of data that has to go off-chip to VRAM. AMD has significantly reworked how the ROPs (or as they like to call them, the Render Back-Ends) interact with their L2 cache. Starting with Vega, the ROPs are now clients of the L2 cache rather than the memory controller, allowing them to better and more directly use the relatively spacious L2 cache.

---------

When combined with superior delta memory compression and the above tile cache rasterization feature, GTX 1070 (6.4 TFLOPS) was able jump higher than RX-480 and both RX-480 and GTX 1070 has the same physical memory bandwidth 256bit GDDR5-8000.

The problem for Shield TV.... Xbox 360 is also tile based rasterization which is very efficient.

The key design is both ROPS and shader ALUs are linked to small high speed memory.

Xbox 360's ROPS are located next to EDRAM and there's 30 GB/s link to GPU's shader ALUs. There's a bottleneck between EDRAM and GPU's shader ALUs.

Xbox One's ESRAM version can only write 109 GB/s rate which is not fast for tile render i.e. it's not at same memory bandwidth as L2 cache.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17 Juub1990
Member since 2013 • 12620 Posts

@22Toothpicks: The new chips are more power efficient.

Avatar image for davillain
DaVillain

56036

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#18 DaVillain  Moderator
Member since 2014 • 56036 Posts

Never cared about how or if Nintendo Switch will be powerful, just make the damn thing easier for 3rd party to port there games is all I really wanted for Switch but even then, I don't plan on buying 3rd party games for Switch.

Avatar image for shellcase86
shellcase86

6846

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 shellcase86
Member since 2012 • 6846 Posts

@charizard1605 said:

Past caring, to be honest.

Same. It is what it is. Not a Nintendo fan, by any means. But my son is -- I may consider getting one for him.

Avatar image for FireEmblem_Man
FireEmblem_Man

20248

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#20  Edited By FireEmblem_Man
Member since 2004 • 20248 Posts

@charizard1605 said:

Past caring, to be honest.

I think it was also proven false, as ARM just used the speculation rumors from Eurogamer and to an Android webpage.

EDIT: Yep, it was proven false, as ARM misled people and have taken down the post due to confusion as it was never an official statement. ARM did use a link that just posted rumored speculation

Source

Avatar image for FireEmblem_Man
FireEmblem_Man

20248

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#21 FireEmblem_Man
Member since 2004 • 20248 Posts

@Micropixel: Check your link again, it's not true at all!

Avatar image for no-scope-AK47
no-scope-AK47

3755

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#23 no-scope-AK47
Member since 2012 • 3755 Posts

Not news that the switch is even weaker than the nvidia shield.

Avatar image for osan0
osan0

17809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 osan0
Member since 2004 • 17809 Posts

@Gatygun: on paper a 780M can certainly give the PS4 a run for its money. in GTA5, BF4 and witcher 3 my laptop got a good bit ahead of the PS4. but doom....yeah. not good for my laptop (well the game is still perfectly playable at pretty high settings). its interesting too. the only thing that gets the framerate up is bumping the resolution down. changing all the other settings has a very small impace in comparison.

@ronvalencia: ah bugger on the kepler front. interesting with the TX1 shader model stuff. hopefully itll be enough. one of the downsides with wii u games has been the lack of AA and AF in many games. i reckon nintendo were just more focused on hitting a certain performance point consistently and implementing even post process AA of some kind was causing unexpected performance problems. im hoping the switch will solve this. on early titles the evidence seems to be that it doesnt but it is early days.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#25 ronvalencia
Member since 2008 • 29612 Posts

@xboxiphoneps3 said:
@daniel_su123 said:

Though this off topic. I can't wait and see what ARM can do to improve the PC industry at the moment, which is in the slumps at the moment, aside from gaming and 2 in 1s

ARM CPU's and GPU's are still no where near the performance of mid/high end pc's, and the PC industry is no where near in a slump at the moment.

Also http://www.gamespot.com/articles/pc-gaming-hardware-revenue-hits-30-billion-in-2016/1100-6447141/

"PC gaming hardware alone drove $30 billion in revenue worldwide in 2016, setting a new record. It wasn't mentioned in the report what the previous record was."

PC industry is far from a slump at the moment, infact PC industry just keeps growing and so does PC gaming

Renewed competition between AMD vs Intel CPUs might trigger larger upgrade cycle.

Avatar image for Pedro
Pedro

69360

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#27 Pedro
Member since 2002 • 69360 Posts

So, Nintendo is selling a weaker version of the X1 at a higher price? Is that the conclusion?

Avatar image for FireEmblem_Man
FireEmblem_Man

20248

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#28 FireEmblem_Man
Member since 2004 • 20248 Posts

@Pedro said:

So, Nintendo is selling a weaker version of the X1 at a higher price? Is that the conclusion?

Nothing is confirmed if you actually checked the link to the source.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29  Edited By ronvalencia
Member since 2008 • 29612 Posts

@xboxiphoneps3: Intel could reduce their IGP R&D budget/overheads and license RTG's GPU. http://hexus.net/business/news/legal/102244-intels-nvidia-gpu-licensing-deal-ends-next-month/

OK. Got information back on this. Everything I have mentioned here is definitively correct.

Intel is licensing AMD GPU technology. No money has changed hands yet, so there is not financial impact till late in the year, hence nothing in the current earnings report.

The first product AMD is working on for Intel is a Kaby Lake processor variant that will positioned in the entry-to-mid level performance segment. It is in fact an MCM (multi-chip-module), and will not be on-die with the KB CPU. It is scheduled to come to market before the end of the year. I would expect more collaboration between AMD and Intel in the future on the graphics side.

And you can take all that to the bank.

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#30  Edited By AzatiS
Member since 2004 • 14969 Posts

@22Toothpicks said:

@charizard1605:

Yeah not to mention had they gone for top-of-line stuff the sucker would be even more expensive (some would say overpriced) and have even worse battery life. I think the Switch is exactly what it needs to be as there was never any hope of competing specs wise anyhow.

LEts be honest ... If Nintendo wanted they could have dropped those gimmick/fancy craps and focus on a better hardware with same costs. But no ..

Avatar image for GameboyTroy
GameboyTroy

9726

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 1

#32 GameboyTroy
Member since 2011 • 9726 Posts

@AzatiS said:
@Pedro said:

So, Nintendo is selling a weaker version of the X1 at a higher price? Is that the conclusion?

They avoiding to give full specs for a reason ...

And we'll find out the full specs anyway when it launches.

Avatar image for AzatiS
AzatiS

14969

Forum Posts

0

Wiki Points

0

Followers

Reviews: 11

User Lists: 0

#33  Edited By AzatiS
Member since 2004 • 14969 Posts

@GameboyTroy said:
@AzatiS said:
@Pedro said:

So, Nintendo is selling a weaker version of the X1 at a higher price? Is that the conclusion?

They avoiding to give full specs for a reason ...

And we'll find out the full specs anyway when it launches.

haha , and as im saying some weeks now ... when people finally understand that 299$ is not that cheap considering whats inside this , as many want to call it "CONSOLE" , we might have some backlash.

Not that die hard fans or kids buying Nintendo consoles for their power but everyone else might have a thing or two to say.

Avatar image for wolfpup7
wolfpup7

1930

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#34 wolfpup7
Member since 2004 • 1930 Posts

@Micropixel: Relatively happy. Obviously we were all hoping it would be better still, but it's about as powerful as we could realistically hope for in a mass market portable console released in when it is.

Avatar image for iandizion713
iandizion713

16025

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#35  Edited By iandizion713
Member since 2005 • 16025 Posts

@wolfpup7: Agreed, its insanely powerful for a hybrid device. And Nintendo might release a more powerful update 3 years from now? Would be pretty sweet.

Avatar image for wolfpup7
wolfpup7

1930

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#37 wolfpup7
Member since 2004 • 1930 Posts

@xboxiphoneps3 said:
@wolfpup7 said:

@Micropixel: Relatively happy. Obviously we were all hoping it would be better still, but it's about as powerful as we could realistically hope for in a mass market portable console released in when it is.

Mass produced flagship smartphones have better performing chips that are also more power efficient (i.e Switch tablet would last longer with better SoC and run better), Nintendo simply went with $$$$ savings over putting better hardware inside, which was absolutely available to them at the time.

1) No they don't. Apple's got the only SOCs that more or less match that.

2) Even if they did, they'd be the $650 models, not $300.

The ONLY chance for something better that's remotely realistic is if the timing worked out for Nvidia's next Tegra.

Avatar image for Micropixel
Micropixel

1383

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#38 Micropixel
Member since 2005 • 1383 Posts

@FireEmblem_Man said:

@Micropixel: Check your link again, it's not true at all!

Ah, you're right! They took the link down! Thanks for the heads up!

Well, if any mods feel like this warrants a lock, go ahead! :)

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#39  Edited By ronvalencia
Member since 2008 • 29612 Posts

@wolfpup7 said:
@xboxiphoneps3 said:
@wolfpup7 said:

@Micropixel: Relatively happy. Obviously we were all hoping it would be better still, but it's about as powerful as we could realistically hope for in a mass market portable console released in when it is.

Mass produced flagship smartphones have better performing chips that are also more power efficient (i.e Switch tablet would last longer with better SoC and run better), Nintendo simply went with $$$$ savings over putting better hardware inside, which was absolutely available to them at the time.

1) No they don't. Apple's got the only SOCs that more or less match that.

2) Even if they did, they'd be the $650 models, not $300.

The ONLY chance for something better that's remotely realistic is if the timing worked out for Nvidia's next Tegra.

Handheld mode for Switch GPU has 307 Mhz clock speed which is ~1/3 of Shield TV.

http://www.pcworld.com/article/3006268/tablets/tested-why-the-ipad-pro-really-isnt-as-fast-a-laptop.html

Google's Pixel C (15 watts) tablet's Ice Storm Unlimited's graphics score from http://www.futuremark.com/hardware/mobile/Google+Pixel+C/review

Ice Storm Unlimited's graphics score is 52,536

Google C has TX1 GPU at 850Mhz with ARM Cortex A57 quad CPU core at 1.9 Ghz

The decline in graphics score from Shield TV to Pixel C matches the reduced GPU clock speed.

Switch at 307 Mhz TX1 GPU estimated Ice Storm Unlimited's graphics score is 20,576

There's a reason why no sane phone vendor has NVIDIA TX1!!!!

LG G4 has Qualcomm Snapdragon 808 SoC

Nvidia Tegra P1 "Parker" SoC has 10 watts at 1.5 Ghz and 768 GFLOPS. 5 watts handheld version would be around 600 to 700 Mhz.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#41 ronvalencia
Member since 2008 • 29612 Posts

@xboxiphoneps3 said:

@wolfpup7: Um yes they do, Snapdragon 820 has better CPU performance then the X1 and also has a GPU on par with the Switch. This isn't even the 821 I am talking about either, let alone the S835 launching in 2 months in the Galaxy S8 that will just further extend the lead.

Exynos 8890 in the S7/Edge outperforms the X1 in the Switch

But I am glad you also brought up Apple because the Apple A9 from 2 years ago has a better CPU then the Switch and the A10 in the 7 Plus/7 is better all around then the SoC in the Switch and also more power efficient, there are a good amount of SoCs on the market that does outperform the Switch's X1 implementation in handheld mode very comfortably, in docked mode, these SoCs still do outperform the Switch although by not as big of a margin as handheld mode

S7 also has S820 (MSM8996) i.e.

S7 (SM-G9300/A/P/T/U/V)

S7 Edge (SM-G9350/A/P/T/U/V)

Avatar image for skektek
skektek

6530

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#42 skektek
Member since 2004 • 6530 Posts

@davillain- said:

Never cared about how or if Nintendo Switch will be powerful, just make the damn thing easier for 3rd party to port there games is all I really wanted for Switch

That's the conundrum. In order to be easy to port there has to be a performance parity (or nearly so). The NS has only 30% of the power of the Xbone, 22% of the power of the PS4, 10% of the power of the PS4 Pro, and only 6% of the power of the Scorpio.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#44 ronvalencia
Member since 2008 • 29612 Posts

@xboxiphoneps3 said:
@ronvalencia said:
@xboxiphoneps3 said:

@wolfpup7: Um yes they do, Snapdragon 820 has better CPU performance then the X1 and also has a GPU on par with the Switch. This isn't even the 821 I am talking about either, let alone the S835 launching in 2 months in the Galaxy S8 that will just further extend the lead.

Exynos 8890 in the S7/Edge outperforms the X1 in the Switch

But I am glad you also brought up Apple because the Apple A9 from 2 years ago has a better CPU then the Switch and the A10 in the 7 Plus/7 is better all around then the SoC in the Switch and also more power efficient, there are a good amount of SoCs on the market that does outperform the Switch's X1 implementation in handheld mode very comfortably, in docked mode, these SoCs still do outperform the Switch although by not as big of a margin as handheld mode

S7 also has S820 (MSM8996) i.e.

S7 (SM-G9300/A/P/T/U/V)

S7 Edge (SM-G9350/A/P/T/U/V)

Yeah, S820 model for U.S and Exynos rest of world, shame that the Exynos model is always more optimized then their Snapdragon counterpart though on Samsung devices

Google's Pixel Phones has S821.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#45  Edited By ronvalencia
Member since 2008 • 29612 Posts

@skektek said:
@davillain- said:

Never cared about how or if Nintendo Switch will be powerful, just make the damn thing easier for 3rd party to port there games is all I really wanted for Switch

That's the conundrum. In order to be easy to port there has to be a performance parity (or nearly so). The NS has only 30% of the power of the Xbone, 22% of the power of the PS4, 10% of the power of the PS4 Pro, and only 6% of the power of the Scorpio.

When running Forza 6 Apex, GeForce 920MX (548.9 GFLOPS FP32) with similar GFLOPS as Shield TV has ~1/8 effective from XBO's 1920x1080p/60 fps results.

GeForce 920MX's Forza 6 Apex result is 1280x720p/~30 fps with all settings set to low. XBO's graphics settings are greater than low.

In dock mode, Switch is in the minimum borderline to run XBO era games.

Switch's double speed Fp16 feature may yield performance boost (~786 GFLOPS FP16 in dock mode).

Switch's 25 GB/s memory bandwidth is lame when compared to XBO's 68 GB/s BW DDR3 + 204 GB/s(109 GB/s BW) ESRAM.

(68+109) x (1/8) = 28 GB/s which is ~1/8 of XBO's BW memory bandwidth and TX1 has 25 GB/s BW memory bandwidth.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47  Edited By Gatygun
Member since 2010 • 2709 Posts

So why would nintendo go for those parts inside the switch if there is so much out there that would be better fitted?

Keep in mind nintendo is no longer under the terrible leadership of iwata, so it seems this next guy has a more brighter vision when it comes to products.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#48  Edited By ronvalencia
Member since 2008 • 29612 Posts

@Gatygun said:

So why would nintendo go for those parts inside the switch if there is so much out there that would be better fitted?

There are other reasons besides hardware.

http://venturebeat.com/2016/11/13/nvidias-ceo-on-everything-from-ais-dangers-to-donald-trump-and-the-nintendo-switch/

VB: There was a day when it seemed like you were happy with just serving the PC gaming market. The console was a less attractive market. I wonder why you guys went after the Nintendo Switch and how you accomplished that.

Huang: We’re dedicated to the gaming market and always have been. Some parts of the market, we just weren’t prepared to serve them. I was fairly open about how, when this current generation of consoles was being considered, we didn’t have x86 CPUs. We weren’t in contention for any of those. However, the other factor is whether we could really make a contribution or not. If a particular game console doesn’t require our special skills, what we can uniquely bring, then it’s a commodity business that may not be suited for us.

In the case of Switch, it was such a ground-breaking design. Performance matters because games are built on great performance, but form factor and energy efficiency matter incredibly because they want to build something that’s portable and transformable. The type of gameplay they want to enable is like nothing the world has so far. It’s a scenario where two great engineering teams, working with their creative teams, needed to hunker down. Several hundred engineering years went into building this new console. It’s the type of project that really inspires us, gets us excited. It’s a classic win-win.

NVIDIA's engineering team can speed up Nintendo's creative teams with the latest rendering techniques.

http://www.neogaf.com/forum/showthread.php?t=1218933

The article is available to subscribers only, however the gist of it is this:

  • Though Nvidia downplayed console margins, their pride was hurt by the loss in console contracts. All the talk about "focusing on Shield" was a cover for the fact that MS and Sony had soured on them and would not enter negotiations.
  • Nvidia team was told to get a console win or "go home." Enter Nintendo, who apparently made off very well in this deal. This to the point that SemiAccurate questions whether this is a "win" at all for Nvidia.
  • SA has heard that Nvidia are promising software, support, and the whole shebang at a very low cost. According to one source, Nvidia may even be taking a loss on this deal. (Take the second sentence here with an extra portion of salt)
  • Not mentioned which generation of Tegra or which process node will be used or when the handheld is scheduled for release.
  • No mention of the home console, but we can speculate what that might be and who might provide the chipset for that one.

Both Microsoft and Sony can support their own selected hardware and has software developer teams that can developed in-house leading edge 3D engines on modern GPU hardware.

Avatar image for no-scope-AK47
no-scope-AK47

3755

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#49 no-scope-AK47
Member since 2012 • 3755 Posts

@Pedro said:

So, Nintendo is selling a weaker version of the X1 at a higher price? Is that the conclusion?

Pretty much with no multimedia app or decent online gaming support.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#50  Edited By ronvalencia
Member since 2008 • 29612 Posts

@no-scope-AK47 said:
@Pedro said:

So, Nintendo is selling a weaker version of the X1 at a higher price? Is that the conclusion?

Pretty much with no multimedia app or decent online gaming support.

Nintendo has decent local multiplayer support.