Pixel 2 / Pixel 2 XL Hiding A Custom Google SoC - "Pixel Visual Core"

Avatar image for NVIDIATI
NVIDIATI

8463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 NVIDIATI
Member since 2010 • 8463 Posts

Google has hidden an 8-core SoC inside of its latest Pixel devices. This SoC is designed for image processing and machine learning. Currently inactive, Google will enable this with Android 8.1.

Pixel Visual Core
Pixel Visual Core

Last year, Google discussed how future Pixel devices would have custom silicon inside. Well, there's our first taste. With their recent HTC dealings, it looks like future Pixel devices will continue to offer unique in-house hardware.

Avatar image for musicalmac
musicalmac

25098

Forum Posts

0

Wiki Points

0

Followers

Reviews: 15

User Lists: 1

#2  Edited By musicalmac  Moderator
Member since 2006 • 25098 Posts

I wonder when they'll have the capacity to stamp it all on the same SoC. I'm not sure HTC had anything to do with this, unless they're manufacturing custom silicon and I just haven't known it.

This unorthodox move is probably a decision made necessary by the stagnation of Qualcomm's own SoCs. But they're not alone on that, the Kirin 970 was also disappointing from a performance standpoint.

Edit: I also wonder if it's capable of doing any AI or machine learning locally, or if it still relies upon, and relays everything back to, Google. Being a data vacuum is really android's purpose anyways, so perhaps it's both.

Avatar image for NVIDIATI
NVIDIATI

8463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 NVIDIATI
Member since 2010 • 8463 Posts

@musicalmac said:

I wonder when they'll have the capacity to stamp it all on the same SoC. I'm not sure HTC had anything to do with this, unless they're manufacturing custom silicon and I just haven't known it.

This unorthodox move is probably a decision made necessary by the stagnation of Qualcomm's own SoCs. But they're not alone on that, the Kirin 970 was also disappointing from a performance standpoint.

HTC had nothing to do with the design, I was just mentioning their recent deal with them will likely lead to other aspects of hardware being done in-house. Google has been developing machine learning hardware for quite sometime.

Google's Tensor Processing Unit was a good example of their machine learning prowess:

The Kirin 970 was never going to have the best CPU and GPU performance, the NPU, on the other hand, definitely stands out. The most interesting part was that Huawei didn't limit the NPU to their own API, it can use Android's machine learning APIs.

-

AI and machine learning is an area Google has a huge advantage, and this is just the start of what's to come.

Avatar image for NVIDIATI
NVIDIATI

8463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 NVIDIATI
Member since 2010 • 8463 Posts

@musicalmac said:

Edit: I also wonder if it's capable of doing any AI or machine learning locally, or if it still relies upon, and relays everything back to, Google. Being a data vacuum is really android's purpose anyways, so perhaps it's both.

The Pixel already has a number of local features. For example, when offline, it can recognize over 17,000 songs just using its local database.

Avatar image for NVIDIATI
NVIDIATI

8463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 NVIDIATI
Member since 2010 • 8463 Posts

UPDATE:

With eight Google-designed custom cores, each with 512 arithmetic logic units (ALUs), the IPU delivers raw performance of more than 3 trillion operations per second on a mobile power budget.

Credit Google

For comparison, Apple claims their neural engine inside the A11 can perform 600 billion operations per second.

Avatar image for NVIDIATI
NVIDIATI

8463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 NVIDIATI
Member since 2010 • 8463 Posts

UPDATE 2:

iFixit teardown shows the Pixel Visual Core inside the Pixel 2 XL (highlighted by the pink box):

Labled: SR3HX X726C502
Labled: SR3HX X726C502

For comparison, the Snapdragon 835 is the SEC 731.

Avatar image for musicalmac
musicalmac

25098

Forum Posts

0

Wiki Points

0

Followers

Reviews: 15

User Lists: 1

#7 musicalmac  Moderator
Member since 2006 • 25098 Posts

That's certainly interesting, though what that means day to day is still a mystery to me. It would be interesting to see a breakdown side-by-side comparison of these separate chips and what they're actually doing functionally. As it stands now, I don't think it's meaningful to compare these on a performance basis without understanding what that performance is responsible for.

It seems to me as though these additional chips living alongside the 970 and 835 in the new android flagships are far more necessary for their respective devices considering the stunted performance of the primary SoC (particularly in comparison to the A11 Bionic).

Avatar image for NVIDIATI
NVIDIATI

8463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 NVIDIATI
Member since 2010 • 8463 Posts

@musicalmac

The machine learning APIs on both iOS and Android are open for developers to use in their apps. So while a side-by-side isn't exactly possible with similar 3rd party apps at the moment, we'll likely see developers jumping on-board over the next year.

In the case of Huawei, machine learning will be used for a number of features. The first is for the camera, using image recognition, it will optimize the camera and picture based on what is in the scene. For example, it can tell the difference between food, cats, dogs, people, etc. and adjust the image accordingly. Huawei has also worked with Microsoft to pre-load their translator software to convert images and words between dozens of languages when offline. They're also using the NPU towards device optimization based on the user. This would include adjustments to improve battery life, optimize storage, manage multiple processes or ensure that there's no performance degradation over time.

From my understanding, the Pixel Visual Core (PVC) is currently inactive and Google's first major use will be for HDR+ with Android 8.1 in the coming weeks. This is also something they'll allow 3rd party apps to use shortly after. The camera on the Pixel 2 / 2 XL is actually very unique as every pixel is split into two. Meaning, that with each shot, it captures double the amount of data. Data is critical for Google and the raw performance of the PVC can crunch this in very short amounts of time. The data can be used for camera features such as HDR+ or portrait mode. This also allows for benefits in image recognition and AR applications.

While the CPU compute on the A11 is definitely higher than the 970 or 835, the graphics performance is only slightly better. So I wouldn't exactly call it a case of stunted performance. That being said, the performance of any app or function that can leverage machine learning will be immensely higher on the PVC than the A11, raw performance alone is ~5x in favor of Google (~3.2x for Huawei). The PVC is separate from the Snapdragon 835, but, like the A11, the Kirin 970's machine learning hardware is part of the SoC. CPU performance for machine learning is almost insignificant compared to the GPU, let alone dedicated hardware.

You also need to take into account the fact that Google is just better at this than Apple. Looking at Google Assistant vs Siri, it's easy to see that Apple's offering is well behind. There's a lot on the software side that, without the dedicated hardware, allows Google to offer offline features such as limited Google Assistant and music recognition. Outside of FaceID, we haven't really seen anything major from Apple that leverages Core ML and the on-board machine learning hardware.

--

Drastically improving the performance of graphic features, such as ray tracing, is something that also becomes possible with machine learning hardware. I'd be curious if smartphones would ever utilize that for games, AR or other 3D applications.