Intel reveals its next-gen Iris graphics tech

New integrated GPUs show up to three times the performance of previous-generation chips, challenge AMD and Nvidia solutions.

'

Intel is promising better gaming performance with a new integrated graphics architecture dubbed Iris. The GPU will debut as part of Intel's fourth-generation CPU architecture Haswell, which is currently being teased for a June 3 release in both laptop and desktop versions.

On the desktop, Iris will come in two configurations, the Iris 5200 and the Iris 5200 Pro (denoted with an R suffix). Intel's own 3DMark11 tests show over two-times the performance of the HD 4000 with the 5200 and over three-times the performance with the Iris 5200 Pro. This places the GPUs in the same performance range as current midrange dedicated GPUs, and AMD's APU chips.

Intel has not yet provided any real-world benchmarks for games.

For high-end laptops, Iris takes the form of Iris Graphics 5100. Intel promises the 5100 will have over two-times the performance of the HD 4000, but it comes with a much higher 28W thermal design power (TDP) than the 17W TDP used by both Sandy Bridge and Ivy Bridge CPUs, currently limiting its use to larger laptops.

Smaller Ultrabooks will use the Intel HD Graphics 5000 GPU. Intel's benchmarks show a 1.5 times performance increase on the HD 4000 part, while also fitting the chips within a 15W TDP envelope for devices that are power-constrained.

Almost all the Iris-based GPUs will support Direct3D 11.1, OpenGL 4.1, and OpenCL 1.2, a faster version of Intel's QuickSync video encoding engine, DisplayPort 1.2, improved support for 2K and 4K resolutions, and the ability to stretch one "logical" monitor across up to three physical monitors.

The Iris 5200 Pro also makes use of a small amount of ultra-fast integrated eDRAM, providing a much higher memory bandwidth than system RAM for increased graphics performance.

A similar technique is used in the Xbox 360, which contains 10MB of eDRAM, while current rumours point to the next-generation Xbox using 32MB of eDRAM.

'

Written By

Discussion

132 comments
blindmoose1
blindmoose1

Seems like a lot of people just dismiss this as garbage automatically because it is integrated...which I get. I use a GTX 680 in my desktop and wouldn't consider using this for gaming in a desktop. This is great news for the laptop/tablet space though. And even more so, I think people aren't seeing the big picture. There are inherent design advantages to using integrated graphics vs a discrete card. You can do the design such that the CPU and GPU has more direct memory access, which can GREATLY speed up operation. When you put things like this together on a chip, there are a lot of things you can do that are simply not possible otherwise. Right now, there are limitations on how well it works simply because the technology isn't quite there to manufacture it well enough, but eventually, using a discrete graphics card will seem outdated and silly, because individual chips will have enormous computing power. This will be the case sooner than you think I'd wager. It'll start to make high end cards more and more of a niche.

radikel
radikel

This is great news for HTPC users, who do a bit of light power gaming. Game for some time in low or medium settings when on break, then back to squre one. 

Also, this is fackin Intel, who progress like the speed of light. They were nowhere in the graphics scenario two years back, and iris is already competing with the  Nvidia 650. Just cant imagine how mind blowing next year's Broadwell CPU release is going to be like :))))

Majkic666
Majkic666

yawn... laptop technology is not interesting.... wake me up when you have something that is useful for Desktops

TohouAsura
TohouAsura

.. That's nice, for light gaming gone a little less lighter. But there's nothing groundbreaking about this, the Iris of today is the Nvidia of a decade ago.

This still means nothing to gamers as we have no use for integrated graphics.

"challenge AMD and Nvidia solutions"

HAAH.

Useful for Laptops, surely. No use for power gamers.

tim1935
tim1935

Two times zero is still zero, lol

PupilsDilated
PupilsDilated

If this drives down prices of competitors, then I'm all for it.

naryanrobinson
naryanrobinson

Just to clear this up; you can't use your graphics card and this at the same time.

It's nearly impossible.

If you're even a casual gamer and have a graphics card, this really doesn't mean very much to you, because the actual CPU power hasn't been improved anything like as much as the GPU power; maybe 10% at most.

This is only really news at all for laptops, and even then, only really because the new Haswells are going to feature far better battery life, which is more important than turning a laptop that's terrible at gaming into a laptop that's almost not terrible at gaming.

Dezuria
Dezuria

I had a graphics card bios flash go wrong(very scary), and it's thanks to that integrated gpu in my i5 that I was able to boot back into Windows and reflash the bios again and get it working!

So yeah, I'd never use an i-gpu normally, but I sure do appreciate that it's there.

gplayer5
gplayer5

Looks awesome. I would love to buy it, but I can't afford to pay thousands of dollars on stuff like that. Guess im sticking with my GTX 570 for right now.

godzillavskong
godzillavskong

Still intergrated. Not too interested ,but at least it's a step in the right direction. I guess it's good for laptops or folks who really don't want to invest in a dedicated gpu, or have space constrictions, like a media pc.  

Jd1680a
Jd1680a

Im wondering if intel will bring back the slot one.

lonewolf1044
lonewolf1044

It still share the disavantages of a intergrated CPU, it cannot be upgraded and it is only good for the moment. However, I will be investing in the new 4770R.

Arab_Spring
Arab_Spring

while this comment section is probably the most informative/educational. its the most boring, nerdy, snoozefest, uninteresting, java-lecture feel, autocad feel, numbers/initials orgy, and headache inducing comment section.

kalipekona
kalipekona

Cool. I might get a decent laptop to do go with my desktop gaming PC for those times I want to game on the go.

botroo3
botroo3

3x of extreme crap still equals crap.

BravoOneActual
BravoOneActual

"Mid-range" is moving target in PC gaming and needs to be directly defined in this article.

The GTX 480 is a two-generation-old flagship and last generation's GTX 570  represented enthusiast grade muscle.  Now there are plenty of sub-$200 cards -- and even a few at $160 --- that best them both in current games.  Is this the kind of power they're touting with these chips?

If these all-in-one solutions can hang with the 7800's and 660's of the world in 1080 gaming that would be pretty incredible, but I'd have to see it to believe it.


07wintert
07wintert

i upgraded my old graphics card to a radeon 7950 and i wouldn't be able to that with integrated graphics id have to upgrade the CPU more expensive. but said that the last two i computers had radeons in them and i cant see how it works. do both graphic set-ups work at the same time i think i have HD3000 aswell.

Solid_Azeus
Solid_Azeus

AMD 7990 coming out and Nvidia's next GTX monster soon. When that happens, current high-end becomes mid-range and current mid-range becomes low-end along with this Intel make-do.

scorpgul
scorpgul

Loioks like Intel is trying to get in the ring with the big boys. Although still primitive on Haswell, this integrated GPU could very well end up being the future. Competition is always good

eric_neo3
eric_neo3

@naryanrobinson Agreed but I have to ask where the logic is in splitting the CPU into a CPU partition and a GPU partition.

The GPU partition will increase power drain and heat which will wear out the chip faster. Not to mention now you'll have a GPU you can't use in your CPU and in the Chipset of your motherboard because most computers these days come with a dedicated graphics card.

inaka_rob
inaka_rob

What do you mean "still integrated" that doesn't even make sense. All intel chips have integrated gpu's built in. Doesn't mean you have to rely on it. The computer you have now has an integrated gpu. Maybe if you got interested you would know what you were talking bout

TohouAsura
TohouAsura

@i-blast-brain So? You don't need Iris really. It's a stock improvement, but it's nothing that's worth turning in your PC back.

Wahab_MinSeo
Wahab_MinSeo

@i-blast-brain My PC is Old & Good I can't stand more times
Core i7 960

AMD HD5970

& I can play most of games are good in 1080p but not the maximum graphics!

godfather830
godfather830

@BravoOneActual Not quite. 3 times the performance of Intel HD 4000 puts them at the GTS 450 level or Radeon HD 7750 level.

Still pretty impressive if actually true.

Problem is: on desktops they only come with high end CPUs. No one who care about graphics will buy a high end CPU and get a GTS 450 level GPU.

But on high end laptops, it gets very interesting. A lot of them use mid-range or low-range Nvidia/AMD graphics, but these Intel chips may render them unncessary.

godfather830
godfather830

@07wintert No, you can only use on at a time. If you have a discrete desktop graphics card (like the Radeon HD 7950), the Intel graphics cards get disabled.

Ravenlore_basic
Ravenlore_basic

@Solid_Azeus  YES,.... and this repeats itself every year or 2 when a WHOLE new chip is announced, and each 6 months when a new variant of the chip launches. 

Thus in order to have the best you have to have the $$$$$$ to do so.  

u1tradt
u1tradt

Intel getting in the ring with the big boys? Intel IS the big boy in the ring mate.

naryanrobinson
naryanrobinson

@eric_neo3 Well that's what Intel are hoping to change, and they're succeeding.

More and more vendors are increasing the number of fully-fledged desktop PCs they are putting on the market that don't have dedicated graphics at all, and I don't blame them.  The result is cheaper PCs that (nowadays) can do all the mid-range gaming and video editing you'd ever want.

Everything but the high-end graphics market will cease to exist in 5 years.  Intel is being very aggressive about making it irrelevant.

XeonForce
XeonForce

@slayn03 -- I'm just looking at the picture that says, "Incredible Desktop Graphics Performance", what are you looking at?

Saketume
Saketume

@godfather830 You know I ran the HD 4000 on my 3770K non OC for a while before I got my graphics card up and running and it's actually not that bad.

I could play modern 3D MMOs and stuff at decent settings.

Games like Lego Batman 2 ran with no problem. Basically any non demanding 3D game ran just fine. Much better than I thought they would. I even played some Cod4.

It's no replacement for a proper card of course but if this new chip is 3 times as powerful in its desktop version then it definitely will have its uses.

goggles123456
goggles123456

@Ravenlore_basic Man, I couldn't agree more.  The last time I upgraded my desktop was 10 years ago, and I just bought a new laptop 2 years ago for gaming...it runs everything from my emulators to Skyrim and Dark Souls.  All my friends are always making new PCs every 1-2 years...I keep trying to get them to see this point, but...THEY NEED MORE POWER OMG

scorpgul
scorpgul

@u1tradt In processors only. In Desktop Gaming Graphics processing, Intel is still a baby.

naryanrobinson
naryanrobinson

@quantumtheo High-end GPUs won't be irrelevant any time in the near future.

Even if they're made by Intel and not a third company, there will *always* be a market in taking a chip and pushing it hard enough that it's too hot to integrate into the CPU, and there will *always* be games that take advantage of whatever the sharpest, most realistic graphics happen to be.

It's the temperature that's restricting faster speeds right now, and the temperature of the processors being shipped isn't becoming lower from year to year, they're being pushed nearly up to their limit, then shipped like that.

Silicon has almost run it's course and before long it will stop being useful.  When that time comes, one of two things will happen.

Either we base processors on graphene or other carbon-based nanotubes, or we develop processors that use light to carry data and are no longer restricted by the speed of electricity.

quantumtheo
quantumtheo

@naryanrobinson @eric_neo3 I think Moore's Law has more to do with it than anything else.  I've been waiting for this day.  And I'm glad, this will hopefully drive the prices of the highest end video cards down. In 10 or 20 years, who knows, maybe the integrated GPUs will be powerful enough to make high end GPUs irrelevant. 


I look at how far we've come over the last 20 years.... in 93, you probably wouldn't have believed how powerful computers and GPU's have become.  Again, the next 20 should offer huge leaps.

slayn03
slayn03

@Arab_Spring @Saketume heh, 1up for u guys, but honestly thats kinda fairly good step for mobile gpu... considering current chips it provides something!

Ravenlore_basic
Ravenlore_basic

@scorpgul @u1tradt But they have the BILLIONS to do R&D and improve very fast, or just buy Nvidia but I don't think that would be allowed.  

Intel is just now being hit hard with compatation in the chip department and AMD is getting better with its APUs.

IF Intel does nothing then they know their days are numbered

N0tYrBeezin
N0tYrBeezin

@Solid_Azeus @u1tradt Not to mention these processors are not even on the market yet. Whereas lots of PC gamers have been using AMD and Nvidia graphics cards for ages.

Solid_Azeus
Solid_Azeus

@u1tradt No one is disputing that. But as far as graphics processing power is concerned, line of APU or not Intel is a still baby.

u1tradt
u1tradt

Well Intel aren't challenging nVidia by introducing their own line of APU's are they? They're challenging their nearest competitor in the CPU market which Intel is completely dominating. Hardly makes sense to say Intel is still a baby in that context.