So... hand helds will have top notch graphics eventually? Not compatible with x86? So PC gaming is capped?
Mammoth chip-maker to pay GPU manufacturer in five annual installments; all legal matters between the two dropped.
Intel will gain access to Nvidia's patents while paying the graphics chip supplier $1.5 billion in licensing fees as part of a new six-year agreement.
"For the future use of Nvidia's technology, Intel will pay Nvidia an aggregate of $1.5 billion in licensing fees payable in five annual installments, beginning Jan. 18, 2011," Nvidia announced today.
Furthermore, Nvidia and Intel have agreed to drop all outstanding legal disputes between them.
The crux of the agreement is that Intel gains access to all of Nvidia's GPU (graphics processing unit) patents but Nvidia gains access to only certain Intel patents. To compensate for the lopsided patent access (which favors Intel), Intel pays Nvidia $1.5 billion.
Intel and Nvidia had both sued each other in early 2009 in a dispute that originally centered on a chipset license agreement. Intel had contended the cross license does not extend to Intel's future-generation processors, and Nvidia countersued blocking access to its patent portfolio.
In effect, Nvidia was barred from building Intel-compatible chipsets beyond the Core 2 Duo generation of processors. For example, the second generation of Apple's MacBook Air used an Nvidia chipset along with Intel's Core 2 Duo processor. However, Nvidia could not build chipsets for the newest generation of Intel Core i3, i5, and i7 processors. This, in effect, forced Apple to stay with Intel's older-generation Core 2 Duo processors in its newest MacBook Airs because it allowed Apple to legally continue to use Nvidia chipsets.
The agreement announced Monday still bars Nvidia from using any of Intel's x86 technology, and as a result, Nvidia cannot build x86-compatible chipsets, according to Intel. But Nvidia CEO Jen-Hsun Huang made it clear he's not interested. "We've already said many times that we have no intention to build chipsets for Intel processors," he said in the conference call Monday afternoon. And many PC makers (including Apple) still use discrete (stand-alone) Nvidia graphics processing units (GPUs) that attach to Intel chipsets.
Huang expounded on its traditional strong suit GPUs--which hold patents that Intel is paying for and which Nvidia incorporates into its ARM processors. GPUs excel at parallel processing, whereas CPUs (central processing units)--such as Intel's x86 chips--do sequential processing. Both types of processors have their merits, though GPUs have the potential to be much faster than CPUs at doing visual processing and scientific number-crunching, for example.
"I don't think you can build a modern computer today without a state-of-the-art GPU technology. Anytime you can do something in parallel, it's better than sequential," Huang said.
"Our cross license with Intel reflects the substantial value of our visual and parallel computing technologies. It also underscores the importance of our inventions to the future of personal computing, as well as the expanding markets for mobile and cloud computing," said Huang in an official statement today.
Huang went on to say that the company's focus is now on ARM processors--which compete with Intel's x86 chips in small devices like Netbooks and tablets. "It's a foregone conclusion that ARM is the most important [chip] architecture. ARM will be the largest installed-based processor. It's pervasive and open. We will extend the ARM processor with our GPU," he said.
Huang pointed to Microsoft's announcement at CES to port its next major release of Windows to ARM processors and to Nvidia's CES announcement of Project Denver, in which it will design high-performance ARM chips for desktops and supercomputers. Those future Nvidia chips will be hybrids--much like Intel's just-announced Sandy Bridge processor. "Project Denver...features an Nvidia CPU running the ARM instruction set, which will be fully integrated on the same chip as the Nvidia GPU," Bill Dally, Nvidia's chief scientist, said last week.
LOL. Sad all this Lawsuit stuffs being exploited here in Gamespot. Hey we just buy and play the games. Thats all.
That pretty much PROVES PC gaming is alive and kicking. Otherwise Intel wouldn't care less about Nvidia. I'm sure they could think about other ways to spend 1.5 billion.
@KuRf You can overclock both Nvidea and ATI? ATI just has a "even morons can overclock now" interface for doing it.
Interesting development. -----Oh look, someone's already tied this article to a "pc gaming is dead" conclusion! Bravo! /sarcrasm
"$1.5 billion is a lot of money to most people, but that's a drop in the bucket to Intel." No, but $300,000,000 a year going into nVidia's coffers is a nice bonus for them and probably considerably more profit (not income) than they would have generated from continuing chipsets. Nvidia also have a very strong lineup of GPUs at the moment with the emerging 500-series Geforce cards being power efficient and lightning quick, plus the still successful GTX 460 which is reasonably priced for performance. Nvidia are also being successful in the supercomputing marketing, selling a large number of Tesla GPGPUs to corporations in China, US and the European Space Agency.
"Intel is saying that part of Intel Insider is the ability for a media company or streaming media service to check whether your PC supports Intel Insider. If it does, you get the video. If not, they might decide not to deliver it to you." Guess i won't be buying an Intel CPU again. Thx intel!
Ok, so now Intel will make quality low end gpus. Nvdia doesn't care since they left mobo chipsets last year and are focusing on ARM SoC and GPGPU. But AMD should be really scared, since they have absolutely no chance to compete with Sandy Bridge atm, and Blulldozer is not coming soon (and btw, we still have to see how good it is, cause Fusion is an epic fail). Intel already has 80% of the market share, but now has a chance to strike a killing blow to AMD in the next 2-3 years. And that's bad.
All I see is a further shift away from the PC gaming market. We've got incredibly powerful CPU/GPU combos right now that are lying dormant for the most part because no one will grow a pair and fund a game that pushes the hardware envelope. Crysis is still the most technically impressive game released and it's been out a little over 3 years now. Really wish Futuremark would make a game around their 3DMark 11 engine, some very impressive visuals going on in there. As far as their DRM on-chip, more power to them, would love to see a choice between a hardware-based disc check or online activation. No doubt there will still be ways around it, would make my life a lot less difficult if I didn't have to authorize and deauthorize games all the damned time.
AMD CEO also Resigned down today to. http://news.cnet.com/8301-1001_3-20028061-92.html
With that three become two. Thus begins an AMD-ATi head to head with Intel-Nvidia. Who will win? Overclockers or Casual?
@MichaeltheCM $1.5 billion is a lot of money to most people, but that's a drop in the bucket to Intel.
Yes, I saw on Google about DMA. Wow, that really awesome for hacker to disable protect inside processor to unlock then allow to copy movies to another blank disc.
Core i7 with sandy bridge will prevent you to copy HD movies like Blu-ray and online HD stream. Sandy bridge have strict inside like copy protect proof.
i really dont care as long as i can get awesome performance form PNY gtx 480 and my Quad core( Q9550) preocessor.
I want real time, high resolution photorealism in games and virtual environments. It's unclear how this agreement will affect the push toward that.
I Got Intel Dual core E5300 and msi Radeon 4670 since upgrading from Pentium 4 630 to dual core i encountered many weird problems for example In gears of war final boss i get stuck between the three vehicles of the train so i have to shake the mouse a little bit to move from one vehicle to another.
Just got my SandyBridge i7 it's meh... Bottleneck 580gtx very disappointed at Intel. Make them work together goddamnit!
Intel and nVidia. Together like chocolate and peanut butter. Mmmmm..... These two companies have both powered all my systems since 1994 and I'm happy to say, unlike Creative Labs, that I will continue to go with this combo for a long time to come.
Good to see the legacy instructions of x86 might finally fade out. How many PC users run 16bit apps regularly still?
Gotta wonder what NVIDIA is going to do with all that money. Possibly a massive price reduction on their top end video cards!? *Sigh* If only...
@CPM_basic Not really, Just because the package says it has the same speed (3.2ghz ect) doesn't mean Intels are overpriced compared to Amd's. The reason Amd's cpu's are cheaper for the same ghz's that intel offers is because they're less effective. Heres a Benchmark example (Laptop Cpu's) (Intels is Duo, while the Amd is Triple but it's almost the exact same score) Intel Core i3 330M @ 2.13GHz = 1,956 AMD Phenom II N830 Triple-Core @ 2.1GHz = 1,958 AMD Turion II Dual-Core Mobile M500 @ 2.2GHz = 1,322 Intel is alot more powerful, but at the moment you really don't need it. Thats a reason why i perfer Amd because it works fine and it's cheap : ) But if i was a super hard-core gamer who had the money, Intel all the way.
I lost all respect for Intel when they said they were going to implement DRM for their Insider feature. If I buy a product from a store, I want to be able to do whatever I want with it, because i'm the one that PAID for it. I give it about 2-3 days after it's release date before hackers manage to bypass the DRM, and chances are that Intel know this anyway...they're just raking the cash from media companies for now.
they hopefully wont have any more of these "disputes" because i never buy anything chip sets form AMD/ATI...........
@telefanatic Got my G73 for the same reason aswell quite a nice setup :) too bad for all this blood in the Nvidia/intel camp, glad they got it sorted
Im all about AMD/ATI setups my last rig rocked but i had to sell it because i needed a laptop, now i got a Asus G73 with Core i7 and ATI 5870 and i max everything so im happy. But my next rig when i get my taxes will be all ATI/AMD because you just cant beat the price.
@sstravisd Won't happen. ARM processors are designed for low power devices like smartphones, netbooks, and tablets. They can't compete or replace x86-based processors because there is too much legacy code out there and the chips are extremely powerful. Have you seen the latest Intel offering? Core i7-2600K. Even down clocked to ARM's speed, the desktop part will be many times faster.
So why was Apple macs picked as an example out of ALL the high end computers in the article? Subliminal messages anyone?
Avalanche Studios co-founder says developer's ambition is for action, not moments that make players cry; steampunk-style game on hold. Full Story
- Posted May 15, 2013 6:33 am PT
4A Games creative director Andrew Prokhorov thanks Jason Rubin for telling the studio's story, but says, "We deserve the ratings we get." Full Story
- Posted May 16, 2013 12:44 pm PT