So why wouldn't next-generation consoles use Nvidia GPUs instead of AMD?

  • 74 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#1 ShadowDeathX
Member since 2006 • 11698 Posts

Oh look a high end computer graphics card that is only 7 inches/177mm in length. This is a full Nvidia GTX 670. It requires very little power compared to other high end PC graphics cards, it produces much less heat (see those 2 small fans? that is all it needs), and it is small.

Now imagine this chip customized for consoles, intergrated onto the console mainboard with customized cooling. You can have a powerful console in a very small size. AMD's offerings are huge compared to this.

This would be a neat GPU to have on a console.

oh3r5.jpg

Avatar image for sneslover
sneslover

957

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 sneslover
Member since 2005 • 957 Posts

Expensive?

MS wisened up after the first Xbox, Sony seems to have done the same with PS4.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#3 ShadowDeathX
Member since 2006 • 11698 Posts

Expensive?

MS wisened up after the first Xbox, Sony seems to have done the same with PS4.

sneslover
These chips prob. don't cost the AiBs much money to produce. Microsoft or Sony can ask for a similar design and buy them up in bulk at a huge discount than we consumers pay for.
Avatar image for famicommander
famicommander

8524

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 famicommander
Member since 2008 • 8524 Posts
Microsoft and Nintendo's current consoles use AMD video. Switching to Nvidia would likely complicate backwards compatibility a lot, such as when Microsoft went from Nvidia (Xbox) to ATI (360).
Avatar image for Stevo_the_gamer
Stevo_the_gamer

49576

Forum Posts

0

Wiki Points

0

Followers

Reviews: 49

User Lists: 0

#5 Stevo_the_gamer  Moderator
Member since 2004 • 49576 Posts
It would be bloody impractical to have something like so in a small console. Cost, size, power and heat. Not ideal.
Avatar image for deactivated-5cf4b2c19c4ab
deactivated-5cf4b2c19c4ab

17476

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#6 deactivated-5cf4b2c19c4ab
Member since 2008 • 17476 Posts

AMD probably offered the better deal/support/technology for the consoles. The console manufacturers didn't decide who to go with on a fleeting whim, it would be a very important choice, and we know barely anything about what went on with choosing the hardware.

Avatar image for NoodleFighter
NoodleFighter

11803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 NoodleFighter
Member since 2011 • 11803 Posts

AMD has the better low end hardware which is probably what the next gen consoles are going to use

Avatar image for theuncharted34
theuncharted34

14529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 theuncharted34
Member since 2010 • 14529 Posts

It's pretty crazy how the same company that could come up with the **** that was Fermi could come up with a beast such as Kepler.

Architecture, feature's and power aside, MS won't use them because Nvidia really screwed them over with the Xbox.

That, and i'm pretty sure both sony and ms have already signed deals with AMD, at least MS has.It'd be cool if Sony would use Kepler, but that's hoping for a bit too much. It's entirely possible, but isn't a given the next gen consoles will be that powerful.

Avatar image for ionusX
ionusX

25777

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#9 ionusX
Member since 2009 • 25777 Posts

power consumption is primarily to blame console makers cant afford to put a good power supply in their systems so a tiny weak mofo psu is basically a guarantee. it then comes down to lowest power consumption with the most well rounded preformance. in that regard amd holds all the cards. nvidia cant stack the ---- up

Avatar image for Inconsistancy
Inconsistancy

8094

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 Inconsistancy
Member since 2004 • 8094 Posts

Die size.

die size

The smaller it is, the more yield you get and the less likely it'll be damaged and rendered useless.

Avatar image for Cali3350
Cali3350

16134

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 Cali3350
Member since 2003 • 16134 Posts

Because AMD offers similiar performance at much better Power Targets with an APU, which seems but certain to be the basis for both consoles at this point. Technologically there is nothing Nvidia offers AMD doesnt, except AMD is willing to sell out the designs and lose royalties, something Nvidia wont do.

Avatar image for topgunmv
topgunmv

10880

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#12 topgunmv
Member since 2003 • 10880 Posts

Because nvidia is a terrible partner that has boned every console manufacturer that they've signed deals with.

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#13 mitu123
Member since 2006 • 155290 Posts

AMD has the better low end hardware which is probably what the next gen consoles are going to use

NoodleFighter
Maybe so.

It's pretty crazy how the same company that could come up with the **** that was Fermi could come up with a beast such as Kepler.theuncharted34

460s are awesome at least!

Avatar image for Mozuckint
Mozuckint

831

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#14 Mozuckint
Member since 2012 • 831 Posts

As a few have pointed out, it's most likely the pricing on Nvidia's end compared to what AMD may be asking/offering.

Avatar image for NoodleFighter
NoodleFighter

11803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 NoodleFighter
Member since 2011 • 11803 Posts

[QUOTE="NoodleFighter"]

AMD has the better low end hardware which is probably what the next gen consoles are going to use

mitu123

Maybe so.

It's pretty crazy how the same company that could come up with the **** that was Fermi could come up with a beast such as Kepler.theuncharted34

460s are awesome at least!

Don't believe me? Check out the benchmarks of Nvidia's GT series versus AMDs low range cards

Low range cards is where Nvidia needs to improve, they pretty much half ass it all the time

Avatar image for theuncharted34
theuncharted34

14529

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 theuncharted34
Member since 2010 • 14529 Posts

Because nvidia is a terrible partner that has boned every console manufacturer that they've signed deals with.

topgunmv

Just MS, it was Sony's own fault for signing with Nvidia when they knew what Xenos was capable of. I bet it would have saved Sony money going with Ati as well.

Every possible hardware decision Sony made for the Ps3 was awful.

Avatar image for mitu123
mitu123

155290

Forum Posts

0

Wiki Points

0

Followers

Reviews: 32

User Lists: 0

#17 mitu123
Member since 2006 • 155290 Posts

[QUOTE="mitu123"][QUOTE="NoodleFighter"] Maybe so. [QUOTE="theuncharted34"]

It's pretty crazy how the same company that could come up with the **** that was Fermi could come up with a beast such as Kepler.NoodleFighter

460s are awesome at least!

Don't believe me? Check out the benchmarks of Nvidia's GT series versus AMDs low range cards

Low range cards is where Nvidia needs to improve, they pretty much half ass it all the time

I know about that, AMD is awesome in that area.

Avatar image for Jebus213
Jebus213

10056

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 Jebus213
Member since 2010 • 10056 Posts
Maybe because AMD doesn't care much about size?
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#19 ronvalencia
Member since 2008 • 29612 Posts

Oh look a high end computer graphics card that is only 7 inches/177mm in length. This is a full Nvidia GTX 670. It requires very little power compared to other high end PC graphics cards, it produces much less heat (see those 2 small fans? that is all it needs), and it is small.

Now imagine this chip customized for consoles, intergrated onto the console mainboard with customized cooling. You can have a powerful console in a very small size. AMD's offerings are huge compared to this.

This would be a neat GPU to have on a console.

ShadowDeathX

With Radeon HD 7870, NVIDIA hasn't beaten AMD on performance per watt and performance per die size.

Compared to other high end PC graphics cards, Radeon HD 79x0 supports 64bit floating point math better than GTX 680.

Radeon HD 79x0 will be receive its own power reduction improvements via Radeon HD 7970 Ghz Edition's version 2 stepping .

Tahiti XT2, or Radeon HD 7970 GHz Edition, will ship with a core clock speed of 1100 MHz, 175 MHz faster than the HD 7970. The GPU core voltage of Tahiti XT2 will be lower, at 1.020V, compared to 1.175V of the Tahiti XT

Radeon HD 7970(Tahiti XT1) consumes about 210 watts(1) @ 1.175V. If the amps remains the same, Tahiti XT2's 1.020V may yield about 182 watts i.e. lower than GTX 680's 190 watts. Tahiti XT2' @ 1.020V is also clocked faster at 1100Mhz i.e. 175Mhz faster than Tahiti XT1's 925Mhz.

References

1. 7970's consumes 210 watts link http://www.atomicmpc.com.au/Feature/296004,amd-radeon-hd-7970-reference-disassembly-guide.aspx

From http://www.guru3d.com/article/asus-radeon-hd-7970-directcu-ii-review/4 Radeon HD 7970 has 210 watts.

Believe it or not, the high end Radeon HD 7970 has a rated peak TDP (maximum power draw) of just 210 Watt, and that's really all right for a product of this caliber, features and performance.

From http://www.techspot.com/review/481-amd-radeon-7970/page2.html Radeon HD 7970 has 210 watts.

"it still chugs up to 210 watts under load"

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#20 ronvalencia
Member since 2008 • 29612 Posts

Because AMD offers similiar performance at much better Power Targets with an APU, which seems but certain to be the basis for both consoles at this point. Technologically there is nothing Nvidia offers AMD doesnt, except AMD is willing to sell out the designs and lose royalties, something Nvidia wont do.

Cali3350
AMD also offers high performance CPU cores unlike NVIDIA's cut-and-paste ARM Cortex A9 efforts.
Avatar image for reach3
reach3

1600

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 reach3
Member since 2012 • 1600 Posts

Because Nvidia is overpriced garbage?

AMD offers the same specs for less money. Nvidia can go out of business for all i care

Avatar image for Kahuna_1
Kahuna_1

7948

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22 Kahuna_1
Member since 2006 • 7948 Posts

If they are both using AMD cpus then they probably get a discount for buying GPUs.

Avatar image for NoodleFighter
NoodleFighter

11803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 NoodleFighter
Member since 2011 • 11803 Posts

Because Nvidia is overpriced garbage?

AMD offers the same specs for less money. Nvidia can go out of business for all i care

reach3

There is more than just specs

OVerall Average Nvidia has the better driver support and they have exclusive features such as GPU accelerated Physx (non nvidia users can use it but it would run better on a Nvidia system)

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#24 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="reach3"]

Because Nvidia is overpriced garbage?

AMD offers the same specs for less money. Nvidia can go out of business for all i care

NoodleFighter

There is more than just specs

OVerall Average Nvidia has the better driver support and they have exclusive features such as GPU accelerated Physx (non nvidia users can use it but it would run better on a Nvidia system)

AMD also has GpGPU based Physics via open source Bullet Physics and DMM2 middleware. From http://www.itproportal.com/2010/03/08/amd-gives-away-gpu-physics-tools/ AMD gives away GPU physics tools.

The latest Futuremark 3DMark 11 has dumped NVIDIA's proprietary Physx middleware.

On Windows, Microsoft's C++ AMP reduce the need for NVIDIA's proprietary middleware

Avatar image for ionusX
ionusX

25777

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#25 ionusX
Member since 2009 • 25777 Posts

[QUOTE="NoodleFighter"]

[QUOTE="reach3"]

Because Nvidia is overpriced garbage?

AMD offers the same specs for less money. Nvidia can go out of business for all i care

ronvalencia

There is more than just specs

OVerall Average Nvidia has the better driver support and they have exclusive features such as GPU accelerated Physx (non nvidia users can use it but it would run better on a Nvidia system)

AMD also has GpGPU based Physics via open source Bullet Physics and DMM2 middleware. From http://www.itproportal.com/2010/03/08/amd-gives-away-gpu-physics-tools/ AMD gives away GPU physics tools.

The latest Futuremark 3DMark 11 has dumped NVIDIA's CUDA/Physx middleware.

was so stoked when that happened. 1 less syntehtic benchmark that wont be 1 side. ftw!

Avatar image for Inconsistancy
Inconsistancy

8094

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26 Inconsistancy
Member since 2004 • 8094 Posts

[QUOTE="reach3"]

Because Nvidia is overpriced garbage?

AMD offers the same specs for less money. Nvidia can go out of business for all i care

NoodleFighter

There is more than just specs

OVerall Average Nvidia has the better driver support and they have exclusive features such as GPU accelerated Physx (non nvidia users can use it but it would run better on a Nvidia system)

PhysX isn't a good feature because of its unnecessary exclusivity, it's by no means a feature that can't be done on AMD hardware. If anything the exclusive features it provides increases development time since you still have to develop an alternative for the other vendor, use the terrible CPU version, or exclude it all together.
Avatar image for NoodleFighter
NoodleFighter

11803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 NoodleFighter
Member since 2011 • 11803 Posts

[QUOTE="NoodleFighter"]

[QUOTE="reach3"]

Because Nvidia is overpriced garbage?

AMD offers the same specs for less money. Nvidia can go out of business for all i care

ronvalencia

There is more than just specs

OVerall Average Nvidia has the better driver support and they have exclusive features such as GPU accelerated Physx (non nvidia users can use it but it would run better on a Nvidia system)

AMD also has GpGPU based Physics via open source Bullet Physics and DMM2 middleware. From http://www.itproportal.com/2010/03/08/amd-gives-away-gpu-physics-tools/ AMD gives away GPU physics tools.

The latest Futuremark 3DMark 11 has dumped NVIDIA's proprietary Physx middleware.

On Windows, Microsoft's C++ AMP reduce the need for NVIDIA's proprietary middleware

So what games use it?

Avatar image for NoodleFighter
NoodleFighter

11803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 NoodleFighter
Member since 2011 • 11803 Posts

[QUOTE="NoodleFighter"]

[QUOTE="reach3"]

Because Nvidia is overpriced garbage?

AMD offers the same specs for less money. Nvidia can go out of business for all i care

Inconsistancy

There is more than just specs

OVerall Average Nvidia has the better driver support and they have exclusive features such as GPU accelerated Physx (non nvidia users can use it but it would run better on a Nvidia system)

PhysX isn't a good feature because of its unnecessary exclusivity, it's by no means a feature that can't be done on AMD hardware. If anything the exclusive features it provides increases development time since you still have to develop an alternative for the other vendor, use the terrible CPU version, or exclude it all together.

Yeah but it is part of the cost of Nvidia GPUs, I'm just glad that Physx is getting better CPU support so we can see it used in more games but Nvidia should be kind enough to at least the GPU do some of the work

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ronvalencia"]

[QUOTE="NoodleFighter"]

There is more than just specs

OVerall Average Nvidia has the better driver support and they have exclusive features such as GPU accelerated Physx (non nvidia users can use it but it would run better on a Nvidia system)

NoodleFighter

AMD also has GpGPU based Physics via open source Bullet Physics and DMM2 middleware. From http://www.itproportal.com/2010/03/08/amd-gives-away-gpu-physics-tools/ AMD gives away GPU physics tools.

The latest Futuremark 3DMark 11 has dumped NVIDIA's proprietary Physx middleware.

On Windows, Microsoft's C++ AMP reduce the need for NVIDIA's proprietary middleware

So what games use it?

The focus is with next-gen games. EA's Frostbite 2 and CryEngine 3 doesn't need NVIDIA's proprietary Physx middleware.

From http://bulletphysics.org/wordpress/?p=279

The presentation is titled ?game physics artifacts? and covers similar material to chapter 2 of the Game Physics Pearls book. We will use both Bullet and the new Sony Physics Effects SDK to illustrate some examples. You can download the presentation for the Game Developers Conference Physics Tutorial and the preliminary Physics Effects-Bullet integration from http://code.google.com/p/bullet/downloads/list.


ARM, Sony, AMD, Microsoft, Google and Intel are working actively against NVIDIA's proprietary middleware.

Avatar image for lostrib
lostrib

49999

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#31 lostrib
Member since 2009 • 49999 Posts

Doesnt Nvidia have supply problems right now for 680 and 670? In addition, having a low power draw for a PC may still be too much for a console

Avatar image for ionusX
ionusX

25777

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

#32 ionusX
Member since 2009 • 25777 Posts

ogre engine games also take advantage of this. torchlight 1 & 2 and the upcoming indie game nekkro all are based in the ogre engine

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#33 ShadowDeathX
Member since 2006 • 11698 Posts
Now now, but later. :) How many games use GPU Accelerated PhysX? Three a year in a good year. Two average.
Avatar image for NoodleFighter
NoodleFighter

11803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 NoodleFighter
Member since 2011 • 11803 Posts

Doesnt Nvidia have supply problems right now for 680 and 670? In addition, having a low power draw for a PC may still be too much for a console

lostrib

Well some guy upgraded his AlienWare X51 to a GTX 670 and he had no problems and the PSU was only 330watts

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#35 ronvalencia
Member since 2008 • 29612 Posts

Microsoft and Nintendo's current consoles use AMD video. Switching to Nvidia would likely complicate backwards compatibility a lot, such as when Microsoft went from Nvidia (Xbox) to ATI (360).famicommander

For Nintendo, Wii backwards compatibility on NVIDIA GPU hardware wouldn't be a problem e.g. refer to Wii emulator D*lphin.

Wii's GPU is nothing like Radeon HD GPU.

Avatar image for NoodleFighter
NoodleFighter

11803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 NoodleFighter
Member since 2011 • 11803 Posts

Now now, but later. :) How many games use GPU Accelerated PhysX? Three a year in a good year. Two average.ShadowDeathX

Well Physx is used in over 300 games but GPU accelerated Physx really only like 24

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#37 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="Inconsistancy"][QUOTE="NoodleFighter"]

There is more than just specs

OVerall Average Nvidia has the better driver support and they have exclusive features such as GPU accelerated Physx (non nvidia users can use it but it would run better on a Nvidia system)

NoodleFighter

PhysX isn't a good feature because of its unnecessary exclusivity, it's by no means a feature that can't be done on AMD hardware. If anything the exclusive features it provides increases development time since you still have to develop an alternative for the other vendor, use the terrible CPU version, or exclude it all together.

Yeah but it is part of the cost of Nvidia GPUs, I'm just glad that Physx is getting better CPU support so we can see it used in more games but Nvidia should be kind enough to at least the GPU do some of the work

From http://http.developer.nvidia.com/GPUGems3/gpugems3_ch16.html for Crysis1

Vegetation in games has always been mainly static, with some sort of simple bending to give the illusion of wind. Our game scenes can have thousands of different vegetations, but still we pushed the envelope further by making vegetation react to global and local wind sources, and we bend not only the vegetation but also the leaves, in detail, with all computations procedurally and efficiently done on the GPU.

Note that Crysis doesn't use NVIDIA's Physx .

Avatar image for NoodleFighter
NoodleFighter

11803

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38 NoodleFighter
Member since 2011 • 11803 Posts

[QUOTE="NoodleFighter"]

[QUOTE="Inconsistancy"] PhysX isn't a good feature because of its unnecessary exclusivity, it's by no means a feature that can't be done on AMD hardware. If anything the exclusive features it provides increases development time since you still have to develop an alternative for the other vendor, use the terrible CPU version, or exclude it all together.ronvalencia

Yeah but it is part of the cost of Nvidia GPUs, I'm just glad that Physx is getting better CPU support so we can see it used in more games but Nvidia should be kind enough to at least the GPU do some of the work

From http://http.developer.nvidia.com/GPUGems3/gpugems3_ch16.html for Crysis1

Vegetation in games has always been mainly static, with some sort of simple bending to give the illusion of wind. Our game scenes can have thousands of different vegetations, but still we pushed the envelope further by making vegetation react to global and local wind sources, and we bend not only the vegetation but also the leaves, in detail, with all computations procedurally and efficiently done on the GPU.

Note that Crysis doesn't use NVIDIA's Physx .

There are games that use Physx and there are games that don't.

I'm guessing your point is if devs put in the effort they don't need to use Physx and CPU physx to process stuff

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#39 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ShadowDeathX"]Now now, but later. :) How many games use GPU Accelerated PhysX? Three a year in a good year. Two average.NoodleFighter

Well Physx is used in over 300 games but GPU accelerated Physx really only like 24

That's mostly current gen games and ussally tied with Epic's Unreal Engine (a known NVIDIA partner).

Sony and Microsoft has thier own ideas.

Avatar image for Aidenfury19
Aidenfury19

2488

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#40 Aidenfury19
Member since 2007 • 2488 Posts

Nvidia is run by an *** and has screwed two major console manufacturers during the last two generations, consoles have to finalize hardware relatively late and major contract negotiations at this stage are a big nono, AMD has had better GPUs with little exception for the last several years, Nvidia doesn't make CPUs, and AMD is probably much more likely to cut a deal with slim profit margins.

So you know, just a few reasons.

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#41 ShadowDeathX
Member since 2006 • 11698 Posts
Looks like it is Sony's move if they will use Nvidia this time around. Radeon HD 4770 GPU for Wii U AMD Kaveri APU for Next Xbox.
Avatar image for reach3
reach3

1600

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 reach3
Member since 2012 • 1600 Posts
[QUOTE="ShadowDeathX"]Looks like it is Sony's move if they will use Nvidia this time around. Radeon HD 4770 GPU for Wii U AMD Kaveri APU for Next Xbox.

Also looks like WII U will be shunned by 3rd parties for being so weak compared to ps4 and 720
Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#43 ShadowDeathX
Member since 2006 • 11698 Posts

[QUOTE="ShadowDeathX"]Looks like it is Sony's move if they will use Nvidia this time around. Radeon HD 4770 GPU for Wii U AMD Kaveri APU for Next Xbox.reach3
Also looks like WII U will be shunned by 3rd parties for being so weak compared to ps4 and 720

I don't think so. If developers can get a game running on the next Xbox at 1080p they can downscale that same game to run on the Wii U at 720p.

Multiplats might change next generation with each platform being so different from one and another.

Avatar image for jsmoke03
jsmoke03

13717

Forum Posts

0

Wiki Points

0

Followers

Reviews: 10

User Lists: 0

#44 jsmoke03
Member since 2004 • 13717 Posts

i wish i knew what you guys were talking about lol

Avatar image for Mr_BillGates
Mr_BillGates

3211

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#45 Mr_BillGates
Member since 2005 • 3211 Posts

Because a second fiddle company like AMD offers cheaper solutions.

Avatar image for Xboxdroolz
Xboxdroolz

386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 Xboxdroolz
Member since 2012 • 386 Posts
nvidia and microsoft got in a lawsuit over original xbox gpu lol like nvidia claimed ms owed them royalties plus they couldnt produce enough of them some bs like that.
Avatar image for Xboxdroolz
Xboxdroolz

386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#47 Xboxdroolz
Member since 2012 • 386 Posts
Yea nvidia wouldnt let microsoft produce original xbox gpus maybe too. i remember reading article about nvidia/microsoft issues with xbox gpu lol.
Avatar image for Xboxdroolz
Xboxdroolz

386

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 Xboxdroolz
Member since 2012 • 386 Posts
Most noobs dont know the gpu for xbox was supposed to be clocked at 300 mhz announced specs but it was lowered to 250 mhz because they were having low yields of working chips at 300 mhz.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#49 ronvalencia
Member since 2008 • 29612 Posts

Because a second fiddle company like AMD offers cheaper solutions.

Mr_BillGates

AMD's licensing model and partnership with IBM/GoFlo is superior i.e. able to have IBM PPE + AMD Xenos fusion chip.

Atleast AMD hasn't gimped 64bit floating point compute in thier Radeon HD 79x0.

AMD's PC GPU market share is larger than NVIDIA. http://www.guru3d.com/news/jpr-q1-2012-gpu-shipments-down-338-percent-yearoveryear/

Avatar image for CwlHeddwyn
CwlHeddwyn

5314

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 CwlHeddwyn
Member since 2005 • 5314 Posts
It would be bloody impractical to have something like so in a small console. Cost, size, power and heat. Not ideal.Stevo_the_gamer
Indeed, by the time technology advanced and the die-size could be shrunk in order to make the GT670 small enough and cheap enough it would be out of date.