Will the next gen consoles determine how long our graphics card will last?

This topic is locked from further discussion.

Avatar image for deactivated-5d98e9b222525
deactivated-5d98e9b222525

162

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 deactivated-5d98e9b222525
Member since 2008 • 162 Posts

I was thinking of buying a 5850 and wondered how many years I could realistically go before having to upgrade again. This led me to thinking that pc games and hardware is always going to be on a par or better than the current generation of consoles. How intense would a game have to be to be hard pushed to squeeze only 60 fps from today's top end cards? Is that level of detail on the horizon?

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#2 04dcarraher
Member since 2004 • 23832 Posts

Well, Console makers are focusing more on software then hardware nowadays. I dont expect the next set of consoles to be expensive and very powerful compared to a semi-modern PC or whatconsoles were back in 2005-2006. I think that the next set of consoles will only have 2-4gb of memory, modified cpu's from what their using now and a $100 ranged pc gpu that we can pickup today. They are going to keep the new set of consoles under $500 more likely use $400 range. Consolers will think its a new generation of graphics (detail, level sizes etc) when we Pc users have been using for the last 4-5 years.

Avatar image for topsemag55
topsemag55

19063

Forum Posts

0

Wiki Points

0

Followers

Reviews: 16

User Lists: 0

#3 topsemag55
Member since 2007 • 19063 Posts

I was thinking of buying a 5850 and wondered how many years I could realistically go before having to upgrade again. This led me to thinking that pc games and hardware is always going to be on a par or better than the current generation of consoles. How intense would a game have to be to be hard pushed to squeeze only 60 fps from today's top end cards? Is that level of detail on the horizon?

StormtrooprDave

If you think about what's inside the casings of the consoles, then you won't have to worry about them affecting a 5850.:)

One console uses the R600 foundation of the Radeon HD 2000 Series, and the other uses a GeForce 7800 GTX.

Both are outdated - look at how many series of GPUs both nVidia and ATI have engineered in the last 5 years.

Aside from that, the 7800 GTX is Direct-X 9.0 and older only - it cannot see anything else associated with nVidia - such as Cuda, PureVideo, DX 10 & 11, and Phys-X recognition.

My prediction is the next gen of consoles will use the 8800 GTX and probably the Radeon 3000 series, or maybe one series higher for both.

Avatar image for aura_enchanted
aura_enchanted

7942

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#4 aura_enchanted
Member since 2006 • 7942 Posts

odds are the best a console will be is:

phenom x4 2.6ghz, 4gb ram and an hd 3870 at best.. if you can top that any console game is pie...

and fyi the first amd 4 cores suck eggs.. basic core 2 duo's own them..

Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 9mmSpliff
Member since 2005 • 21751 Posts

next gen of consoles are about 2yrs away. Which will make the current GPUs 3yrs old. It would not blow me away if the next gen consoles stepped it up to take on the PC a lot more vigorously. I think we will see at least Dx10.1 implemented into the next gen with the chance of Dx11 in there. If that is the case, then it could maybe hurt the PC or bring it onto the console as a more multi platform buddy.

I think ATI is working with M$ on Xbox 720. And Nintendo has used ATI for Gamecube and Wii, so maybe Wii 2 also. Seeing as nVidia and SONY worked together on PS3. And then Intel Larabee being dropped for the next GPU in the PS4, then nVidia also just shrunk the GPU for SONY in the Slim. So they could be working on doing something for the PS4. Keep in mind nVidia is all about 3D right now and so is SONY.

The next gen of GPUs will need to be around 1GB of VRAM for 1080p graphics and Physical Memory around 2gb for quicker load times (512mb currently). We already know the CELL is in the next PS4 again, I believe IBM is diong the CPU for Xbox 3 and I am sure the same for Wii 2. This current generation are using 7800GT 256mb or a mobile 7800gtx 256mb for PS3 and the Xbox 360 is using the x1900 series from ATI while the Wii is somewhere around a x600 from ATI.

Keep in mind, these GPUs were out only 1 years before the next gen consoles launched (minus Nintendo). So if that was the trend, then we could possibly see the next gens being 5850-6850's and GTX470-GTX570's. But that is if they implement Dx11, if not, then we will probably see GTX260-GTX280s from nVidia and 4870s from ATI.

Thats just my opinion and everyones will be different as we have no idea what the big players are doing (WHICH SUCKS, hahaha).

Avatar image for mouthforbathory
mouthforbathory

2114

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#6 mouthforbathory
Member since 2006 • 2114 Posts

The PS3 has a G71 derivative which was indeed the same GPU used in the 7800 series.

The 360 uses not a X1900 derivative, but a precursor to the R600, as it is a unified shader part, and pretty much was developed into the the R600 series. Of course it also relies on an eDRAM die and isn't fully DX10 compatible (it can do some DX10 level things though). However it isn't related to the X1900 at all.

The Wii doesn't use an x300/600 part, nor is the GPU in the Wii even equivalent to it, it's much weaker than those ATi GPUs. It's just a small die process + higher clocked version of the GPU used in the Gamecube to preserve perfect backwards compatibility. Same goes for the CPU. The big increase for the Wii was the amount of RAM and much higher memory bandwidths.

As for the next gen systems, we'll see derivatives/developments of what the current consoles use now. They will need backwards compatibility in hardware, with the exception of the next gen Wii successor. The Wii is quite easily emulated, homebrew developers have proven that with the Dolphin Wii/GC emulator. I think it would be in Ninty's best interest to go PPC again, but with a newer architecture, and use a whole new GPU. Honestly it wouldn't take much to satisfy me in a new Wii, especially if low development cost is desired. I'd go with a quad-core PPC CPU, Radeon 5670, with 1 GB of GDDR5 RAM as a unified memory pool. As paltry as the Wii is now, even a dual-core PPC, Radeon 5450 series, 512 MB of GDDR5 would satisfy most developers. It would make porting extremely easy. A current end GPU feature set would there for devs and 720p could easily be be done. The 80 stream units in the 5450 would still be roughly a third of what the 360 now can do, but the graphics would still be roughly 4 to 5x better than what we now see with the Wii and it would be at 720p. Shaders would be easily implemented. Easy to do global lighting and shadowing. I'd still recommend a 5670 though. It's 5x more powerful than a 5450 (400 vs 80 Stream processors). And 1080p would be easy to do.

It is still highly possible that Sony will try to develop the Cell further, despite IBM virtually abandoning it. 2x PPEs + 16 SPEs is quite plausible, perhaps 32. It was in IBMs research roadmap before ending Cell development on their end. Surely Sony will go with another Nvidia GPU in order to gain all the data they would need to make sure BC isn't an issue, and probably go with something along the lines of a "medium end" Geforce GTX/S 4xx series. It's possible as well that Sony may hardly revise the Cell BE and use a high end Nvidia GPU to augment it not only in graphics but for GPGPU functions as well.

The next Xbox will probably use a newer PPC quad-core, one that wipes the floor with the Xenons older PPC architecture per clock yet still be compatible. I think will surely do some GPGPU implementation using an ATi 57xx or 58xx equivalent part that will afford relatively huge amounts of graphics power while giving them plenty of power to do on GPU physics, etc at the same time.

Avatar image for SakusEnvoy
SakusEnvoy

4764

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 SakusEnvoy
Member since 2009 • 4764 Posts

I think you can buy a good graphics card today with confidence that you won't be forced into upgrading it anytime soon. Most people predict an extended console generation, and even when more powerful consoles do arrive, I expect they will not be sold at a loss as tradition would normally dictate. In other words, console's technical capabilities may start to plateau as manufacturers put more of an emphasis in software and bringing in broad audiences through, for example, new control interfaces. This trend appears to be both clear and irreversible to me. While more powerful GPUs are inevitable, I don't expect Microsoft or Sony to break the bank trying to put them in to their next consoles.

Avatar image for 9mmSpliff
9mmSpliff

21751

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 9mmSpliff
Member since 2005 • 21751 Posts

i was using x1900 as something peopel could relate too for speed. It is a precursor to R600, but its performance is more around a x1900. Also I remember years ago when it launched, Guru3D ( I believe) did a report on the Wii, finding its graphical capabilities were near a x600 from ATI.

Also was reading in a tech mag a year ago talking about how M$ was looking at dual cores for the next gen, not quad (but that could change). CELL was staying and they were trying to make it easier for developers to unload some of the graphics onto the CELL. As everyone knows last gen SONY rushed to get the G71 into their system after failing to get the CELL to run the graphics. But they (CELL & GPU) could communicate with eachother. Now they want it easier to program on, as that was teh complaint about developing games for PS3. Hard to program games and expensive kits.