PS4 rumoured to use same graphics card as next Xbox - Report

IGN is reporting that Sony's next console will use AMD's A8-3850 APU and an HD 7670 GPU.

by

Sony's next-generation console, reportedly codenamed Orbis, is rumoured to have the same graphics processing power as the next Xbox.

Will the PS4 be graphically aligned to the next Xbox?

According to a report on IGN citing anonymous sources, the PlayStation 4 is rumoured to utilise custom chips based on AMD's A8-3850 APU and the Radeon HD 7670 GPU.

The report states that the HD 7670, a DirectX 11-enabled card clocked to 1GHz with up to 1GB of dedicated VRAM, is a "re-branded" version of the HD 6670 with identical specifications that will work together with the APU (the A8-3850 has a quad-core 2.9GHz processor with an integrated graphics chip).

According to IGN, the HD 6670 is the same card that will reportedly be used in the next Xbox console; if true, this would mean that Sony's next-gen console would be graphically identical to Microsoft's next-gen effort.

Whether this will help cross-platform development remains to be seen; IGN's sources say Sony will "custom tool" the chipsets mentioned above for the PS4 ahead of the console's launch.

Discussion

205 comments
XspidervenomX
XspidervenomX

Fanboys talked themselves into PS3 is more powerful than nuclear fusion...turned out is was weak crap. Had RAM only found in the late 80's, downloading was a chore, installing took days, PORTS where a mess, they took away Backwards Compatibility, had hundreds of titles canceled........and.... you still couldn't chat with someone....People need to stop fooling themselves..

XspidervenomX
XspidervenomX

Next consoles will be overbearing, overpriced, over restrictive, junk..Fanboys never stop talking themselves into this junk.

GH05T-666
GH05T-666

They wont be a big change from what we have today in the current xbox360 and ps3.

Disappointing but maybe the next gen consoles will get better after the PS4 and new xbox.

To bad they couldn't make these bigger and better then they will be :(

build a PC if u want the best gaming machine ever!

jockie_chan
jockie_chan

the more i'm reading about the next gen consoles the more i think they'll be crap. 

hopefully, these rumours aren't true

Slyster1181
Slyster1181

If they are indeed using an APU then that means they are shooting for a hybrid crossfire setup which means the graphic power will be better than just a 6670.

vishisluv7
vishisluv7

At this point in time, how does anyone justify buying a weak, non-upgradable computer (console) rather than just have a nice desktop computer that you keep up to date and have a controller for? On PC you have mods, better graphics..I don't understand why have two computers for the same purpose?

So the next gen consoles will be outdated on the day they release, and both will likely have hardware issues for the first year.

orangepeel1972
orangepeel1972

Isn't it funny when the latest piece of next-gen console hardware news comes along all the PC snobs come out of their holes to try and amuse everybody us with their boring geek tech talk.

TheOnlyConan
TheOnlyConan

Using a video card already outdated to current PC cards is stupid. I realize they do this to keep costs down, but don't ask why PC gamers laugh at consoles.  Add to that the next gen console will come out 18 (my guess) months from now............

Frosty192
Frosty192

Make it have 2gb of VRAM then the console might be worth purchasing. Most new games are pushing the 1gb of vram, these consoles are already outdated.

krishnanmc3
krishnanmc3

The visual quality comparison might not be accurate. Mainly because: 

1) The architecture of both machines might be different. (I am talking about how the various components are utilised).

2) Either system could boast/boost performance improvement, merely by using different versions of the same graphics card and the APU (assuming it's the master unit at the heart of the machine) or use a different architecture to speed up the processing.

I wish I could find work at such places, as both a games enthusiast and a computing enthusiast.

theKSMM
theKSMM

I have a suspicion that Sony and Microsoft are trying to avoid a mistake that made with their really expensive console launches in the last generation.  Sony priced themselves completely out of the market at $500 / $600, and Microsoft almsot did at $300 / $400, and nobody is trying to repeat that.

 

This means that the components they use won't be cutting edge, even by today's standards.  A high end graphics card on a PC will run you about $400 to $1,000, more than the entire console should cost; so they'll probably look to deliver something you could get for today's $150-$200 graphics card...not bleeding edge, but respectable gaming.  And since it will only need to go up to 1920 x 1080 rather than the insane resolutions that some PC gamers use (2500 x 1600 or somesuch) then it should do the job nicely.  But it will definitely push more polygons than an XBOX 360 or PS3.

rarson
rarson

GAMESPOT: GPU chips are NOT "cards." Someone needs to clue your writers into this. A graphics card is physical PCB board with a GPU on it that plugs into an expansion port.

Gooeykat
Gooeykat

Wait, both are going AMD?  How did Nvidia get left out?  They can't be happy about this.

ssj2los
ssj2los

I want cross-platform multiplayer...... at least playstation with PC, and xbox with PC, doesn't really have to be playstation with xbox as i know that will eliminate a form of competition in the market and create chaos.

DarthRevan
DarthRevan

isn't that an entry-level card?  maybe we can hook the PS4 and X720 to the iPAD4 and let it handle the graphics:-)  MS and Sony can offer a shell for each console with a blu ray drive in it and consumers can save $800 that way...  So what if we say an Apple Logo when it loads? 

nyran125
nyran125

my AMD6870 of 2 years ago already owns that.

syler4815162342
syler4815162342

Come on!! Nvidia really has better Technology all of the game dev, Digital Artist people are using Nvidia's Graphic Cards! Nvidia has amazing Physics Tech! also Unreal Engine 4 only works with GTX 690 at the moment!! only with Cuda Tech yhey can bring all of those details now they are using ATI!! for next Gen Consoles!!! 0.O!

billlabowski
billlabowski

@constantterror Reported...for multiple reasons :D

Hvac0120
Hvac0120

The next generation is going to be based on the services and experience that the system can provide more than the comparison of it's graphics and processing capabilities.

Hvac0120
Hvac0120

The PS3 and the Xbox 360 also have variants of the same CPU. The differences are in the GPU and the motherboard architecture. The PS3's Cell Architecture being the biggest difference. What's been interesting is to see developers continue to push the PS3 further while hinting that the 360 is pretty much maxing it's potential with current games. While it's all good for PS3 exclusives, there isn't much difference between 360 and PS3 versions of cross-platform games.

Ladiesman17
Ladiesman17

If you're Casual gamers go buy = Wii / Wii U. If you're Hardcore gamers go buy = PS4 and Next-Gen Xbox. If you consider yourself Superior / Advanced Hardcores go buy = PC. Easy enough :cool:

nonfanboygamer1
nonfanboygamer1

Hopefully sony will be smart enough to "compete with" the next xbox's price instead of assuming that people will refinance their house just to have the latest sony product

billlabowski
billlabowski

@constantterror Also, side note. Do you talk like that when you play CoD? Just asking...

billlabowski
billlabowski

@constantterror Too much candy from the easter bunny today? Reported for lack of self control and too much sugar.

poopinpat
poopinpat

 @orangepeel1972 I did not find this comment funny or amusing.

 

Isn't it funny when people make shitty comments on really old articles.

 

Now I can get back in my cozy hole while you drag your knuckles back under the bridge from whence you came.

badwins
badwins

 @theKSMM u forgot that the xbox 360 only has 48 shader units (alu's) and is a directx 9 (really a 9.5) part. the difference that allowed such good graphics  on these aging machines was the direct low level access to the hardware that allowed developers to squeeze more out of the system. PC graphics cards have alot of overhead and a ton of wasted cycles. the 7670 should be more than enough and the special modifications that could be made for the console Architectures such as unique caches to reduce memory latency and other techniques are going to have the graphics comparable to the pc for alteast  3 to 4 years i believe.

00Joseph00
00Joseph00

 @rarson The unfortunate thing with language, and English in particular, is people like hints instead of knowing things one hundred percent accurately.

rarson
rarson

 @Gooeykat

 

AMD is going to be in all the next-gen consoles, Sony, MS, and Nintendo. I blame Nvidia's management and their apparent inability to deliver decent products on time and at decent prices.

rarson
rarson

 @nyran125

 

The console chips will surely be custom variants optimized for console use and will surely be faster than the PC equivalent part. Plus, it should have less overhead being a game console, assuming Microsoft and Sony do a good enough job on the operating systems.

topeira
topeira

 @nyran125

i got the HD6870 as well. actually im pretty happy that my card is on par with the consoles. maybe (MAYBE) the PC versions of these games wont require me to upgrade my GPU.  but that's only if they are optimized. 2 GPUs ago i had a ATI 1900xt, which was on par with the 360's equivalent of a 1800 (i think) and that card barely made it. 

rarson
rarson

 @syler4815162342

 

Wrong. AMD just released the GHz Edition 7970 which is neck-and-neck with the 680... they're about the same, they just trade blows depending on the specific game. And the 680 blows at GPU compute (the 580 is faster).

 

PhysX isn't supported by a lot of titles and never will be as long as it only runs decently on Nvidia hardware (which is by design, by the way, as Nvidia cripples it for other platforms to make it look like the Nvidia hardware is actually doing something special). Just like Glide. The problem for Nvidia is that there are plenty of other physics APIs out there that work on all hardware that are just as good or better than PhysX.

 

"also Unreal Engine 4 only works with GTX 690 at the moment!!"

 

Completely false. Lots of fanboy misinformation from you.

topeira
topeira

 @percuvius2 not accurate. they cant push the tech to the max otherwise the consoles will cost too much. and with THESE specs the games will be optimized to use them properly. games can look good already. the 7600 series is a lot better than the 360's ATI 1800XT (or whatever it is) so imagine that games will look twice as good on the consoles and even better on well endowed PCs.

Nishank93
Nishank93

 @percuvius2 If they're going to come out next year, then this hardware sounds reasonable. Games for consoles are optimised for that specific hardware, so you get 'more performance'. Regardless, a console will never triumph a gaming PC in terms of raw processing power.

vault-boy
vault-boy

 @Hvac0120 Oh, if you know that I can only assume your part of the Microsoft/Sony planning committee. I mean if your not then I guess you would have to be projecting so hard we could point your head at a wall and show off powerpoints.

rarson
rarson

 @Hvac0120

 

"The PS3 and the Xbox 360 also have variants of the same CPU. The differences are in the GPU and the motherboard architecture. The PS3's Cell Architecture being the biggest difference."

 

Does not compute. Cell IS the CPU of the PS3. The design of the PPE core inside the Cell is similar to the cores inside the Xbox 360's CPU, but the 360 doesn't have the 7 extra SPE units that Cell has (which handle the bulk of the processing). Cell probably has more processing power than Xenon, but is much harder to program for.

 

Also, the GPU in the Xbox is more powerful than the GPU in the PS3.

digi-demon
digi-demon

 @rarson Its probably down to price - but tbh I got both X360 & PS3 and graphics generally run smother on X360, this is despite the NVidia's claim of PS3 GPU graphical superiority over the older AMD/ATi X360 GPU.

syler4815162342
syler4815162342

 @rarson I spoke as a Digital Artist who is using software's to make games and VFX...and see the differences in action, just do a research and you will see, i speak of truth, i my self have ATI 4670 and AMD phenom 2 x4 940 and it sucks at rendering, calculating simulation in real time... AMD will work for running video games but its technology is way behind of Nvidia in terms of Real Time and rendering...http://blog.renderstream.com/ and as i said in other posts that GS deleted my links et cetera...only with Nvidia Physics you can Calculate great real time simulations...http://www.3daliens.com/tpv2/

TheOnlyConan
TheOnlyConan

 @topeira  @percuvius2

They only "look good already" if you do not have a decent PC to compare them too. Take Skyrim. Compare the console to the modded PC version. No contest. Yes, consoles will always be behind PC's, but using cards already outdated (in a system not even coming out for another 18 months/my guess) is just horrible.

vault-boy
vault-boy

 @Nishank93 When the 360 came out it had a tri-core 3.2GHz processor. That beat the everliving fuck out of almost every computer at the time. If this is true then the next gen is going to be pretty weak.

rarson
rarson

 @digi-demon

 

Yeah, the idea that the PS3's GPU is more powerful than the 360's is laughable. The 360's GPU is what gives it the advantage over the PS3 (well, that, and the fact that the Xenon CPU is much easier to deal with than Cell).

 

In a way, you're right, it is due to price; Nvidia's relationship with Microsoft eroded due to pricing disputes over the original Xbox hardware (which is the entire reason they went with ATI for the 360). I'm not sure whether Nvidia has done the same to Sony, although Sony surely can't be happy with the results of the PS3 hardware (which is really their own fault). But I strongly feel that the huge delay in releasing Kepler, along with non-existent availability (both of which seem to be becoming recurring problems for new Nvidia hardware) is starting to make hardware manufacturers nervous. Nvidia has been screwing up in the market that they made their name in, which is not a good sign.

syler4815162342
syler4815162342

 @rarson haha what a guy! my writing is clear don't make excuses for that!!

 

That video you've just saw is not a simple thing to skip, go watch it, it's the latest Unreal Engine 4 technology that runs only with Nvidia GTX 680, GTX690 on GDC 2012

 

And you said they have same technology and all of them (any hardware AMD or Nvidia) can run physics simulations! just fine it's about the software,

 

And i know about this competition marketing between them but it's not about the topic i was pointing! 

 

Right now only with Cuda Technology that has more than hundreds Cores, they can simulate thousands particles, fur...in real time whereas AMD GPU's has only 4 cores!

 

Surly if AMD could run that, there was articles or videos about it! or Epic would use AMD for the show instead!

 

This is the advantage of Nvidia GPU's so you clearly can not deny that.

 

And about the Havok Physics, Game Engines like UDK (the on i work with it and i know it well) Cryengine3, Unity3D they have their own physics simulation and there is no need for them to use Havok in most of the productions! and there is nothing special about Havok Game Engine that require special hardware to run! those game engine that may have not  good at physics use Havok Physics Engine!

 

You actually are the one who didn't get what i said! my first comment was about Nvidia technology that can run games much better and it's obvious for all but since it's more expensive they don't use it!

 

Instead you wrote me the History of AMD and Nvidia for me! tnx for your time though.

 

Who knows maybe one day AMD  comes up with a better technology to help Game Industry, I'm not fan of Nvidia or something,

I have ATI 4670 for my PC! i speak of a Digital Artist who saw the differences in productions and how helpful Nvidia is to achieve the best result, i just wished for next gen they would use Nvidia for better experiences.

 

rarson
rarson

 @syler4815162342

 

First of all, I'm having a hard time parsing your comment because it's so poorly-worded. I'm assuming English isn't your first language?Anyway, I've seen all those demos before. Showing me a youtube video does nothing to support your argument. Let me break it down for you:

 

PhysX was originally designed to run on Ageia's hardware, the PPU, before Nvidia bought the company. The PPU consisted of "a general purpose RISC core controlling an array of custom SIMD floating point VLIW processors." If this sounds familiar, it should, because that's basically what a GPU is. When Nvidia bought Ageia, all they did was port the over to Nvidia hardware. They also eventually dropped support for PPU acceleration because Nvidia didn't feel like supporting old hardware that they didn't design or sell.

 

Secondly, up until version 3.0, which wasn't released until last year, PhysX lacked multiprocessor support and used largely outdated x87 code. That's one of the reasons why PhsyX used to run so much slower on CPUs. Nvidia arbitrarily locked out AMD GPUs from PhysX as a marketing ploy (if you want PhysX, you have to buy Nvidia hardware) and even went as far as to use driver lockouts to disable PhysX ON NVIDIA HARDWARE if any ATI/AMD graphics cards were detected in the system (in other words, if you tried to use an AMD graphics card and add an Nvidia GPU for PhysX, it would disable PhysX on the Nvidia card).

 

The PhysX SDK is available across a number of platforms, including Windows, OSX, and all three current consoles (two of those consoles don't have any Nvidia hardware in them). So you don't need Nvidia hardware to run it. The caveat is that Nvidia doesn't bother to provide optimization for any hardware other than its own, so it's not guaranteed to run as smoothly as it would on Nvidia hardware... but as I pointed out before, that's NOT because the hardware is more suited to calculating physics, but because of the inefficiency of the API.

 

If anything, calculating physics on the CPU is probably a better use of resources, especially now with the prevalence of multi-core processors that are rarely taxed heavily in most modern games, but Nvidia will never tell you that because they use PhysX to try to sell graphics cards.

 

"Surly you are smart enough that you wouldn't compare Bullet with Nvidia physics for games!!"

 

They are both physics APIs used in games. I don't understand why you're not getting this. THERE IS NO HARDWARE COMPONENT TO PHYSX, IT IS A SOFTWARE API. There is nothing magical about the SIMDs in Nvidia's GPUs that make them able to do anything that the SIMDs in AMD's GPUs cannot do.

 

"Second Havok is a different story it's an engine like a game engine helping for simulation I am talking about Nvidia Graphic card technology!!!!!"

 

Havok is a physics engine. Actually, Havok offers a variety of tools for game developers, including Physics, Cloth, Destruction, the Vision Engine, and more. Go look at their web site, you're clearly confused by the products that the company offers.

 

So yeah, you obviously don't know what you're talking about.

syler4815162342
syler4815162342

 @rarson Let us not insulting each other when we can't find a good answer to reply! I am aware of them first Surly you are smart enough that you wouldn't compare Bullet with Nvidia physics for games!!

http://www.youtube.com/watch?v=rFqNh-hxme8&feature=plcp 

Second Havok is a different story it's an engine like a game engine helping for simulation I am talking about Nvidia Graphic card technology!!!!!

So yea I'm clear what i am talking about!

rarson
rarson

 @syler4815162342

 

"i my self have ATI 4670 and AMD phenom 2 x4 940 and it sucks at rendering"

 

That's because you're using an entry-level card that isn't even well-suited to gaming. Only a moron would attempt to do GPGPU on a bargain-bin gaming card.

 

"only with Nvidia Physics you can Calculate great real time simulations"

 

Wrong. There are several physics APIs out there that do real-time calculations on a myriad of hardware, including CPUs, such as Havok and Bullet. Bullet has even been used in several major films. You clearly don't know what you're talking about.