Anyone remember that xbox fanboy who kept taking Carmack's opinion that "Xbox is the greatest system ever" as a fact?
Wish he could've seen this...
id Software developer John Carmack says Sony made "wise engineering choices" for just-announced next-gen platform.
Industry veteran and id Software designer John Carmack has only kind things to say about the just-announced PlayStation 4. Writing on Twitter, Carmack said he is pleased with the specifications for Sony's next-generation platform.
"I can’t speak freely about PS4, but now that some specs have been made public, I can say that Sony made wise engineering choices," he said.
Carmack's commented last year on next-generation platforms, saying he was "not all that excited" about these consoles. He noted Microsoft and Sony are too focused on boosting technical power, when they should instead be working on building new experiences for future consoles.
"Sony and Microsoft are going to fight over gigaflops and teraflops and GPUs and all this," he said at the time. "In the end, it won't make that much difference."
Sony officially announced the PS4 this week at a New York City event. The system will be available during holiday 2013 and features a custom chip that contains eight x86-64 cores and an enhanced PC GPU graphics processor.
Anyone remember that xbox fanboy who kept taking Carmack's opinion that "Xbox is the greatest system ever" as a fact?
Wish he could've seen this...
Wait a Tweet is now news! How about an interview fellas. What is this? This guy could have been taking a poop while he tweeted that out. Let's step up the journalism a bit please.
@Incubus420 Check out ESPN.com. Half the news stories are about tweets. Same with all the major news web sites. This is the new reality. This morning's local news here in Los Angeles opened with the feature story that Kobe Bryant tweeted "Amnesty that". Serious
@jakesnakeel Honestly I just noticed your email after I was reading an article about how Kobe Bryant took to twitter to bash Marc Cuban of the Dallas Mavericks. I see your point.
I'm happy with it. I know the RAM is sufficient for the multitasking that's gonna be featured with the PS4. I just hope that AMD's onboard GPU setup will be sufficient, even though they've said it works the same if not better [in some minor aspects] than a dedicated graphics card.
I think you're confused. The GPU is a separate part. Sony is using a CPU, not an APU. It's like any other GPU that you'd find on a discrete graphics card, only tweaked a little to Sony's needs.
Wait wait wait forget about the CPU. The GPU is the graphics processing chip. It's either located on a dedicated video card (like any gaming PC), or embedded on the motherboard.
Sony, I believe are going with AMD's new setup of having the GPU embedded on the motherboard. That's the point I was making - unless of course I'm wrong, and or I've used the wrong terminology? But I'm happy to be corrected.
AMD's blog seems to confirm that the PS4 is using a semi-custom APU, but it's not really clear whether they mean that it's using an APU + GPU, nor is it clear that the near-2 TFLOPS is total power or what. I guess we'll just have to wait for final specs.
I see. That's pretty interesting.
I can't wait to see the full specs-sheet in the near future. Hope I find you in a similar setting, so I can through you some more brainteasers.
Well, packaging, price, and in AMD's case, a heterogeneous architecture. By 2014 or so, we should see them releasing chips that can do CPU and GPU instructions automatically, fully heterogenous. They won't be separate components anymore. This will really help push GPGPU compute, which in turn means that more software will take advantage of the processing power provided. I'm sure you'll still have dedicated GPU cards, but it won't be necessary to buy one to get good GPU performance (as it is, the current AMD APUs have pretty decent GPU performance anyway).
I said that Sony's not using an APU, but I'm basing that off the reports that they are using a "CPU." If you don't know, "Jaguar" is the core architecture that is replacing the "Bobcat" cores in Brazos 2.0, the low-power chips AMD is currently using in smaller devices, laptops, and a few desktops. Brazos is an APU design, and Jaguar is meant to be used in APUs as well (Kabini and Temash).
My point in mentioning it is that Jaguar is not slated to be put into anything that ISN'T an APU, and the initial reports when Sony decided to go with an x86 architecture is that they were planning on using an APU along with a discrete GPU chip. I would imagine that aside from a few console-specific optimizations, whatever AMD is providing Sony is going to be largely based off the products that were already in the pipeline. Hence, it would not surprise me if the Jaguar-based "CPU" in the PS4 is actually an APU. I can think of a couple reasons why Sony would report it as a CPU, primarily because that's the term that most people are familiar with. Sony was also rumored to be working on some kind of APU-GPU synchronization technology with AMD similar to, but different from, Hybrid Crossfire (which would marry the GPU power of the APU with the power of the GPU, for even more performance).
In other words, if those initial rumors were correct, and Sony just hasn't elaborated on the full specs yet, it could have even more GPU power than we're expecting, and a pretty interesting APU/GPU architecture that could show a lot of potential. I have no doubt that Jaguar will be robust, but it's interesting to see how Sony has stepped away from making the CPU the central, powerful center of the console and is going with something far more GPU-based. That, along with the choice to use 8GB of fast GDDR5 instead of regular DRAM, is a really good choice that I think will offer an excellent tradeoff between performance and cost. The move to x86 also means it's going to be MUCH more developer-friendly, which is great.
Gotcha! Thank for taking the time to correct me.
If the lack of a dedicated/unique GPU decreases shader counts, what real benefit does that faster communication provide with an APU setup?
You're still confused. An APU is a CPU with the GPU on the same die. It's not on the motherboard anymore, it's moved directly onto the same die as the CPU.
There's no physical difference between a GPU that is located on a graphics card versus one that is located on a motherboard (other than PC motherboards with integrated graphics tend to use system RAM instead of graphics RAM for the GPU). All console GPUs are located on the motherboard (that is, they were until the Xbox 360 die shrink that moved the GPU onto the CPU); the reason that it's on the motherboard is because the GPU is fixed hardware and not designed to be replaced. The only thing the card does is make upgrades easier by making the construction modular.
There is, however, a big difference between putting a GPU on die, versus on a card. Having the two on the same die enables them to communicate faster, but it also creates packaging limitations. You're physically limited by size and thermally by TDP. For this reason, shader counts tend to be lower on APUs than, say, a mid-range graphics card. But every console GPU ever has been "on-board" in the same way, so I'm not sure what your concern is. It is no more or less on-board than any other console GPU.
the game looked good and it played well but its story was the only part that pulled it down. It even had good voice acting. definitely not mediocre
Its good that he's happy. Sony would have had to go back to the drawing board if he wasn't! That's sarcasm for the Sheldon's out there :P
is he even revelation anymore? Last thing ID and this guy made i care for was Doom3. Besides he first say the 360-ps3 had great spec and would be great, then he said he regretted making games on 360-ps3 then porting them to pc. Cuase they were terrible limiting.
He says what ever benfits him self and ID atm just to change there tune down the line
@tsunami2311 No, he simply learnt and realized that porting from console to PC was not a good idea afterall, when proven by actually doing it. It's called learning and adapting, and everyone has the right to do that. Only idiots are unable to change their minds when facts show that they were wrong.
Do I have a problem if I don't give a crap about this guy? I respect his opinion as developer and I recognise that he has done great things in the past, but he has failed to impress not just me but pretty much everyone one in the past 5-10 years. However, if he says Sony has made good choices, then I guess we can relax a bit.
John Carmack is happy with the Playstation 4 specs. Great! Now start making games for PS4. Snap too it!
Here to laugh at all the tough kids who're like;
"Carmack hasn't made me a video game I liked recently. He doesn't know shit anyways... just another idiot."
He's still in the top 10 most experiences devs in the business, whether you like it or not. And he knows more about hardware than most ever will.
The PS4 has almost the exact same processing power as the PS3, it's the RAM that makes the difference, it's the RAM he told them to put in, and it's the RAM they improved.
Now the world of gaming will be amazing for everyone because Carmack was listened to, while you bitches were busy stamping your feet and screaming about Rage.
The man's a wizard. If you can't recognise his genius then at least shut your mouth and stop embarrassing yourself.
@naryanrobinson The cpu is very different in terms of architecture. which makes it very easy to program for. The GPU is way better and yes, there is More RAM. Not to mention the GPU is integrated which increases bus speeds. It's not JUST the RAM they improved on.
@FreedomPrime I didn't say it *was* a PS3 with more RAM, I just said there's no difference in terms of computing power between a PS3 and PS4.
I'm aware of the implications of an x86 CPU.
@naryanrobinson The GPU defenitly have more power than the PS3's. However, as you say, the CPU might not differ much.
@Eldeorn The whole system has just under two teraflops of computing power, the same as a PS3.
The bandwidth and amount of RAM is the real difference.
@rarson If it is indeed an APU, then perhaps they incorporated an HD 7870M-7970M, underclocked with low TDP?
@Eldeorn From what i've read, the RSX is based on the G70-71 chip, equivalent performance to a 7800GTX, which has around 400 GFlops SP. The Cell is rated around 200 GFlops. That 1.8TFlops is PR bs for sure. I agree, the ps3 real compute performance should be around 400-700 GFlops. My HD4870 has 1.2 TFlops SP and was realeased in 2008...
AMD's blog says it's an APU, but it's pretty ambiguous about what they mean. It does seem to indicate that "near 2 TFLOPS" is the total system power. I guess we'll just have to wait for the full specs.
I have my doubts. For starters, we don't really know much about the CPU. In fact, Jaguar is slated for APU use only, so it's entirely possible that Sony is using an APU and simply calling it a CPU to either hide an ace up their sleeve or for simplicity's sake. It was rumored back when the x86 switch leaked that Sony was using an APU and something akin to Hybrid Crossfire (but different). Assuming it does use an APU (an 8-core, no less), then it's certainly possible the machine could have over 2 TFLOPS of processing power.
I just find it a little unlikely that they'd use Jaguar cores if they weren't going to go with an APU. Not that Sony can't afford it, but I doubt AMD would go back and redesign something as a CPU that was never meant to be in the first place, and the benefits of having a heterogeneous architecture are obvious. I hope they are using an APU, because that would make the machine even more interesting from a tech standpoint.
@putonekas17 Care to back that up with any *actual* facts?
@putonekas17 @naryanrobinson @Eldeorn If you mean based on that it's target is 1080p 60hz, then it requires 4x the power indeed (pixel fillrate and pixel shader operations)?
I've been digging through more material on the net, and while its quite easy to find specs on the PS3 (wikipedia), there is very little on the actual PS4 chip.
According to swedish wikipedia (most likely based on rumors from devkits), the PS4 GPU has 1.84 TFlops.
According to Sony, the PS3 RSX has 1.8 TFlops and a total of 2.1 TFlops with the cell included. People question this however, since in real-world applications, it would meassure at around 400-1000GFlops on the GPU.
@naryanrobinson @Eldeorn Yeah, I stand corrected after looking over some spec papers out there. The GPU mainly differs with the updated feature set it has, which could be considered as "more powerful" even if it's not actually faster - some operations it does is either simpler or more efficient. The ability to flush more data from a faster RAM is ofcourse what really makes it more powerful.
Raiden is a cool character, but the game feels really shallow and simplistic - at least that's the impression I got from playing the demo. Sure there are some strategy involved in how you approach the encounters, but still. It's not an MGS game (which I didnt expect either). I wont buy this game based on the demo. 4/10 rating from me.
@Eldeorn I didn't play the demo that so many fanboys hated. But the game is hot. A lot more depth than a demo can reveal.
@Eldeorn sure you got the right article there?
Does that mean it will support Oculus Rift? That's all I really give a shit about at this point for next-gen gaming.
The love for Carmack has gone, thats right Carmack you is outta my FB Inspiration list. How could you be such a kissass? -_-
@AZE160 He's just not a rabid fanboy that can't embrace new things and see positive things in something that is new and has obviously changed for the better.
Just because he didnt like PS3, doesnt mean he has to hate PS4. That'd be plain stupid.
What have you made lately? .....hypocrite!
For one thing, he has shown just how far you can push the hardware with streaming textures/graphics from a disk while still maintaining great framerate, which is very valuable knowledge for other companies in the business.
He also has proven, at a big cost, that the road to get there is way too complex and costly, and simply not worth it compared to the more generic solutions out there. He didn't strike gold, but it's still valuable, since noone else has ever gone that far to try it out - which is exactly why he did it most likely.
ALL engines he has produced up until Rage has been stunning and quite frankly genious solutions - like the original Quake engine, and the Doom 3 engine with it's dynamic lighting and shadow techniques. I'll give him the benefit of doubt and see what he comes up with for his next game, assuming it's not just another id tech 5 engine rage 2.
His biggest fault is probably that he CAN make generic engines like everyone else does, but gets bored with stuff that isn't new or already tested and common knowledge.
@jeffrobin Have to agree - the actual shooter part of the gameplay was really fun. AI was unpredictable, weapons had a nice feel to them and they all were fun and useful.
One thing that sucked though was the way-too-short and non-epic ending. You collected all these toys and explored every little thing, and then it suddenly was over just like that. The last 1/3rd of the game should have been harder, and three times as long, to make it feel just about right. IMO.
Another thing was the user interface mouse cursor. When you lowered sensetivity for mouse look, it affected the mouse cursor too for some stupid reason, and made it nearly unusable. That, I hated most of all.
@Thanatos2k @Eldeorn @edant79 It was a good game with great graphics for a console and awesome level design. Check out each environment in that game so different, very impressive. It's only problem was it's length and fetch quests. If Id had made it under 10 hours without so much of the get the fandoogle for the conflibralator it would have been much better.
Sucked though? Why?
Mark Cerny says Sony understood shortly after PS3 launch that company needed to make future platform more developer-centric.
Users who looked at this article also looked at these content items.
Avalanche Studios co-founder says developer's ambition is for action, not moments that make players cry; steampunk-style game on hold. Full Story
4A Games creative director Andrew Prokhorov thanks Jason Rubin for telling the studio's story, but says, "We deserve the ratings we get." Full Story