Dante_1990 / Member

Forum Posts Following Followers
1098 65 53

Dante_1990 Blog

Hideo Kojima hosting podcasts

by on
The creator of Metal Gear Solid is now producing his own online talk show, dropping MGS4 details.

TOKYO--Months after he started keeping a blog, Metal Gear Solid creator Hideo Kojima has begun his own podcast. Titled "Hidechan! Radio," it features various guests, most of which are Metal Gear Solid series developers. The podcast appears to now be the primary source of information about the ongoing development of Metal Gear Solid 4, since Kojima has neglected to update his blog in recent weeks. Gamers with an understanding of Japanese can download Kojima's broadcasts (in MP3 format) from his Japanese blog.

In his latest show, Kojima started a new segment called "Hidechan's Café." Its first guest is Akio Otsuka, the seasoned actor who voices Snake in the Japanese MGS games. Otsuka's conversation with Kojima will be available in installments over the next few weeks and has already revealed details about the development of MGS4. In his first talk with Otsuka, Kojima let slip that dialogue for MGS4 will be recorded throughout 2006, and the recording process will take as long as it did with previous entries in the series.

Grand Theft Auto 4: Europe?

by on
UK games mag hints at next-gen Grand Theft Auto - is it going to be set a bit closer to home this time?
While the Italian-American hijinks present in the Grand Theft Auto series have an undeniable appeal, the patriots in us tingle at the thought of some more UK-centric action.

According to PSM, a Rockstar rep has revealed that the next generation GTA will be set in Europe, with a total of 6 cities featured in the game. The mag also states that the game will be officially unveiled at this year's E3 show and will be a PlayStation 3 exclusive.

Interestingly, PSM also suggests that GTA4 will be released as early as the end of this year, alongside the PS3 itself. The mag writes that a "proper" GTA is released every two years, and speculates that 2006 is ripe for the next-gen sequel.

What with the GTA developer being located here in good old Blighty, it's only a matter of time before Europe gets the full-on GTA treatment. As for whether there's any truth to the rumour though, we only have to wait a few months to find out.

Alan Esad

Dead Or Alive Coming to PS3?

by on
According to an update at TVG the statement issued by Tecmo that brought about the speculation that the Dead or Alive series was about to switch over to the PlayStation3 appears to have been the victim of a typo, with the statement apparently supposed to state that the next two DoA will appear on Xbox 360 and not PlayStation3.

Well it was nice while it lasted :)

...........................

A recent article in a Japanese newspaper suggested that Team Ninja are currently working on two titles for the PS3. If one of these titles will be Dead Or Alive 4 is still unkown, so for now we can only hope.

Dead or Alive made its debut in the Arcades 10 years ago, followed by a Saturn and PlayStation port a few years later. And then shortly after the most excellent Dreamcast port of Dead or Alive 2, the franchise appeared on the PS2 in 2000. So the series is no stranger to the PlayStation platform. However, after the highly successful Dead or Alive 3 that was released on the launch day of the Xbox fall 2002, Tecmo and Microsoft seemed to have formed some sort of love affair. How deep this relationship goes is a bit unclear, but unlike Bungie (Halo) who now is owned by Microsoft, Tecmo still owns the rights to the Dead Or Alive series. So if we will see Dead or Alive on the PS3 will solely be up to Tecmo and Team Ninja. If you look away from any future bribes from Microsoft that is.

Dead Or Alive Coming to PS3?

by on
Update: According to an update at TVG the statement issued by Tecmo that brought about the speculation that the Dead or Alive series was about to switch over to the PlayStation3 appears to have been the victim of a typo, with the statement apparently supposed to state that the next two DoA will appear on Xbox 360 and not PlayStation3.

Well it was nice while it lasted :)

...........................

A recent article in a Japanese newspaper suggested that Team Ninja are currently working on two titles for the PS3. If one of these titles will be Dead Or Alive 4 is still unkown, so for now we can only hope.

Dead or Alive made its debut in the Arcades 10 years ago, followed by a Saturn and PlayStation port a few years later. And then shortly after the most excellent Dreamcast port of Dead or Alive 2, the franchise appeared on the PS2 in 2000. So the series is no stranger to the PlayStation platform. However, after the highly successful Dead or Alive 3 that was released on the launch day of the Xbox fall 2002, Tecmo and Microsoft seemed to have formed some sort of love affair. How deep this relationship goes is a bit unclear, but unlike Bungie (Halo) who now is owned by Microsoft, Tecmo still owns the rights to the Dead Or Alive series. So if we will see Dead or Alive on the PS3 will solely be up to Tecmo and Team Ninja. If you look away from any future bribes from Microsoft that is.

Sony Confirms DevStation PS3 Developer EU Conference

by on
According to Gamasutra.com, DevStation, Sony Europe's yearly development conference for Sony platform developers, will this year focus exclusively on game development for the PS3.

The conference, which will take place in London's Brick Lane from March 1st to March 2nd, has posted an extensive agenda online, which includes significant discussion on the Cell chip, graphics, art, performance, audio, networking, and a new tool named PSSG, which is described as "a PS3 optimised cross-platform graphics engine and tool-chain."

There will also be a third-party demonstration and a lecture from Unreal Engine creator Epic, as well as middleware input from companies such as Havok and Ageia, who have already officially signed up to provide their physics technology to PlayStation 3 developers.

More information on the conference is available on its official website, which notes: "DevStation 2006 is a PlayStation 3 Development Conference, to receive a delegate space your company would have to have an executed TMLA agreement with Sony Computer Entertainment Europe."

PlayStation 3 price - $500?

by on

While there was little doubt the Xbox 360 was going to be a hit in the just completed holiday season, no one was real sure how consumers would react to the $399 price tag.

Despite the $100 bump over the launch price of the original Xbox, few seemed to mind. Now, with Sony's PlayStation 3 looming, it appears another price threshold may be crossed before the year is out.

Sony hasn't commented on specific pricing figures, though Ken Kutaragi, president of Sony Consumer Entertainment, reportedly told attendees of a 2005 corporate meeting "it'll be expensive." Analysts and many video game developers, though, suspect the system may debut with a price tag reaching nearly $500.

"[Sony] could now consider launching its PlayStation 3 at a price range of $399 to $499, with the $499 price point more likely," said American Technology Research's P.J. McNealy in a note to clients Monday.

Sony, as you might guess, didn't have much to say about McNealy's theory.

"We haven't made an official announcement about pricing yet," said Ryan Bowling, PR manager for Sony. "At this point, that's all speculation."

The strongest argument behind the $499 price point is the PS3's inclusion of a Blu-Ray drive. This bleeding edge technology will give Sony (Research) significant bragging rights, but it comes at a cost. Pioneer last week at the Consumer Electronics Show unveiled a standalone Blu-Ray player for $1,800.

Obviously, Pioneer's earning some profit there – and Sony will almost certainly subsidize the cost of the drives, but you're still looking at an expensive bit of hardware. The PS3 will also feature other pricey items, such as a hard drive, the Cell processor and a new graphics chip from nVidia (Research).

Developers, for the most part, say they, too, are expecting the PS3 to be more expensive than the Xbox 360's highest price package. Sony, they said, has been sending mixed messages to the gaming world, but several developers I spoke with (under the condition of anonymity) said their studios were expecting the system to launch at $499.

There wasn't universal consensus, though. Some predicted the price would be closer to $450, others said they wouldn't be surprised if it was as high as $600. $700 was mentioned by a couple of developers, though even they said the number seemed unreasonable. And one game maker felt Sony would try to stay in line with Microsoft, offering the PS3 for just $399.

None of the developers, by the way, echoed my hypothetical theory that Sony might be pulling a head-fake on Microsoft with the high price warnings, though a couple did bring up the months of speculation leading up to the PSP's launch. Analysts, journalists and even publishers were wildly grasping for a solid clue about the launch price of the handheld device. (Atari's CEO even publicly proclaimed he expected the PSP to sell for $500.)

Sony, while this went on, smiled enigmatically and did nothing to dissuade anyone that the device would be $300 or more. It launched at $249, still incredibly expensive by handheld standards, but lower than some consumers were expecting.

We're seeing much the same thing with the PS3. After an onslaught of information last May, the company hasn't released any information of substance. Even at CES, the device was an essential no-show. (A hardware design was there to be gawked at and a video loop of potential gameplay footage, but no new information was announced.)

There's one other possibility about the PS3 that few people have discussed: Dual-pricing strategies. It's frustrating from a consumer standpoint, but Microsoft proved it can work – at least in the U.S. Whether Sony's willing to risk fragmenting the market by offering both "bare bones" and "bells and whistles" versions of the PS3 is another matter.

For one thing, it would look as if Sony were following Microsoft's (Research) lead – a vision the market leader does not want to convey. Offering a PS3 with reduced features would also chip away at the company's stance that the system is much more than a video game machine.

Whatever Sony decides, we should start to get some sort of clarification in the next few months – almost certainly by E3 in May. One thing's for sure, though: For saying a whole lot of nothing, Sony has somehow managed to keep everybody talking about its product.

Sony PS3 Video Report from CES 2006 - Sony Displayed the PS3 Controller

by on
Reporter: Alan Esad
We have a video report of Nikki Cash from the Sony Booth at CES 2006 where she reports about the PS3.

There is no new information of course, but the video shows that the Sony PS3 Controller was on display. Gizmodo reported that it was not there when they visited the booth and came up with the theory that Sony might redesign the controller.
It could be that Sony, after learning about the pretty quick spreading news to display the controller. The Sony PS3 controller is getting not too much love yet, because of its design. There was a sign in the glass cube saying that these are conceptual designs, so things can still change.

Will PS3 be Ready for E3?

by on

Though most of us had hoped for some big PS3 announcements at the recent CES 06 show in Las Vegas, deep down inside I guess we knew that it wasn’t very likely. CES 06 is already just a faint memory, and we are now looking ahead towards E3 2006 where we will be playing MGS4 on the hundreds of PS3 booths available. Or maybe not?

If Sony is serious about their plans to launch their PlayStation 3 spring/summer 2006, this years E3 is no doubt going play a huge role in their launch strategy. I think both gamers and most members of the gaming press are expecting Sony to have the final build of both the machine and the controller ready for E3. Another round with tech demos is not going to cut it, this time only real playable demos of the launch titles running on actual PS3 hardware is going to please the crowd. Of course Sony knows this, and E3 2006 has probably been marked off on their calendar ever since they started working on PS3.

The only problem is that people working on PS3s are reporting that the console is far from done. Apparently Sony are still working out bugs in the hardware, which leaves us guessing if a spring launch in Japan is at all realistically possible or just wishful thinking from the gaming community. So even if they won’t have PS3 booths at E3 06, they will have tons of playable demos to show running off dev kits right? If not tons of demos I think at least we will see playable versions of MGS4, WarHawk and possibly even Killzone. Though I suspect that Sony is not planning to show any real footage of Killzone until it at least resembles the stunning trailer showed last year. The current status of Killzone is anyone’s guess since Guerrilla haven’t talked to the press about the game since May 2005 when they first showed the PS3 Killzone trailer. We are of course holding our fingers crossed, hoping that both Sony and Guerrilla will deliver the goods. Feel free to share your own E3 2006 predictions in our comment section.

 

Microsoft Abandons 90-Day Sales Forecast

by on

January 9, 2006 - According to a report published in this morning's Financial Times newspaper, Microsoft has abandoned its highly ambitious sales target of 3 million Xbox 360s delivered into the hands of consumers at the end of the console's initial 90-day introduction period. Instead, the company will focus on the goal announced at last week's Consumer Electronics Show of 4 million to 5.5 million units sold by the close at the company's fiscal year at the end of June.

According to Peter Moore, chief of the company's Xbox 360 division, the revision of sales estimates reflected the fact that "nothing's perfect - [the Xbox 360 is] a complex piece of hardware that includes 1,700 different parts. Every now and again the line will slow down because something's happened and there'll be a component that didn't make it that morning."

Moore added that by shifting the company's projections to the summer, Microsoft will better be able to fulfill both consumer and analyst expectations. Moore believes that the addition of another hardware manufacturer (Celestica), the company will be able to meet its revised sales projections.


Cell's nine processors make it a supercomputer on a chip

by on

We're flying at about Mach 1.5 around Mount Saint Helens, in Washington state. IBM Corp. senior programmer Barry L. Minor is at the controls, rocketing us over the crater and then down to the lake at its base to skim over the tree trunks that have been floating there since the volcano exploded over 25 years ago. The flight is exhilarating, even though it's just a simulation projected on a widescreen monitor in a cluttered testing lab.

Then, at the flick of a switch, Minor turns the simulation over from his new Cell processor to a dual-processor Apple Power Mac G5, and the scenery freezes. The G5 almost audibly groans under the burden, though it's no slouch. In fact, it's currently the top of the line for PCs. But Cell is something different entirely. It's a bet on what consumers will do with data and how best to suit microprocessors to the task—and it's really, really fast. Cell, which is shorthand for Cell Broadband Engine Architecture, is a US $400 million joint effort of IBM, Sony, and Toshiba. It was originally conceived as the microprocessor to power Sony's third-generation game console, PlayStation 3, to be released this spring, but it is expected to find a home in lots of other broadband-connected consumer items and in servers too.

Executives at Sony Corp., in Tokyo, wanted more than just an incremental improvement over PlayStation 2's processor, the Emotion Engine. What they got was a 36-fold acceleration, to a whopping 192 billion floating-point operations per second (192 gigaflops). Because Cell is a combination of general-purpose and multimedia processors, it defies an exact comparison with other upcoming chips, but it's thought to be more powerful than the chips driving competing game systems.

Cell can calculate at such blazing speed, in part, because it's made up of nine processors on a single chip of silicon, optimized for the kind of real-time calculations needed in today's broadband, media-rich environment. A specially designed 300-gigabit-per-second bus knits the processors into a single machine, and interface technology from Rambus Inc., Los Altos, Calif., gives it fast access to memory and other off-chip systems.

So far, microprocessor watchers have been impressed with what they've seen of Cell. "To bring huge parallel processing onto a single chip in a clean and efficient way is a real accomplishment," says Ruby B. Lee, a professor of electrical engineering at Princeton University and an IEEE Fellow.

A graphics-heavy item such as PlayStation 3 isn't just a showcase for an unusual chip. For IBM it's a philosophical statement. "Gaming is the next interface driving computing," says James A. Kahle, Cell's chief architect with the IBM Technology Group, in Austin, Texas [see photo, "Multicellular"]. Just as moving from punch cards to electronic displays changed what people expected of computers, the highly collaborative, real-time realism of today's games will set the standard for what people want from computers in the future.

But even now, the sheer desire for power in the gaming market guarantees that Cell will be made in volumes that more than make up for the loss last year of IBM's highest profile customer, Apple Computer Inc. Market research firm iSuppli Corp., in El Segundo, Calif., predicts that 37 million game consoles will be sold this year alone worldwide. By 2007, when all three game console makers will have released their next-generation products, the market will have grown to 44 million. And though Cell is exclusive to the PlayStation 3, IBM has a lock on the rest of the console market. Its microprocessors will power both of Sony's competitors, Microsoft's Xbox and Nintendo's GameCube.

The Cell-powered PlayStation 3 can expect to pick up a little less than half of what could become a market worth up to $9.5 billion in 2007, according to iSuppli senior analyst Chris Crotty. And, of course, there are other high-volume plans for Cell.

Toshiba Corp., in Tokyo, for one, plans to build television sets around it. The company has already shown that a single Cell processor can decode and display 48 compressed video streams at once, potentially allowing a television viewer to choose a channel based on dozens of thumbnail videos displayed simultaneously on the screen. And in a smaller market, Cell has already found its first outside customer in medical- and military-systems maker Mercury Computer Systems Inc., in Chelmsford, Mass., which is developing a two-Cell blade server due out by April.

With two such massive consumer electronics makers as Toshiba and Sony behind it, Cell is an obvious attempt to control the "digital living room," as technology executives have dubbed their dream of a home where all the media players are intelligent and networked together. "[Sony's] goal is to make a computer fun...to make it an entertainment platform," says Sony's Cell director Masakazu Suzuoki. "But even if we make the Cell system an entertainment platform, there's nothing if there's no content."

Indeed, experts say Cell's success hinges on whether programmers outside IBM, Sony, and Toshiba will be able to exploit the gigaflops that Cell has to offer. Tony Massimini, chief of technology at the consulting firm Semico Research Corp., in Phoenix, puts it bluntly: "Cell has strong potential, assuming that the game developers satisfy their customers' needs. But if the games suck, who wants to buy it?"

That Cell has more than one processor core on a single chip is more a sign of the times than a revolution. All the microprocessor stalwarts are moving to multicore design. The principal reason is that the old way of doing things—increasing the number of calculations per second by shrinking the processors into a tighter knot of tinier transistors and then dialing up the clock speed­has essentially crashed headlong into the brick wall of heat generation.

Because transistors using today's technology are so small, even when they are supposed to be in the "off" state, infinitesimal currents still leak through them. That leakage warms them constantly, and with the extra heat generated when transistors switch "on" or "off," it produces a microfurnace on a chip. If chip makers had continued on their old path, by the year 2015, microprocessors would be throwing off more watts per square millimeter than the surface of the sun.

As a result, the industry has shifted from maximizing performance to maximizing performance per watt, mainly by putting more than one microprocessor on a single chip and running them all well below their top speed. Because the transistors are switching less frequently, the processors generate less heat. And because there are at least two hot spots on each chip, the heat is spread more evenly over it, so it's less damaging to the circuitry and easier to get rid of with fans and heat sinks.

Multicore processors on the market today are generally symmetrical—that is, they have two copies of essentially the same core on one chip. Cell, on the other hand, has an asymmetric architecture that contains two different kinds of cores [see photo, "Cell City Map"]. One, the Power processing element, is similar to the CPU in a Mac; it runs the Linux operating system and divides up work for the other eight processors to do. Those eight—called Synergistic processing elements—are designed specifically to juggle multimedia applications: video compression and decompression, encryption and decryption of copyrighted content, and, especially, rendering and modifying graphics.

The Synergistic elements were built from the ground up to do what are called single-precision floating-point calculations—the kind of operations needed for dazzling three-dimensional graphics and a host of other multimedia tasks. The design traded flexibility—a Synergistic element is not versatile enough to run the Linux operating system on its own—for eye-popping speed. When pushed to its 5.6-gigahertz limits, a single unit can do 44.8 billion single-precision floating-point calculations per second. Not wanting to cut Cell off from a role in scientific computing, its designers included circuitry in each Synergistic element that can do the more exacting calculations, called double-precision, that scientists demand, but its performance is only about one-tenth that of the single-precision unit.

In fact, the Synergistic elements are so fast that a single one could easily consume the entire bandwidth on the interconnects to the off-chip memory, leaving its siblings starved for data and stalled out. IBM and its partners had to design a special chunk of circuitry into Cell just to prevent that problem.

Apart from its raw power, Cell has content-protection tricks that should make it attractive to multimedia applications makers. For instance, the Synergistic element's architecture prevents any application or external device from accessing the element's local memory, so that, for instance, a program cannot steal a music file that is being decrypted by the processor. "Once you bring your code in and decrypt it, it can execute in a virtually trusted environment," says IBM's Cell architect Charles R. Johns. "All the data it calculates on, sends out, and brings in is fully protected."

The isolation function can be used in several ways, says Kahle. "We knew we couldn't anticipate all the different security needs in the future, but we wanted to know we had the right hardware to support a very robust security system."

Barry Minor's Mount Saint Helens simulator is a good example of how Cell's different processors work together. His program takes a satellite photo of the volcano, lines it up with an elevation map, and then turns it into a detailed 3-D terrain on the fly. The Mount Saint Helen's data has a resolution of 2.4 meters. The city of Austin, where the Cell design center is, once gave Minor access to its 15.4-centimeter-resolution satellite map. "You could land in Michael Dell's backyard and check out his view," Minor says with a grin.

What's happening inside the processor is a finely choreographed dance. The Power processing element starts by figuring out where the joystick is pointing the simulator in the stored 2-D maps. Then it divides that scene into 32 portions, four for each Synergistic element. Though perfectly capable of it, the Power processing element does no calculations on the actual data. Instead, it plays to its strength as a controller, figuring out which chunk of work should go to each of the other cores according to how complex the scene is and which cores have more or less time on their hands.

The Synergistic elements then go to work. They pull their portion of the data into their local memories, which they can access at great speed. Then each runs a rendering algorithm on the data and stores it off the chip in the system memory. When the processors are done, they signal the Power element, which instructs one of the synergistic units to run a video compression algorithm. That processor compresses its sister units' finished products and then pushes them out to be displayed on the screen or streamed to a PDA or some other device.

Because the compression takes less time than rendering the graphics, the compressing processor automatically switches gears when it's finished and runs the rendering algorithm on a portion of data until it's needed for compression again. With each frame, the process starts over.

This dance works so well for two reasons. The first has to do with the way Cell handles memory. Rather than waste several clock cycles waiting for the right data to arrive from memory, a Synergistic element works only on data stored in its own 256 kilobytes of memory, to which it has a high-bandwidth connection. More important, Cell's memory-handling engines can be programmed to keep data streaming through the processor. "We can get over 128 memory transactions going in flight at once," boasts Michael N. Day, a distinguished engineer at IBM.

The memory-access engine takes in new data and sends out the old just in time for the synergistic unit to perform the necessary calculations. When Cell runs Minor's volcano simulator, it waits for data to arrive from memory for only 1 percent of the time; the G5, in contrast, stands idle for about 40 percent of the time.

Cell's other key to speed has to do with breaking problems into parts that can be done in parallel. In Minor's simulation, it probably seems obvious that an image can be divided up into eight strips and these worked on independently. What wasn't so obvious was that the 3-D rendering could be done four pieces of data at a time within each synergistic processor. Such four-way parallel computing is called single instruction multiple data, or SIMD, and it is particularly well suited to the manipulation of graphics and other multimedia.

In these problems, you typically want to perform the same operation on each of the elements in a large chunk of data. For example, to increase the brightness of an image, you'd want to add the same number to every pixel in it. Since around the mid-1990s, general-purpose processors such as the Intel x86 architectures have been doing SIMD computing using a set of multimedia-specific instructions, explains Princeton's Lee, a multimedia instructions pioneer.

But SIMD instructions run far faster on Cell's Synergistic processors, because the Cell processors were designed from the start to handle them. And don't forget: there are eight such processors on each chip. Cell programmers spend most of their time turning complex algorithms into efficient SIMD algorithms, says Minor. "Once you've done that, you're 80 percent done."

The chip's commercial success will depend on whether programmers can learn to exploit its full potential. To that end, the developers have from the beginning put a high priority on crafting the appropriate software tools.

One of the key deadlines the Cell development team had to meet was having its software ready and tested in time for the arrival of the first chips, in spring 2004. The software team was running programs on a Cell simulator two full years before it got the first chip—and when the chip finally arrived, both the operating system and the applications worked on the first try. "Had we waited to do software development until the chip came back, it would have been a disaster," says Theodore R. Maeurer, software manager at IBM.

With such a head start on the software, the group could focus on how to familiarize new programmers with Cell. "A programmer has to do a really nice job of laying out the data transfers and so forth," says Day. But soon that job will be turned over to the compiler and the programming tools. IBM software engineers are also developing tools that will make it easier for programmers to divide tasks between the Power element and the Synergistic cores, and they're making others to automatically find solutions to problems that fit well with the Synergistic units' SIMD strengths. The company has already released more than 700 pages of documents to applications developers and will begin releasing tools and compilers, as well.


Cell's asymmetric architecture signals the beginning of a big shift in how computers are programmed, says Craig Steffen, a senior research scientist at the National Center for Supercomputing Applications, Urbana-Champaign, Ill., who gained some fame lashing together 70 PlayStation 2 consoles to form a $50 000 supercomputer.

"How do you program with eight engines running full speed without them constantly stopping and waiting for data?" Steffen asks. Cell will force mainstream programmers to wrestle with that question. But ultimately, parallel programming will become fairly routine, he predicts. "Over the next several years, we won't think of an asymmetric processor as anything different."

Indeed, some think Cell is an indication of what's to come in other microprocessors. "In the future, we'll see convergence of general-purpose multiprocessors and game- and media-oriented processors," says Princeton's Lee. "Media processors will become more general purpose, and general purpose, more multimedia." And with any luck, that will make your living room a more entertaining place.

Cell Microprocessor

Goal: Make a new microprocessor architecture that beats all others at handling graphics and broadband multimedia.

Why it's a winner: It met that goal and is being designed into high-volume mass-market items like game consoles and televisions.

Organizations: IBM, Sony, and Toshiba.

Center of activity: Austin, Texas.

Number of people on the project: 400 at its peak.

Budget: US $400 million.