Welcome to the apocalyptic edition of GameSpotting, in which our editors spill their guts about games and everything...one final time. Yes, that's right. After nearly four years of weekly installments, GameSpotting is done. "Why," you ask. "Why must this mortal coil torture me thus, as to wrest my beloved GameSpotting from my heart and body? Alas! Would that I could have been cut down in my prime instead!" Oh, go on now, dry those tears, sonny boy. It's not like we won't continue taking time out of your busy schedule to regale you with litanies of complaints and occasionally-involuntary insights about gaming. It just isn't going to happen here anymore, because this old place, it has passed. Thanks for joining us as we put GameSpotting out to pasture, proper.
Jeff Gerstmann/Senior Editor
"If this business is going to keep on taking money away from the movie and TV business, it needs to be aiming higher than reflective surfaces and two tons of light bloom."
Ricardo Torres/Senior Editor
"I think the competition between the next generation of platforms may end up feeling a lot like the now-classic battle between the Super Nintendo and the Genesis back in the 16-bit day."
Alex Navarro/Associate Editor
"If there's one thing I hear more often than anything else out of the hardcore gaming set (and it upsets me to no end), it's one tirade or another about how much they hate casual gamers. You know? People like me."
Jason Ocampo/Associate PC Editor
"All Tor could do was whoop it up and holler as every little detail came in about how the consoles kick ass. But you know what? I'm not taking it any longer."
Brian Ekberg/Sports Editor
"Imagine a Mario Bros. game as designed by Hideo Kojima. How would playing the next Madden NFL game change if you mixed in Valve's vaunted Half-Life 2 physics model?"
Justin Calvert/Associate Editor
"Since The Streets Of is a game that exists only in my own mind, it seems reasonable to assume that the following preview is a GameSpot exclusive."
Steve Palley/Chief Editor, Mobile Games
"We can infer a lot about the stereotypical gamer simply by looking at the types of products that have won the money race over the past few years, putting on our Masters in Business Administration caps and working backwards."
Avery Score/Assistant Editor, Mobile Games
"While transitioning to the next generation of hardware, console manufacturers have announced plans to strip their new systems of features and functionality we'd taken for granted."
Greg Mueller/Associate Editor
"There are dozens of factors that contribute to the success or failure of a system launch. But in the end, any gamer will tell you that it's not the color of the box, the shape of the controller, or the system specs that sell a console. It's the games."
Kurt Collins/Associate Software Engineer
"It's ridiculous to think that while games have changed drastically over the years, we've somehow managed to get rid of intuitive interfaces in favor of more-complicated setups involving multiple keyboard configurations and seven-button mice."
Adam Buchen/Associate Software Engineer
"As long as you ignore the rules of logic and sensibility, you will never have to worry about whether or not you made a good investment. That sure helps all the insecurity go away, doesn't it?"
| Greg Kasavin|
Confessions of a Poseur
One of the most cynical expressions I've ever heard goes like this: "Those who can, do. Those who can't, teach." Some of the most world-weary purveyors of this belief also like to provide the following addendum: "And those who can't teach, criticize."
To them I say: Fair enough. The first rule of being a critic is, "Be able to take at least as much as you're willing to dish out." The only way for critics to contend with society's natural mistrust and disdain for their willingness to act authoritatively, without having any real credentials to do so (in most cases), is to be accountable for their commentaries and to communicate with conviction. They need to stick to their guns. At any rate, I'm not really opposed to all the negative views of this profession, despite being in it. More to the point, I happen to believe that the art of designing games is inherently nobler than the art of critiquing them. Put everybody on my scale, and it'd have a bad game designer on about the level of a good game critic. Why do I critique games? Because I can't make them myself, and because I can't teach anyone how to make them either. I prove the cynical theory. I'm a cliché.
Critics always open themselves up to retorts to the effect of, "Well, if you're so smart, let's see you try to do better." Such responses are predictable and understandable. After all, what drives the critic, if not a cynicism similar to what's behind the "those who can't, teach" expression? Do critics get off on dragging the hard work of others through the mud? Or maybe criticism is their only outlet for venting an underlying, even subconscious, envy at not being in the position (or not even being able) to create that which they've focused on so intently.
You could say that last thing about me. I'd certainly say it if I was to critique myself as though it (meaning "I") was "just" a game. People are just as deserving a subject of criticism as anything, after all. So there it is: I'm no critic. I'm just a poseur who's come as close as possible to that which I admire. Ten years into my career, I still feel guilty getting to do what I do for a living.
As mentioned, one of the problems with critics is that they tend to present themselves as authority figures without necessarily having any legitimate credentials to give them authority. There's no barrier to entry in this line of work, ergo everyone's a critic. Sure, I've spent a lot of time playing games. But what right do I have to deem them successes or failures, in the grand scheme of things, when all I ever have to go on is just my own personal experience, colored by my own private subjectivity?
Certainly I've wondered what it must be like to work on the "other" side. I've wondered a lot about what it must be like to spend years focused on a single project, as opposed to having the luxury of being able to move through games (and therefore different experiences) as quickly as possible. Never a dull moment. For the most part, my few close acquaintances who've worked on games have flatly told me it's not worth it--too much toil, too many late nights for something that's such a gamble. I understand. I nod and smile and express my sympathy. But it's not like they can scare me. I've never thought the process of making games would be a bed of roses, and in fact, the main reason I admire the profession as much as I do is because the tireless work ethic that many game developers have in common appeals to me. So I've wondered whether I have what it takes to be a game designer. Do I dare consider myself among the ranks of those people whose work I critique for a living?
There came a point when I more than just wanted to know the answer to that question. Specifically, I challenged myself to actually formulate an original game design document. That is, I decided to think through an idea of mine and commit it to writing, spelling it out in such a way so that anyone else could read the draft and hopefully get as excited about the idea as I would. They'd picture the game I had in mind as clearly as I could.
The catalyst for this personal challenge was the Developer for a Day contest hosted on GameSpot's own community message boards--an informal-but-serious contest (featuring a respectable cash prize) intended for would-be game designers willing to participate in a voluntary test that would see their work judged by a panel of so-called experts: namely, GameSpot editors. To make a long story short, and to confess to something I've never told even to my closest colleagues, I entered that contest for two consecutive years under a pseudonym. Let me explain.
The first year of the contest was 2003. I agreed to be one of the judges and was impressed with the quality and dedication evident in the work that was submitted. It was at around this time that it occurred to me that, for once, I had the opportunity to be a doer rather than a critic. The following year, when Developer for a Day 2 was getting started, I informed my colleague and contest proprietor, Adam Buchen, that I could no longer participate in the judging. Of course, I didn't get too specific about why.
I wanted to see if I had what it took to win the contest. However, I obviously couldn't enter using my real identity, because there's no way that the judges could be expected to objectively evaluate my work if they knew it was mine. And furthermore, the other contest participants would justifiably be able to cry foul play. I was also concerned that some might think I had an unfair advantage in the contest, since, being a GameSpot editor myself, arguably I have an awareness of what the other judges might specifically be looking for. Plus, I also write about and evaluate games for a living. What I didn't want to do--and what I sincerely hope won't happen as a result of my confession here--was to subvert the contest in any way. It's a great chance for any would-be game designers to feel like they're in that element.
Eventually, I reconciled my decision in several ways: One, I would enter the contest anonymously and only with the intent to take it seriously. Thus my work would hopefully help legitimize the contest itself as something worthwhile. I already had a lot of respect for the grassroots effort that went into it and wanted to contribute again in a meaningful way. Two, I, in fact, had no unfair advantage over the intelligent, resourceful members of the community who would be my competition. The judges of the contest wouldn't be announced until the submissions were finished. And, presumably, none of us had any real previous experience working on games or game design documents. I sure didn't. Three, it was clearly stated that the quality of the prose wasn't key to the contest. So my ability to write flowery sentences wouldn't help me much if I couldn't come up with well-formulated ideas. And four: Frankly, I wasn't breaking any of the contest rules anyway.
My 2004 contest entry was an adaptation of my favorite fantasy novella, something I've always thought would make a great video game. I combined it with ideas I've had about the action adventure and fighting genres, along with my interest in melodramatic, self-important storytelling (begging your pardon). I won the contest that year, a thrilling moment that I was hoping for but wasn't expecting. I was elated, certain that my concept could make for an outstanding game. Like all feelings, this one was fleeting, though it lasted for a good couple of weeks.
The following year--this year--I decided I'd enter the contest again to see if I could go two for two. Some of my critics from the previous year surely thought I was a one-trick pony, and so I was motivated to prove them wrong. There's nothing like the potential to prove people wrong. This time I spelled out a more original idea for a comedic action adventure game styled after teen-geek movies from the '80s. I wanted to "show my range as an author" with this second concept. I produced it much more quickly and fluidly than the first one, resulting in a document I felt better about than the last one. And it too ended up taking home the grand prize, though the competition was fierce. (The contestants had read the winning entries from the previous two years, after all.) And in case you were wondering... No, I never claimed Adam's prize money. September23 sincerely told him, in so many words, that the journey was its own reward.
What was I hoping to accomplish with this two-year experiment? I didn't really know at the time. This, then, is perhaps what I was trying to accomplish.
I try to rationalize my actions as sort of a wannabe Fast Times at Ridgemont High type of thing, prior to which director/author Cameron Crowe actually posed as a high school student for a year in order to better inform his work. And then there's the less-idolized '80s classic, Just One of the Guys, a similar source of inspiration. Ultimately, I simply hope all this makes for interesting reading. And if you proceed to take a look at the design documents, I hope they give you more insight as to where I'm coming from as a critic: This is what I'm into and how I am. And for whenever the next time your reaction to one of my reviews is, "If you're so smart, let's see you try to do better," well, there you go.
As for me, what did I get out of the experience? I got the satisfaction of seeing a lengthy personal project through from start to finish, and I got to step out of the role I play on a daily basis at this site by doing something a little different. I got the experience of trying to think about a game design holistically. I came closer than ever to realizing how tempting it must be to keep adding new features while working within the constraints of the reality of a project that has a limited budget of time and other resources. I got the benefit of critical feedback from the community, as well as from my colleagues--the toughest, best game critics out there--about my ideas and my presentation. Afterwards, I even submitted my documents to the few acquaintances I know personally in the development community to get their input. Theirs was a reality check. "That's nice, Greg. But you know games aren't built on paper."
They're right, of course. Great games aren't built on paper. And for that matter, they're not built by individuals. They're built by great teams. Great big teams. A good concept is merely a waste without good execution. And good execution is no weekend project. It takes years of toil and tons of money. Most of those of us who think we'd like to make games merely like the idea of it.
So, at the end of the day, I still have no compelling evidence to suggest I'd be capable of being a game designer--even a bad one. I have no reason to think I'd be capable of doing a better job on the "other" side than the sort of work I've been training in for about a decade. And so the contest was everything and nothing to me; it was literally the pursuit of a dream. This pursuit is something that's always motivated me to be involved in this field, and to deeply appreciate those who create the work that I then criticize using much less time and effort.
In the past, I've debated both for and against the idea that games could be considered an art form. For my final column in this format--a format intended to be personal but specific to games--it's only fitting that I settle the debate at least for my own edification, if not for yours: There is no nobler pursuit than making games. Not for my time and money there isn't. I am and forever will be a humble servant of this art in whatever capacity.
| Jeff Gerstmann|
You Are Being Lied To
"If this business is going to keep on taking money away from the movie and TV business, it needs to be aiming higher than reflective surfaces and two tons of light bloom."
As we move from one generation of gaming hardware to the next, right now the only real piece of next-generation content we have to look at are the graphical aspects. While we've seen playable Xbox 360 games, they aren't really in a state where you can accurately judge their gameplay...or their graphics, for that matter. As for the PlayStation 3, all we've really seen is a series of videos that may or may not be real. Graphics, as you probably understand, are important. They draw you in, right off the bat, before you've figured out if a game plays well or not. But there's a dark, seedy underbelly to game graphics. While I wouldn't classify it as purposeful deceit, I can't help but feel like we're all being lied to. No, I'm not talking about the water-cooler question du jour about if that silly-ass Killzone movie is real or not. The lie goes much deeper than that. The lie's been going on since 1984.
In 1984, Universal released a movie called The Last Starfighter. While most circles wouldn't really regard the film as a classic, it's got a slightly higher profile to people who have spent the last 20-plus years playing games. I was nine when it was released, and at the time, it was awesome. Having just rewatched the movie recently, one thing stood out to me. God damn, that ship looks good! Too good. The Gunstar was well-lit, with great use of shadows, and get this: the edges of the ship? Totally smooth. No jagged edges. No aliasing whatsoever. On top of that, the ship moved with a startling amount of fluidity. No need to stop and try to guess at the frame rate of the footage. It simply qualifies as "smooth."
The Gunstar and a handful of other spacecraft and scenes were rendered by a crazy supercomputer from Cray. The Cray X-MP was said to perform 180 million calculations per second, and like any good supercomputer from the '80s, it weighed thousands of pounds. Today's computers, as you might expect, make the Cray machine look like a calculator with a knife sticking out of it. Totally worthless, at least on paper. But in practice, the CGI footage shown in The Last Starfighter and other movies of the era looks better than the games of today. How the hell is that even possible?
Today, we seem to be constantly worried about the same sorts of problems with game visuals. Does the game run at a smooth rate? Nothing will rip you out of an immersed state faster than a chunky, wild frame rate. How do the models look? How's the animation? Do things move smoothly? What's the image fidelity like? Do the models and environments look smooth and lifelike? Most modern games seem to scrape by, but let's just be clear. Even today's best-looking games are sort of all over the place. On a high-end supermachine, Battlefield 2 still occasionally bogs down a slight bit. Grand Theft Auto: San Andreas, one of the most popular games around, sports jagged-edged characters and draw-in that might be fine by today's standards, but anyone who doesn't play games on a regular basis probably wonders why the hell the cars take so long to show up onscreen. Halo 2 has noticeable level-of-detail issues. If the Cray could render such flawless-looking ships in 1984, why is any of this still a problem?
Larger areas are part of the problem. Obviously, creating and animating one or a small handful of ships against a black-star background isn't as taxing as rendering the inside of some Covenant outpost. Plus, the Gunstar stuff wasn't rendered in real time. But after watching the movie, I couldn't help but think that 20 years of technological improvements should at least have gotten us to the fidelity level of a 1984 sci-fi film. Yet here we are--2005, new consoles on the horizon, new PC technology in development--and we're still wondering about the same stuff. Meanwhile, someone's working on a processor that will specifically handle physics modeling. OK, fine. How about a processor that just says "Sixty frames per second at all times." on it, while we're at it? If I were a console manufacturer, that'd be the first step of my approval process, because I feel that's totally key to the experience, especially when trying to draw in people that haven't been playing games for decades and understand the concept of dropped frames. If you're trying to build a human reactor with G3n3rat10n R3m1xXx's world-famous VelocityGirl at the center, that reactor should be locked at 60.
So there you have it. The new high-water mark is more than two decades old. All your games look like garbage. Photo-realism is a weak goal. It's "heck no, we won't go" time, and we ain't leavin' until there are no jaggies, no choppy frame rates, and no more wooden, lifeless characters. If this business is going to keep on taking money away from the movie and TV business, it needs to be aiming higher than reflective surfaces and two tons of light bloom. Otherwise this next generation is going to end up looking a whole lot like--oh, I don't know--the PC games of today. Now that's a weak goal. Go buy The Last Starfighter on DVD right now.
| Andrew Park|
When Looks Aren't Enough
Stop me if you've heard this before, but the way a game looks is important. Unfortunately, there are still people out there (maybe even some of you) who will read the previous sentence and immediately blurt out, "I'd rather have a game that's fun" or "Gameplay is more important than graphics." For the past, I'd say, six or seven years or so (at least), you would have been completely wrong, because that kind of thinking suggests that, somehow, you can't have a game that both looks great and plays well. To prove you wrong, I could have made a list of games--on all major platforms--as long as your arm, where each game had excellent production values (graphics, voice acting, music, special effects, and so on) for its time and each was also fantastic to play, in addition to being that much better-looking and better-sounding. That's not the point of this article, however.
Some years ago, I met with Revolution Studios' Charles Cecil at the now-extinct ECTS game show in London, England. Mr. Cecil was very excited to talk with me about the game his studio was working on, and for some reason, he talked for quite a while about how his game's characters had a highly stylized, cartoonlike look to them. In his words (I'm paraphrasing), "It was crucial for the studio to make a game that looked different, because so many other games in development were desperately attempting to have photo-realistic graphics. And as a result, all these games tended to look alike. As a result, each game would have a much tougher time distinguishing itself from other games to catch the eye of people who were looking to pick up a new game." He was right, of course. Fast-forward a few years and we have games (and
That's not to say that graphics aren't important or that they don't have room to improve (they just have far less ground to cover). They're still the first thing anyone looks at, literally, the first time anyone sees a game, and they remain the most reliable way to catch someone's attention. Unfortunately, they're also the most expensive and time-consuming thing to get right for most games. And considering that there's a new generation of console hardware on the horizon and a new generation of PC processor hardware already upon us, are we going to start seeing a new crop of games that will be harder to tell apart? Obviously, most upcoming games will take advantage of DirectX 9.0 native features, including pixel shaders and per-pixel lighting, and if this year's GDC is any indication, many new games will put serious emphasis on object physics--items that can be realistically moved and manipulated within a game, like in Half-Life 2 or Silent Storm. Because the future of console hardware, most notably the PlayStation 3, remains an X factor, more studios are now looking to middleware, like Unreal technology, to get them up and running quickly on the new platforms, thereby letting them, hopefully, face a much gentler learning curve.
With the increasing costs of games (especially on the new hardware), I'm pretty sure we'll see more game companies attempting to mitigate their risks (or recoup their losses) by going multiplatform (and by "multiplatform" I mean the "Xbox 360 and PC"). We'll also probably see more sequels, because it's already insanely difficult to launch an all-new property, and it will be even more difficult once the barrier of entry is raised even further with the advent of more-powerful hardware that also requires more time, effort, and money to develop for. I wouldn't be surprised if we also see games getting shorter on average, just because creating all that content (levels, character models, textures, animations, and so on) for games more than 10 hours will take much longer and cost a mint.
Unfortunately, while graphics can get people to look, they're not as effective in getting people to stay and play. What's going to set them apart? I'll go on record right now and say that I seriously doubt the use of in-game physics is going to help any games stand out in the next generation. In fact, I'm pretty sure that most new games will use physics only to just look pretty or as a gimmick, rather than making physics actually mean something in the game. Because a lot of studios may need to make sequels and spin-offs to help them cover their initial costs, it seems like it will be more and more important to focus on things like fan communities and editing tools that fans can use to give the game a much longer lifetime, or at least, long enough to keep people interested until the sequel rolls around. (For this reason, it seems that niche games might still be feasible, as long as game companies can actually reach their intended audiences and make the games those fans want.) If you wanted to wax poetic, you might even say that creating a successful game requires companies to try to create not just an entertainment product, but a living, breathing entity that has to expand beyond just hitting the store shelves by effectively moving into fan-community, mod-making, fan-content, and follow-up-game territories. Maybe the so-called "emergent" gameplay (games that don't use predetermined, scripted events, but instead let you create your own experience in an open, sandbox-like world) will help set games apart. For example, the Grand Theft Auto series continues to be incredibly successful, and at least one upcoming game in particular, Spore, seems to offer almost unlimited possibilities. I don't know if we'll ever truly get past graphics being as important as they are and as difficult and expensive to produce as they are (and will become), but hopefully, at some point, we can get past graphics and start focusing on other things. But that won't be for a while, if it happens.
| Ricardo Torres|
So another transition year is upon us. But unlike the previous changing of the guard between console generations, this one feels a bit off. Part of the problem is, in the case of the Xbox, Microsoft has gone off and taken Old Yeller out back and shot him in the back of the head too soon. On the one hand, it's as good a time as any to shift first-party development weight to the new platform, meaning no more current-gen stuff from MS, given the change in graphics chip manufacturers. Probably good business, too, since it's not like MS wants to be juggling two sets of partners working on two different platforms. However, it's still a shame, because the Xbox was really hitting its groove in the way a mature platform does once it's been out on the market for a while. This year saw the release of some really strong content for the platform that showed off its considerable power. If things had worked out differently, odds are they would have had a killer Christmas this year, with people snapping up Xboxes, which would have only helped their next Box. Instead, I expect they're going to have a weird holiday season with folks clamoring for 360s...and the Xbox probably getting left in its shadow. In that regard, there's nobody to blame but us...well, at least the sizable chunk of hardcore gamers who'll adopt quickly and early when there's new hardware. For better or worse, we all love shiny cool-looking things, and there's rarely anything shinier or cooler than new hardware.
But here's the thing about the promise of the next generation, which will surely see its fair share of epic bickering across IRC, forums, and blogs across the globe as gamers argue over who's kicking whose ass because of their platform's specs or even finer bits of minutiae: I think this next round of the console wars is going to end up being heavy on the déjà vu. This may well be the generation of platforms where the hardware isn't as relevant as it used to be. What does that mean? (Other than the fact that people may think I've lost my mind.) Well, consider this: The next generation of consoles are all fairly similar in terms of power and ability. Granted, the jury's still out on exactly how the Revolution is going to pan out, but I don't doubt there will be some amazing-looking games on it. In any case, the Xbox 360 and PlayStation 3 should be in the same ballpark in terms of power.
Each will obviously have its strengths and weaknesses, but at the end of the day, it's going to be about the content on the platform rather than a collection of specs on a piece of paper. This is why I think the competition between the next generation of platforms may end up feeling a lot like the now-classic battle between the Super Nintendo and the Genesis back in the 16-bit day. Both platforms were extremely capable machines that were basically equals, give or take a color palette or processor speed. Without any overwhelming superiority, developers were forced to focus on making the content on their machines unique and inventive. No matter how much each hardware manufacturer crows about its box's specs, the proof will be in the games. I can't imagine caring too much about the polygon-pushing might of any of the upcoming platforms, because they're all pretty freaking powerful. I expect the majority of games on the next boxes will look good, regardless of their poly counts. Yes, we'll still see some crappy-looking games, but most developers know how to make a decent-looking 3D game. At this point, it's going to come down to the very basic question of: Who's got the best games?
The wild card to the above theory will be third-party support. Platform exclusivity has become far less common, as publishers try to find the sweet spot between going wide with a title or sticking to one platform in order to make the most cash. We won't know how that's going to pan out for a while. Once developers start to get a proper feel for what developing for all the boxes is like, we might see some publishers choosing to focus on one platform. We might even see a greater reduction in the amount of platform-exclusive titles. Basically, however, it all depends on ease of development and the costs that go along with making a next-gen game.
Yes, this is all crazy theoretical talk, since all the boxes won't be out for another year or so, but it's worth some ruminating. The stage is set for arguably one of the most interesting and pivotal battles our industry has ever seen. Will Microsoft take out Sony? Will Sony be able to have such an overwhelming lead over its competitors? Where will Nintendo be in all this? At this point, it's anyone's guess. But I'd say in the next battle, what you play on the consoles will likely be more important than what's under the hood.
| Alex Navarro|
Casual Every Day
I've been playing games for an exceedingly long period of time. I'm not an original, mind you. I came into this whole thing during the NES days. But since then, I've played me the hell out of some games. Admittedly, though, it wasn't until I got this job that I really got to see what the hardcore gaming audience was really like. Throughout my early years, I didn't really talk about games much with anyone else, save for a few friends I'd play games like Contra and Life Force with. It wasn't that I was ashamed of being into games or anything--hell, at that point in time, having a Nintendo Entertainment System made you one of the most popular kids around--it's just that I didn't really think of it as anything more than a hobby...a way to pass the time during dull summer days, and nothing more. In fact, I didn't even really start getting into the gaming "scene" until my teenage years, when I started writing for fan sites, again, mostly just as something to do. I'd start reading these Internet forums dedicated to the specific games I was writing about, and I'd marvel that there were people with these utterly insane levels of devotion to things like wrestling games, and RPGs, and what have you. I kind of didn't know what to make of it, really.
Once I started working here, the size and scope of people's devotion to their favorite games finally became clear as day to me. Gaming isn't just a hobby to a lot of people; it's a lifestyle. I know that's probably not anything even close to resembling a revelation to those reading this column, but it wasn't something I was altogether familiar with prior to my employment here. I was just a guy who liked games and liked writing (you can see the attraction to the profession, then). I wasn't entrenched in any major gaming scenes, I barely had anyone I talked to about games regularly, and, frankly, I just wasn't all that hardcore about them.
I read our forums and hear about people spending thousands of dollars to upgrade their PCs just to play one game or another, as well as people going out and buying dozens of games in seemingly minuscule periods of time. I also hear about people who get incredibly huffy when one negative thing or another comes out in the news or in reviews about a game they've been desperately anticipating for God knows how long. I've since gotten used to these typical bits of dialogue between members of our audience, and very little shocks me these days. However, if there's one thing I hear more often than anything else out of the hardcore gaming set (and it upsets me to no end), it's one tirade or another about how much they hate casual gamers. You know, the kinds of people who only play Grand Theft Auto clones and Madden? The kind of people whose short attention spans have led to the shortening and simplification of games? The deadly plague that threatens to tear the gaming industry apart, and throw it into a pit of overly commercialized doom the depths of which are utterly inescapable? You know? People like me.
OK, so those statements are basically brash generalizations, and I don't think I'm exactly the definition of a casual gamer. I own all three major consoles, most handhelds, and a fairly competent PC. I don't just play Madden or GTA. And my attention span does last for more than five hours' worth of gaming. However, on the flip side, I don't think I really qualify under the "hardcore" definition either. The idea of a 50-hour RPG is just about the most unappealing thing I can think of; I don't really buy all that many games (fewer than 10 since the calendar year began); and I happen to like some of that trite commercial junk that some of you out there claim to be the Devil incarnate. One of my absolute favorite games of this year (and of this console generation) is God of War. It seems like, for the most part, people have latched onto it as well, so it's not like it's just me. However, I've read plenty of snide comments from people talking about how it isn't hard enough and how it's way too short (it took me around 15 hours to beat it, incidentally), aside from just going nuts over really nitpicky stuff, like how it has a sexually themed minigame and whatnot. I'll admit that God of War is neither overly challenging nor overly long, but that's what I loved about it. It provided a solid challenge; it was just the right length to keep me interested but didn't wear me down; and the whole thing was just fun from beginning to end. But, hey, I also like wrestling games and football, so my opinion's meaningless, right? No, I'm not just being dramatic. That exact sentiment has been sent my way a couple of times via e-mail. Harumph.
I suppose I shouldn't take it personally. The casual gaming market really has become extremely big in recent years, and it has pushed some quality titles, aimed at a more dedicated audience, off to the side. Another one of my favorite games this year, Psychonauts, straight-up sold like ass its first month. That sucks. But, honestly, I can't say I'm surprised. For as awesome as that game is, most people are just going to look at that box art, raise an eyebrow, and then look for something more recognizable. It's unfortunate that it's come to that, but facts are facts: People flock toward recognizable names and franchises, as well as what looks real purdy on the TV ads. That doesn't mean the game industry is broken; it just means that it's finally grown up, and it can now lump itself in with all the other entertainment mediums that cater to the masses. Maybe it sucks for the hardcore, but in order for this industry to survive on the level it's aiming for, it's going to need that casual money.
I'm sure I'll get at least a few e-mails from folks telling me, "Whatever, man. You're just a sellout. You don't understand real games. You just play popular crap. You're the problem, man, not the solution!" That's fine. Just realize that claiming to be into the "real" stuff--the hardcore, the rebellious aspect of gaming--doesn't make you special. It just makes you the gaming equivalent of all those punk-rocker wannabes who shop at Hot Topic, get pissed when their favorite bands get played on the radio, and desperately try to be counterculture, only to end up being another brand of conformist regardless. You "comic book guy" wannabes who refused to even play Mercenaries simply because of its goofy subtitle, "Playground of Destruction," aren't doing anything except robbing yourselves of a potentially enjoyable gaming experience. Hate the game because of what it actually is, not because of what you presume to think it represents. Unless all you're doing is downloading home-brew games by crazy people who refuse to make money from any mainstream publisher, you're not really all that different from those of us who buy Madden every year. There is no publisher that's not out to make money, and more often than not, the same publishers that are putting out the commercialized crap you love to lobby against are the same ones that are putting out equally esoteric games that please the hardcore audience.
What I want people to understand is that the casual gaming audience is not your enemy. Just because someone buys Madden every year and doesn't understand your fascination with Atlus-published RPGs doesn't make that person your enemy. Sure, you may not want to hang out with that person necessarily, but you should at least respect the fact that he or she also enjoys games, even if it is on a more modest and mainstream scale than perhaps you do. Casualness by itself is not going to be the downfall of this industry. In fact, it's probably going to help it in the long run. Hardcore players and casual players can coexist in this industry, and, in fact, they'll have to if it is to continue to succeed.
| Ryan Davis|
When Games Go on Vacation
Just as you can predict the end of the regular school year, the arrival of Jerry Bruckheimer movies, county fairs, illegal firecrackers, and the beginning of summer school, it has come to be expected that video game releases during the summertime come to a standstill. But I'm not here to bellyache about how foolhardy this is and how everyone's on vacation and whatever NPD sales data I can drag up about Star Wars: Knights of the Old Republic. Companies release all of their games at the end of the year because they have massive research evidence to back up the idea that, even in the horribly overcrowded holiday season, games sell better than they do in the summer, regardless of their quality. End of discussion. But the practical question remains, "Just what the hell are you supposed to do with all your video game systems during the dry summer months?"
You could replay those games you've already finished. (I've got a short stack of games that I wouldn't mind playing through a second time, including Alien Hominid, Mercenaries, and Psychonauts.) And, of course, then there are those games that I never got around to finishing in the first place--a list longer than I'd care to admit, and definitely longer than I'll commit to in this column. But even with these stacks of unplayed and highly replayable games staring me down, I've been more interested in flexing some of their secondary skills.
Take my PSP, for example. Ever since I found my ceiling in Lumines (which I am loathe to admit is a shamefully pathetic 98,000 points) and my copy of Tiger Woods PGA Tour mysteriously stopped working, there's been a distinct shortage of interesting games. Coupled with the fact that UMD movies are still ridiculously costly (I mean, seriously, how can Miramax justify charging $30 for Kill Bill on UMD when I can pick up the DVD for $15? The whole thing whiffs of MiniDisc.), the primary job of my PSP has gone from game machine to portable VCR. Through the machinations of my home Wi-Fi network, TiVoToGo, my 1GB Memory Stick Duo Pro, and a little personal-computing black magic, my bus rides to and from work have been made infinitely more tolerable now that I can watch last night's episode of Entourage or catch up on my backlog of Deadwood episodes. The process of getting TV shows from my TiVo onto my PSP is an arduous one--requiring a ramshackle suite of funky third-party and home-brewed software and more than a little bit of trial and error--but the point is that it works.
Similarly, I recently messed around with a PSP that someone had gotten the original Doom running on. Doom has been popping up on all manner of unexpected platforms ever since id Software started giving away the source code some years ago, so I wasn't surprised to see a jerry-rigged version of it running on the PSP--though, I will admit that it made me wonder how long it would be until someone constructed a Doom 3 WAD for the original Doom engine. Getting Doom to load onto a PSP requires two Memory Sticks and the nerve to swap out one for the other in the middle of the loading process. This is not for the faint of heart, and I have yet to roll the dice and take this risk with my own PSP.
What's most interesting to me is that all this gray-area use of console hardware is being incorporated into the infrastructure of the next-gen consoles. Hackers have been using mod chips to stream video and music from their PCs to their Xboxes for a while now, and this very functionality is one of the more highly touted secondary capabilities of the Xbox 360. Console hardware emulators have long been the bane of old video game copyright holders. Emulation of the NES, SNES, and even the N64 runs rampant on a wide variety of platforms, so Nintendo has decided that rather than sit by and watch pirates play their games for free, they'll let you emulate those games on the Revolution--for a modest fee, of course.
I guess I'm a little surprised, and maybe even impressed, that these massive corporations are taking some of their plays from pirates and hackers--the kinds of folks that they've been openly vilifying for years now. But by doing so, video game manufacturers are closer than they've ever been to capturing the dream of an all-in-one set-top box, a dream that people have been chasing since Trip Hawkins frothed at the mouth while preaching the revolutionary wonders of the 3DO. This is all very exciting for the evolution of the console video game business. But frankly, I'd probably be happier if they just learned to put out more games during the summer.
| Brad Shoemaker|
Jumped the Gun
It's now an unavoidable fact that the next generation of consoles will start this very year--in 2005--not next year, as tradition would dictate. If it seems like only a few years have passed since we received the newest members of this generation (those being the Xbox and GameCube--released just days apart), well, that's because it's only been a few years. Those systems came out in late 2001, and I happen to think that we could get at least another year out of them and the slightly more aged workhorse, the PlayStation 2, than we're being given. But like it or not, Microsoft is forcibly birthing the next-next big thing sooner rather than later, and we, as the faithful hardcore, must capitulate to Bill Gates' new digital order...or languish on diminishing technology.
New consoles would have come out eventually, of course. That's the way of things. But I find it a little ironic that Microsoft, which currently has the most technically capable console, is now rushing to market with its next one, which will likely render the Xbox 360 inferior in whatever respects to the PlayStation 3. That's also the way of things: first out of the gate means more time for your competitor to one-up you. What happens right now when a new multiplatform game comes out? We say, "I'll get the Xbox one. I'll wait for the Xbox port. That game? Looks best on the Xbox." Will it strike anyone as strange to say, "I'm waiting for the PlayStation version."?
Everybody and his or her brother (except for Nintendo, God bless 'em) has been preaching for years about the inevitable convergence of digital vectors into one all-inclusive device that will sit atop your television. Gaming, home theater, home computing... One day it'll all come from the same place, they say, and it will be glorious. Somehow, it seems the video game console has been appointed the vessel of this new revolution. The PlayStation and Nintendo 64 were the last pure consoles. Even the Dreamcast had a modem. Now you buy a console and it has any combination of Ethernet, hard drives, USB, powered by ATI, and Intel Inside. It's not hard to imagine the day when the game machine will be indistinguishable from the TiVo, the PC, and the stereo receiver.
I used to be staunchly against this sort of full-scale media integration with my console. You buy this little box to do the one thing it does best: play games. You watch movies on that box there, and you play music with this other one. Keep it modular and you can swap out the ones that are outmoded or just plain broken down. In light of that fact, I'm surprised at how drawn I suddenly am to Microsoft's Xbox 360 media implementation. Maybe it's the seamless switching from music to high-def television to movies, back to games and your persistent online profile. Maybe it's the idea of networking to my home PC or the possibility, however distant, of adding in DVR functionality. Maybe I'm just sick of juggling a bunch of remote controls.
People seemed really down on the 360 after Sony's E3 press-conference-to-end-all-press-conferences, but I can't really figure out why. Think about it this way: Microsoft and its developers were all showing off real games running on unfinished hardware, all of which has yet another good several months in which to mature before it hits shelves. Sony had the luxury of showing off what some consider to be smoke-and-mirrors tech demos, and we certainly know from past experience that final product is rarely, if ever, exemplified by tech demos (I'm still waiting for real-time Toy Story on the PS2, thanks). So it wouldn't surprise me much if the two systems equalize to some extent, when all is said and done.
Of course, the point has been made (and belabored ad nauseam) in the past weeks that game design is neither evolving nor keeping up with the speed of hardware development. Some of the next-gen stuff they've been showing off looks mildly cool, but nothing is blowing my mind the way I'd hoped...certainly not to the degree that the last consoles did. Need for Speed: Most Wanted was (I thought) the best-looking playable 360 game at E3, but the demo was literally about 30 seconds long, and it didn't seem to be doing anything from a gameplay standpoint you couldn't do on existing machines. And many of the upcoming 360 games are just games that are also coming to the PS2 or Xbox but are slightly gussied up for their capable new platform. Great games will come out on the 360, no doubt, but the stuff they've shown off so far isn't doing a lot for me.
Maybe that's why I'm so excited about the media-box features of the Xbox 360 and why I want one of those things right when it comes out just so I can play every conceivable piece of digital content I own from one spot. I've never been much of an early adopter. In fact, I came a year late to the PS2 and Xbox while I waited for games I really wanted to play. And I'd assumed the 360 would be no different. Kudos to Microsoft, I guess, for finding compelling reasons to own a video game console besides the video games. Let's just hope those quality titles don't get lost in the shuffle.
| Jason Ocampo|
Associate PC Editor
E3 Like a Bat Out of Hell
So there I was, sitting in the GameSpot E3 booth during the big Monday press conferences. To my left was Tor, who was getting text-messaged all the big details as they happened from Jeff. At least until Jeff's battery ran out. But all Tor could do was whoop it up and holler as every little detail came in about how the consoles kick ass. And as a PC guy, I had to sit and take it. But you know what? I'm not taking it any longer. So here are my thoughts of E3.
Sony - Yeah, yeah, yeah. I've heard it all before. Look at all those pretty prerendered (that is, fake) videos. I've seen this before, Sony. It was called your PS2 hype. And you know what? The real thing didn't even come close. I'm just surprised that Sony didn't promise that North Korea would use the PS3 to make nuclear weapons (remember all those media stories about Saddam buying PS2s because they were so powerful that he could research weapons of mass destruction) or that the PS3 would calculate the cure for cancer. Fool me once, shame on you. Fool me twice, shame on me. And by the way, I'll get excited about Killzone once Guerilla proves it can make even a halfway decent game, because its track record to date is downright dismal.
Microsoft - You screwed up, badly. Imagine that you're running the ball in for the winning touchdown in the Super Bowl, and you fumble it without anyone having touched you. That's what you did, Microsoft. Where to begin? Let's see, people come to E3 to see games, of which you showed very little. All we saw were marketing snippets, for the most part. When your big Square Enix announcement is that you're getting a port of a PS2 game, your next-generation console is looking like a has-been before it's even hit the shelves. Dead or Alive 4 looked like Dead or Alive 3. Your supposed ace in the hole, Perfect Dark Zero, was MIA. When Peter Moore was talking about 250,000 Xbox 360 users being able watch a virtual racing game, I'm not sure if he was talking to gamers or to potential corporate marketing sponsors. And where the hell was Halo 3? It's your ultimate killer app. It's the game that sold the Xbox. I realize Halo 3 must not be ready for prime time, but you could at least have cobbled up a fake teaser (like Sony did with Killzone) to get people pumped. And don't even get me started about that MTV debacle...
Nintendo - Don't get me wrong. We all love Reggie. He's great. He's the kind of executive that Sony and Nintendo both need. I even saw him quietly come into our booth a couple of hours after his appearance on our E3 stage to give our makeup technician some free stuff. He didn't have a lackey do it for him; he did it himself. That's class. But come on, Nintendo, show us the goods. All we got was mumbo jumbo about revolution and Revolution. E3 is all about hype. Sony knows that, which is why it kicked Microsoft's ass. But it's like Sony and Microsoft couldn't even be bothered with you.
PC - Forget all that fake console stuff, the PC owned this show. Spore was the talk of E3. F.E.A.R. kicked all sorts of ass. Battlefield 2 was simply gorgeous. Prey looks pretty cool, especially how it messes with your sense of direction. But they all got lost amid the noise that is E3. Here's a nice example. Microsoft has by far the biggest booth at the show. It was the size of Sony and Nintendo's booths combined. And it was Xbox-only. Microsoft had a much smaller booth dedicated to Windows games, which goes to show where PC gaming fits in an industry that will soon have six major consoles, two major handhelds, and mobile games. Still, the PC games that were there looked amazing. And if I hear one more PS3 fanboy talk about how the PC will never catch up, I'm going to throttle him. The PS3 basically uses the next Nvidia graphics chip, which we already have.
Random thoughts - Like many E3 editors, I have a real love/hate relationship with E3. I love the show itself, as well as the chance to see amazing games. But I hate the crowds, the noise, the weeklong disruption to my sleep pattern, etc., etc. However, the best moment I have of E3 is standing on the stage, right at the end, while all of us are hanging out and talking about what we liked. It felt like a triumphant moment. And now, we have 49 or so more weeks to go until next year's E3. And if you thought this year was bad, next year is going to be positively insane.
| Brian Ekberg|
The Art of the Remix
J Allard recently called it "Generation Remix," the vaguely defined mass of post-Gen Y consumers that will presumably take to the streets in armed revolt unless they get their chance to purchase $30-50 customized faceplates for their Xbox 360s. As silly as that sounds--and as soul-crunchingly dull as a tiger-striped Xbox will most assuredly be--I think there's merit in the idea of the video game industry embracing the ability to "cover" one another's work, for lack of a better term.
After all, in nearly every other form of pop culture, "covers" are an acceptable sidebar to original material. The time-honored tradition of covering other artists' work is most seen in music, where a good chunk of today's performers make names for themselves by reinterpreting (or, in some cases, butchering) other people's tunes.
Just last night, I saw Howard Jones perform a perfectly moving rendition of the Dido song "White Flag," a tune I normally can't turn off fast enough when I hear it on the radio. Something about Jones' slightly aged voice and pared-down delivery brought out a new sensibility to the tune that I had never heard before, so I had to rewind it three times to get it out of my system. Of course, on the very same show, I saw Wang Chung do an atrocious cover of Nelly's "Hot in Herre." So, to be sure, we're not batting a thousand here.
Consider comic books. The mainstream stuff--that arriving from DC and Marvel--is an entire industry dedicated to the art of remix. Because the real moneymakers in the comics industry are the properties themselves--Batman, Spidey, and all the rest--and because they are, at the same time, many decades old, it's up to the individual creators to bring fresh new takes to characters and universes that were created and re-created, sometimes several times over, before many of them were born. Of course, mainstream comics perhaps take the remix example too far in an ugly direction. Take, for instance, the continual dependence on established intellectual properties as interpreted by creators who haven't brought anything new to the table in years (I'm talking to you, Chris Claremont), and at a priority over new, creator-owned projects. This is still an anchor that's been weighing down the industry for years. However, for every Claremont, there's a Warren Ellis, Peter Milligan, or Mark Millar who can bring appealing and original takes on established tropes and characters, or at least make them fun to read again.
The precious little lit-mag, The Believer, just published its music issue, complete with a CD of indie bands that cover other indie bands' material. Christopher Nolan is back to revitalize the Batman movie franchise after the Joel Schumacher bastardization, bringing the series closer to the darker tone of Tim Burton's original duo of films...themselves inspired in part by Frank Miller's gloomy reimagining of the character in The Dark Knight Returns. Tim O'Brien put out a Bob Dylan tribute album in the late '90s that, in my opinion, is just as strong an artistic statement as anything Dylan himself ever did. O'Brien manages to infuse the songs with enough of his own invention and fire to truly make them his own--even if he didn't write them.
So why not take the remix concept and apply it to games? No, I'm not talking about one more DDR sequel with yet another variation of "Butterfly." Rather, I'm talking about the essential and vital mixing of genres and talents that has the potential to create powerful pieces of interactive entertainment (God, that sounds so '90s). Imagine a Mario Bros. game as designed by Hideo Kojima. How would playing the next Madden NFL game change if you mixed in Valve's vaunted Half-Life 2 physics model? Shigeru Miyamoto helms a Forza Motorsport sequel. Resident Evil by Blizzard Entertainment. Grand Theft Auto: Ivalice. How about Gran Turismo as brought to you by the developers of Katamari Damacy? Of course, the examples don't have to be this forcibly disparate. What would NCsoft do with an Everquest expansion pack? The list goes on.
Certainly, with the sheer volume of movie-licensed pabulum that passes as games these days, it's difficult to suggest that remix games could or should take any sort of precedence over original ideas and new game types. But the games industry has its staid concepts that have moved well past the point of welcome...the gaming equivalents of the Bond film franchise, so to speak. World War II shooters for instance. Or platformers. Or, yeah, I'll say it: sports games. I'm not suggesting games like these go anywhere--God knows I practically wouldn't game at all if the sports genre didn't exist--but that it is never a bad thing to mix things up.
Is this likely to happen? Considering the ever-consolidating state of the industry and the massive amount of green being exchanged in exclusive licensing deals these days, it doesn't seem likely. Would Warner Bros. Interactive Entertainment have bet the farm on a turbulent reimagining of the Matrix franchise for The Matrix Online? Probably not. But considering the success of the title and the fate of the game's development house, perhaps they should have.
Be it the strict ties of intellectual property and licensing arrangements, or the relative youth of the industry, "cover games" simply aren't a huge part of the industry's output...yet. There are exceptions, like the creative reimaginings of currently existing properties, such as The Legend of Zelda: The Wind Waker or Metal Gear Acid, among others. And to be sure, not all of these experiments would work (come to think of it, perhaps Katamari Turismo isn't such a good idea after all). But at the very least, it would be an interesting alternative to the buckets of movie-licensed junk that line retail stores these days.
| Justin Calvert|
The Streets Of Q&A
Since The Streets Of is a game that exists only in my own mind, it seems reasonable to assume that the following preview is a GameSpot exclusive. Development on the game is currently zero percent complete, and there has been absolutely no interest expressed by any of the publishers that I come into contact with on a daily basis. Being in the unique position to have spent a little time playing the game in my head, however, I am pleased to report that it's coming along nicely.
GS: How long has The Streets Of been in development?
JC: The Streets Of exists only in my own imagination. It's not a real game, and I'm pretty sure it never will be.
GS: Can you give us a brief overview of the game?
JC: The Streets Of is a massively multiplayer online action game that places a strong emphasis on player-versus-player gameplay. The three "factions" in the game are the mob, the Thrill Killers street gang, and law enforcement. There'll be plenty of weapons to play with, of course, and players will be able to specialize in a number of different skills, many of which are faction-specific. Law enforcement players, for example, will be able to wear body armor, will learn advanced driving techniques, and will be more skilled in the use of many firearms than their mob and gang counterparts. Gang players, on the other hand, will be able to steal cars, GTA-style, use readily available everyday objects as weapons, perform drive-by shootings, and receive significant attribute bonuses when playing in a group with other gang members. Current ideas for mob players include stealth moves, bullet-proof limousines, and the ability to enlist the help of non-player characters.
GS: Will players be able to opt in and out of PvP?
JC: No, I've always felt like that option feels a little too artificial. All of the city servers will be PvP-enabled, and many of the missions that you carry out to earn money will require you to hunt down players from other factions, as well as non-player characters. Mob and gang players will have the ability to attack anyone in the game at any time, while law enforcement players will only be able to use their weapons for self-defense, to aid another law enforcement player, or against mob and gang players, whose criminal activities make them among the most wanted players in the city. Law enforcement players will see a red arrow above these players' heads at all times, indicating that they are viable targets.
GS: You mentioned missions. Can you give us any idea what a typical The Streets Of mission will comprise?
JC: Since this is an action game rather than a role-playing game, most of the missions will simply require players to kill NPCs and players from opposing factions. A law enforcement player might have to prevent a crime by shooting any number of NPCs and players at a specific location, for example. A gang player might have to assassinate a mob boss who is being protected by gang players on an escort mission. Driving missions will also be included, along with organized street races that'll take place on the same streets where other players are busy killing one another.
The money that you earn for completing missions will be used to purchase better equipment for your character, incidentally, but it'll be your skills that win you fights in The Streets Of, not great equipment, and certainly not some number next to your name that represents nothing more than how many hours you've poured into the game.
GS: You also mentioned city servers. Where is the game going to be set?
JC: Each server will feature one of three accurately re-created cities: London, San Francisco, or Hong Kong. Players will only be able to create one character on each server, and the plan right now is that each player account will be good for up to three characters. So you'll be able to play as a gang member in all three cities, play all three factions in your favorite city, or anything in between.
GS: How do you plan to deal with death in The Streets Of?
JC: When players are killed, they'll be taken out of the game for a short time, during which they'll be able to choose where they respawn. Options will include the location where they just died, as well as a number of faction bases scattered throughout the city.
GS: Thanks for your time.
Like Paparazzi: Celebrity Shooter and Pub Quest before it, The Streets Of is currently in development for no platforms and has a tentative release date of never. We'll bring you more information on the game as soon as it becomes available.
| Carrie Gouskos|
Associate Editor, Mobile Games
The E3 Me
Being a developer and a journalist at E3 are fundamentally different experiences. But they share one thing in common: They remind you that E3 isn't supposed to be all fun. I've taken this long to write about my time there, because this is exactly how long it has taken me to recuperate. Of my six years attending the show, this past one (my first as a GameSpot employee) reminds me most of my first E3, working the better three-quarters of every day and not seeing any games that I didn't happen to be working on. But I've worn a number of different hats to the show, and each of them highlights different aspects of it in different ways.
If you work in quality assurance for a game developer or publisher and are lucky enough to get to go to E3, then chances are that they want you to work the show floor. Of the three years that I attended the show on behalf of Acclaim Entertainment, my first two consisted of exactly that. I was assigned to a section of games. In my case, it was their extreme sports lineup, and I had to stay there to explain the games (in the best light possible, of course) and reset the systems if they crashed. It's a simple enough task, and since I enjoy talking to people, I had no problem standing around making small talk for 10 hours a day. The problem arises when you throw the E3 atmosphere into the mix. Why would anyone come play Jeremy McGrath's Supercross game when Jeremy McGrath is standing four feet to my right, looking beautiful and signing autographs? What's the appeal of checking out the new tricks in the Dave Mirra sequel when he's pulling backflips on the halfpipe over my shoulder? If E3 is supposed to be about video games, why does all the E3 fluff seem so tempting? When your role isn't clearly defined, it seems too easy to fall into these E3 pitfalls. For two years I attended E3 with almost no purpose whatsoever, once as a guest of my former publisher-employer and once as an employee of a retail chain. Without a place to be or appointments to meet, I merely wandered the show floor, taking in all of the games, but also way too preoccupied with the events and the swag. Those years I had to return home with additional suitcases because I had picked up so much game paraphernalia. And for what? To wear for a few years and then throw out when I move out to California...because nobody in the universe needs 80 video game T-shirts. Although I didn't fall prey to it, there were definitely events, like autograph sessions and lines for game demos and movies, that ate up four or five hours of time. As a visitor to E3, even if you have no place to be, you still have to monitor your time carefully, being wary of time sinks with low reward, no matter how tempting meeting Carmen Electra might be.
It was almost a relief that, as a journalist, I didn't even have time to stop and see what they were passing out or who was in what booth, let alone actually take part in it. This year, if I wasn't in an appointment, I was running to an appointment. And if I wasn't running to an appointment, then I was writing about one. Gone were the simple days of leisurely strolling around the show floor; if I wasn't hustling, I was late. While a mere year before, I would have hung around in the middle of a crowd to see exactly how much of Vin Diesel's shoulder I could glimpse, this year I had to know about the big signings only so that I could plan my route around them while sprinting to my next appointment. And to think, I have yet to go to E3 as an investor, a PR person, or the president of a major company (all entirely possible, of course).
E3 is my favorite time of the year, and it ranks up there in my top life experiences, somewhere directly behind being born and meeting Ice-T (which also happened at E3, incidentally). But it's amazing how one event can mean so many different things to many different people. It's the neat kind of amalgamation that you might only be able to get in the game industry, the cross section where work, business, interactivity, and entertainment all come together.
| Steve Palley|
Chief Editor, Mobile Games
The Social Animal
These days, lots of upbeat marketing surveys seem to indicate that games are hitting the mainstream and that the complexion of the average gamer is changing dramatically. Poppycock! Basically, this poor soul has degenerated from an already wretched creature into subhuman, gurgling filth. He (there are no female gamers, remember) is somewhere between 12 and 17 years of age and is afflicted with terrible acne. His sweatpants are in rags, his haircut is terrible, and his toothbrush needs to be replaced. He is surveilled by the CDC for information on his dietary habits, which some believe hold the key to the obesity epidemic. His libido knows no bounds, as he just met a new girl on IRC while hunting for Lara Croft pics, and he's trying to get her to model her schoolgirl outfit on her webcam. He's an OCD-afflicted completist, so he can be counted on to buy anything publishers throw in his general direction, especially if the gameplay is choked by product placement that's rife with digitized T&A and is cross-marketed with a Hollywood blockbuster starring completely vacuous, mass-produced characters. Oh, yes: The edgier, the better. Basically, he aspires to nerd status but is neither intelligent nor discerning enough to deserve the moniker. Why bother helping this luckless jerk socialize himself or expand his mind through gaming when a simple deathmatch will suffice? Other than his seemingly inexaustible gaming budget, he's an easily replaceable cipher.
This is a rather harsh assessment, but it's not necessarily a controversial one. A lot of respected games professionals--including some dude at Nintendo named Miyamoto--have railed about the direction video games seem to be headed in: an expensive, generic, and dumb-as-rocks one. These people are taking steps in a myriad of directions to try to keep the industry from entering another cannibalization phase, which seems to happen about once a decade. Some of these ideas have to do with fundamental game design. Y'know? Boring nuts-and-bolts stuff like "writing," "controls," and "characters." Another possibility lies in the next generation of gaming hardware, which will apparently have the capability to do some pretty crazy things, if properly utilized. Still others are searching promising new frontiers, like mobile phones, for the "promised land," but it doesn't even seem like they've discovered the compass yet, let alone drawn a map.
In my humble opinion, all of these people are wrong to varying degrees, because they're misunderstanding the goal. These well-intentioned souls are stumbling around, testing out one theory after another to try to figure out what people consider entertaining. They're looking for a silver bullet that can be built up on today's consumer technology.
Folks, there isn't a silver bullet, because the gun's changing way, way too fast. Are you ready to start burning all of your HD movies to Blu-ray? How about buying stuff by pointing your mobile phone at it? No? Those are just two small examples from the landslide of progress that we, in the developed world, are currently floundering in. Now that everything's possible, we're really busy trying to figure out what stuff we actually want. Just because we can do something doesn't mean that it's a wonderful idea (why doesn't anyone else own a videophone?!).
There's really only one constant here in the 21st century, and that's ubiquitous, broadband Internet. Assuming that cold fusion and faster-than-light-speed travel are impossible (and maybe even if they aren't), it is likely to be the most important innovation of the next hundred years. This is because human beings are social animals. On the balance of things, we've never really been able to do much as individuals; we need to collaborate with, argue against, and simply absorb the ideas of other people to make any meaningful progress in any sphere of our lives. This is true even of the great geniuses, whose brilliant leaps would be quite meaningless if divorced of their context. Simply put, the Internet is the medium that will catalyze progress across all areas of human endeavor simply by bringing us into constant contact with other people and their ideas. It's already in the process of becoming second nature, and there's literally no way to get rid of it, or to slow it down for long.
Don't worry, I have a point! Real "next-generation gaming" isn't likely to unfold with the advent of the new consoles, or even new gaming software. These are really just tools that dictate the environmental conditions of a particular game, so they will only continue to develop until they reach the threshold of realism. And eventually, the particular Internet terminal we use to access a game world simply won't matter that much. I don't have the foggiest idea what kind of capabilities mobile devices, for instance, will have in five or 10 years, but I bet they'll be able to do whatever you need them to, games-wise. After that, people are going to have to take over: the people you play against, the people you play with, and, perhaps most importantly, the people that administer the entire process.
Next-generation gaming won't have "connected features"; it will be synonymous with communications. I'm sure that deathmatches on Xbox Live will have a market for a good, long while, but the possibilities will far exceed these basic capabilities. You need only glance at Steam, casual Internet gaming channels, and the new consoles' online capabilities to see that games are already moving toward a "post-retail" sales model. Soon, you won't be sold a product at all; you'll be a subscriber to a carefully modulated entertainment service. And concepts like release dates, sequels, and new physics engines will be swept right out the door in favor of a constantly improving and changing social experience. At this point, human capital will be far more important than anything the hardware could ever do--barring the creation of actual artificial intelligence to do it for us. Today, there are a couple of MMO games that employ economists to keep tabs on their rudimentary monetary and trading systems. I can foresee the rise of an entire class of industry professionals devoted to administering these gaming services. They'll monitor a service's dizzying array of economic details, constantly adjust various balance levels, and even codify laws and dole out justice--all to keep the game manageable and to keep you and your dwarf rogue (or virtual supervillain, or uranium tycoon, or whatever) happily playing.
As always, the better your team, the better the product. So I expect fierce competition for the services of the brightest gamers, as well as political scientists, ethicists, economists, sociologists, demographers, and so forth. Who knows? Maybe the social sciences will become relevant again thanks to games. I've had crazier ideas.
| Avery Score|
Assistant Editor, Mobile Games
As the saying goes, "One step forward. Two steps back." While transitioning to the next generation of hardware, console manufacturers have announced plans to strip their new systems of features and functionality we'd taken for granted. Some of these moves make sense, while others are a bit more surprising. Here's a list of what I see to be unusual moves, along with my comments and predictions.
The PS3's "Batarang" Controller
The Dual Shock controller is the PlayStation brand. It has emerged victorious as the gold standard of the past two console generations, and it has become iconic of video gaming as a whole. I bought mine for the first Ape Escape and never looked back. The change is likely related to Immersion's successful lawsuit against Sony, which illegally reverse-engineered Immersion's vibration technology, using it to power the Dual Shock's rumble feature. Still, one wonders why the change in shape was necessary and whether the new winged grip will prove as ergonomically pleasing.
The Cell Processor
Every day, I hear about a new company that worked on some facet of the Cell processor, which I'm convinced is the greatest collective achievement of mankind since Smucker's Goober. Elegant multithreading code is going to be needed to cater to its seven autonomous cores, which we're promised operate with the staggering synchronicity of honeybees.
Defective Cell processors will be paired for use in home servers, according to mad scientist Ken Kutaragi. The unique architecture of the unit, specifically its ability to tap an untold number of SPEs (Synergistic Processor Elements), allow for this kind of portability and expandability. Sounds like things will be peachy server-side.
What about the games, though? So far, Sony hasn't made a great case for the usefulness of multicored design on this level. We've heard things like, "a developer can use one core for physics, one for AI, and one for graphics." OK. What about the four other available cores? Those should be used to count Sony's money.
The Price of Prettiness
So, Sony's got
Sony has predicted that increased development costs will trickle down to the consumer, resulting in more-expensive games. In a way, that makes perfect sense. The advent of cheap removable media allowed for a long period of relatively inexpensive games. However, until another key technological advance occurs, such as the creation of more-powerful, intuitive modeling software, we can expect these increases in high-profile game prices to exceed the rate of inflation.
No HD Support for the Revolution
It's safe to say that the Nintendo GameCube's 480p option has been enjoyed by a minority of players, including those with HDTVs. This is because Nintendo completely botched the marketing of this feature. While myriad Xbox products are available in brick-and-mortar retailers to cater to high-definition enthusiasts, the GCN's component cables must be specially ordered, directly from Nintendo. This alienates just about all potential users.
What's worse, just tracking down those cables, plugging them in, and inserting a game with the "Progressive Scan" designation isn't enough. You have to enable 480p every time you want to play in high definition by holding down the "B" button while booting a game until a dialog box appears. When I first got my HD setup going, it took me a whole two days to realize I hadn't enabled high-def support on my Cube.
It's therefore not too shocking that Nintendo has eschewed all high-definition support in the Revolution. They probably conducted a survey to figure out which percentage of their users played in 480p mode, when available. Then they likely concluded that this paltry percentage was attributable to the younger demographic enjoying their games, or something. I have a g******** GameCube logo tattooed on my right bicep, and this makes me want to have laser surgery performed.
Many of you might be asking, "But Avram, could it be that you're just a huge snob? Is the difference between 480i and 480p even significant?" I'm glad you brought that up. Even the average Yoseph--without unruly hair or the tendency to throw up "the mule" symbol--can benefit from progressive scan. While both 480i (the default analog NTSC resolution) and 480p show pictures at 704x480 pixels, the progressive feed looks much better in motion. This is because an interlaced feed, ostensibly running at 60 frames per second, actually only displays 29.97 complete frames per second. The progressive picture works more like a computer monitor, showing each frame in its entirety. A progressive-scan image is the only way to enjoy flawless motion in your games. This is why a 720p output is often preferable to a 1080i signal, even though the latter has a higher resolution (1080i's 1920x1080 pixels versus 720p's 1280x720 pixels).
The PS3 will be capable of outputting two 1080p feeds to two HD monitors. Given, very few people will enjoy games in that level of visual fidelity, but you can be sure I'll be one of them. It's going to be pretty silly to play one game at 3840x1080 progressive (on my PS3) and another at 704x480 interlaced (on my Revolution). Have the sheep truly been owned, or will Nintendo's Revolution have graphical features to offset this huge visual-quality disparity? We'll see. Literally.
The Xbox 360 Unit's Design
Now let's take this from the interior to the exterior. I'll bet my posterior that no one's first priority is what their new console looks like. I'm also dubious whether the 360's chassis design was truly "based on the concept of an inhale." A more likely suggestion is that it's based on the idea that a Microsoft product should never be fully compatible with opposing platforms. If you've ever tried to integrate Macs and PCs on the same network, you know what I'm talking about. An even better example: Turn on "correct as you type" in Microsoft Word, and pound out "Xbox." Now try "GameCube" or any permutation thereof.
With the 360, J Allard's Xbox team has produced yet another unstackable console. This is a huge bummer for anyone with a "Magicker" line entertainment center. Contrary to Microsoft's claims, we're not "Generation Remix"; we're "Generation IKEA."
Also, the faceplates are dumb. I'm sorry. They are. I have no need to personalize highly impersonal hardware in small, prescribed ways. That doesn't empower me; it makes me feel like a tool, in both the colloquial and conventional senses. Sure, I can't stack my 360, thereby saving space and headache, but at least I can get a phat Ninja Gaiden motif going! I appreciate the wireless controllers. That's enough of an aesthetic improvement right there.
Consumer technology can and should be visually appealing, so long as it remains functional. You can elevate a pair of pants to the level of artwork by adding interesting stitching, patchwork, leopard-print bondage straps, and more. They must, however, conform to the basic shape of pants. One can't model pants after the concept of an inhale, because one doesn't breathe with one's legs. At least I don't. Although that would admittedly be a great way to win our semiannual GameSpot breath-holding contests: "Check it out, chumps! My torso is stationary! No breathing going on here!"
If you've made it this far, I commend you. Adapting to this next generation of craziness will require a similar level of dedication. Of course, you could just upgrade your PC and tell NinSonySoft to take a hike. I don't recommend it, though.
| Matthew Rorie|
Game Guides Editor
One of the most intriguing aspects of the upcoming console generation, at least for me, involves the controllers. While controllers are a bit more mutable than the actual guts of a console's hardware (witness the shift from the Xbox Controller to the Controller S after it came out), I'm going to go out on a limb and say that a console's controller is just as important as, if not more important than, the console's graphical capabilities when it comes down to determining how people feel about playing their various consoles, since the controller represents the most tactile interface people have with them.
The current generation of consoles, for instance, has seen some pretty interesting controller designs. The Dual Shock 2 was, to no one's surprise, just a remake of the original Dual Shock. But that's all the PS2 really needed, as the controller there works fine for almost any application you can think of, save perhaps throttling an accelerator in a racing game. The Xbox's Controller S is a far better racing game controller, but it has some other weird issues going on with it, such as the often-difficult-to-hit black and white buttons and the unsightly gaps in the top of the controller for memory cards and the XBL hookup. There's also the matter of the four rubber nubs on top of each analog stick, which are purposeless and annoying and take absolutely forever to rub away to the point where you can't feel them. While the original Xbox controller was a disaster for a number of reasons, Microsoft corrected their mistakes and wound up with something quite usable.
Then, of course, there's the oddball GameCube controller, which, at least in my opinion, is the worst controller I've ever used on a console...and I've used a lot of them. Nintendo obviously gets a kick out of their iconoclastic reputation when compared to the other two big console makers, and they've had a habit of incorporating that reputation even into their controllers, with mixed results. Obviously, some people find the GameCube and N64 controllers perfectly fine. But it's fairly apparent that they're designed to be...well...simpler than the other consoles' controllers, such as how they only have eight buttons compared to the PS2 and the Xbox's 12 (if you count start, select, and the two clickable analog sticks on each of them). For games that are designed specifically for the Cube, this isn't that big of a deal, but it's still somewhat restrictive. (Who didn't want to have a weapon-switch button while playing through RE4, for instance? Or a strafe button? It'll be interesting to see how the PlayStation port of that game takes advantage of the extra button real estate.) Beyond the button deficiency, there's the matter of the oddball button placement, the awful octagonal divots around each of the analog sticks, and the tiny C stick. Again, I know some people who prefer the GameCube controller over the other consoles' controllers, but I'm not a member of that camp.
This is one of the reasons that I'm looking forward to getting some hands-on time with the next-gen console controllers, whenever they happen to start rolling out. Aside from ergonomic concerns, it appears that the major technological change here is the standard inclusion of wireless controllers, which is something both welcome and frightening. Speaking of someone who has an inordinate fear of input lag, I can only hope that these things are as responsive as the old-school corded controllers, or at least that the console makers incorporate input ports for people who just prefer to have some kind of tether to their consoles. You know, in case we get blown away by the graphics. You're reading the words of someone who discarded a perfectly good Logitech MX900--by all accounts one of the finest wireless mice available--on account of a slight bit of input lag in online FPSes. I'm all for the appeal of being able to sit back without worrying about tangled cords, but I'm not willing to trade the least bit of responsiveness just for this convenience. The jury's out on this for now, and it will remain so until we get our hands on some of these consoles to try them out.
So far as the actual controller designs go, at this point it seems like the Xbox 360 is going to be my favorite of the three, if only because they're adopting Sony's old strategy of staying the course and sticking with what works. While I personally prefer the Dual Shock 2 to the Controller S, I have no gripes with it (other than those I listed above), and they seem to have made some nice changes to it, such as moving the black and white buttons to the shoulders and softening some of the edges to make it more space-agey. The rubber nubs are still on the analog sticks, so I can look forward to a few months--at the very least--of annoying thumb imprints before they rub away. On the whole, though, it appears that Microsoft is doing their best not to screw things up. By sticking with what works, Microsoft isn't being particularly innovative, but that's never been their strong suit anyway.
The Dual Shock 3, or whatever you want to call it, is a different matter. At this point, it's pretty hard to get a straight answer on it, as we've all read the Sony statements about it being a work in progress, with more refinement still to come. Regardless of the naysayers that came out of the woodwork when the "boomerang" was first glimpsed during Sony's pre-E3 press conference, the fact remains that there's still very little hard information on it, and hands-on impressions of the ergonomics are hard to come by. Given that Sony is spending untold millions of dollars to develop the PS3, we can only hope that they're taking the time to make sure that the controller is somewhat comfortable to hold for long periods of time. The looks of a controller aren't particularly important, so long as it does the job well. My main concern at this point is the size of the thing, because from the pictures that've been released so far, it seems like the analog sticks are going to be tiny.
Of course, the X factor of the next-gen controller wars is the Revolution input device. This thing has been talked about more than a hot girl at a Dungeons & Dragons convention, with little real information coming to light. Will it have a touch screen? A gyroscope? An ancient oni inside it that's bound to your will? All of the above? At this point, it's difficult to really imagine what Nintendo has up their sleeve, and with the console still more than a year away from launch, it's going to be interesting to see how the community reacts when the hard details come out.
What I'm more curious about, however, is how the likely wackiness of the controller will affect single-console gamers. If Nintendo does opt for some weird, proprietary input system, how is that going to affect the ability for developers to port games to it? Will Revolution owners wind up getting superb first-party games from Nintendo but be neglected when it comes to multiconsole games due to a hard-to-adapt-to controller?
Say what you will about Nintendo, but it's a company that commits to its decisions, even when people on the outside find them to be...difficult to comprehend, such as when they downplayed the desirability of online play during this generation or when they apparently decided not to include HD support in the Revolution. Speaking as someone who's GameCube spends far, far more time unplugged and on the shelf than actually being played, I'm hoping that the spirit of innovation that seems to be guiding the development of the Revolution controller will spill over to the design of the games themselves. One can only find so many Triforce pieces before you realize that you've been playing essentially the same Zelda game since the NES, however consistent their quality may be.
No matter how things work out, it's obvious that the next round of console wars is going to be a high-stakes one. Despite the debates over the graphical capabilities of the various consoles, however, I doubt there's going to be as big a gap next generation as there was between the Xbox and the Playstation 2...especially when you consider games that are developed for all the platforms simultaneously. Rather than choosing a specific platform for a game based on the marginal graphical differences, then, I'm likely going to decide based on the controller, making the next generation a controller war rather than a console war.
| Greg Mueller|
Fuel to Feed the Next-Gen Fire
"There are dozens of factors that contribute to the success or failure of a system launch. But in the end, any gamer will tell you that it's not the color of the box, the shape of the controller, or the system specs that sell a console. It's the games."
The launch of a new generation of gaming consoles is one of the most exciting events in the video game industry. Aside from the new technology and games, console launches are particularly exciting landmarks because they are few and far between. If the next generation of systems lasts as long as the current generation has, it'll be 2011 before we see another launch. However, with the upcoming systems, Sony and Microsoft seem to be preparing for an extra-long cycle by building the PlayStation 3 and Xbox 360 around technology that could easily take us to 2015 and beyond. By then I'll be a feeble old man of 35, which means the upcoming system launches will likely be the biggest events in my adult gaming life. I'm not alone either, as there are millions of gamers in the same situation, which means there's tremendous pressure on the big three to launch their systems with a bang. There are dozens of factors that contribute to the success or failure of a system launch. But in the end, any gamer will tell you that it's not the color of the box, the shape of the controller, or the system specs that sell a console. It's the games.
Little is now known as to what games are scheduled to launch with any of the new systems, so I've been thinking of past launch lineups and of what kind of games Sony, Microsoft, and Nintendo will need to come up with to sell their machines. Launch lineups are typically the same, with 20 or so titles that are either first-party or are from the largest of the large third-party publishers. Also, there's a secret rule among game publishers that launch titles have to follow specific guidelines for what types of games can be made available at launch, as well as what types of games must be avoided. Invariably, there will be at least two or three racing games, like
While there have been some strong launch titles in the past, with the exception of
By looking at past system launches, we can come up with a pretty good guess of what will be available when the Xbox 360, PlayStation 3, and Nintendo Revolution launch within the next 18 months or so. Here are my predictions based entirely on what has been announced, in addition to some speculation on my part. And remember: I'm not privy to any sort of top secret information just because I work at GameSpot.
Enough EA Sports games to fill out the rest of the "25 to 40" games planned for the Xbox 360 launch.
Need for Speed Most Wanted
Tomb Raider Legend
Ghost Recon 3
Again, all the perennial franchises from EA will make appearances.
...and many more.
...and many more.
...and many more.
...and many more.
...and many more.
...and a bunch of EA games.
You're probably wondering why I didn't place any of the big guns, like
Of course, you won't buy a system based solely on the lineup of games available at launch. But for most people, it's the biggest deciding factor. Even if a system has some great-looking games in the pipeline, if the launch games are underwhelming, you'll be much more likely to wait for a price drop rather than making the investment right away. As I said, these lists are purely speculation. But for gamers who have been trying to decide which next-gen console to buy first, it's definitely food for thought.
| Kurt Collins|
Associate Software Engineer
The Revolution is Here
Sit down all you Nintendo fanboys. Sit down. I have many a problem with Nintendo in recent memory, but I'm forced to give them credit where credit is due. Gaming needs an overhaul of biblical proportions.
Over the years, games have gotten more sophisticated, while the interfaces we use have languished in the realm of mediocrity. On the PC side, we're still using joysticks, keyboards, and mice (with the occasional wheel for those of you with the dough to spend on them). For strategy games, role-playing games, and any other games that don't require the use of your well-refined gamer's-twitch muscle, then everything is fine. However, once you get in to the realm of playing Battlefield 2, Half-Life, and Counter-Strike, you're left with nothing but carpal tunnel syndrome.
Meanwhile, on the console side, we're using a slightly modified version of the NES controller. Sure, there are a few more buttons to deal with, along with the addition of analog sticks, but on the whole, we're still using the same interface. For everyone claiming the Dual Shock is the greatest thing to happen to controller design, I agree with you; it is definitely a beautiful controller, only improved upon by Logitech's wireless controller. However, the fact that something is good doesn't make it ideal.
It's ridiculous to think that while games have changed drastically over the years, we've somehow managed to get rid of intuitive interfaces in favor of more-complicated setups involving multiple keyboard configurations and seven-button mice. When I'm playing a first-person shooter, which, almost by definition, is some sort of simulation of me shooting a gun, I don't want to have to determine where and when to fire by pressing W-A-S-D on my keyboard in conjunction with one of seven buttons on my mouse. Does anyone remember Duck Hunt? Of course you do. If I want to shoot ducks, I can do it very simply with a controller that can properly simulate shooting ducks: a gun.
So why did we give up the simple for the more complicated? Contrary to popular belief, it's not because gamers are stupid. In my opinion, it's because gamers want more options. They want the ability to change to any of 50 different weapons on the fly while strafing to the left and looking above them. How else would one be able to achieve the right altitude while rocket jumping?
However, in gaining more control, have we lost some of the fun? Things have become a lot more intense in the gaming world than they've ever been. In the online world, competition is strong, and gameplay requires concentration and focus. The intensity of the game seems to turn things more into a life experience rather than a mere round of shooting a few ducks with a friend laughing nearby. The amount of control given to us by the interfaces we use to interact with the video games seems to have been both a blessing and a curse. We now have the capability to control almost every aspect of our gaming experience. There are 104-plus different keys on a given keyboard that allow for any number of key combinations and multiple control mechanisms.
Unfortunately for me, the more control I get, the less enjoyment I receive from video games. The days of Duck Hunt are gone. The DDR dancepad is left as one of the last remaining vestiges of a long-forgotten past filled with ideas of new, funky, and intuitive controllers. The Virtual Boy was an attempt at changing the way things work, as was/is the EyeToy. Neither the VB nor the EyeToy took off as it should have. The fact that nonstandard peripherals that don't ship with consoles traditionally do not sell well adds to the dearth of intuitive controller interfaces.
The next generation of consoles brings about the opportunity for some interesting control methods. Voice control is no longer a technology of tomorrow, but it's an instrument of today. While it's not even remotely close to perfect, we are finally starting to see voice better integrated in video games in simple ways, such as the use of voice over IP in Battlefield 2.
During Sony's PS3 press conference, while every gamer was slack-jawed over the Killzone trailer, I was more intrigued by the Eyedentify trailer. The idea of my image and voice interacting with the game to control the outcome is something that is more interesting to me than any improvement in graphics could ever be. Can they pull it off? Who knows. Sony has overpromised and underdelivered in the past. Nintendo is keeping their mouth shut about their new controller, leaving us all breathless and angry with hyped-up anticipation.
At the end of the day, though, if their demonstrations and talk are any indication, Sony, Microsoft, and Nintendo seem to agree on one thing: The next generation isn't only about more power. It's also about a new interface. As long as that new interface extends to include different control schemes, then I can't wait.
| Adam Buchen |
Associate Software Engineer
An Open Letter to Fanboys
Since this is the last GameSpotting, I decided it was my last chance to convey my thoughts on a subject so near and dear to my heart: fanboys. Console and PC fanboys are an interesting breed, and they always bring a unique discourse to our forums. Their well-written and wonderfully articulated posts and emails give them a special place in my heart. I don't believe I've ever addressed them directly before, so this is my chance.
I have a few pieces of advice I want to pass onto you in this GameSpotting. I know that in an age of multiple platforms and a huge amount of games, it can be difficult to remain loyal to one platform or one company. It's a real challenge, and you are all noble and courageous for taking on this quest. I have a few suggestions that I hope you all take to heart. It will help guide you down the path of the fanboy.
GameSpot is the most reliable source for reviews. Except when another publication rates the game you're hyping higher than us. Then that publication is the most reliable source. Until next week, anyway.
A person who owns just one console is inherently less biased than editors at GameSpot who have access to all consoles.
The term "exclusive" is tricky. Its definition changes depending upon whether a hot game appears on your platform or not.
Innovation is what it's called when your console has a unique feature. Otherwise, it's called a gimmick.
Your specific needs are shaped around your platform's features. For instance, if your console doesn't feature HDTV support, then it really wasn't that necessary to have, anyway.
A game that is scored less than a 9.5 on any platform you don't own is called a flop. If it does achieve that score, it's called overrated. (According to the GameSpot rating system, a game rated 8 or above is "great," but we all know that it really means "flop.")
Sales numbers are everything, as long as the numbers are best for your favorite company or console. If that means you have to dig up numbers for sales in New Zealand stores from seven months ago to prove your point, so be it.
Likewise, it is up to you to monitor how companies are doing and report every dip in stock price of manufacturers of other consoles. You're all experts in the fields of economics and finance, so your analyses will always prove correct.
You must realize, that some companies are evil, and some are inherently good. Some companies do it for the money, others do it... so, well, you can brag about them.
Perhaps the most clever thing you as fanboys can do is to come up with derogatory names for competing consoles, such as Xbrick instead of Xbox or FlopStation 2 instead of PlayStation 2.
On message boards, the best way to convey your point is to write in all caps, ignore the rules of grammar, and include lots of "LMAO's" and "LOLs" in your post.
It makes perfect sense to insult a company's lineup one moment and at the same time hope that company will go third party and develop for your console the next.
Playing a game on a different console for ten minutes at a friend's house qualifies you as an expert on that game and console.
You are obviously the most knowledgeable person about hardware. So when you're talking about the difference between CISC and RISC processors, everyone should stop and listen.
A sequel on another console is a "rehash," while a sequel on your console is not; in fact, it's highly anticipated.
Your platform is the best one in existence. Any time you see what might be evidence to the contrary, it's clearly because others are blinded by their idiotic fanboyism.
So you see, dear readers, the best part about being a fanboy is that you can never be proven wrong. As long as you ignore the rules of logic and sensibility, you will never have to worry about whether or not you made a good investment. That sure helps all the insecurity go away, doesn't it?
In the meanwhile, the rest of us will have to go on using our brains to make decisions when it comes to videogames. We will have to actually think about whether or not it's worth it to buy a console based on our needs. We will have to consider forking over more money for a new machine if its games are excellent. We will have to play the best games instead of bash them if they're not on our consoles.
In other words, the rest of us non-fanboys are doomed to a life of playing the best games and enjoying our hobby, while you will have the luxury of being able to ignore the majority of games that come out, because they're not on your console. That is indeed very lucky for you all.
And with that, I must wrap up my letter today. I hope I was able to teach you all a thing or two.