Who hates the Everyman? He just doesn't fit in

by on

In response to the "Why Hate the Everyman" feature over here:

http://asia.gamespot.com/prototype-2/videos/why-hate-the-everyman-6374168/?tag=Topslot;TheEveryman;WhyHateTheEveryman

Is game protagonists being super-powered anything new at all? Games are all about challenge and overcoming it, and the natural progression of that has been to amp up the challenge and amp up the powers given to the player to surmount the challenge. That leads eventually to turning the players into super-beings or just totally unrealistic crazy bad-ass dudes. How is that not perfectly obvious?

Isn't that the same in movies? Movies rely on conflict and the only way to resolve a conflict to a happy ending is by making the heroes more and more powerful. Conflict is most easily depicted through violence so naturally we've moved from cowboy westerns to IronMan.

What about stories? I can go on and on.

Where movies and literature differ from games, and the ones that shine as the brightest beacons of the respective art forms are those that shine a spotlight on harsh realities and focus on failure. Failure of the human spirit and of our value system. Requiem of a Dream is an obvious, if cliched example (cliched in the sense of how many people use it as an example) of man's failures making for moving, gut-wrenching cinema.

The other category of movies are those that take people to extreme sadness and agony, and make us watch them emerge from those tragedies as heroes. This is but an emotional translation of the "conflict" concept, but it still results in having to give the protagonist extraordinary emotional strength to get through the conflict. Movies achieve empathy even for characters who are stronger - sometimes - than any real person could be, by either showing us more of what's unbelievably great about them (Life is Beautiful) or by showing us their flaws (no good examples right now).


We're starting to see people talk about the second variety of conflict being introduced to games in the form of emotional conflict and character progression through story rather than points menus. Heavy Rain is but one example of this. And that's great/fine.

However, we're quite far from doing the first variety of games that focus on failure. Games do not gel well with failure at all and it'll be a while before we see such games come to life, or before someone even understands how that would work. But it's when a successful game can be made about failure, that we could possibly see TRUE everyman characters (not ones with superhuman emotional strength) walk their way through games.

A Utopian World of Copyright

by on

This is something that's been on my mind for a very long time, and I was provoked to this post by Tom Magrino's rant on "always on DRM" being the ultimate solution for the world's copyright problems.

I take that idea further into a parallel universe utopia:


All information and software is encoded with a copyright id. Some software or information - public domain, open source or user produced - would not necessarily have a copyright id.

Parallel to this, all computing moves to the cloud, and all devices turn into screens and keyboards for the data and processing in the cloud. Bandwidth is free and people are only billed a FLAT rate for the CPU cycles they use in the cloud. FLAT, and cpu cycles. Remember those two because they're going to be very important.

Now record the copyright id associated with the executable and data processed with every CPU cycle. Tally this over the duration of a month and you have a comprehensive bill of all copyright ids that I have consumed in that period. I only pay the cloud service provide ONE bill every month for my CPU usage, multiplied by the FLAT rate. But the money from that gets split across all the copyright owners, by the cloud providers.

I don't pay for content, ever. I pay for processing and time. The cloud providers figure out the copyright stuff and distribute the money accordingly.


Such a system would be fair, because if I only watched a movie for (say) 10 CPU cycles, the movie owner gets paid only a couple of cents. However if I watched a movie 10s of times, for thousands or millions of CPU cycles the movie owner gets paid that much more. So the really cool stuff, that gets watched or played a LOT by people - a good indicator of quality - even if it's only a few people, gets due recognition and MONEY this way.

So a massive rpg with 100s of hours of gameplay, that gets only a few hundreds of thousands of users buying it or even knowing about it, could possibly make a LOT more money than a Call of Duty'esque game that offers only 5 hours or so of an okay experience, but gets bought by millions (I'm thinking single player of course. Multi is an entirely different proposition, and very different in the value it provides)

Such a solution would eliminate the need to buy anything digital anymore - nor would publishers have to worry about distribution - which I think is the primary point of friction and the primary origin point for piracy. Instead all validation and billing happens in the cloud AS YOU USE. It doesn't matter where you got the software or movie from. Maybe a copy from a friend, maybe free discs at the mall, maybe a torrent. It doesn't matter. Your cloud provider (or their CPUs) figure out which copyright holder's content you're watching and pays them accordingly.

This also levels the playing field in terms of pricing. A fair and workable system cannot have differential pricing for different content. All content is billed in terms of CPU cycles, which becomes the new currency. Ways of cheating by corporations - such as writing code that consumes a lot of CPU to tally up a huge bill, will be easily caught by millions of angry users, or even audited by the cloud provider who don't want to waste resources. People who try to increase the time spent by users in their movies or games by padding content, will become boring and eventually get played less, and by fewer number of people. Such a system would thus be self-regulating. I think a cloud CPU-cycle based billing system, and by extension one which is based on the time users spend with their content, could be the ultimate solution not only to beat piracy, but also make pricing fair, simple for users, and make producing great content (and art) that doesn't get consumed by a lot of people, but gets consumed a LOT by some people, an actually viable business proposition. Also, artists keep getting paid as long as their work remains relevant.


P.s. To answer some obvious questions on logistical issues:


a. Who manages all this copyright stuff?

Copyrights would be consolidated globally by select corporations - who may collect a nice fee on the side from copyright owners.

Cloud providers don't manage the copyright stuff themselves. They just figure out which copyright aggregator the content gets billed to - lets say there are 2 or 3 of those max, and send them a log of all copyright ids used in a month and the CPU spent on each.

The aggregator then splits money to all individual copyright owners.

Privacy issues get addressed by the cloud providers aggregating the copyright usage information provided to copyright aggregators, rather than providing per-user information. It's just a question of setting up the right legal framework around this - a MUST for any such far reaching system.


b. Copyright ids could still be stripped using custom software.

True, but there would be zero benefit to a pirate. Since the billing rates are FLAT, they get billed for what they see or use anyway. The only difference is the money may not go to a copyright owner, just sit in a slush fund or add to the cloud providers profits.

The cloud providers can be audited for these kind of slime tactics anyway, and all instances of copyright-free usage can still be logged and analyzed if required - the percentage would be significantly small, as there is no major benefit to stripping copyright ids anyway.

The only possibility is of people replacing somebody else's copyright is with their own so as to make money as "Trojans". Since lay people won't have copyright ids, and would have to register as a company to get one, any such companies which indulge in such tactics will be subject to regulatory restrictions, and can be audited for their monthly or quarterly earnings anyway.

Again, the nice thing about all this is that it takes the entire copyright mess away from users, and makes it the headache of companies and auditors/regulators, where it rightfully belongs.


c. People could setup rogue clouds where piracy can still flourish.

True, but these would be consolidated areas of illegal activity. These would be easy targets, and could be easily tracked and shutdown.

Legal and illegal use, could be viably separated like this.


d. What about offline devices?

In a world of cloud based devices, offline devices would either be prohibitively expensive, limiting in terms of their power and speed compared to stuff in the cloud, and also limitng in terms of having hardware compatible with what's on the cloud - incredibly difficult to do if the Intels of the world control and know who they sell to, and prohibitively expensive for any company to setup a rogue business around.


e. People would have reduced choice of OS.

True, and that's an undeniable side effect. It's entirely upto the cloud service provider to decide which OS to provide. Eventually, multiple OS's may even be eliminated, commoditizing instead different aspects of the OS, such as memory/CPU supported (high performance or consumer OS), user interface (think about how easy this is to do in Linux. The interface is just a plugin-able component) and screen sizes or usage paradigm supported (touch, mobile, gesture based, desktop/laptop, others)


f. By massive commoditization, hardware and software innovation could be killed.

Hardware innovation will move to the servers and to user interfaces. The former (in terms of CPU innovation, or chipset innovation) is super complex in a non-cloud world anyway. Software innovation will move to FUNCTIONALITY rather than technology such as drivers - which will be taken care of largely by cloud providers and their r&d teams. Developers will instead be able to focus on creative applications (like the awesome "Action Movie FX" on the iPhone - which in my view is rendered possible as a simple commercial product because of the ONLY incredibly easy to use video editing/splicing APIs already available in iOS, and the sheer ease of App Store distribution. Complex technology, simple, commoditized Apis and market) rather than worry about multiple platforms or device hardware capabilities.


Wrap-up: There are many more such questions and issues, which may or may not have solutions. But the point is that this is a completely utopian concept, that needs the entire world, every device, every OS and the fundamental WAY in which copyrighted content gets packaged, distributed and billed, to change. That ITSELF won't happen, but we can think about it, dream about it, and hope that there's a version of this utopian idea that can perhaps be implemented anyway, in a way which makes life easier and simpler for users and developers. Anyhow.

What makes a great game?

by on

This is a follow up to the Demon's Souls GOTY HotSpot podcast.

While I'm not at a level of dexterity or of hand-eye coordination where I could tackle Demon's Souls, I could certainly appreciate the love the panelists had for the game and what made it great for them. I've recently rediscovered fun with Team Fortress 2 (which I'd been laying off of due to my formerly spitting and stuttering "broadband" connection), and for the first time EVER, I find myself caring about statistics, achievements and especially gameplay/class strategy - something i'd never given importance to, but find myself constantly thinking about now, with TF2.

Now from a n00b's perspective online competitive multiplayer (especially when you've been 'pwned' repeatedly in CS) seemed like such a minefield to step into, and indeed dying was only too easy when I first began (yes I'm that bad). But as I stayed on I slowly discovered and was amazed by how much thought had gone into balancing and giving every class and every type of player a chance to survive and eventually even contribute to the team's efforts. Learning newer strategies from wikis, other players and just my own mistakes, the fun I have with this game, and consequently my investment in it with the achievements, unlockables etc, is just growing everyday.

Now as I listened to the podcast, it dawned on me as to how similar the NATURE of my experience as someone just learning the ropes of competitive multiplayer with TF2 was (starting with apprehension, gradually uncovering the layers of gameplay, and eventually falling in love), to that of the panelists as experienced gamers, with Demon's Souls. Now TF2 and Demon's Souls are two VERY different games, and I'm not drawing any parallels here, other than remarking at the fact that at the core of both these games lies an extremely well designed SYSTEM that is getting us excited and making us fall in love with these games!

But I find this completely at odds with what has usually defined a great game for me, which is good but not necessarily great gameplay, integrated really well with great storytelling. Gameplay is something I've usually seen as another enabler in the process of immersion into another world, alongside say the graphics and voice-acting. So while I appreciated the skill/augmentation system in Deus Ex, that's not what MADE the game great for me - it was an element, a necessary part of this world, required to flesh out the experience.

So the question is, what makes a great game? Gameplay/Systems, story? Does it depend on the type of game? Is it possible to identify a single element of game design as the artistic core of a great game, with the other elements merely required to 'do their job' and not screw up. For example, for a great movie, it might be the screenplay. Great movies may have great photography or sound, but these aspects are not REQUIRED to be great. Without a good screenplay though any movie would be lost. Given that, can we say that good gameplay is REQUIRED to be at the heart of every great game? Is gameplay in fact, the one true, distinguishing, quality that separates and identifies games as art from among other media? Is gameplay/game-system-design the true artistic identity of our medium? After thinking this through, I'd have to vote YES on that, but I'd love to know what others think.

Morality in video-games - Not all about consequence

by on

I was reading the excellent Gamespot AU feature (link) on Morality in VideoGames, and being a pet topic (or a subset of one), I couldn't contain myself to the comments section...Let me jump straight in.

Do moral choices really have-to-have visible in-game consequences? I don't remember being given a bag of gold coins the last time I was nice to someone. (That would have been REALLY neat though!)

The problem with how present-day games seem to approach morality is that they see that as another game MECHANIC, with a risk-reward equation at play. Morality is more often about subtext, subtle social consequences in some cases. (From the article, it looks like Dragon Age: Origins 'gets' this) The ONE thing that is a direct consequence of a moral choice EVERY single time, is self-appraisal. The game doesn't need to tell me how to feel about an action. I'm doing that myself. So that's one job off the shoulders of the developers.

In my opinion, developers need introduce only two things in any video game to infuse it with moral complexity: choice and context.

One brilliant example of this was Deus Ex. Now Deus Ex presented you with some larger moral choices through the game and even at the end that affected the world in significant ways. But most of these were typical video-game type choices that in some cases even had risk-reward elements sewn in. What made Deus Ex morally relevant for me though, were the minute-to-minute choices I faced - akin to real-life in the frequency and relative insignificance of these choices - when being told to kill guard 'X' to get to the other side of the road, or having the choice of simply sedating him or even bypassing him completely.

These choices by and of themselves did not make Deus Ex morally compelling. Notes scribbled to a 'comrade', or eloquent diary entries gave us a peek into the minds of these foot-soldiers, and nudged one into thinking of them as more than cardboard cut-outs that were simply "put there" to obstruct your progress. While - in the game universe - they were only following orders, you felt like they still had individual reasons or even compulsions to be there. Maybe they didn't even like doing what they did.

This kind of empathy could make one think twice before killing them, and could EVEN make one feel happy/satisfied or guilty for having chosen to do whatever it is one did, regardless of how difficult that choice was to execute. Which, of course, IS what morality is all about!

I'm not sure if Ion Storm intended for this kind of an interpretation, and this is just one of the MANY (and often much larger) experiences you can draw from this game. But this is a great example of how the right mix of choice and context can add moral texture, in ways perhaps not even imagined by the creators. And I hear that's what great art is all about!

(P.S: Yep, I think we've already had our 'Citizen Kane' in Deus Ex, and one only needs to understand what made it 'art' - the interaction elements, the choice, the story, pacing? Or all of these coming together to create a world with room for varying moral interpretations, THUS imbuing the game with true artistic merit and social value? - to find one more direction in which to further games as art, instead of trying to reinvent the wheel. And please, technology is NOT the answer! - Looking at you, Activision - It is an enabler..at best.)

Innovation without publisher support - Sustainable?

by on

I was listening to the 17th Nov HotSpot, where Brendan & Co were lamenting EA's decision the week past to trim down development work on new/experimental IP and focus on annualized franchises instead. It was argued this is understandable considering how the core gamer market that could actually nurture and support such innovation and risk-taking has shrunk to a really small slice of any "game" company's overall revenue pie.

While that is (sadly) true, how is it different from any other media or forms of entertainment?

Works that push the boundaries of an art-form or medium get noticed only by a select few, even fewer of whom are able to give the work or its achievements historical context. The kind of big-names that get covered on the evening news for making a gazillion dollars are most often just HEAVILY-promoted remixes of old hits, with of course certain notable - but extremely rare - exceptions like 'The Dark Knight' or 'Modern Warfare'.

BUT then, even these 'exceptions' were basically established franchises that corporations were willing to pour their money into!

So be it movies, or games, it seems like revenue is only directly proportional to the marketing spend and pre-established brand (recall) value. EA's move thus fits perfectly in line with how a publicly listed corporation is expected to work. But when did that really stop any art-form from moving forward?

That said, games ARE a different equation from movies. You can make a low-budget film with not-so-great production values and get praised to hell and back - it is the sensibility that matters.

But games are by-and-large EXPECTED to have a certain level of polish visually, apart from in the gameplay or story-telling. Bugs and crashes are mostly inexcusable. And polish, testing all that costs a LOT of man-hours. A LOT of initial investment.

Games are longer, sell in much smaller numbers and are pricier to 'experience' (esp. if you factor in the cost of a console + any subscription). Harder to reach. Cinema tickets are usually cheap enough for most. Movies also air on TV, for free (pretty much). At festivals, you can catch a large number of obscure movies and spread word about them. You cannot experience a whole game at E3, just a sample - which may even turn out to be a 'best-parts' collection of the game. It is much harder to get excited about that, and to KEEP that product in memory until the time it releases.

Then there is the fact that films are recognised widely as art, which by itself creates a separate market and SPACE for consumption and discussion of bolder films. But the fact that video-games are still seen more as entertainment than art, means that there is practically no separation in the dialogue on 'commercial' and the more 'artsy' games..also perhaps because it is much harder to define what makes a game 'artsy' (other than the obvious, like quirky art styles and the absence of usual progression incentives), and more troublingly perhaps, if this new game or idea is even GOOD for the medium and whether it expands the medium's scope or reach.

So it does seem difficult for innovation in games to survive for too long or to be meaningful without active, committed publisher support. Intrinsic problems with the format make it even harder - even in the age of the Internet - for whatever innovation does manage to come through, to be spotlighted and adequately appreciated. Maybe we should accept that these problems ARE simply intrinsic to the medium, and the current innovative-to-derivative ratio, while perhaps not being comparable to other media, is OK for games anyway. It could be that all these questions are simply signs of the teething troubles faced by any new art-form/medium that has yet to find a strong identity or language and we should quit comparing the game-evolution-arc to that of movies. Or maybe that comparison is not too far off. I don't know how exactly movies were perceived in the 30's/40's, but I DO know we have come very far in the art. And I sure would like to hope the same for games.

What do you think? Leave your comments below...

More on storytelling in videogames

by on

Was reading the article on storytelling in video games and had to post. Adventure games are fine and dandy (I love them actually) but that's really not how storytelling can move forward in video games. Because most examples of good story telling quoted in that article, and the ones I remember, are cases where there was little if any control in the hands of the player during the story's pivotal moments. That's directly defeating the prime purpose of PLAYING a game, and indeed the medium's greatest strength.

That doesn't mean it should all be open, (like, I don't know, Second Life) that would just be pointless. It doesn't mean binary choices - like in Bioshock - either. I mean, kudos to the developers/writers/artists for coming up with such a rich, and beautifully detailed world, but if you're gonna try to shove that down my throat as a step forward in video-game storytelling? Sorry folks.

The primary criterion for a good story telling medium is its ability to offer the listeners/viewers/readers/players choice/freedom in INTERpreting the plot, the meaning of it, the actions and consequences. Crucial to that, especially in an interactive medium, is for the GAME to set some rules and then just lay off, letting the player perform and INTERpret the meaning of his actions, and not have the game do that with a binary black/white (or dark grey/light grey) judgemental/reward mechanism. I am not referring to sandbox games either, because those are not even remotely attempts at serious storytelling. I mean they try, but come on. GTA? For a review on storytelling in games? Really? I had to say this, those are at best amusement parks, where you jump from one amusing ride to another one. And as long as there is enough variety, it is fun. But I do not remember being moved by a roller coaster.

Well anyway, the example that I am shooting for is Deus Ex. It created a rich complex world you were glad to spend time in. And by way of equally rich and complex gameplay mechanics, that ranged from simple shooting, to using power-ups such as skills or augmentations TO using the environment to do your job, gave you an incredible range of choice in approaching objectives, that were at times acknowledged directly by other characters, but mostly just left alone for you to experience. Example. Most of the missions could be accomplished by either shooting your way through, or using stealth to down some of the opponents OR (as I gradually discovered) not even involving them AT all! By the time I was playing the game like the third or fourth time, I realized that here in this game, I had a REAL choice whether or not to kill to get to my aims, and the game was not necessarily judging me for it. This meant that my actions MEANT something beyond a stupid reward. Because the moment a reward is introduced the very idea of a morally higher choice is beaten. I mean, what is the big deal about being nice to your neighbours if after seven days of doing it you are going to win a million dollars? See my point? Games have been trying to get heavier and more serious ABSOLutely the wrong way. Eidos had it nailed YEARS ago; put the choice in there but don't judge or reward anything. That way the ONLY reward I get is of the satisfaction of having made the choice. Thus I have no control over whether the game rewards me positively or not for my 'moral' actions, just like in the real world. And THAT way an empathy is established with the game world, thereby simultaneously achieving THE goal that both legitimate story-telling media and art forms strive for.

Anyway, none of this can be achieved by any reward-based mechanism, no matter how 'cool' - which is all games seem to be about these days. That way only stupidity and eternal hell lie. But I fear we have already started on that downhill journey...

The REAL reason why games will never replace movies

by on

This is in response to Danny_DM_Moore's rant "On the one valid reason why games will never replace movies". Link below:

http://www.gamespot.com/users/danny_dm_moore/show_blog_entry.php?topic_id=m-100-25697349&om_act=convert&om_clk=soapbox&tag=soapbox%3Bsubject%3B1

I have to (politely) disagree with Danny here. What separates games - or movies like Transformers 2 and T4:Salvation - from great movies is not the technology - it is the ineptitude of the WRITING in most games to be able to evoke an emotional response from a player.

The technology to tell a good story has been there since the age of 2D sprites and text. The Longest Journey looks 'horrible' by today's standards; yet tells a compelling story and manages to make you care about the characters regardless! Just like a book. A book does not have to be an ILLUSTRATED one (or a WELL-illustrated one) for it to establish a connect. I know that sounds stupid the moment you read it because you're used to the idea of books using your IMAGINATION to create beautiful landscapes, or horribly scarred villains. The same applies to movies. The scariest horror movies are not those that use graphic images or blood to put the point across. Just as in the best of Hitchcock's work, it is what is NOT shown, what is to be IMAGINED that has the greatest impact on the viewer.

We need to move beyond stupidly gazing at our monitors and expecting the game to do all the work for us. This expectation is part of the problem. Part of the reason why Crysis (a technically accomplished game that did NOTHING with all that amazing technology) inexplicably got 9.0 scores. We have forgotten about gems like Deus Ex (and I'm sure there are several other great games that did this too, my knowledge is limited) which relied on solid writing (often combined with engaging and RELEVANT gameplay), on creating an atmosphere and characters to suck you into the game and keep you there.

The only REAL constraint preventing a game from becoming as good as a movie, is the involvement of a player, any player. In a given situation within the story, when a protagonist is involved only emotionally and is reacting thus, rather than physically, how is the writer supposed to justify the presence of what is effectively a third person (YOU) in the scene? What role could you possibly play at that point in the game/story, other than a mechanical one where you instruct the player-model to merely MOVE this way or that?

Indigo Prophecy/Fahrenheit countered this limitation cleverly, by tying in some physical responses to emotional cues. Basically, mapping your inputs to the body language of the player, and connecting that to the emotional response of other characters. To use a sledgehammer example, an awkward handshake may end the scene differently than a solid one.But even that happened maybe only once or twice in the otherwise excellent game. There too, all the possibilities had to be pre-determined, thus immediately limiting the number, range and nuance of such possibilities. :)

So, with such a basic level of interaction, how can one reasonably expect games to ever be as emotionally complex as movies? I know, cutscenes. But that's exactly my point. The only way most writers are able to push the story forward, is by momentarily cutting you out of the interaction loop. :) Touche.

P.S:

1. Milo may have an answer to my last rhetorical question. We'll have to wait and see, although I must say I am VERY skeptical. Not of the technology, but how well it will/could be used to engage the player any deeper than we already are engaged via buttons on a controller. But that is another story.

2. I have to reiterate I'm not talking about video-games beating box-office megaliths like Transformers 1&2, T4:Salvation or even the next Bond movie at their game. That is easy, and has been done several times over. I am talking about achieving the kind of response a 'Schindler's List' or 'Life is Beautiful' manages to evoke. For games, well-known games, to truly become art (instead of them merely being a possible MEDIUM in which an artistic vision COULD be expressed). We are a long long long way away from that.