Microsoft's new reputation system for Xbox Live on the Xbox One promises "no more cheats or jerks." In a blog post, Xbox Live program manager Michael Dunn explained that the new community-powered system will help "filter out" users that gamers do not want to play with.
"No question that Xbox Live is a distinct community of passionate gamers. We love that. But just like in life, there are all types of people--some shy, some polite, some aggressive, some snarky, some annoying, and some that can't avoid swearing at #$%^ happens to them," Dunn said. "Most Xbox Live players are polite online and know how to socially adjust to people they're playing with. But not everyone does this. And, it can be challenging to pick up on social cues when you are connected online and not face-to-face in the same room."
This is why Microsoft conceived the new reputation system for the Xbox One. The new model will "expose" people who "aren't fun to be around" and will implement "real consequences" for gamers who harass others.
To do this, Microsoft will include actions like "block" or "mute player" into the feedback model. This model will take a player's online ratings and put it into a system with a "crazy algorithm" created by Microsoft and "validated" by a Microsoft Research PhD.
A player's reputation score will determine which category they are assigned to: "Green = Good Player," "Yellow = Needs Improvement," or "Red = Avoid Me." Gamers will be able to view this data by looking at someone's gamer card.
"And, your reputation score is ultimately up to you. The more hours you play online without being a jerk, the better your reputation will be," Dunn said. "Similar to the more hours you drive without an accident, the better your driving record and insurance rates will be."
Overall, Dunn said most players will have good reputations, as the algorithm is designed to identify players who are "repeatedly disruptive" on Xbox Live. Dunn described the algorithm as "sophisticated" and one that won't penalize players for "a few bad reports."
"Even good players might receive a few player feedback reports each month and that is OK," Dunn said. "The algorithm weighs the data collected so if a dozen people suddenly report a single user, the system will look at a variety of factors before docking their reputation. We'll verify if those people actually played in an online game with the person reported--if not, all of those players' feedback won't matter as much as a single person who spent 15 minutes playing with the reported person. The system also looks at the reputation of the person reporting and the alleged offender, frequency of reports from a single user, and a number of other factors."
Dunn did not detail any of the "real consequences" gamers will suffer if they are found to be harassing other players.