So... here's an interesting read here.
When you consider the main purpose of a review is to answer the question of whether something is actually as good as it appears to be in ads and previews before you decide to spend your time and money on it, there’s no greater waste of everybody’s time and effort than telling people that something they’ve never heard of isn’t good. This is what I tell people when they ask why so few reviews on IGN end up on the bottom half of that scale. There are a few reasons for this and none of them have to do with a reluctance to give low scores when they’re warranted.
The reality is there are simply far too many things coming out to possibly review them all. In a typical week you might see countless new games, movies and episodes of TV. But all of these are not created equal: most are underwhelming or clearly bad and go completely unnoticed by the vast majority of people. Some are middle-of-the-road and generally worthwhile, and just a few are heavily marketed, highly anticipated events. The rarest of all are the surprise-hit gems that come out of nowhere and are all but impossible to predict. So given that IGN can only handle so many reviews at once – we generally review around 1,000 things per year, across all categories – we have to pick and choose which to review and which to let sail by.
We make these decisions in a number of ways. When it’s not obvious that something is a big deal, like a Grand Theft Auto or an Avengers movie or a Game of Thrones-level show, we use metrics like traffic on IGN for news, trailers, and previews to see if the wider audience is interested. We also use publicly available tools like Google Trends and YouTube. Did a lot of people check out the trailer for a new movie? It’s a safe bet they’ll be interested to know more. Did just a few thousand view it on YouTube? Maybe it’s just not clicking with IGN’s audience and a review would suffer the same fate. When those raw numbers leave us uncertain – or even sometimes when they tell us most people aren’t interested, but we are – we often take a chance on something we think is special and should be highlighted, even though it probably won’t do a lot of traffic for us. That’s when you’ll see smaller things make it onto our review list.
Importantly, this doesn’t necessarily mean that if we don’t review something we didn’t think anybody cared or assumed that it isn’t good. Oftentimes it’s just an issue of timing, where we have too many irons in the fire due to a busy time of year, and by the time we’d be able to circle back and review it the question of whether it’s good or not would’ve long been settled. In those cases we generally have to wait for another opportunity to come around – such as a port to another platform that opens a game up to a new audience or a streaming date – to make something relevant again before we go back to review it.
So why is it that the big-name stuff rarely seems to score below a 7, which means “Good” on IGN’s review scale? Simply put: if something doesn’t at least look like it might be great you probably weren’t paying attention to it in the first place. People didn’t click on it or Google it or watch YouTube videos about it, and we probably didn’t review it as a result of that. But if it does look great enough to pique your interest, it usually ends up being at least okay. And especially in the case of big-budget games, publishers have to be feeling pretty confident about them being reasonably well received for them to make it to the point of being released at all.
Imagine a developer is working on an unannounced video game. (Typically, a significant game is in development for a couple of years before it’s publicly announced or shown, if not longer.) Try as they might, their ideas just aren’t working out, and most of the testers who’ve played it say it’s simply not fun. Maybe the publisher has even hired some freelance writers to do mock reviews – written under non-disclosure agreements for their eyes only – to see how it might score on release, and many of them came back with 5s or lower. In that scenario, assuming there’s not a lot of faith that its problems can be fixed with a delay, most of the time the publisher is going to cancel that game and pivot to a more promising project. They might’ve spent millions to get to this point, but it still makes sense to cut their losses there rather than continue spending many millions more on developing and marketing a bad game. When that happens it never sees the light of day, much less gets reviewed. And games are canceled like this all the time, before we ever even know they exist.
Of course, some very rough productions still come out for whatever reason, and when they do we have no issues giving a 5 to a highly marketed game like Gotham Knights or a big-name movie like Black Adam. But it’s relatively rare that so much time and money is spent on creating something that falls short of being at least “okay” (6), if not “good” (7), “great” (8) or even “amazing” (9), which is why the great majority of our reviews end up on the top half of the scale. When it comes to the big events they’re usually safe bets, and in most cases the question isn’tifthey’ll be good but how good they’ll be. But when we have to choose, we’d much rather tell you about something we think you should experience than hammer something you shouldn’t. That’s just more fun for everybody.
I mean... makes sense, I guess. Even reading this, I still have no clue why they do this. Netizens are likely to read this like a customer at a grocery store needing to ask a cashier if they're open. I've seen the review score memes.
What do you think?
Log in to comment