Metacritic's Fallout: Calling Metarcritic on its Bullshit Edition


This blog was originally a place for me to vent. After about a year, I found the same things making me angry, but I already wrote about them. I don't know whether that means I did not have many things to be angry about because all my buttons have been hit, or I am a very angry person and it took me a whole year to get it out. I certainly did not expect to change things - the blog is cathartic, so once it's out I have no compelling need to wright about publishers' mistreatment of developers, bad marketing, myopic product releases, used games, ratings, or parental responsibility - again. But every once in a while something so egregious comes along and, I just can't help being redundant. One of these things Metacritic's continued feigning of objectivity in the face of blatant bias. I wish I could say this particular rant is unique, but it's not. It is just something that set me off before and set me off again.

I have written in detail about why Metacritic's methods are flawed, as well as Metacritics statement of non affiliation with any studios, when it is actually owned by Viacom sister company, CBS. But the most relevant post was the one about skewing the data on Wet, a game released by an affiliated company.

For a while the Metacritic score on Wet was riding around 80, it is now a 64. Part of the boost was attributable to Armchair Empire's 88, which you can see here. But if you click through to the actual review you would see a score of 75. This has never been corrected. Now at the time I was not saying this error had anything to do with Les Monves position as CEO of CBS and position on the board of Bethesda parent Zenimax, but I thought it was interesting.

Today I started to read about Fallout:New Vegas and saw a lot of griping about bugs in the game. With all the heat around the game and development time, this was surprising. More significantly, Bethesda's business development guy told me Metacritic scores are very important. So important he will not even talk to a studio without an 80 plus game. Admittedly, curiosity got the better of me, and probably a bit of gloating, so I went to Metacritic to see the scores for the game with the crash bugs. Surprisingly, it stands at 84 and with mostly positive reviews. A cynic could say the positive reviews were made by large outlets which enjoyed Bethesda's marketing largesse. But in reality, the game is strong and very ambitious and the adherence to the brand and scope while it is running seem to outweigh the bugs. I guess it is kind of like ignoring the continuity errors in Star Wars.

I could go along with the 84 and most of the reviews support the score, but the same giant, glaring, festering boil of impropriety marring the WET score makes an appearance here. The highest score on the list never really happened. Sure a bit of finesse was used this time, but the result is the same. Fallout: New Vegas' highest score is a 100. Notable, because not many game receive a 100 from any outlet and if a game does achieve a 100 it is worth a look. After all, someone thought it is a perfect game. In this case, The Guardian, it is not a game site, but it is a significant mainstream presence with a history of solid game reviews. I clicked through to the review and noticed it had no score. This is not unusual. Metacritic has a policy to address this situation:

Many critics include some sort of grade for the movie, album, TV show, or game they are reviewing, whether it is on a 5-star scale, a 100-point scale, a letter grade, or other mark. However, plenty of other reviewers choose not to do this. Hey, that's great... they want you to actually read their review rather than just glance at a number. (Personally, we at Metacritic like to read reviews, which is one of the reasons we include a link to every full review on our site....we want you to read them too!)

However, this does pose a problem for our METASCORE computations, which are based on numbers, not qualitative concepts like art and emotions. (If only all of life were like that!) Thus, our staff must assign a numeric score, from 0-100, to each review that is not already scored by the critic. Naturally, there is some discretion involved here, and there will be times when you disagree with the score we assigned. However, our staffers have read a lot of reviews--and we mean a lot--and thus through experience are able to maintain consistency both from film to film and from reviewer to reviewer. When you read over 200 reviews from Manohla Dargis, you begin to develop a decent idea about when she's indicating a 90 and when she's indicating an 80.

Note, however, that our staff will not attempt to assign super-exact scores like 87 or 43, as doing so would be impossible. Typically, we will work in increments of 10 (so a good review will get a 60, 70, 80, 90, or 100), although in some instances we may also fall halfway in-between (such as a 75).


This is a long way of saying they make up a score if the reviewer does not provide one. No big deal - unless the core happens to be an outlier, it happens to be objectively inappropriate, and it favors an affiliated company. While the review is generally favorable, it had a few negative statements, like:

Familiar problems, such as regular crashes – I've had to switch my Xbox off using the power button roughly once every two hours so far – and a lack of signposting for irrevocably game-altering decisions can be frustrating, though perhaps understandable given the huge scope of the game. Getting into the habit of regular saving is more important than ever.

If Metacritic had not allowed this statement to stand in the way of a 90, I would not be saying anything. However, in spite of this line Metacritic interpreted the Guardian's review as saying the game is perfect. I am sure perfection was the front of mind for the reviewer each time he had to stand up to reboot his Xbox. This may be a big deal, or it may not. There is no way of knowing how much impact the 100 had because all scores are not treated equally. Different reviews are given different weight based on Metacritic's perception of the review and publication. We can see the PlayStation 3 version which did not include the Guardian's 100 - or about 18 other reviews - and it is a full 3 points lower.

I am not saying Metacritic was swayed by their parent to place an artificially high highest score on two Bethesda games, and I didn't say it last time. However, the first time seems like a simply - uncorrected - error. The same poetic license applied to a high score a second time may merit a head scratch. Will there be a three peat?


Comments

Popular posts from this blog

The Media is Reporting the Game Industry is Over: Why are They Such Babies Edition

The Game Business Is A Year From Irrelevance:Where are Our Easy Riders and Raging Bulls Edition

We Should All Want Brash to Work: They Can't Help Themselves Edition