Rating The Best: GOTYs and MetaCritic

Written by: Dalibor Dimovski, Managing Editor

Around this time of year, we begin to hear talk about what games will take a publication’s crown as the Game of the Year.  Oftentimes, we agree, other times we don’t.  That’s just the nature of taste.

While watching the 2010 Spike VGAs, and complaining all the way throughout, it dawned on me that the highest rated game of the year (and almost of all time) per Metacritic and GameRankings, Super Mario Galaxy 2, was not even nominated for the show’s prestige award.  How often does this happen, and how should a game’s rating affect a publication’s GOTY choice?

The typical method used to choose a GOTY is to poll an outlet’s staff for their faves, assign points, and discern the “winner” via arithmetic.  The VGAs are similar, asking input from individuals of said outlets to select their winners (or at least their nominees).  Super Mario Galaxy 2 was the highest rated game of the year according to Metacritic — the median score of all of these supplying outlets.  Yet, Mario wasn’t even nominated.  Mario‘s score, 97, was higher than any of the other nominees: Call of Duty Black Ops at 89, God of War III at 92, Halo Reach at 91, Red Dead Redemption at 95, and Mass Effect 2 at 96.

In fact, Rock Band 3, Starcraft 2, Shantae’s Revenge, and Bayonetta were ranked higher than some of the nominees.

But, this is Spike TV… we knew going in that we shouldn’t take this award show seriously.  There was already a level of entertainment value thrown in, as well as mainstream visibility.  The average viewer may not know what Alan Wake is, but they’ve heard of the Haloz.

But what about with media outlets?   Ratings are a strong factor for media outlets in their game recommendations.  Should the highest-rated game automatically win a site’s GOTY?  Should games that aren’t received well by the majority on staff even be recommended?  I suspect games like Alan Wake, Heavy Rain, Dragon Quest IX, and Angry Birds will be top picks for many a website and media outlet, yet none of these received the respective outlet’s highest review scores.

So there it is: Review Scores and GOTYs more often than not don’t align.  Does that take anything away from the review score, or does that reduce the GOTY to an on-package sticker?  Or, does that further “game” the effectiveness of Metacritic as to assessing the actual rankings of games?