“Don’t give me your opinion, just tell me if this game is good or not.” I wish I were making that quote up or that the person who wrote it in the comments of a review was being satirical, but unless they were extremely dedicated to the bit I fear it was a genuine request from someone who doesn’t understand that reviews are, by definition, opinions. And that’s not an uncommon misconception: a lot of people believe that when they disagree with a critic’s opinion it must be because that critic isn’t being “objective,” which they take to mean evaluating it without any emotion or personal preferences factored in – which is of course impossible for a human being to do, as the entire point of any form of art is to inspire an emotional reaction in people who experience it.
If it were possible to review art objectively, based purely on the facts and without human emotion, AI would be the ultimate critic: it has no concept of things like hype or disappointment; it can’t be swayed by genre preferences, brand loyalties, or personal grudges against a developer that ran over its dog; it can’t be intentionally contrarian; and it can’t be bribed.
AI can analyze the data on a screen and even identify people present, but it doesn’t understand what makes writing and acting work
“
But here’s the problem: while an AI certainly could learn to play a game (we play against forms of AI in games all the time), it wouldn’t play like you or I because it doesn’t have any concept of what’s fun to do and what’s not. It would only try to complete it in the most efficient way possible, never missing a shot or being hit by an avoidable attack. It wouldn’t be able to tell you if a weapon is creatively designed or if an enemy was annoying to deal with. It wouldn’t know if managing your inventory was tedious or if upgrades were meaningful and rewarding. It couldn’t distinguish good graphics from bad ones beyond counting pixels and frame rates, and it would be incapable of telling you if the music got stuck in its head. The same is true of movies or TV shows – it can analyze the data on a screen and even identify people and objects present, but it simply doesn’t understand what makes writing and acting work.
As an experiment, I asked ChatGPT to tell me if Forspoken was a good game. It thought for a moment before responding: “There are a few different ways to approach this problem, but one possible method is to use the Pythagorean theorem to find the missing side length,” and then rambled on for over a page describing various uses for the famous geometric equation.
But that was a fluke; I then asked it if Top Gun: Maverick was a good movie, and it responded appropriately: “I am an AI language model and do not have the ability to form opinions or provide personal recommendations. However, the quality of a movie is subjective and opinions on Top Gun: Maverick may vary. It is best to read reviews or watch the trailer to get an idea of whether the movie may be to your liking.” That’s good advice, and I can appreciate that it knows and is honest about its own limitations.
Then I told it to write a review of Top Gun: Maverick, and it went right ahead in spite of itself.
A bit generic, but I’ve certainly read worse reviews written by humans. But wait, didn’t you just say you don’t have the ability to form opinions? Where did this come from then?
In other words, AI can’t answer the question of whether a game or movie is good, and why, because it doesn’t feel. Whether that’s fun, sadness, anger, awe of beauty, or anything else, it simply isn’t capable of experiencing any of that first-hand. And since it can’t have those experiences itself, when you ask it whether something is good or not it’s simply searching the internet and regurgitating the opinions of people who have – no different from an aggregator.
That means AI does pose a threat to professional critics in that it can very effectively gather and summarize their thoughts without paying them anything (a problem that will affect every type of content creator, and something that will certainly have to be worked out in court before too long), but it cannot replace them. Without critics watching and playing things before they’re widely available, AI would have nothing to draw from when people ask it these questions until after they’re out and people are posting about it on social media. Which would be fine for a lot of people, but we know from Google Trends that interest in reviews is highest just before things come out to the public, after which point demand drops off considerably. So when most people are asking the question of whether something is good, AI is not equipped to answer it.
And really, that’s for the best: the day an AI can tell you if it thinks a game or movie is good or bad will be the same day it forms an opinion on whether the human race deserves to exist, and we all know how that’s likely to go. In fact, IGN can recommend you a few good movies on the subject.
Dan Stapleton is IGN's Director of Reviews. Follow him on Twitter.