Every Sunday when I call my parents, at one point in the conversation with my father we will begin talking about movies. I’ll ask what movies he’s seen that week and he’ll ask the same. And then, when talking about a movie he’s not sure if he wants to see, he’ll say something along the lines of “well it did pretty well/good on Rotten Tomatoes.” Inevitably, he will base his decision on that movie’s score on Rotten Tomatoes.
It seems many people simply base the quality of a movie on whatever arbitrary number a film receives on Rotten Tomatoes. For those of you who have no idea what Rotten Tomatoes is, it’s a website that aggregates hundreds of film critics’ scores of a recent movie they have seen, and rate it as either “rotten” or “fresh.”
A film can be rated on a scale from 0-100 percent by a critic. These scores are all averaged to get the films’ “tomato-meter rating.” If a film receives between a 0-50 percent, the film is automatically classified as a 0 when calculating the tomato-meter score. Scores of 60-100 percent becomes an automatic 100 percent, and then the score is calculated. The actual score that is displayed prominently as the movie’s score is actually a ratio of “fresh” (60-100%) scores out of the total amount of ratings, not the actual average of the scores.
For example, The Dark Knight has a tomato-meter rating of 94 percent because 271 out of 289 ratings were “fresh.” However, the actual average score the movie has received is 8.5/10, but this score (which is a much more accurate rating of critical consensus) is displayed much smaller. This system is in place, in my opinion, for two reasons.
1.) To create a dichotomy for consumers in movies. Most movies that have a high score on Rotten Tomatoes are not recognized by critics as being that high, and most movies with a low rating actually have a much higher average score. For example, the 2011 film Green Lantern has a score of 24 percent, which is pretty abysmal. However, its average rating is a 4.6/10, which looks much better: that’s actually 46 percent. On the other end of the spectrum, 2009’s Crazy Heart has a score of 91 percent, but its actual average score is 7.1/10, a huge separation. It’s simply used to glorify some movies and squash movies that are not as good as they could have been.
2.) The average movie consumer does not want to sit down a read an organized review, a task that would take a total of 10 minutes tops. We have submitted to the convenience of Rotten Tomatoes, even though it’s not accurate. There is a reason I don’t assign ratings to movies I review: we would like to think the quality of films is black and white, but it’s often a grey area. Different films have significance to everyone: many films are so subjective to our interpretations that a person can get a completely different experience out of a film. In addition, different films appeal to different people: to me, I don’t care about a Twilight movie, so I will rate it low simply because it doesn’t interest me. However, a die-hard fan of the series will view it differently because that franchise is important to them. Their score will be much higher than mine.
Assigning movies arbitrary numbers to qualify them is inaccurate, subjective, and not a true representation of what movies should be.