At this point in history, people have been seeing movies for well over 100 years, meaning that movie debates have been happening for just as long. As a form of art, movie-making is subject to all forms of criticism and, as the saying goes, everyone’s a critic.

The more recent emergence of social media has only magnified that phenomenon, as everyone now has access to a nearly endless supply of people with whom to talk about movies. Thanks to the internet, an abundance of data is now available at our fingertips, and since it’s easier to simply cite "objective" data than to break down more abstract concepts and opinions, audiences gravitate toward using that data as a means of judging the quality of a movie.

People look to a variety of sources to provide this information, but there are two metrics that are most often cited by moviegoers in search of evidence to justify their love or hate of a particular film: the Rotten Tomatoes Tomatometer, and box office numbers. Unfortunately, while both of these statistics have their proper uses, they have no place in an analysis of a movie’s quality.

Rotten Tomatoes

Before the age of the internet, and particularly before the advent of social media, film criticism wasn’t all that different from what it is now, except people would look far more toward individual voices to learn about the quality of new releases. There were some clear pillars of criticism that people looked to en masse, like the late great Roger Ebert, but outside of that it was a collection of individual voices from each person’s preferred source - be it a local newspaper, or an international magazine.

As such, the “collective” opinion on a movie was not gauged by an aggregate critical score. People would find a specific critic or handful of critics they respected and that’s who they’d read. After all, why invest any stake in someone who you know you typically disagree with, unless you just value a dissenting opinion?

So how does Rotten Tomatoes change that? Well, on a small scale, it doesn’t. Individual critics still have individual opinions, everyone still has varying opinions, and people are free to seek out critics whose opinion they respect. However, now we also have the almighty Tomatometer - a fairly sterile grading scale by which hundreds of different opinions are boiled down to a single number and graded as “fresh,” or “rotten.”

Before getting too into this, it’s important to note that there is a value to the Rotten Tomatoes system. It helps draw attention to movies critics are buzzing about, sometimes these are even films nobody knew were coming, helping moviegoers who don’t already have their mind made up decide what ticket to purchase. It's a great way to answer the question "do people like this movie" without getting into any sort of analytical weeds or spoilers.

The Tomatometer's weaknesses lie in it being misused - specifically, being used as an objective indicator of quality, which it is not. The final Rotten Tomatoes percentage is not indicative of the average score of all eligible reviews, but a reflection of what percentage of those reviews are positive, where all positive reviews (regardless of score) garners a 100%, and all negative reviews (regardless of score) garners 0%.

So, a movie scoring a 90% does not mean the average reviewer gave it 4.5 out of 5 stars. It could mean that, if the numbers line up correctly, but it could also mean that 90% of reviewers gave it 3 out of 5 stars or better, meaning a movie where 90% of reviewers score a movie at 3 out of 5 and 10% of reviewers score it at 2.5 out of 5, that movie is certified fresh at 90% - even though the average score is just under 3-stars. And that's not even accounting for the times that audiences diverge significantly from critics.

The problem with this approach should be fairly apparent. While it's fairly effective at reflecting the general outlook, there's not enough nuance to know what separates an ok from a great movie, and divisive and polarizing movies always suffer the most, since the Tomatometer rewards consensus.

Continued: [valnet-url-page page=2 paginated=0 text='What%27s%20the%20Definitive%20Value%20of%20the%20Tomatometer%3F']

Gal Gadot as Wonder Woman crossing No Man's Land in the 2017 solo movie

What's the Definitive Value of the Tomatometer?

To rank a few popular well-scored comic book movies by the Tomatometer from best to worst, we'd get 1 - The Dark Knight (94%), 2 - Wonder Woman (93%), 3 - Logan (93%), and 4 - The Avengers (92%). Yet, taking the actual average critical scores of those movies the ranking would change to 1 - The Dark Knight (8.6/10), 2 - The Avengers (8/10), 3 - Logan (7.9/10), and 4 - Wonder Woman (7.6/10). Even so, can either of those rankings even be considered definitive when they’re comparing movies that all tell drastically different stories have widely varying tones and themes?

The Dark Knight is an established Batman grappling with a madman that challenges the very nature of his own principles. The Avengers is a team up film that represents the culmination of five previous movies. Logan is an R-rated gritty send off to Hugh Jackman’s 17 year legacy, and Wonder Woman is a period piece origin story about the horrors of war and the power or love. It seems a disservice to every single one of those movies to even place them on the same scale by boiling their value down to a simple number.

The Avengers 2012 line-up

Rotten Tomatoes, and many critical reviews in general, also suffer from a lack of historical context, which helps skew the perceived relevance of the Tomatometer. The site does hold scores for movies that were released well before Rotten Tomatoes was established, but those scores were all entered retroactively, meaning the Tomatometer doesn’t always reflect their initial reception and suffers from a sort of revisionist history. The impact is that we’re incredibly short sighted in the way we see movies over time, treating the Tomatometer as a permanent stamp of approval or rejection based on a movie's initial reception, leaving little room for reevaluation.

Verified classics like 2001: A Space Odyssey, Psycho, The Shining, Blade Runner, Alien, and Star Wars all boast Tomatometer rankings from the high 80s to upper 90s, yet they all experienced initial critical reception that ranged from “mixed” to outright polarizing. Now they all benefit from a red tomato stamp of approval even though the Tomatometer would have dropped their scores significantly, possibly into the rotten range in some cases, upon their initial release - due not to a lack of praise, but a lack of consensus.

This isn't to say Rotten Tomatoes is flawed in its conception and shouldn’t exist, but that citing the Tomatometer as a definitive representation of a film’s quality is incredibly reductive. When discussions or debates about the quality of a film are reduced to “my preferred movie is better than your preferred movie because it scored higher on Rotten Tomatoes,” then we begin to condense all elements from acting, writing, direction, cinematography, sound design, various forms of visual effects, music, and editing all into a single, meaningless, number.

Continued: [valnet-url-page page=3 paginated=0 text='What%20Does%20Box%20Office%20Success%20Mean%3F']

What Does Box Office Success Mean?

Box office numbers are something else entirely, yet they’re used in a fairly similar capacity to Rotten Tomatoes scores as a measuring stick for comparing movies. But hat does the box office really say about quality?

A quick look at some of the biggest box office hits of all time makes the answer fairly clear, with the top 100 earners running the gamut from critical darlings to near-universally derided flicks. There is a definitive correlation between well-reviewed movies and high box office takes, but it’s not an absolute rule, and it’s only a one-way street, meaning good reviews will help a movie perform at the box office, but a good box office performance does not indicate good reviews.

Also, where audiences and critics are partial to quality, the box office isn’t. A ticket is a ticket, regardless of how the buyer felt about it, so a box office dollar from someone that gives a movie a 5-star review is reflected identically to a box office dollar from someone who gave it a zero-star review. There are also numerous other factors at play, such as rating, marketing, release season, competition, social or political reception, and other buzz that can all boost or depress a movie’s box office turnout.

A related statistic that people sometimes cite when using financials to gauge the quality of a movie is the amount of profit it made, but that has even less bearing on actual quality than the total gross does. Sure, the filmmakers care if the movie makes a profit, but it’s ultimately a behind-the-scenes detail - one that has about as much relevance to the finished product as the quality of catering available on set during production.

So, while box office numbers are a fun discussion from a business or general trivia angle, comparing the box office numbers as a means of determining quality is a non-starter, and obviously completely arbitrary when comparing lower budget movies loved by critics and audiences like John Wick or 10 Cloverfield Lane, which would never earn as much as the movies that reach for the billion dollar mark.

Are some movies objectively better than others? Sure. But that’s not because of Rotten Tomatoes or box office scores, and there are far better ways to establish that then bringing them into the discussion.

If seemingly objective metrics like review aggregators and box office revenue aren’t sufficient trump cards for movie evaluation, then how do we know what movies are good or bad? That’s where you’re in luck. You can like whatever you want to like - for whatever reasons you want to like it. If you want to share that appreciation with others, then you can explain why it is you like it. Gush over the cinematography, dissect the themes, fawn over the performance of the actors, and etc. Or do the reverse if you find something you dislike. You don’t need validation from arbitrary metrics to do that.

As long as there are movies being made there will be disagreement on movies. That disagreement can come in the form of fans simply declaring one movie and another bad based on some statistic that ultimately holds little meaning, or it can be conducted in a way that actually sheds light on the way cinematic qualities are perceived by somebody else. Or we can just grab some popcorn and enjoy ourselves. They're just movies.

NEXT: What Wonder Woman Reviews Get Wrong About The DCEU