At this point in history, people have been seeing movies for well over 100 years, meaning that movie debates have been happening for just as long. As a form of art, movie-making is subject to all forms of criticism and, as the saying goes, everyone’s a critic.
The more recent emergence of social media has only magnified that phenomenon, as everyone now has access to a nearly endless supply of people with whom to talk about movies. Thanks to the internet, an abundance of data is now available at our fingertips, and since it’s easier to simply cite "objective" data than to break down more abstract concepts and opinions, audiences gravitate toward using that data as a means of judging the quality of a movie.
People look to a variety of sources to provide this information, but there are two metrics that are most often cited by moviegoers in search of evidence to justify their love or hate of a particular film: the Rotten Tomatoes Tomatometer, and box office numbers. Unfortunately, while both of these statistics have their proper uses, they have no place in an analysis of a movie’s quality.
Before the age of the internet, and particularly before the advent of social media, film criticism wasn’t all that different from what it is now, except people would look far more toward individual voices to learn about the quality of new releases. There were some clear pillars of criticism that people looked to en masse, like the late great Roger Ebert, but outside of that it was a collection of individual voices from each person’s preferred source - be it a local newspaper, or an international magazine.
As such, the “collective” opinion on a movie was not gauged by an aggregate critical score. People would find a specific critic or handful of critics they respected and that’s who they’d read. After all, why invest any stake in someone who you know you typically disagree with, unless you just value a dissenting opinion?
So how does Rotten Tomatoes change that? Well, on a small scale, it doesn’t. Individual critics still have individual opinions, everyone still has varying opinions, and people are free to seek out critics whose opinion they respect. However, now we also have the almighty Tomatometer - a fairly sterile grading scale by which hundreds of different opinions are boiled down to a single number and graded as “fresh,” or “rotten.”
Before getting too into this, it’s important to note that there is a value to the Rotten Tomatoes system. It helps draw attention to movies critics are buzzing about, sometimes these are even films nobody knew were coming, helping moviegoers who don’t already have their mind made up decide what ticket to purchase. It's a great way to answer the question "do people like this movie" without getting into any sort of analytical weeds or spoilers.
The Tomatometer's weaknesses lie in it being misused - specifically, being used as an objective indicator of quality, which it is not. The final Rotten Tomatoes percentage is not indicative of the average score of all eligible reviews, but a reflection of what percentage of those reviews are positive, where all positive reviews (regardless of score) garners a 100%, and all negative reviews (regardless of score) garners 0%.
So, a movie scoring a 90% does not mean the average reviewer gave it 4.5 out of 5 stars. It could mean that, if the numbers line up correctly, but it could also mean that 90% of reviewers gave it 3 out of 5 stars or better, meaning a movie where 90% of reviewers score a movie at 3 out of 5 and 10% of reviewers score it at 2.5 out of 5, that movie is certified fresh at 90% - even though the average score is just under 3-stars. And that's not even accounting for the times that audiences diverge significantly from critics.
The problem with this approach should be fairly apparent. While it's fairly effective at reflecting the general outlook, there's not enough nuance to know what separates an ok from a great movie, and divisive and polarizing movies always suffer the most, since the Tomatometer rewards consensus.