Rotten Tomatoes’s audience score is broken and it’s only getting worse. Rotten Tomatoes’ audience review system has always had far more problems than its critic review system, but some important data about audience reviews reveals just how broken the system is. While Rotten Tomatoes has made some changes to the data and the way it’s displayed over the years, most of these changes have only made the data even more unusable.

The audience review system on Rotten Tomatoes has always been criticized due to its vulnerability to review bombing, brigading, and the perception of a greater bias among audiences than with professional critics. While the differences in the quality of critic reviews and audience reviews can be seen as a subjective debate, some very clear issues with the quality of audience review data and its presentation diminish the value of user-driven reviews.

Rotten Tomatoes' Audience Review Score Data is Broken

A number of popular movies have significant issues with their audience review data in Rotten Tomatoes. Screen Rant has previously covered the problem with the Rotten Tomatoes audience score for Sam Raimi’s Spider-Man, where it mysteriously gained over 30 million new reviews and dropped 27 percent and a virtually identical problem with Star Wars: Revenge of the Sith's audience score saw it drop 21 percent, but the issue extends to numerous other movies released before 2010, including The Lord of the Rings: The Return of the King, Pirates of the Caribbean: The Curse of the Black Pearl, King Kong, Gladiator, Oceans Eleven, and likely many more.

The issue with the data wouldn’t be as notable if it weren’t for the seemingly uniform behavior. Using The Wayback Machine to measure the scores of the movies at different points in time, the movies in question all see their audience scores suddenly jump from a few thousand user-submitted reviews to over 30 million reviews in October 2010. In some cases, such as Spider-Man and Star Wars: Revenge of the Sith, the audience score sees a significant drop as a result of this influx, but the score change isn’t consistent across the board.

There Might be an Obvious Answer For Why Rotten Tomatoes' Data is So Bad

Rotten Tomatoes Logo With Popcorn and Green Tomato Splat

The most notable thing about the data is the consistency in the number of new reviews and the timing of when they appeared in Rotten Tomatoes' system. In most of the cases identified, the number of new reviews hovers around 30 million, and they all appeared in October 2010. Given the behavior of the data, it's pretty easy to rule out an organic surge or review bombing as the cause of the massive influx since neither of those explanations accounts for the similarities in timing and volume spread across numerous movies with no obvious similarities other than the fact that they were all popular movies in the 2000s.

This behavior is particularly notable because Flixter acquired Rotten Tomatoes in 2010. Flixter was a movie recommendation app designed to collect user reviews and use that data to recommend movies to its subscribers. Flixter boasted over 35 million downloads and was linked with Facebook, making it easy for users to submit reviews, especially for recent releases since they were featured in the app with only a couple clicks or taps necessary to set a star rating. If Flixter combined its user review data with Rotten Tomatoes, it would explain all the anomalies that are unexplainable by organic submissions or review bombing.

Why Combining With Flixter Data Breaks Rotten Tomatoes' Audience System

Rotten tomatoes Sam raimi spiderman

If the data was simply combined as it appears, it presents a few major problems. First, the data is gathered from different audiences in different ways, so it can't simply be combined as if a 5-star review from Flixter is equivalent to a 5-star review on Rotten Tomatoes or vice versa. Second, there's no way to de-duplicate reviews submitted by the same person on each platform. Third, the fact that some movies saw 30 million reviews added and others didn't means there's a massively disproportionate impact depending on a movie's popularity, meaning some scores are comprised of nearly 100 percent Flixter data, while others are mostly Rotten Tomatoes reviews.

An analysis by Margins in 2019 identified discrepancies between review scores on different platforms like Rotten Tomatoes, Fandango, IMDB, and Metacritic. While there's no available data to parse how different Rotten Tomatoes and Flixter data was pre-integration, it's logical to assume there was a similar disparity. The value of the data on any given platform doesn't come from the actual scores, but from the scores relative to other movies in the same data set. For example, IMDB ratings can be compared to each other, but an IMDB rating can't be compared to a Fandango rating, so arbitrarily combining data from two platforms only makes the data less usable.

Rotten Tomatoes' Fixes Broke The Audience Score Even More

Rotten Tomatoes

Rotten Tomatoes has made a few changes to the rating system in the years since, most of which seem designed to fix these very issues. Unfortunately, the underlying is still broken, and the superficial fixes have only made the audience scoring system in Rotten Tomatoes even more broken over time. The reason this data is only identifiable by using The Wayback Machine is because Rotten Tomatoes began hiding the true review counts for each movie, capping the counter at "250,000+", so there's no other way to differentiate which movies have 30 million Flixter reviews added and which ones don't.

Additionally, Rotten Tomatoes' new "Verified Audience Score" further fragments the data. To be fair, ticket purchase verification to prevent spam reviews is a smart idea, but the problem is the verified score is only for movies released after the feature was implemented, meaning there's now basically three different data sets at play - unverified audience reviews, verified audience reviews, and Flixter reviews. Different movies have different combinations of these three data sets, but there's no way to identify it without using The Wayback Machine, and, as review data for Spider-Man and Star Wars: Revenge of the Sith show, the difference in the data could account for massive changes to the score.

Whether these changes were implemented by Rotten Tomatoes explicitly to cover up or correct for the fact that the bad data broke the audience review system isn't clear, although it is odd to cap the review counter at 250,000 when many movies have over 30 million reviews submitted. Normally a review aggregator like Rotten Tomatoes would want to highlight the fact that some of its movies have over 30 million reviews submitted, although that would look odd alongside other massively popular movies like Avengers: Endgame, which only has 50 thousand user reviews.

Rotten Tomatoes reviews will always be subjective, especially the audience reviews, so any scoring system should be taken with a grain of salt; however, big collections of review data can be a valuable tool to compare audience reactions to different movies. Unfortunately, whatever exactly happened to Rotten Tomatoes data in 2010, whether it be Flixter integration or not, totally broke the audience review system, and the reduction in transparency and other changes to the Rotten Tomatoes audience review system in the years since have only made things worse.