Rotten Tomatoes, Metacritic, IMDb & CinemaScore Explained

Movie review aggregators like Rotten Tomatoes have become a big part of the movie dialogue, but how are the scores calculated and what do they mean?

Art and story analysis is by no means a recent cultural phenomenon, but thanks to the internet, modern day criticism, partucularly film criticism, is far more widespread, leading to the creation of tools to sort and present data to quantify various qualities about movies. The most prevalent of these tools is obviously Rotten Tomatoes and the Tomatometer, but Metacritic, IMDB, and CinemaScore are also commonly referenced.

Of course, because film is a subjective medium, using "objective" numerical values to represent a movie's cumulative critical evaluation rubs many people the wrong way. Rotten Tomatoes, for example, regularly finds itself at the center of audience controversy for ranking a given film too high or too low. It's an understandable disagreement; it's impossible to take a variety of differing opinions, turn those opinions into data, average that data out, and suggest that equation represents a "definitive" opinion, when in reality that average only coincides with the opinions in the middle of the road. So if there are three groups of critics - one that views a film poorly, one that is neutral, and one that sees a film positively - the average opinion is going to land somewhere closer to the neutral ground where it finds itself in disagreement with 2/3's of the sampled critics. This middle ground becomes more narrow for movies with a strong consensus amongst critics and broader for polarizing films.

Some tools lean into that difference, while others try to apply even more math to make the divide less significant, but unless there is a rare unanimous opinion, there can't be an objective way to establish a consensus (even by average) when the subject matter is subjective. Fortunately, despite being pointed to as arbiters of quality, that's not something any of these tools are intended to be. Rotten Tomatoes, Metacritic, IMDB, and CinemaScore all have widely different methods of analysis, and their reported numbers mean something totally different about a given film.

Even so, audiences will commonly cite these figures as proof of what movie is good or bad, and sometimes the response will be to cite the numbers from another tool, saying "see, Metacritic is better than Rotten Tomatoes" when, in fact, each scale is totally different and comparing the two is like comparing tomatoes and oranges. Each tool has a very different and distinct objective in its reporting, so in order to fully understand what each review aggregator score means, we're going to look at the method of reporting and the implications of the results when it comes to movie quality.

Page 1: Intro (this page)

Page 2:  Rotten Tomatoes

Page 3: Metacritic

Page 4: IMDB

Page 5: CinemaScore


Page 2:  Rotten Tomatoes

Zeffo First Jedi Comparison
Star Wars May Have Just Introduced The Real FIRST Jedi