The Promise, starring Christian Bale and Oscar Isaac, is a true labor of love. The historical drama directed by Terry George (Hotel Rwanda) is a love-story told to the backdrop of the Armenian Genocide, and comes with a $100m budget, bolstered by support from the late businessman Kirk Kerkorian. It hasn’t made much of an impact with the general public, but visitors to the film’s IMDb page may have noticed something very peculiar with its ratings. By the end of October 2016, less than a month after the film’s small screening at the Toronto Film Festival, IMDb had registered over 55,000 1 star ratings.
Clearly a medium-sized Canadian cinema didn’t house all those voters, and it became clear that something else was going on. In fact, a Turkish website had organized a mass 1 star campaign to sink the film’s IMDb ratings, as well as down-vote any trailers on YouTube. The country to this day denies the existence of the Armenian genocide, and this small act of opposition signaled a wider problem with both IMDb’s ratings system and the film industry’s issues with online abuse.
This is not a new issue. As noted by the Hollywood Reporter, “a search of 4chan reveals multiple campaigns against everything from Star Wars spinoff Rogue One to indie Holocaust-denier drama Denial to Justin Simien’s upcoming Netflix series Dear White People, many with step-by-step instructions on how to negatively impact films on sites like IMDb and YouTube.” Last year’s reboot of Ghostbusters, a frequent target for misogyny and hate-campaigns (most notably the attacks on its star Leslie Jones), faced similar tactics. Recently, Netflix ditched its own star ratings system and switched to a thumbs up or down one following a Reddit-led campaign against Amy Schumer‘s newest stand-up special, The Leather Special.
Many fans gave offered a fire-with-fire reply to such attacks, and The Promise’s rating has increased to an average of 4.2 stars thanks to a dramatic increase of over 35,000 10 star ratings. This is an admirable effort, and an appropriately loud measure to help drown out the negativity, but it can’t help but highlight the inherent faults of the site’s easily abused and consistently rigged system.
It’s easy to invest in a system as simple as IMDb’s ratings, or indeed the reviews aggregate sites like Rotten Tomatoes and Metacritic. There’s a comfort in easy solutions, and a convenience in cutting through the talk to the answer of whether or not a film is any good. Separate from the industry and internet’s growing issues of abuse and toxicity, IMDb’s ratings system is an annoyance, but not one that keeps Hollywood up at night. IMDb rankings simply aren’t viewed with the prestige or accuracy of sites like Rotten Tomatoes or Metacritic. Those sites are not without their own issues, but they are generally accepted as more reflective of a film’s quality and critical popularity than their competitors. However, where IMDb’s power lies is in the way it can be appropriated to cause maximum damage to properties opposed by hate groups and trolls.
Walt Hickey’s Five Thirty Eight investigation into the gender gap of IMDb’s user ratings noted the ways in which male users sank the average ratings of shows primarily enjoyed by women. Tastes differ and a mere number rating cannot convey a show’s critical nature, but as online abuse grows louder and harder to overlook, the ways in which online hubs are used to exacerbate these problems cannot be ignored, and even relatively benign systems can become a defining tool of the culture wars. A slew of one-star ratings may not seem like much, but when combined with organized online hate campaigns, Twitter trolls and the bombarding of comments sections, a manipulated user score becomes part of a larger, uglier tapestry.
Unfortunately this not a problem with any easy solutions, and IMDb is but a drop in an increasingly deep ocean. However, there are measures that the site could take to prevent the obvious rigging of the system. IMDb already incorporates the Metacritic score of each film into its page, with links to external critics to give a variety of options for the user to check out. Metacritic, like IMDb and Rotten Tomatoes, also has a user score and user reviews in order to allow the “average” audience member to weigh in with their opinion of the movie, but the problem with each of these user-generated scores is that there is no way to know whether the person giving the rating has actually seen the movie – making it incredibly easy to organize efforts to artificially deflate (or inflate) an IMDb score.
One potential solution would be for IMDb to ditch the user score in favor of embedding the movie’s CinemaScore as the main measure of the average audience member’s opinion. CinemaScore is a market research firm whose ratings are based on polls of moviegoers across North America, who are given a ballot card when they go into a screening and hand it back with their score when they exit. This ensures that CinemaScore ratings only represent the opinions of people who have seen the movie, making them far more useful than any anonymous online poll, while also providing an alternative to the consensus of professional critics.
In addition its user scores, IMDb has a wealth of user reviews, which are often very personal and written to express specific thoughts or stances, but currently those reviews are somewhat hidden away. A TripAdvisor style-tailoring of ratings from users with similar profiles to oneself could make the site not only more beneficial to use but fairer in how it approaches critical consensus. This would require users to provide more details about themselves, which has its advantages and disadvantages, but the wild-West anonymity of the current model is clearly not working. This is something that IMDb is aware of, and the recent move to controversially scrap the site’s oft-maligned yet wildly popular message boards signals that. The ratings system remains a defining element of the site, but in its current form it’s simply too vulnerable to manipulation to serve its designed purpose.
At its best, a user-driven film ratings system can reveal the tastes of the populist mainstream, allowing for a sample-size analysis of the general audiences who make or break Hollywood’s blockbuster slates. Two days after its release, The Dark Knight shot to number one on the Top 250, signaling the power of public fervor and fandom clout. There’s value in that. However, it has been clear for several years now that IMDb’s ratings are too vulnerable to manipulation to function as a reliable metric for movie quality.
No ratings system is foolproof, and there will always be arguments over their inherent worth and the reductive nature of putting a number or yes/no marker to anything. Whatever stock you put in the power of ratings, and even if you find them to be oversimplified and detrimental to how you make your viewing choices, there is worth in making the system a fairer game for all.