Facebook and Twitter received most of the attention and blame for spreading disinformation in the 2016 presidential election. But a new report from New York University’s Center for Business and Human Rights says that Instagram and WhatsApp may be the bigger risks to democracy in 2020.
Last year, a Senate Intelligence Committee report found that Instagram was a greater tool of Russian disinformation than Facebook in the 2016 election. Its defenses are weaker, too. “Instagram didn’t have the same set of rules or capacities for identifying false information as its older brother Facebook,” says Paul Barrett, the NYU law professor who wrote the new report. Meanwhile, disinformation on WhatsApp has already influenced elections in Brazil and India.
What to do?
Both platforms have started taking the issue of disinformation more seriously. In August, Instagram began testing a tool to flag false information. WhatsApp has limited how easily messages can be forwarded. Users could once forward messages to 256 chat groups, each of which can have 256 members. Now they users can forward to only five chat groups.
But this might not be enough, Barrett says. He thinks that, at a minimum, Instagram needs to adopt all the tools Facebook is using. These include 54 fact-checking partners in 24 languages. (One of Facebook’s partners, UK-based Full Fact, recently released a 46-page report highlighting the program’s weaknesses, such as not sharing enough data with the fact-checkers. The FullFact report also recommended that fact-checking extend to Instagram.) As for WhatsApp, Barrett thinks that the platform should “go all the way” and limit users so they can forward messages to no more than a single group.
Other cause for worry:
The report also mentions the risk of deepfakes, voter suppression, and disinformation campaigns from Iran and China. It recommends that tech companies improve deepfake detection and support the Honest Ads Act, a piece of legislation that would require large platforms to maintain a public file of election advertising.