Less than two months before Brazil’s 2022 election, a report by an international NGO found that Facebook parent Meta had an “appalling” failure to detect false political ads. The group tested Facebook’s ability to catch election-related misinformation by submitting 10 ads.
Five of the ads contained blatantly false information about the election. For example, some mentioned wrong election dates and methods citizens could use to vote. Five other ads sought to discredit Brazil’s electoral process, including the country’s electronic voting system, which has been in use since 1996. Of the 10 ads, Facebook initially rejected only one, but later approved it without Global Witness taking any further action.
In addition to content, these ads have other red flags that Global Witness believes Meta should spot. First, the nonprofit didn’t pass the company’s. “It’s a safeguard Meta has in place to prevent election interference, but we can easily get around that,” Global Witness said.
In addition, the organization submitted ads from London and Nairobi. In doing so, it doesn’t need to use a VPN or local payment system to mask its identity. Additionally, these ads do not have a “paying party” disclaimer, which Meta notes that all “social issue” ads in Brazil must include by June 22, 2022.
“One of the big reasons that it’s clear from the results for this and other surveys is that the content moderation capabilities and integrity systems they’ve put in place to mitigate some of the risks during the election, just doesn’t work,” Global Witness senior consultant Jon Lloyd.
Meta did not immediately respond to Engadget’s request for comment. A Meta spokesperson told The Associated Press that it had “made a lot of preparations” for the upcoming Brazilian election. “We launched tools to promote credible information and flag election-related posts, established a direct channel for the High Electoral Court (Brazil’s electoral authority) to send us potentially harmful content for censorship, and continued to communicate with Brazilian authorities and The researchers worked closely together,” the company said.
This isn’t the first time Global Witness has found Facebook’s election safeguards to be inadequate. The nonprofit conducted a similar investigation earlier this year and came to many of the same conclusions. Then, as now, Global Witness called on Meta to strengthen and increase its content moderation and integrity systems.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. We may receive an affiliate commission if you purchase through one of these links.