Global giant Meta, the parent company of popular social media platforms Facebook, WhatsApp and Instagram, says its systems are ready to fend off disinformation and voter fraud campaigns meant to influence the upcoming general elections.
The company said this during an interview with Sunday World on Thursday at its Johannesburg offices, warning that it will remove any content aimed at misinforming the public about this year’s fiercely contested national and provincial elections from its platforms.
Meta’s public policy manager Balkissa Ide Siddo emphasised that the company has election-specific content policies, which help prevent voter fraud and election interference.
“On misinformation, we have adopted a three-step approach: remove, reduce and inform. Any content, whether it is misinformation, just regular content or generated by artificial intelligence, if it violates our community standards we will remove it.
“In an election context, we are going to remove all content, any incitement to violence, any content that could lead to imminent harm, physical damage and voter suppression. Under our policies, we will remove it as soon as it comes to our attention,” stated Siddo.
Siddo revealed that over the past eight years, the company had developed industry-leading tools and standards when it comes to safety and security in elections.
“We work with independent fact-checkers across the globe, about 100 partners, and they fact-check content in 60 languages. In South Africa, we work with AFP and Africa Check. They fact-check in English and also in local languages such as isiZulu, seSotho and Setswana.”
The independent fact checkers review and rate the content, and Meta will attach a warning to content that has been debunked by the fact checkers.
She said in the coming weeks, Meta will make it possible for users and advertisers to disclose when they post content that is AI-generated. “If they don’t disclose it and it is brought to our attention or we detected that with our tools, they will get a warning and if they don’t change that they will get warnings.”
Siddo said Meta has globally invested more than $20-billion in this area of dealing with disinformation campaigns, safety and security during elections.
She stated that the platform had ads transparency tools on their platforms, which through an authorisation process, resulted in people who want to advertise political election ads on their platform having to go through the process of disclosing several information. This is because users are demanding more transparency and want to know who is behind an ad, she said.
“When people want to advertise political or election-related ads in South Africa, they have to tell us where they are located and … who is paying for the ads, and they provide this information which will be stored in the ads library and the information is stored there for seven years and everyone can access the information,” she said.
“We also know that no two elections are the same, so we have a tailored approach for the South African election … We also collaborate in South Africa with a number of organisations to prepare for the elections and we are also going to put in place a dedicated elections operations centre, pulling together resources from across the company, [including] threat intelligence experts, data scientists, legal people and comms people.” She said Meta has covered more than 200 elections and the company will prioritise listening to experts on the ground and civil society, regulators and authorities to understand what are the trends, potential challenges and gaps.
“That is the reason why for us in South Africa it is really important that we remove anything that is harmful content from our platform.”
The company has also partnered with 10 national and community radio stations, to raise awareness on misinformation and hate speech, and how to detect such content and what to do when they come across misinformation.