Innovation

Disinformation and Deepfakes Fuel Growing Mistrust

The IABC Trends Watch Task Force identified and researched six key trends shaping communication in 2021 and beyond. In 2020, the growing mistrust in agreed-upon facts and a global decline in trust of all forms of media accelerated. This article presents research about disinformation and deepfakes and why communicators must take a stand in identifying and stopping the damage disinformation could cause in their own organizations. Read on to understand why this trend is relevant to IABC members, and download the full 2021 Trend Report here.

The growing mistrust in agreed-upon facts and a global decline in trust of all forms of media accelerated in 2020. In the most recent Edelman Trust Barometer of 33,000 participants worldwide, the threat and negative result of disinformation campaigns was a key takeaway. (For additional analysis and talking points, see the Catalyst article from Mike Klein, IABC task force member, “Talking about Trust: IABC Insights From the Edelman Trust Barometer.”)

Heightened polarization in many regions of the world and the rise of tribalism contribute to a narrowcasting of news that reaches only those who already share a point of view. On social media forums in particular, there appears to be both a free-for-all in terms of sources purporting to be experts, combined with a censorship of content and sources inconsistently applied by standards that seem to shift in response to crowd thinking.

The declining credibility of the media is tied to polarized viewpoints (i.e. “my station is the only one that does not spread lies”) but that does not explain everything that makes us skeptical about content on all media formats, including social media. The trends to understand that may well be deepening polarization is the rapidly increasing technical capabilities for news and information manipulation, along with the continued power, growth and consolidation of giant news platforms. This includes cable, network and publishing conglomerates, and platforms like Google, Facebook, Snapchat and others.

Potential Impact and Scope

Since 2017 when initial fake videos came to the internet as pornography, huge leaps in sophistication of technology and video production have made the capability attractive to good actors and bad actors alike. The now accessible (inexpensive) technology makes it incredibly easy to manipulate content and facts for entertainment, marketing and information, as well as sinister purposes.

Deepfakes are among the communication tools that pose a significant risk to trust in communications. The convergence of artificial intelligence (AI) “deep meaning” and “fakes,” or deepfakes, use facial recognition and audio technology to produce AI-synthetic media. According to Sentinel, a global organization founded by ex-NATO cybersecurity experts to detect them primarily for government clients, the number of deepfakes online has grown 900% in the last year, amassing close to 6 billion views.

The impact of deepfakes on credibility cannot be understated in a world where a false image can be shared millions of times in a few seconds. Videos of politicians and business leaders appearing to say words they never used have hit country leaders like Speaker of the United States House of Representatives Nancy Pelosi, the president of Gabon, a UK major financial institution, Jeff Bezos and Tesla’s Elon Musk.

In the country of Gabon, a suspected deepfake video of the country’s president (who had been in ill health) sparked an unsuccessful coup. In the United States, a deepfake video of Pelosi added stuttering that made her appear drunk. It was widely circulated by her opposition.

Disinformation and the Conditions That Enable It

In the U.S., in January 2021, Edelman published its latest survey, which was completed by 33,000 people in 28 countries, indicates a growing "epidemic of misinformation," according to Edelman CEO Richard Edelman, as reported in a CBS News commentary.

"We have an infodemic, and in short, we don't trust the sources of information, meaning we don't trust the media, it's seen as politicized, biased, and we don't trust the people who are speaking," Edelman said to CBSN.

Advances in AI also allow “black box” algorithms to automate the control of precise distribution by media publishers and online platforms. Use of these algorithms can control what news, which version, frequency, and where and how content is displayed. For marketers, micro-personalization of content is a dream come true. For media companies, and those who rely on the media as a viable third-party source of accurate information, it is both boon and threat. For the public and for governments, the jury is still out.

While AI image/text monitoring allows platforms to remove violent images or incitement to violence (think Christchurch massacre images pulled country by country on Facebook), it also enables content suppression, a “cancel culture.” Growing populist fury at real or imagined suppression of information fuels mistrust and potentially makes it almost impossible for organizations to get out facts for unpopular positions.

In late 2020, Twitter, Facebook and YouTube joined in a fight to remove conspiracy theories and any disinformation about new COVID-19 vaccines. The problem is that while many still get through by eliminating code words, other organizations trying to reach their employers and community members have experienced their channels temporarily blocked without warning.

Other business and financial damage examples are included in an excellent report by Alex Moltzau in Medium, published in January 2020.

Who Is Most Directly Impacted by This Trend?

Communicators for governments and government agencies already face critical challenges related to disinformation and the impact of mistrust in leaders and in institutions. This threat could easily migrate to business leaders; wherever they may stand on political issues, they are an “authority” institution that, by its leaders’ actions or inaction, could be a target.

Organizations that are now in the public eye, or may be in the future, and those who need to reach constituents through the media will be most impacted by these issues. External communications and public affairs will be increasingly challenged, as traditional approaches will be inadequate if they don’t address new and constantly changing technology and political realities.

Corporate and investor relations professions for publicly traded companies rely on timely, accurate third-party information about their stocks and the company, including media reports. If even an unchecked rumor can send a stock plunging or skyrocketing, calculated disinformation using deepfake technology could be too difficult to detect until much damage had been accomplished.

Public affairs and communications professionals who operate in a climate of division on public/private issues will be affected, as opposition have new tools that are not as easy to monitor, and require fast responses.

What Do IABC Members Need to Know and Do by Audience?

Employees

Businesses and business communicators have a huge opportunity according to the information from the Edelman survey. Business leaders have been tracking higher in trust by employees than their governments, but this year, that gap was at a record level. Sixty-one percent of respondents said they trusted business leaders versus a 53% trust of government (the survey covered 28 countries).

Other research shows that employees increasingly want their companies to take a stand on social issues. Often, that stand involves communicating through key influencers, such as respected media. With declining trust and narrow-cast channels, companies need to be able to demonstrate they are taking a public stand while giving great attention to where and how this message is delivered, as many media channels now carry their own bias baggage, real or perceived.

The 2019 Edelman Trust Survey showed that while trust in many institutions and the media had declined, employees were much more likely to trust managers in their own organizations. While we do not know if this will be true in 2021 and beyond, we do see some corporate leaders embracing an antidote to declining trust. That antidote, for some, is radical transparency. See more on the use of radical transparency in trend No. 6 below.

Customers

In a world of deepfakes and misinformation, branding takes on a new imperative. Owning, protecting and consistently using your brand over trusted channels will be essential to maintain customer trust. Some notable marketing campaigns and commercials have used AI technology recently to produce ads where celebrities or sports personalities appear to be together, or in a location created beyond them. Disclosure on these techniques is vital to maintaining trust, even for the most entertaining content.

The use of manipulated images of people — to protect identities — or purely for promotional reasons (a happy gallery of Zoom call participants on a screen shot) should also be disclosed. The law may not be there yet; be ahead of the regulations.

Influencers/Media

Communicators should work with media channels to understand their own capabilities to identify and eliminate the threats of disinformation and deepfakes. When corporate leadership is being interviewed, it is even more critical to work to identify who else is providing information and to have the knowledge and resources to assess their credibility and veracity.

Investors

Understanding the potential and monitoring for deepfakes and other disinformation that can impact a stock will be a critical skill. Investor relations and corporate communication professionals may need to find immediate and alternate channels to correct information when a media report is false. This will require working with attorneys and following stock trading laws and regulations that are unique to each country and market exchange.