Daniel Kimmage is Principal Deputy Coordinator of the State Department’s Global Engagement Center.
A senior U.S. official discusses how the government and its partners are using technological innovation, social media engagement, and other tools to challenge false narratives spread by state and nonstate actors.
On February 17, The Washington Institute held a virtual Policy Forum with Daniel Kimmage, the Principal Deputy Coordinator of the State Department’s Global Engagement Center (GEC). The following rapporteur’s summary focuses mainly on the event’s question-and-answer session; for his full remarks, download the PDF transcript or watch the video.
Recognizing the burgeoning threat posed by disinformation, Congress expanded the Global Engagement Center’s mission in 2017. Today, the GEC has positioned itself as the hub of a broader effort to counter disinformation by heightening inter- and intra-governmental coordination. Its mission is driven by data and analysis. The center works with a team of more than thirty data scientists who analyze open-source materials and produce actionable products for partners. It also strives to position itself at the forefront of technological innovation, especially in the case of countering disinformation technologies
To deprive bad actors of the space to conduct their malign influence activities, the GEC works on a global scale. Over the past year, it has conducted analysis touching on more than seventy-seven countries. These analytic products are designed to be actionable and unclassified in order to encourage information-sharing and meet the needs of customers and partners foreign and domestic. The GEC’s Technology and Engagement Team tracks various social media platforms while communicating with threat teams to link its research to relevant national security matters. Likewise, GEC analytic products draw on information gleaned from threat teams, often from the local level. For example, Line is the most popular social messaging app in Taiwan, but preferred modes of communication and propaganda dissemination differ across borders and ideologies. Ultimately, analytic products in a given country seek to answer the most critical questions: what platforms are people using, where, and for what purpose?
Crucially, the GEC values its relationships with academic institutions and think tanks, embracing the desire for open-source information and striving to foster public-private collaboration. For example, it recently embarked on a one-year research project with the U.S. Institute for Peace to examine foreign online extremist communications exchanged by racially and ethnically motivated violent extremist (REMVE) actors.
Countering Terrorist Propaganda and State-Driven Disinformation
Terrorist organizations of various ideologies use social media and disinformation to multiply the impact of their operations by sharing multimedia content, sensationalizing their attacks, and appealing to audiences around the world. They also use social media platforms to recruit new members.
At its zenith, the Islamic State (IS) produced a great deal of high-quality content that allowed it to craft a particular narrative and project an image of power. The group also enjoyed greater freedom of online movement at the time, which made its content and contacts more readily accessible to potential recruits. Today, IS propaganda production is far more difficult. In addition to losing its “caliphate,” the group now operates in a more hostile social media landscape, making the content it does produce far less accessible. Thanks to more effective coordination among social media companies, orchestrating cross-platform terrorist propaganda campaigns has become nearly impossible. To take advantage of this new space and disrupt extremist narratives, the GEC works with local and state-level partners to provide counter-narratives from former extremists, religious figures, and community leaders.
GEC threat teams track, analyze, and counter state-driven disinformation efforts as well. The center began reporting on coronavirus-centered propaganda in January 2020, and it continues to work with partners in and out of government to identify falsehoods. In particular, Russia and China have capitalized on the uncertain pandemic information landscape, with Moscow encouraging the spread of conspiracy theories and Beijing pushing dangerously false narratives about COVID-19’s origins. In fact, the center has witnessed a convergence of Russian and Chinese narratives, with each utilizing well-developed and sophisticated disinformation infrastructure to spin politically advantageous narratives. Further, Beijing has now adopted the Russian approach of using troll farms and tactics to further its disinformation campaigns. To counter these strategies, the GEC has provided rapid-response grants to community-level organizers working to expose disinformation and reverse the flood of falsehoods.
The center also works to educate global partners about the threat posed by Iranian propaganda, including efforts to erode confidence in democratic elections. The Islamic Republic uses state-driven disinformation to improve its global standing, minimize its domestic issues, undermine U.S. credibility, and conduct influence operations. The GEC’s efforts to address these campaigns often intersect with its counter-terrorist propaganda work, since Iran remains one of the foremost state sponsors of terror. Lebanese Hezbollah and other Iranian militia proxies often operate media outlets and carry out online influence and cyber operations that serve Tehran’s interests. To counter such narratives, restore local faith in media, and provide reliable sources of news, the GEC supports fact-based journalism in the Middle East.
Technological Innovation and Resiliency Programs
Understanding how adversaries spread disinformation and what tools they have at their disposal helps the GEC devise tactics to confront their constantly evolving strategies. For example, the center hosts biweekly Tech Demos in which private-sector firms discuss emerging technologies, as well as Tech Challenges in which foreign companies showcase new tools that can be used to counter disinformation. The GEC also created the Technology Testbed and Disinfo Cloud—the former allows U.S. agencies to experiment with promising tools from the Tech Demos, while the latter is an online, open-source platform that shares findings on new technologies.
Although the GEC furthers its mission by working with major social media platforms like Facebook and Twitter, the truth is that any platform can host disinformation and terrorist propaganda. Fortunately, these same platforms can also be employed against dangerous narratives. The GEC places great value in building resilience among audiences likely to be targeted by propaganda campaigns. In collaboration with the Department of Homeland Security, it released “Harmony Square,” a virtual game in which users spread disinformation and work to undermine community trust. The game draws on “inoculation theory” to show users how disinformation is shared and build their resiliency against real-world propaganda efforts.
Still, the GEC recognizes that the U.S. government is not always the best communicator to a foreign audience, so its partnerships with embassies, local actors, and religious leaders are all the more essential. The center often coordinates with community-level partners to assess major threats, better understand the affected audiences, and collaborate on best practices. In particular, it has worked with teachers and youth leaders in East Africa to detect signs of radicalization and build local resiliency to disinformation. This includes the far-reaching “Somali Voices” program, in which local partners build websites and social media platforms focused on counter-messaging and interrupting terrorist propaganda.
This summary was prepared by Lauren Fredericks. The Policy Forum series is made possible through the generosity of the Florence and Robert Kaufman Family.