New ASU center to fight disinformation campaigns that threaten democracy


|

The internet is open to everyone, and that democratization has a dark side. Disinformation is flourishing and affects everything from elections to public health.

The Global Security Initiative at Arizona State University has always focused on how disinformation influences people, and it has now dedicated a new unit to that research — the Center on Narrative, Disinformation, and Strategic Influence.

The center will use an interdisciplinary method of researching disinformation campaigns and developing tools to combat them, according to Scott Ruston, a research professor who will lead the new center, housed within the Global Security Initiative. He has already done extensive work in this area, tracking propaganda operations around the world.

“We aspire to facilitate research that would go in this general problem space of how humans understand the world and how nefarious actors manipulate the environment for degradation of democratic values and faith and trust in institutions that undergird our democracy,” he said.

“This is the kind of work that tells people in the federal government that this is worth pursuing, this is a valid and valuable way of going after tough problems,” he said.

The spread of disinformation is a major security challenge that is significantly impacting our country and the world in negative ways, according to Nadya Bliss, executive director of the Global Security Initiative.

"Whether spread by domestic actors for political gain or by foreign adversaries as a tool in geopolitical competition for global influence, disinformation is tearing away at the fabric of democratic institutions and our ability to unify around a common purpose," she said.

"At ASU, we have been working on this issue for years. The launch of this center builds on that work and brings together strengths in narrative theory, computer science, social sciences, humanities, journalism and other disciplines so we can better understand, identify and combat the purposeful spread of false or misleading information.”

Ruston answered some questions from ASU News:

Question: What will the center do?

Answer: The goal is to fuse humanities research, theories and methods and scientific research and methods with computer science and other technologies to better understand how humans make sense of the world around them, and how adversarial actors exploit that in the contemporary information environment.

What we’re doing is helping preserve those democratic values and principles that we’re seeing under threats that have been exposed in the last few years.

Most of the phenomena we study are not new. It’s how they’re being leveraged and accelerated that’s new.

Q: You approach disinformation from a “narrative” perspective. What does that mean?

A: The underlying premise is that humans make sense of the world around them via the stories they tell themselves, and the stories they are told, and how they put them into meaning-making structures. That’s how we conceive narrative. It's a slippery concept.

What comes to mind is the story of George Washington and the cherry tree. It’s a national myth taught to schoolchildren to learn the national hero origin story about honesty. It’s part of a larger ethos of the American Revolution and what America stands for.

The Boston Tea Party is a specific story about a bunch of Massachusetts figures who dress up, jump on a ship and throw some tea overboard. But it communicates certain values: resistance in the face of oppression, resistance to a distant government and resistance to taxation without representation.

Taken together, these stories create the narrative arc of the revolution and the birth of the country. Political protest is a valued principle. Honesty is a valued principle. There’s a narrative consistency to the stories.

At the end of the arc is the founding of the U.S.

Q: How does that apply to what the center does?

A: We look at how the contemporary environment tells stories and how those stories nest into narrative styles.

One research project focused on Russian propaganda in Latvia, where they flooded the information environment with stories about corruption in the Latvian legislature and in Latvian municipal government. They told stories about how pending legislation would be discriminatory against Russian speakers residing in Latvia. They told stories about how the Latvian government is beholden to NATO as a puppet master. They told stories about how NATO military exercises in Latvia tear up farmers’ fields and destroy their livelihoods.

We have all these little stories that paint a larger arc – that Latvia is a country in conflict beset by corruption manifested against Russian speakers. NATO is a global elite that doesn’t care about the little guy. The conclusion is that you should become an ally of Russia.

Q: So is it “fake news”?

A: To equate disinformation with things that are untrue is a misunderstanding of the phenomenon.

It could be specific pieces of information that may be false or misleading or inaccurate, but not solely untrue. Those pieces are accumulated into the narrative so the narrative does the work of disinformation, which has the goal of political influence.

That’s the sweet spot of the center.

Q: Is Russian propaganda a focus?

A: We’re agnostic on what part of the globe we look at. We’ve had projects in the Baltics, Eastern Europe, Southeast Asia.

The consistent piece is that we’re always looking at how humans understand the world and how that’s impacted by 21st-century information exchanges that are accelerated by new technologies that layer in some degree of malignant influence — and particularly those perpetrated by an adversarial actor for the purposes of strategic influence.

If there’s a lone voice in the wilderness spouting all kind of disinformation about some topic, we’re not going to pay attention to a single actor. We’re interested in larger scale disinformation campaigns tied to geopolitics.

Q: How do you apply social science research?

A: A good example of how we approach studying disinformation and influence and propaganda was the project in Latvia.

We collected a bunch of news articles that were published by known Russian state-sponsored propaganda outlets and also known news outlets that had a Russia bias.

And we read the articles, we looked for evidence of framing, a human communication or rhetorical practice that guides the interpretation of a set of facts. The principle of that is that it tells you not what you’re supposed to think but what lens to think about it through.

An example is an article that said that yesterday in Latvia, the veterans of the Latvia Legion marched to commemorate their service in World War II, and there were cheering crowds.

The Latvia Legion fought against the Red Army. From the Russian, or Soviet, perspective, these were Nazis. From the Latvian perspective, they were nationalist heroes who happened to be funded by the Nazis.

The actual guys are in their 90s now. There were a few hundred people in downtown Riga watching.

The Russian press frames it as cheering crowds, and that framing is indicative of the appeal of fascism in the Baltic country.

The theories behind framing come from the social science of human communication. We use social science to detect it reliably.

Q: How do you do that?

A: We had thousands of texts, like news articles. It was far more than we could read on our own.

We wanted to train a computer program that would read and adds the same labels that the humans would have. It uses data mining techniques to distinguish a sentence that frames the rise of fascism or corruption in local government versus a sentence indicative of discrimination against ethnic Russians versus sentences that did none of those things. The machine classifier was able to detect with a high degree of accuracy and consistency.

Q: Why is the social science component important?

A: The vast majority of work in this area from other universities and think tanks tends to be computer science heavy – data mining, information science, network science, social network analysis. They bring in social science, sociological and communication principles, but they aren’t as developed with the level of sophistication that we are doing. They’re not truly interdisciplinary.

We are adapting social science to meet computer science and adapting computer science to meet social science.

Our approach to narrative draws heavily on the humanities, how literary and film and media studies and the whole subfield of narratology approaches the study and analysis of narrative.

Q: What was the result of the Latvia project?

A: We could plot the curves on a graph that showed the relative distribution within the propaganda material. The most important thing we could determine was when Russia changed their minds. They were pushing the discrimination thing a fair bit, and then pushing the corruption thing a fair bit, but pretty suddenly in early 2018 they started pushing the discrimination element significantly more than fascism.

My thought about why they shifted gears is they decided the disinformation angle was the thing that would be most influential in the parliament election in fall 2018.

Q: What kind of outcomes will the center produce?

A: A lot of our research is sponsored by organizations affiliated with the government and particularly agencies that are tied closely to national security. Right now, we have four projects funded by the Department of Defense and one anticipated to be funded by the Department of Homeland Security.

So internal reports to the sponsoring agency is one outcome. The Latvia project was funded by the Department of State. We filed periodic reports through the course of that project that went into their library of available insights about disinformation and propaganda in different parts of the world.

Publications that contribute to academic fields is another. One of the computer science graduate students published a paper about his approach to identifying inauthentic behavior.

Other projects generate things like computer algorithms. The computer algorithm that the graduate students produced would be available to the State Department to use.

We have a project now in which the goal is to develop a computer system that incorporates a wide range of different algorithms that analyze media in different ways. Its ultimate application would be in a government or security agency to analyze manipulated media, like deep fakes.

What is commonly meant by deep fakes are videos altered to make it appear that a person is acting or saying words they didn’t say and making it look natural. It’s important to determine whether it was done for entertainment or humorous purposes or nefarious purposes.

The program is called "SemaFor," short for "semantic forensics"

This follows the model of interdisciplinary applications because the bulk is computer science but the contribution of the ASU team is from the Walter Cronkite School of Journalism and Mass Communication to bring insights about how the journalism industry works.

At the end of this three-year project, the prime contractor, a company, intends to deliver a computer system to the government that would be able to ingest any piece of data, such as a video news article or social media post, and run those algorithms and spit out a score that assesses whether the piece of information has had falsification or manipulation done to it. Then it attributes what kind of manipulation has taken place and characterizes whether it’s disinformation.

That’s sponsored by DARPA, which takes really tough problems and throws as much as they can at it to solve bits and pieces that can be put together.

Q: How do you work across the ASU community?

A: We want to leverage the wide range of talent at ASU, and to that effort we run a disinformation working group with approximately 20 members of faculty from departments and schools across the community and across all campuses. We have faculty from library science, theater, psychology. We get together regularly to triage what’s going on.

The field is moving incredibly fast. When we first started the working group about a year ago, about half of the participants hadn’t heard of the term deep fake.

Q: Will you look at disinformation locally?

A: We recognize that as researchers within a public university, we owe back to the public some of the benefits of our research, so it doesn’t just get published in an esoteric journal that other scholars in our field read. And that the benefit doesn’t just go back to the sponsoring agencies, but also is realized by the ASU community, the Phoenix metro community and the state of Arizona.

We do some small-scale efforts scanning the information environment of Arizona to identify trending elements of disinformation. We are aspiring to develop that capability so we could provide answers to the public if we caught wind of a disinformation campaign in Arizona.

Q: Are we living in a unique period of disinformation?

A: There’s not a lot of people left who were adults at the turn of the 20th century, when there was a rapid transformation of electrified communication, with the invention of the telephone and the television. There were massive changes in society in a short period of time. There were significant changes in social norms.

What we’ve experienced in the last 30 years is a similar epochal shift in information exchange.

The production of and distribution of information has been democratized, and in the early days of the internet, that was viewed with utopic zeal. It was, “More information produced by more people should be better for everyone.”

But we’re seeing the dark side, that the wild west of the information environment is ripe for exploitation.

And we ask, “How can our citizens and institutions – educators, the press, the judiciary, the legislature, civic groups – better defend against that exploitation?”

Top image by Pixabay.com

More Science and technology

 

A closeup of a silicon wafer next to a molded wafer

ASU and Deca Technologies selected to lead $100M SHIELD USA project to strengthen U.S. semiconductor packaging capabilities

The National Institute of Standards and Technology — part of the U.S. Department of Commerce — announced today that it plans to award as much as $100 million to Arizona State University and Deca…

Close-up illustration of cancer cells

From food crops to cancer clinics: Lessons in extermination resistance

Just as crop-devouring insects evolve to resist pesticides, cancer cells can increase their lethality by developing resistance to treatment. In fact, most deaths from cancer are caused by the…

Close-up of a DNA double helix with colorful bokeh lights and network lines in the background.

ASU professor wins NIH Director’s New Innovator Award for research linking gene function to brain structure

Life experiences alter us in many ways, including how we act and our mental and physical health. What we go through can even change how our genes work, how the instructions coded into our DNA are…