image title

Can computing combat climate change?

October 12, 2021

ASU expert highlights role of computing research in addressing climate change-induced challenges

The public may be divided over climate change issues, but the Pentagon and national security community are not. Secretary of Defense Lloyd Austin has said climate change is making the world more unsafe, citing accelerating security issues like pandemic and stability, mass migration, conflict over resources and natural disasters.

Nadya Bliss, executive director of Arizona State University’s Global Security Initiative, recently co-authored a white paper for the Computing Community Consortium that highlighted the role of computing research in addressing climate change-induced challenges.

The consortium catalyzes the computing research community with debate and discussion.

“When you have boots on the ground, you're not going to be stressing about partisan politics,” Bliss said. “You're going to be like, it's hotter in our areas of deployment. We need to protect our troops.”

Bliss and her co-authors examined six key areas where these challenges will arise: energy, environmental justice, transportation, infrastructure, agriculture, and environmental monitoring and forecasting. They identified specific ways in which computing research could help, using devices and architectures, software, algorithms/AI/robotics and sociotechnical computing.

Question: You've talked about monitoring energy in the environment. Where do you see gaps in these areas? The government and a lot of the private sector has sensors all over the Earth. What's not being covered? And what are we missing from them that computing research could fill in?

Answer: There's a bunch of different layers in various energy systems and how they interact with climate and weather patterns. Often those things are not connected. So if you think about it from a systemic perspective, a particular power company may have a clear monitoring of the energy needs of the community that it's monitoring. But if it's not connected to a state next door, what is happening? What kind of things are happening that potentially would cause people to either up or reduce their energy consumption and optimize accordingly?

If we think about the blackout in Texas this year, it was a complex sort of a confluence of a very specialized measurement infrastructure and then optimized for efficiency as opposed to resiliency, right? A lot of our systems tend to be optimized for efficiency versus resiliency. You can essentially create a system that's so complex and complicated and multi-layered that all you're doing is trying to figure out how to do analysis instead of making systems that are actually helping you with decisions. Acceptance of the fact that a lot of different facets of these challenges are interconnected and implementing for those interconnections could benefit society at large.

Q: Transportation: If all these elements — traffic, goods, people, pollution, energy availability and needs, etc. — are monitored, what could be improved in the name of climate crisis?

A: If we have a clear sense of various distribution patterns, we can absolutely increase efficiency of transportation networks — for example, if we could understand where we need to have electrical vehicle charging stations based on traffic patterns. Another thing that we can understand is a flow of how people go to certain places, right? Minimize congestion. It's well-documented that when non-electric cars are stalling, they are very wasteful with their gas consumption. If you could dynamically reroute traffic based on what is happening on the road, you potentially could reduce that congestion. This is something that is a very computationally intensive problem.

Q: Infrastructure: Any kind of sensor that could monitor the state of a bridge or road (and a database to manage the information) is easy to understand. How would AI/robotics/algorithms assist that?

A: You can have very cheap sensors embedded in roads. So low-power, specialized computing chips that are basically sensing the load on the roads that could communicate broadly and say, well, this traffic area is really congested, but this one isn't. So how do we do this? Some of this I imagine is already being done, but you can optimize traffic lights to adjust to traffic patterns. And these are just some examples of optimizations, but I think broadly there's a lot of opportunity for making a lot of those systems efficient.

Q:  Finally, agriculture. Modern farms are swarming with tech, measuring things like moisture content of soils, etc. Tell me how tech-like systems for deployment and control of autonomous vehicles (e.g., UAVs, or unmanned aerial vehicles) and algorithms that leverage rich sensor data, together with real-time information about economic factors and transportation networks for planning and risk assessment, could help improve farming.

A: There could be optimization of things that are predictive like weather and climate patterns. I am definitely not an agriculture expert, but it's a similar set of things, right? It's embedded sensing connected to broad multilayer modeling that could allow you to make decisions and optimize for inefficiencies that potentially lead to a savings in energy.

Q: What impact does your collaborative research bring for the ASU community?

A: One of the things I really appreciate about the (Computing Community Consortium) is working with my colleagues across the country. I learned a ton from each and every one of them. It also gives me an opportunity to have ASU be at the table for these conversations, as we're shaping and informing the national agenda for computing research. I get to bring to bear all of the experiences and expertise that we have here at ASU to that discussion.

Top image by Comfreak/Pixabay

Scott Seckel

Reporter , ASU News


image title

Cybersecurity competition challenges next generation of security experts

September 30, 2021

ASU's Global Security Initiative wraps up Capture the Flag competition at DEF CON

Every year, the gladiators of hacking meet to sharpen their skills and compete in the world’s most elite digital coliseum — DEF CON.

A pillar of the cybersecurity industry, DEF CON is one of the world’s largest hacking conventions, with its first event taking place in 1993. It offers hands-on hacking opportunities, workshops and presentations from government, industry and education experts in the field. Attendees included those interested in protecting software computer architecture, digital infrastructure and anything vulnerable to hacking.

Since 2018, faculty, students and staff with the ASU Global Security Initiative’s Center for Cybersecurity and Digital Forensics have organized DEF CON’s signature event, the Capture the Flag competition, which has multiple security challenges that competitors must identify and resolve. Hundreds of teams from all over the world compete each year to make the final round, with 16 teams emerging as finalists.

“Our goal is to identify the best hackers on the planet. We designed this competition to demonstrate just that,” says Adam Doupé, director of the Center for Cybersecurity and Digital Forensics and associate professor in ASU’s School of Computing and Augmented Intelligence.

Through the Capture the Flag event, Arizona State University has helped thousands of people develop an adversarial mindset — an understanding of how an adversary thinks, what information is valuable to them and what sort of tactics they may deploy. This knowledge is crucial in today’s world where cybersecurity professionals need to identify vulnerabilities before bad actors do.

With so much of our lives taking place online, cybersecurity is everyone’s concern."

– Sally C. Morton, executive vice president of Knowledge Enterprise at ASU

DEFCON poster

DEF CON’s Capture the Flag (CTF) is an example of putting ASU’s mission of creating social impact and helping learners build the knowledge and skills needed to thrive in today’s workforce into practice.

“The university has a huge appetite for real impact, and one challenge we face in academia is showing that ideas being explored are relevant — DEF CON allows us to do that,” said Yan Shoshitaishvili, Center for Cybersecurity and Digital Forensics researcher and assistant professor at the School of Computing and Augmented Intelligence. “ASU is the top university to attend for cybersecurity. The people in charge of the ‘Olympics of hacking’ are also professors you can learn from.”   

This year’s DEF CON, which was held in Las Vegas on Aug. 5–8,  concludes ASU's hosting of the Capture the Flag competition, as organization of the event rotates every few years. In 2020, the team pivoted to a fully virtual environment due to COVID-19. This year, the event became half remote, half in person.

“The team persevered, and I am proud to call this our last year hosting DEF CON CTF,” Doupé said.

As the United States continues to see threats to the nation's security and infrastructure, ASU professors have found that this competition brings to light just how much impact education and research can provide.

“With so much of our lives taking place online, cybersecurity is everyone’s concern. By organizing one of the world’s premier cybersecurity competitions, the university’s Global Security Initiative demonstrates the importance ASU puts on solving problems that affect everyone, all while training the next generation of security experts," said Sally C. Morton, executive vice president of Knowledge Enterprise at ASU.

Doupé said, “We try to translate academic research into practical application, which is where we’ve seen some of the best ideas and techniques disseminate. It’s very difficult to apply a theoretical concept from an academic paper until you’ve actually done it.”

One distinctive characteristic of the Capture the Flag competition is that despite the high caliber of competitors, anyone can try these hands-on challenges. The game’s architects are dedicated to the philosophy of applying theory to everyday situations and providing these kinds of advanced skill-building opportunities to anyone who is interested. To accomplish this, they have uploaded challenges used in the tournament to for easy access.

“When we look back at the history of DEF CON CTF, the same techniques and challenges we do now will be standardized five to 10 years from now for anyone in cybersecurity,” Doupé said.

Student competing in Capture the Flag, DEF CON 29

A Capture the Flag participant. Photo from

Organizers are embedded within ASU’s network of cyber educators, and the Global Security Institute team tailored competitions from their own areas of expertise. Over the four years of the institute's involvement, 3,229 teams from around the world competed in the Capture the Flag qualifying and finals, logging 276 hours of active game time. ASU faculty, staff, graduate students and external collaborators created 176 custom challenges.

Zion Basque, an ASU student pursuing a PhD in computer science with a focus on cybersecurity who competed at DEF CON29, aims to be the best of the best hackers while making the world a better place with his technical skills.

"The competition really puts your field into perspective. Engaging with and against world-class hackers makes you understand just how much this field has to offer,” Basque said. “As a PhD student, publishing papers is not enough. I believe good security should be applied to real-world situations. I am inspired by everything at DEF CON, helping the community and working hard toward my dreams.”

The Global Security Institute will continue to stay connected to the Capture the Flag community by inspiring DEF CON collaborators and competitors.

“We wanted our last year to be exceptional — pulling out all the stops on the novelty and scale of our challenges,” Shoshitaishvili said. “I am passionate about what DEF CON represents: an opportunity for aspiring hackers to find resources and inspiration.”

Shoshitaishvili and Doupé host a podcast exclusively focused on CTF competitions called CTF Radio.

The development of technical skills and applied, accessible knowledge is central to DEF CON and Capture the Flag. GSI is committed to increasing cybersecurity literacy for all learners, and DEF CON Capture the Flag has been a key pillar of these efforts.

“The best thing about DEF CON CTF is that it brings people together,” says Debbie Kyle, Center for Cybersecurity and Digital Forensics (CDF) project manager. “As the realm of cybersecurity continues to evolve, players will continue to rise to that challenge, and that’s exactly where CDF wants to be – right in the middle of the action.”

Top photo: The Capture the Flag team at DEF CON 29.

Oliver Dean

Communications Specialist , Global Security Initiative


image title

Addressing software bugs

September 27, 2021

ASU tackles the problem of fixing software vulnerabilities through micropatching

From the recent Facebook data breach of over 500 million accounts, to stories of hackers getting into family homes through baby monitors, we are constantly bombarded by headlines of hackers taking advantage of security vulnerabilities in the software we use every day.

The truth is, humans create software, and humans are imperfect. Sometimes developers accidentally introduce a bug that leaves the software vulnerable to attacks. As soon as a company or software developer discovers a vulnerability that could harm the user or leak data, the critical solution is to provide a fix as soon as possible. This is where patching comes in.

Patching can be thought of as a fix for a computer software or program, kind of like duct tape around a loose wire to prevent wiggling. We are all familiar with the alerts on our devices from Apple or Microsoft asking us to update our systems. One of the best things the user can do is to install those updates as soon as possible, protecting against those known vulnerabilities. There are helpful mechanisms in place, such as automatic software updates — an automated safety feature on most commonly used internet browsers.

However, software updates are not the only solution needed.

“What happens if the software company no longer exists? How and who can fix those bugs?” asks Adam Doupé, director of the Center for Cybersecurity and Digital Forensics, part of the Global Security Initiative at Arizona State University. “What if the company goes bankrupt and somebody finds a bug — a vulnerability that allows a remote hacker to have access to your system? How do we actually fix those problems?”

ASU is tackling this problem through a four-year Defense Advanced Research Projects Agency (DARPA) contract awarded to center, which is contributing research and development efforts to the Assured Micropatching program (AMP). We spoke to Doupé about the importance of this research and the impact it provides.

What is a micropatch?

A micropatch is a small patch that fixes one vulnerability without jeopardizing functionality.

“The goal of a micropatch is to figure out how to reduce the size of the patch so that we change few parts of the program,” says Doupé, who is also an associate professor in ASU’s School of Computing and Augmented Intelligence. “Ultimately, we want to increase our confidence that we will not break the functionality of the application — the less you change, the less you have to worry about in terms of collateral damage.”

What is the Center for Cybersecurity and Digital Forensics bringing to the table?

The center has put together VOLT (a Viscous, Orchestrated Lifting and Translation framework), which aims to reverse engineer the software it is applied to so that efficient and effective patches can be created.

“I have worked on software reverse engineering for over 10 years, and much to my surprise, no one has created techniques to make effortless binary patching possible,” says Ruoyu (Fish) Wang, the lead project investigator of the Assured Micropatching project. “Our VOLT framework, upon success, will be the first of its kind that enables easy bug fixing on deployed software. This capability will mean a lot to both industry and national security. We really appreciate DARPA’s interest in supporting our research on this front.”

One of the core strengths of the Center for Cybersecurity and Digital Forensics is “angr” — an open-source framework created and founded by core center researchers Yan Shoshitaishvili and Wang, with the goal of analyzing binary code to learn about what the program it’s being applied to does. Yan and Wang will lead a team of researchers to significantly improve the state of the art of binary decompilation techniques (transforming a binary program back into readable and understandable source code). As the technical foundation of VOLT, these techniques will enable sound and faithful translation between binary code and their corresponding decompilation output.” 

"The ‘angr’ framework enables us to perform 'binary analysis,' which is able to take the ones and zeros of a binary program and allows us to make sense of what the program does,” says Doupé. “On HACCS (Harnessing Autonomy for Countering Cyberadversary Systems), an additional DARPA program we’re involved with, we use 'angr' to automatically identify and exploit bugs in a binary program.

How can this improve defense in the United States?

Imagine this scenario: A modern warfare vehicle, like a tank, has software that runs a vast number of components — from movement mechanisms and the speed of the tread to directional navigation and targeting technology.

“We would not want a security vulnerability that exists (for example) in the wireless communications that allows someone to jam or shut down your systems,” says Doupé. “In this context, it would be very impactful if the tanks were all down while systems reboot. It’s frustrating enough in our everyday lives, let alone within the setting of warfare — it could be catastrophic.

“Governments buy these systems and related software, procure contracts with various companies that build those binary systems to specification and run them. However, even if the government gains access to the source code, they may not have the tool chain of how to build and recompile them. The goal of AMP is to completely automate this process, through mathematical proofs and testing.”

Another challenge from a security perspective is that some control systems run on Windows '98, a software which hasn’t been updated in over a decade. The operating system has accrued a vast history of vulnerabilities and known exploits, which then creates difficulties when securing the system.

On a national level, the Department of Defense is very interested in this type of research.

“The DOD has a lot of manpower that they can direct at a problem, but the flipside is understanding what kind of things they don’t necessarily have power over,” says Doupé. “The key to addressing any security problem is, once identified, you need to actually act on it. One of the key concepts of security is if you find something, you should assume that someone else — say, your adversary — can find it as well.”

What is the problem and the solution?

In general, software and device manufacturers do a good job of fixing problems as they come up, but there are areas where consumers are more vulnerable.

Under DARPA Assured Micropatching, the Center for Cybersecurity and Digital Forensics team is developing new automated methods for “understanding” the machine-readable form of software, reversing the translation process, and generating human-readable source code. They can then repair small segments of code, retranslate the repaired segments and integrate them back into the deployed software. This will allow the team to address security issues in deployed mission-critical software in a timely, cost-effective and scalable manner.

“Operating systems, cellphones, web browsers — all typically have very good systems for pushing out patches, as everyone understands the security importance,” says Doupé. “Phones are another great example where companies are efficient with deploying fixes. Yes, you may be unable to use your phone for a short while, but it’s really important to keep it up to date.”

But not all fixes come to your attention, and not all of them are in your control. For example, when was the last time you updated your Wi-Fi router for security vulnerabilities? And what if a vendor of a product you use through Wi-Fi no longer supports your router anymore? This puts you in a difficult position because you cannot personally apply a fix.

“Ultimately, there should be changes at the policy level to handle cases of companies willingly selling a system that has known security vulnerabilities. They should not have the choice to simply not update software,” says Doupé. “Regulators and policymakers should be thinking about the exact aspect of companies going bankrupt, or no longer supporting the security updates on people’s devices.

“It’s worse from a security perspective if the device works but never receives updates, especially home-based devices that connect to other systems. From the individual level, it’s difficult. My recommendation is to enable automatic updates on every system possible, thus doing your bit for cyber hygiene.

“The unfortunate thing here is that this puts more burden on the consumer to do that research.”

Oliver Dean

Communications Specialist , Global Security Initiative


image title

How 9/11 changed the ways these faculty teach and research

September 9, 2021

ASU faculty reflect on the day that changed our lives

From the global response to terrorism and the subversive weaponization of narratives, to the evolution of crisis management and guardians of civil liberties — 9/11 forced us to think differently; to rise to new challenges; and to confront the vulnerabilities of our democracy.

Twenty years after the attacks and in observance of the anniversary, ASU News reached out to faculty experts across Arizona State University to share their observations, research and reflections on 9/11’s cultural and global impact on our world — and on their work.

Stepping up safety and security after 9/11

Melanie Gall and Brian Gerber, co-directors of the Center for Emergency Management and Homeland Security in ASU’s Watts College of Public Service and Community Solutions on how the events of Sept. 11 heightened safety protocols and standardized emergency management practices across the nation.

Brian Gerber: Before 9/11 we didn't have a particularly strong nationalized system. After 9/11, we have a much higher level of consistent and uniform practice and how all aspects of the emergency management process are followed by federal, state and local governments. In other words, we created a set of national standards and everybody follows them now. So that's a pretty dramatic change.

Melanie Gall: As a result of 9/11, the National Incident Management System (NIMS) was developed following the Homeland Security Presidential Directive 5. Part of NIMS is the so-called Incident Command System, which was adopted from the fire service, where it was used for interagency coordination for wildfires within and between California and Arizona. What is unique about ICS is the fact that it is a command-and-control structure with a clear chain of command and defined roles and responsibilities. Think of it as a system with defined job descriptions and job titles allowing responders coming to the incident site(s) to easily slot in and boosting local capacities.

Gerber: Over the last two decades there's been considerable progress made in managing or creating tools to help manage a really complex set of interacting hazards. We used to not talk about community resilience as part of dealing with emergencies and disasters. Now we've moved in that direction to figure out how different members of a community are affected differently. Social equity is now a consideration as well. All those kinds of issues in the year 2000 really weren't on the table.

Gall: Our field is constantly growing and evolving since 9/11. Our field is in flux. As an emergency management and homeland security academic, you have to keep up with a constant flow of new policy guidance, new or updated planning guidelines, new or updated training requirements and more. What has perhaps become most challenging for many academic programs in emergency management and homeland security is to find the right balance between teaching students the EMHS vocabulary, systems and structures but go beyond and expand students' thinking and incorporate topics such as climate change, sustainability, resilience, social vulnerability, the differential impacts of disasters and more. Those are topics that training courses from FEMA generally do not touch upon.


William Terrill, professor in the School of Criminology and Criminal Justice in ASU’s Watts College of Public Service and Community Solutions, says evolving policing practices reversed course after 9/11.

Much of the 1990s was trying to get away from the military approach of policing. After 9/11, it really went back to this basic, fundamental aspect of public safety and the fear that we could be attacked in our homeland so we need a very strong police force. I do think law enforcement needs to get away from the “lock-up-everyone,” “everyone-is-a-threat” policing and think more toward policing the community.


Nadya Bliss, executive director of ASU’s Global Security Initiative, says technological advancements made in cybersecurity since Sept. 11 are helping to keep us safe from other attacks, but there is still much work to be done.

I started working professionally in the field of national security right after 9/11. My personal research and my own doctoral dissertation is on analysis of graphs and networks, so I mathematically understand networks. And while this is a very important aspect of security advancement efforts, it is not sufficient to address the problems of understanding the emergence of dangerous patterns or potentially dangerous themes.

The national security research and development community has been very techno-centric over the last 20 years. But we have come to realize that the community can't advance in isolation — away from the human or cultural elements that drive our society. Interdisciplinarity is very important, and we have to do it together. Bringing together a bunch of engineers with a bunch of humanists is a non-trivial task, but where we really make amazing progress on some of these challenges is where we do in fact bring people together.

We in the national security research community have gotten much better at widening our scope when it comes to developing technology, and I think we're much better at appreciating that this technology and isolation modality is not sufficient. We appreciate the need to bring non-technical disciplines into development of national security.

Fighting the longest war in history

Daniel Rothenberg is the co-director of the Center on the Future of War, a partnership between ASU and New America created in the 2014–15 academic year to explore the changing character of war, and the social and political impacts of recent conflict in the U.S. and around the world.

After the shock, devastation and national trauma of the 9/11 terrorist attacks, the U.S. invaded Afghanistan and later Iraq as part of a reconceptualization of security policy known as the Global War on Terror. Through these processes, we have witnessed the extraordinary capacity of the U.S. to project military force and to complexly impact world politics. These actions produced many incoherent policies, led to enormous suffering abroad, and left our country divided and disillusioned.

This can be partly explained by two notable and interrelated elements of post-9/11 wars. First, the U.S. never clearly articulated the purpose, objective or broad strategy justifying these wars, despite their enormous cost. The goals that were consistently voiced — defeating terrorism, defending the homeland, protecting freedom and supporting democracy — were often too general and too vague for the specifics of the practice of war and required a political vision at odds with the humanitarian significance of large-scale military deployments. Second, the wars were managed with minimal serious recognition of the places and peoples where U.S. forces were deployed, commonly making assumptions about the ease of political transformation abroad — even as Americans took for granted the impossibility of mutual understanding and compromise here at home.

While it is still too early to fully comprehend the meaning of the post-9/11 era, the last 20 years reveal how difficult it is to create a unifying narrative of what it means to commit a country to the danger and sacrifice of war as well as the profound impact and unexpected consequences, at home and abroad, of failing to do so.


Brooks Simpson, Foundation Professor of history in the College of Integrative Sciences and Arts, compares the war in Afghanistan — launched in response to 9/11 — to previous U.S. wars.

The withdrawal of United States military forces from Afghanistan has brought to an end nearly two decades of direct involvement in that nation’s affairs after the Sept. 11 attacks on U.S. soil, winning it the characterization of “America’s Longest War.” The departure of Americans from Kabul in August 2021 seemed all too familiar to those people who remembered the final American evacuation of Saigon in 1975, two years after most U.S. troops had left the struggling Republic of South Vietnam to fend for itself. That the cost in American dead was far less in Afghanistan does not assuage feelings of anger over what appears to have been the futility of the mission — although that mission shifted from destroying the ability of terrorist groups to attack the United States and its allies to an effort to establish a new regime in a nation that has long weathered external interventions, as the former Soviet Union could attest.

Vietnam casts a long shadow over Afghanistan. Less obvious but in many ways as telling was the United States’ effort to subdue first the Confederacy during the American Civil War followed by a seemingly endless endeavor to quell white supremacist terrorist violence and protect shaky Republican state governments throughout the South. In all three cases, whatever the original mission, efforts to engage in state-building failed before a resilient enemy who disrupted efforts to establish peace, often through unconventional military obligations, as public support for the endeavor faltered. In all three cases the final disengagement came under heavy criticism as people wondered whether the effort had been worth it. In the cases of Reconstruction and Vietnam, Americans continue to wrestle with the consequences of retreat and defeat. We can expect the same when we begin to review the failure of American intervention in Afghanistan and what that means for American foreign policy in the future. 

Culture, communication and counterterrorism shifts

Keith Brown, anthropologist and director of the Melikian Center: Russian, Eurasian and East European Studies at ASU, on how the aftermath of 9/11 called for an urgent re-examination of our cultural competency in foreign policy and education.

Two long-standing concerns for the field in the U.S. have been to maintain a distance from U.S. intelligence and military operations, and to challenge easy stereotypes about other cultures. The shock of 9/11, and the pressure on Americans (or foreign residents) to demonstrate their national loyalty by supporting a military response, called those concerns into question. This has changed the field, highlighting tension between those willing to take the risk of “weaponizing” anthropological knowledge to advance national security, and those who maintain that critical analysis of growing U.S. militarism — including, for example, the up-arming of civilian police — is the best way to serve the common good.

After 9/11, U.S. anthropologists were asked to answer very specific questions, including “Why do they — lumping together Middle Easterners, Muslims, Arabs and, after 2003, Iraqis — hate us (Americans)?” And “How can our military win hearts and minds?” It was harder to raise public awareness in more complex analyses of the long-term impacts of U.S. foreign policy — including supplying arms to the mujahedeen, supporting authoritarian regimes like Saddam Hussein’s and denying Palestinian rights of return or restitution — on attitudes and livelihoods around the world. By opening new lines of social science research funding for focused work on topics like religious fundamentalism, tribal structures or youth radicalization, it emphasized anthropology’s local applications rather than its global sweep.

After 9/11, I expanded my own teaching and research interests in the cultural dimensions of conflict, especially in insurgencies against imperial or occupying powers. I have worked closely with military colleagues to explore the comparative experience of irregular warfare, including developing reading lists and syllabi for military personnel and, especially, for veterans of the long wars of Iraq and Afghanistan. The civilian-military disconnect in the contemporary U.S. — in particular, civilian ignorance of what it is like to go to war, and the challenges that veterans face in leaving war behind — constitutes an enduring threat to civic discourse, that demands attention.


Steve Corman, professor and director of the Center for Strategic Communication in the Hugh Downs School of Human Communication, provided academic research to support counterterrorism efforts against the group behind the Sept. 11 attacks.

I was a typical unfunded researcher in a small academic field, studying organizational communication. After the 9/11 attacks, I researched texts from the extremist group al-Qaida and became alarmed. People on an academic mailing list I was a part of were suggesting al-Qaida had legitimate grievances and maybe the United States had it coming. I replied with some al-Qaida quotes and noted that if they got their way and took over, academics like those on the list would be among the first to be targeted for violent acts. This was not a popular point of view on the list.

Apparently, someone in the Department of Defense saw this exchange and invited me to participate in a workshop for the Joint Warfare Analysis Center on countering al-Qaida. I presented some work I had done on how to stress organizational communication and activity systems. This led to other such invitations, and eventually a large grant from the Office of Naval Research in 2009 to study how al-Qaida and similar groups use narratives to recruit people to their cause. This in turn led to the establishment of the Center for Strategic Communication here at ASU, which I currently direct. CSC has since received several more large grants from the Department of Defense and other agencies in the years since.


Ehsan Zaffar, founding executive director of The Difference Engine at ASU and professor of practice at the Sandra Day O’Connor College of Law, says that 9/11 “essentially created my field: the nexus of civil rights and security.”

Prior to 9/11 the field of counterterrorism (much less civil rights policies around counterterrorism) was a sleepy, relatively niche professional field. Sept. 11 resulted in an unprecedented influx of cash and the largest reorganization of the federal government since World War II. New departments like the Department of Homeland Security were established from scratch, and the subfield of civil rights related to security was even more niche — focusing almost exclusively on American activities fighting terrorism abroad, which sit more squarely in human rights rather than civil rights.

My reaction to 9/11 was fear, but not fear of terrorism or future terrorist attacks since I was relatively familiar with terrorism, having fled war myself. Instead, I feared for the reactions and reprisal that would ensue against people who looked like me or had a background similar to mine. That concern, among other reasons, drove me to become a civil rights lawyer, and I have spent the better part of that career directly trying to curb the worst excesses of the government, and now broader society, as they seek to achieve an ostensibly laudable goal at the expense of our rights. 

A turning point for teaching

John Carlson, interim director of the Center for the Study of Religion and Conflict and associate professor of religious studies in the School of Historical, Philosophical and Religious Studies, on how 9/11 increased study interest in religion, especially Islam.

The terrorist attacks of 9/11 hit the academy like a hurricane. And the field of religion was right in the eye of it. Scholars of religion have always understood the importance of their subject — so 9/11 didn’t change that. But society urgently began calling upon them to explain religion’s influence in public life — especially in international affairs. Rather suddenly, religion became chic.

Foremost, people wanted to learn more about Islam, given the role of Osama bin Laden, al-Qaida and the Taliban in the terrorist attacks. So, scholars of Islam had an incredible burden to educate entire societies (especially in the West) about a religion that few people knew about — no, Islam doesn’t command terrorism; no, most Muslims don’t want to live under Taliban rule; no, Shariah law isn’t coming for the Constitution.

In the course of confronting such stereotypes, many Americans learned — or relearned — about the struggles for acceptance that minority religions have faced throughout the nation’s history. Catholics, Jews, and Mormons (among others) had been here before — either persecuted or pressed to reconcile their faith with their allegiance to the United States or both. Violence often has played a part in that national story, and 9/11 forced many of us — scholars and citizens alike — to retell and update that story.


Sarah Risha, senior lecturer of Arabic studies in the School of International Letters and Cultures, on how educators can create a safer learning experience for students of diverse cultures by offering a broader perspective of 9/11 and its aftermath.

Sept. 11 disrupted the lives of many of us in the Muslim community, especially women wearing scarves. To this day I, as a Muslim woman, still feel fearful of those who blame Islam for the attacks. In the days and weeks following the attacks 20 years ago, some people screamed at me, "GO BACK HOME!" when I went to stores, and others made me feel uncomfortable with their stares. I would leave stores quickly without getting all that I needed. However, one day when I arrived home from school, I found a box full of vegetables, fruits and snacks for my children with a note "I am here when you need anything." It was from our next-door neighbor, and it really made me grateful.

Now as a lecturer, I use actual examples of daily life to help students face their own unconscious beliefs in my Arabic culture and Islam class. I introduce Islam and discuss Muslim women who wear traditional dress. I am encouraged to hear students say that they have a much better understanding of Islam and Muslim women after our lessons and readings. I am always happy with their honest opinion, and most importantly that they have learned something new.

Amy Dawn Shinabarger, a lecturer of English and linguistics in the College of Integrative Sciences and Arts, on how the aftermath of 9/11 fortified her own commitment to teaching.

In the days after 9/11, four years into a teaching career I never planned, I experienced my first crisis in faith as a teacher. My students were not necessarily safe, and I could not assure them otherwise. In all of the classes I taught and visited, students left ASU and returned home. The United Arab Emirates, most notably, pulled their students out of American universities, but others left, too. One class shrunk by nearly half. A female student who had previously worn a hijab returned with her hair uncovered, and I bit my lip to hold back tears which sprang forth in spite of my best effort. My students and I discussed the events at length; then we discussed the vandalism and rocks thrown through windows of Arab and Muslim places near our campus, our place that was supposed to be safe. We discussed the hate-crime murder of Balbir Singh Sodhi in our community days later. I was a green teacher but already trying to make the world a more peaceful place through education, studying and working with the framework of Paulo Freire's critical pedagogy, a precursor to today's critical race theory.

I still wasn't sure what I was going to do once I actually had the doctorate in hand until after 9/11. My mom is an elementary teacher, and I was always aware of the power of teachers to change lives. Teaching can be so powerful that sometimes it frightens me, then and now. The events of that day and the days that followed changed me as a scholar, as a teacher and as a human being. But, even more so, going through the aftermath of those events with my students changed me. I committed my future to postsecondary education as a result.

9/11: An astronaut’s story

Cady Coleman, global explorer in residence in the School of Earth and Space Exploration, describes her experience learning about the 9/11 attacks as an astronaut and the fears around the safety of Mission Control as the attacks were unfolding.


Related 9/11 stories

Mary Beth Faller, Emma Greguska, Mark Scarp, Scott Seckel and Penny Walker also contributed to the creation of this article.

Top photo of the 9/11 Memorial in New York City by Jesse Mills/

Sr. Media Relations Officer , Media Relations & Strategic Communications


image title

Equipping the next generation of cybersecurity professionals

July 28, 2021

The cybersecurity workforce gap continues to grow, with hundreds of thousands of jobs left vacant. What can we do?

This year alone, roughly 465,000 jobs (according to Cyberseek) remain unfilled in cybersecurity across the United States. At the same time, ransomware and other dangers continue to threaten organizations and individuals. The recent Colonial Pipeline hack and other ransomware attacks on government and private industry highlight the urgent need for more cybersecurity professionals.

This is why the National Security Agency (NSA) and National Science Foundation (NSF) have come together to bring grant-funded GenCyber cybersecurity camps to high schoolers across the country. 

GenCyber camps bring together students from grades eight to 12 for a weeklong cyber exploration. Topics include network security, cyber awareness and information security. New College of Interdisciplinary Arts and Science and the Cybersecurity Education Consortium, a unit of the Global Security Initiative, collaborated to host the first GenCyber camp at ASU through grant funding from the NSF and NSA.

The goal of the grant funding is to "provide as many opportunities as possible for students to be exposed to this new and evolving area so critical to our nation's security,” said Diane M. Janosek, training director of the NSA's Central Security Service

A competitive pool of student applicants applied for the free, weeklong experience. Funds were specifically allocated to provide underrepresented students opportunities to attend the camp as there is an increased need to diversify the field of cybersecurity. 

“(The camp helped to) expand my knowledge and understand what cybersecurity is without having to commit to it like you would with a degree,” said high school student Osmond Chong, a camp attendee.'s cybersecurity curriculum for high school students was the basis for this year's GenCyber camp. Students implemented the content using the U.S. Cyber range, a virtual, machine-based learning environment that allows students to run programs, simulate malware and build cyber know-how. Both tie together to bring high schoolers an immersive experience in cyber. 

Chuck Gardner, director of curriculum at, explained how students discover issues and threats in the field of cybersecurity: “It was a wonderful opportunity to see ‘aha’ moments develop for high school students as they saw passwords get cracked and realized that many pictures they see on the internet can contain hidden messages using encryption methods, such as steganography.”

Throughout the week, campers learned from a variety of presenters, including industry professionals, ASU faculty and cyber employees from the Arizona Department of Homeland Security. “We keep learning so that if there is a new attack, we can be better prepared”, Feng Wang, professor in the School of Mathematical and Natural Sciences, explained to students. Wang stood alongside a number of speakers who helped students appreciate the nuanced perspectives of working in the cybersecurity field. 

Another component of the camp is to destigmatize careers in cyber.

“Hacker, in its original meaning, is someone who applied ingenuity to create a clever result, called a hack. In our minds, in our popular culture, we think of a hacker as someone in a hoodie hunched over," said Adam Doupé, associate professor in the School of Computing, Informatics, and Decision Systems Engineering. “Hackers look like you, me and everyone. To be a hacker is (to be) someone who understands how to do something.”

Through the GenCyber program and the efforts of Arizona State University, and others, including the Center for the Future of Arizona, there is a strong mission underway to locate and educate cyber talent right now, in high school classrooms across the state of Arizona.

Written by Javier Carlos; top photo from Shutterstock

image title

How will we protect American infrastructure from cyberattacks?

June 10, 2021

Infrastructure — it’s one of those words we think we understand, but it can be a hard concept to wrap our brains around. We may vaguely imagine electrical grids or railroads, but infrastructure also includes many other services that are essential for keeping our homes, schools and businesses thriving. It includes roads and transportation, telecommunications networks, water and sewage systems, and electricity. And today, much of it is connected to the internet.

“We're all connected so deeply through the internet in so many different ways, whether we realize it or not,” said Jamie Winterton, director of strategy for Arizona State University’s Global Security Initiative. “Having your credit card number stolen by an online thief would obviously be a terrible thing. But if the water to your home became unsafe because it was tampered with, or if the power was out and it's summer in Arizona, or it's winter in the Northeast, this is where that coupling of the internet to our lives becomes very direct, affecting not just our quality of life, but our life source.”

As the Colonial Pipeline hack and subsequent shutdown reminded us so recently, our infrastructure’s digital connectedness — while bringing benefits like convenience, better monitoring and remote problem-solving — leaves it vulnerable to cyberattacks.

MORE: Global Security Initiative Executive Director Nadya Bliss talks about the pipeline hack and the effects of ransomware

As the Biden administration looks to implement the American Jobs Plan, which includes expanding U.S. infrastructure, cybersecurity needs to be a key consideration to prevent even more costly and dangerous attacks.

ASU is home to a bevy of experts on cybersecurity — in fields from computer science and law to business and humanities — who come together in order to understand and find solutions to this complex, far-reaching problem.

What’s my incentive?

Jamie Winterton

Jamie Winterton is the director of strategy for ASU’s Global Security Initiative.

Winterton studies the role of incentives in cybersecurity and how policy can affect those incentives.

“The federal laws that address computer fraud were invented in 1984,” she said. “Computers and the internet have changed considerably since then, but policy has had a really hard time keeping up.”

One result of this lag is the difficulty in figuring out the responsible party. For example, if a company experienced a large data breach, is it the fault of the company, the software provider, the person who didn’t patch the system or the CFO who didn’t fund security adequately?

Compounding this confusion is the fact that cybersecurity is spread out over multiple federal agencies. In this example, the overlaps and gaps between them would make it unclear who in government should oversee investigation into the data breach.

Yet another vague area is pinpointing the impact of the data breach, such as how much it cost the companies and individuals involved.

“You know exactly the cost if you total your car — it's the worth of your car. But if you lose 145 million credit records, like Equifax did, what's the dollar amount that's assigned to that? We really don't know,” Winterton said.

The lack of clear and up-to-date policy, she argues, ultimately means that companies may have a greater incentive to buy a data breach insurance policy than to be proactive about cybersecurity. This leaves individuals’ data inadequately protected.

Individuals may also lack proper incentives to make smart cybersecurity moves. That’s because messaging to the public tends to hyperfocus on all of the dangers and neglect practical advice, leaving people feeling overwhelmed.

“I would always ask people, ‘What's your biggest concern?’ And the answer I got most frequently was, ‘I don't know. I know I should be worried, but I don't even know what to worry about,’” Winterton said. “And in that case, as technologists we've failed. We haven't done a good job of helping them understand what their real threats are and the steps that they can take.

“If we rethink the laws and policies around cybersecurity and assess what effects they're actually having, then I think we'll start to see how our incentives often undermine security. Then, we can figure out how to change them to make people, companies and the nation safer."

Kidnapping valuable data

Adam Doupe

Adam Doupé is the acting director of the Global Security Initiative’s Center for Cybersecurity and Digital Forensics and an associate professor in the School of Computing, Informatics, and Decision Systems Engineering.

Imagine you decide to become a neighborhood burglar. Your first step would be to go around the block, trying a bunch of front doors to see which are left unlocked.

“You could run around and do maybe a hundred doors in a day, but you're limited by geography and physics in terms of how many you could potentially test,” said Adam Doupé. “Whereas once you attach a computer to the internet, literally anybody on Earth could say, ‘I'm going to jiggle the front door of a billion devices on the internet,’ and it takes about a half hour.”

Doupé is the acting director of the Global Security Initiative’s Center for Cybersecurity and Digital Forensics and an associate professor in the School of Computing, Informatics, and Decision Systems Engineering. The center’s aim is to keep users safe, whether they’re working with an iPhone, a browser, an electrical grid or (ahem) an oil pipeline.

“Adding computers to these systems is oftentimes good because we get more efficiency, but we also don't think about the security implications of what we're doing,” he said.

Critical infrastructure presents a unique cybersecurity problem. Because it’s critical, there’s a tendency to avoid software security updates since the update could potentially mess up other parts of the system. Many critical infrastructure companies don’t have procedures to check that a patch won’t interfere with the system’s function, Doupé said.

Malicious hackers will often try to exploit this dilemma by infecting machines before the fix for a known vulnerability is applied to everyone.

Ransomware is a type of cyberattack in which hackers take data hostage by encrypting it, only releasing it when the data’s owner pays a ransom. This type of attack was behind the Colonial Pipeline temporary shutdown, and in general is becoming more common.

“One of the risks in my mind is we get so focused on just preventing this specific incident from happening again, instead of trying to identify a root cause and apply that to all the critical infrastructure that we have,” he said.

It used to be that ransomware was used against random people and earned the criminal a smaller sum of a few hundred dollars. Now, however, malicious hackers are doing their research and attacking valuable data that they can hold for millions in ransom money.

“It’s almost like targeting and kidnapping the children of billionaires,” Doupé said.

Organizations are often told that the best way to protect themselves against ransomware attacks is to have backups of data that aren’t connected to their machines. The problem is that they don’t test how quickly they can restore their system from their backups.

“What we've seen is if it takes you a month to get restored, you're going to pay the money for the ransomware, even if you have backups,” Doupé said.

Sometimes companies think they can relax on security if they isolate their computer systems from the public internet and other unsecured systems, which is called air-gapping. However, criminals can still infiltrate air-gapped networks using innovative methods like infected USB drives, nearby mobile phones and even audio frequencies undetectable to human ears.

It’s a serious problem with sometimes deadly consequences, such as a hospital being suddenly unavailable during a medical emergency because it was dealing with an attack. And as companies continue paying these ransoms, it only inspires more cybercriminals to demand them.

Doupé and others are working on solutions so that cybersecurity isn’t such a hassle for large companies.

He’s currently involved in a project with the Defense Advanced Research Projects Agency on “assured micro-patching.” They aim to create the smallest patch possible (changing as little as possible) and use mathematical proofs to guarantee that a system will still work after the patch is deployed.

As the U.S. expands its infrastructure under the American Jobs Plan, it will be critical to build cybersecurity into that infrastructure from the beginning rather than trying to bolt it on at the end, Doupé argues.

Choose wisely … with the help of AI

Tiffany Bao

Tiffany Bao is an assistant professor in the School of Computing, Informatics, and Decision Systems Engineering.

Usually when people think of software security, they think of finding bugs. But Tiffany Bao is interested in all of the decisions that come after the moment of discovery. For example, when should the cybersecurity professional report it to the security software vendor? When is the best time to patch the bug? Does it even need to be patched?

Bao is an assistant professor in the School of Computing, Informatics, and Decision Systems Engineering. She researches how to address software vulnerabilities by combining artificial intelligence and game theory, a method that aims to find the best solution at the lowest cost.

All actions in cybersecurity have a cost — even well-intentioned measures to prevent harm. For example, a patch may cause other issues in the system.

“It’s like when you shop for a big piece of furniture. You really love it, you know that it will make your house pretty, but it’s huge. So then you need to think about whether you really want to buy it,” Bao said. “If you buy it, it comes with costs. You need to pay for it, maybe you will have to hire a mover or adjust your other furniture. Game theory gives you this nice way to model the costs and benefits.”

Creating a model to find the optimal strategy, however, can take expensive time and effort. This is where artificial intelligence comes in — it can give approximate results that reveal what is most likely to lead to the optimal strategy. A system’s ability to search out its own bugs and suggest solutions is called cyberautonomy.

It will be a little while before cyberautonomous systems are deployed; right now they’re still in research and development. However, Bao and other researchers are motivated to understand and implement them.

“If there's a system that can find bugs and also make intelligent decisions based on the situation, then that would definitely make the network and the computer more secure,” she said.

Currently, it’s up to humans to do the work of scanning for bugs and figuring out a course of action.

“People try to make the best decision, but sometimes they just don't know what the best decision is,” Bao said. Cyberautonomous systems can better compute the ideal decision, which they can then present to humans who make the final call on what to do.

Game theory can also provide valuable information on issues surrounding cybersecurity efforts, such as the ideal cyberdefense budget. For example, game theory models can help predict the outcomes with a $10 million vs. a $20 million budget.

“In recent work, we found it’s not always the case that the more budget we allocate, the better,” Bao said. “You don’t want to spend too much money, because the outcome is not going to be as good as spending less money.”

Bao sees the cybersecurity arena moving toward using cyberautonomous systems in the future.

“The world is becoming more complicated,” she said. “We definitely need computers to help us gain a more comprehensive understanding and make good decisions.”

Avoiding legal tech lag

Diana Bowman

Diana Bowman is a professor in the Sandra Day O'Connor College of Law and the School for the Future of Innovation in Society.

“I find that everything in the Global Security Initiative has a legal, ethical or policy dimension to it,” said Diana Bowman, who studies the international governance of emerging technologies. Bowman is a professor in the Sandra Day O'Connor College of Law and the School for the Future of Innovation in Society, and she frequently collaborates with the Global Security Initiative on her research.

Although international governance includes things like laws and treaties, much of what Bowman focuses on is what she calls soft law — nonlegally binding methods that sway how parties act by encouraging certain behavior. The World Economic Forum and the Organisation for Economic Co-operation and Development are two examples of powerful soft law influencers.

Soft law becomes critical in regulating emerging technology because traditional law can’t keep up with technological development.

“When we're talking about having legislation passed or a new treaty, it can literally take years of negotiation. By the time you actually get an agreement, the technology has moved on in many different ways. In many cases it's not really the most effective way to regulate a technology or its applications,” Bowman said.

On the other hand, soft law is more agile. It’s easier to have a standard-setting organization encourage companies to self-regulate, creating a kind of quick governance. Even so, there’s still a lag with soft law, and enforcement is trickier, relying more on incentives than consequences. As such, not everyone is a big proponent of soft law as a governance tool.

The reason that traditional and soft law both fall behind technology to some degree is that technological evolution is faster than ever before.

“Also, a lot of platform technologies have many different applications in various different realms. It's a lot harder to keep up when the trajectory of a technology is less certain than what we would have seen four decades ago,” Bowman said.

She cites nanotechnology as an example. When it was first introduced, it was hard to imagine all of its possible applications. Now, it’s used in everything from cosmetics to airplanes and involves many different regulatory agencies.

Technology law is also tricky because it has to encompass not only governments and federal agencies, but also privately owned companies, including multinationals. But despite the difficulty, it’s becoming increasingly important. Technology is deeply embedded in our critical infrastructure, and the Colonial Pipeline hack highlights how infrastructure is vulnerable in both cybersecurity and legal measures. Bowman believes that targeting critical infrastructure will be a powerful tool for adversaries in the future.

“It's a lot easier for a foreign nation to pay 10,000 hackers to target critical infrastructure than to build bombs and deploy them,” she said. “The type of attacks that we've seen will continue, and national interest suggests that we do have to create solutions that are far more proactive and agile.”

Diplomacy will have a key role going forward, since a cyberattack could come from outside the U.S., limiting our ability to bring someone to justice. In general, a better understanding of how to influence behavior around emerging technology, such as international trade agreements, will be important.

“A lot of people think of governance only in terms of the ability to bring somebody to court, but there are many different ways that you could encourage or discourage behavior,” Bowman said. “That’s what we really are talking about when talking about governance.”

The challenge of cybersecurity is too complex for one person — or even one academic field — to have the knowledge necessary to solve it. That’s why the Global Security Initiative brings together expert minds from across ASU’s research community; a problem that affects so many also requires many perspectives to understand it.

“This kind of research is going to be valuable whether you're talking about the laptop on your desk or large-scale industrial control systems like we see in infrastructure,” Winterton said.

For more information about ASU's cybersecurity expertise, please see our media resources folder.

Mikala Kass

Communications Specialist , ASU Knowledge Enterprise


image title

ASU engineering experts reframe infrastructure security

May 25, 2021

The multifaceted nature of modern infrastructure systems demands a more thorough approach

Infrastructure has always been a target in warfare, according to Mikhail Chester, an associate professor of civil and environmental engineering at Arizona State University.

"Think about military aircraft dropping bombs on bridges or railroad lines. But battles today are not just army versus army. They are society versus society, and this change means we need to change how we think about infrastructure.”

Chester points to the recent ransomware attack that shut down one of America’s largest fuel pipeline networks. The incident sparked surges in the price of gasoline, panic buying and several days of shortages across the southeastern United States.

“This kind of problem is growing, and it can’t be solved through remedial repairs to old infrastructure,” Chester said. “We need to take a step back and ask what a pipeline is in 2021 or in 2100. Yes, it’s a means to move fuel. But it’s also a network of sensors and an information conduit, and that integrated purpose makes it both valuable and vulnerable amid intensifying global competition and conflict.”

Chester and his faculty peers in the Ira A. Fulton Schools of Engineering at ASU believe broader perspectives need to be part of the current debate about improving America’s infrastructure systems. One issue is the way we frame our thinking about security.

“It’s a multidomain factor,” said Brad Allenby, a professor of civil, environmental and sustainable engineering in the School of Sustainable Engineering and the Built Environment, one of the six Fulton Schools. “It’s not a power grid issue, nor a drinking water issue nor an issue for fuel pipelines. It’s a factor everywhere, and that’s the problem. It’s not addressed as it should be because nobody ‘owns’ it.”

Allenby says the time has arrived for the United States to create national cybersecurity requirements for infrastructure development. As part of those requirements, he says any new civil engineering project supported by federal money needs to have a cybersecurity assessment.

Adam Doupé, a Fulton Schools associate professor of computer science and acting director of ASU’s Center for Cybersecurity and Digital Forensics, agrees with Allenby but adds that even a thorough security assessment is only a snapshot in time. Since threats keep evolving, seeking and solving vulnerabilities is not something that can be done once and considered finished.

“We also know as engineers that changing things after a build is more expensive and difficult than it is at the design phase,” Doupé said. “In software, for example, we conduct a security analysis of a system before it’s actually created. And we keep doing that at every stage of development, fixing the problems we find as we go. This software engineering concept could be applied to civil engineering and yield significant benefits as we upgrade our national infrastructure.”

Chester says this kind of cross-disciplinary collaboration is vital to advancing resilience in the power, transportation, water and other systems that enable society to function. But alongside research and development, he says we need to expand our view of engineering education.

“We already do a lot to enable emerging and disruptive technologies that expand the functionality and improve the efficiency of our civil systems,” he said. “But successfully managing the transitions we face as a society includes thinking about how we see a civil engineer today versus what we envision a civil engineer as needing to be in the years ahead.”

Chester and Allenby say that engineering’s core competencies now need to include cybersecurity. Doupé notes that the Accreditation Board for Engineering and Technology, or ABET, requires that all university computer science students be exposed to security concepts.

“And at ASU, we’ve been very proactive in requiring that all computer science students take our introduction to cybersecurity course. But we can broaden that requirement,” Doupé said.

“Yes,” said Allenby. “We can make security a mandatory part of all engineering degrees. Engineers need to understand the security implications of what they do. But education is not just about universities. Professional associations have a role to play in training. There are many ways in which we can stay on top of security.”

Doupé added that equipping engineers to stay on top of security is vital to avoiding more ransomware attacks of the kind that struck the fuel pipeline network in the southeastern U.S. — or worse.

“If that situation caused as much disruption as it did, imagine what would happen if an entire city’s power grid gets seized for ransom.”

Basic hygiene

Engineers build bridges to carry remarkable loads and withstand powerful winds. But the challenges of infrastructure security are not just physical forces or Mother Nature.

The dangers include human adversaries who learn how a system works and then plan to make trouble. And while there are cybercriminals capable of defeating almost any security measures, a great deal of protection can be achieved through what Doupé calls basic hygiene.

“When a computer requests user permission to conduct an important security update, most people click ‘remind me later’,” said Doupé, an associate professor of computer science in the School of Computing, Informatics, and Decision Systems Engineering, one of the six Fulton Schools at ASU. “However, if it’s one of the computers running a power grid, it’s probably vital to make sure those updates are in place.”

But Doupé said even these simple measures can be difficult to implement for a variety of valid reasons. One example may be when computers supporting a utility system have been certified for one particular use case.

“The company running that system could be very hesitant to apply updates out of fear that it will cause operational problems,” Doupé said. “That’s one of the key challenges of computer security right now. We tend to push the task of testing onto end users. So, we need to think about how to offer these security patches with assurance that they are not going to compromise functionality. It’s not necessarily a complex issue, but we need to address it.”

Top photo: Modern infrastructure systems serve multiple purposes. Pipelines are a means to move liquids, but they also represent networks of sensors and information conduits connected to the wider world. Such advanced functionality demands more advanced security. Graphic courtesy of Shutterstock

Gary Werner

Science writer , Ira A. Fulton Schools of Engineering


image title

Colonial Pipeline hack is latest example of cybersecurity threats to physical infrastructure

May 13, 2021

ASU expert says growing ransomware business model poses a threat to more than just information networks

Colonial Pipeline Co., which operates 5,500 miles of pipeline that delivers 45% of gas and jet fuel to the East Coast of the U.S., was shut down on May 7 by an organization now identified as the ransomware group DarkSide.

DarkSide has issued statements since the attack noting that it is an apolitical group with a goal “to make money, and not creating problems for society.” It follows the ransomware-as-a-service (RaaS) business model.

“Ransomware is hugely profitable,” said Nadya Bliss, executive director of the Global Security Initiative at Arizona State University. “Considering the amount of money involved, It’s not surprising that some groups would establish a business model selling ransomware as a service.

“The trend has escalated as technology development and adoption have outpaced policies and regulations, which contributes to cyber vulnerability,” she said.

According to Homeland Security adviser Elizabeth Sherwood-Randall, the hackers broke into networks devoted to the company’s business operations but did not reach the computers that control the physical infrastructure that transports gasoline and other fuel.

Colonial shut down the network as a precautionary measure and has called in external cybersecurity experts to ensure the hack is not dangerous to the overall network.

According to Sherwood-Randall, “Our nation’s critical infrastructure is largely owned and operated by private sector companies.”

On Wednesday, May 12, the White House issued a new executive order that aims to improve U.S. cybersecurity.

“We’re finding ourselves more frequently in an interesting space — the intersection of federal and private jurisdictions where security regulations may be different,” said Bliss. “At a national level, we’re still trying to figure out what policies make sense in the context of cybersecurity.”

Prior to the Colonial hack, the Biden administration had already launched an initiative to improve the cybersecurity of critical infrastructure, focused on high-priority collaboration with private sector partners to harden defenses, according to Sherwood-Randall.

But the broader perspective, according to Bliss, is that domains that don’t typically see themselves in the computer science space — schools, hospitals, utility companies and, in this case, pipelines — are becoming increasingly at risk from outside attacks.

“Some of these entities don’t have in-house security staff trained to assess and thwart risks,” Bliss said. “The need to have dedicated cybersecurity protocols has become increasingly important as more sophisticated versions of software-based attacks are developed.”

Bliss shared some insights about who is vulnerable to ransomware and ways to protect data from attack.

Question: What is ransomware?

Answer: Ransomware is a type of malicious software that encrypts the data on your hard drive and prevents access until the responsible hackers are paid to release your data. It’s like putting it behind a lock – and you can’t unlock it until you pay the fee. In some cases, the hackers not only lock your data, they threaten to make sensitive information publicly available.

Common targets have been schools and health care providers. For example, hospitals can’t access medical records unless they pay the ransom. Because this data is critical to their operation, they are motivated to pay the fee.

Q: Who is most vulnerable to these attacks?

A: First, everyone is vulnerable to some extent, because of the human factor. Cybersecurity is not only about having the most up-to-date technology, it’s about making sure people understand the risks and practice good cyber hygiene, like avoiding clicking on phishing emails and setting strong passwords.

With that said, older systems that don’t update regularly are particularly vulnerable to ransomware attacks.

Q: How can businesses and municipalities protect themselves?

A: Engage in good cyber hygiene:

  • Back up your data regularly on a separate system that is not connected to the internet. This will give you access to your critical data if your systems are unavailable due to encryption or other attack.
  • Adopt a proactive security profile — don’t assume you won’t be hacked. If you leave the door open, at some point someone will come in. And, the more enticing your house is, the more attractive you are as a target.
  • Update your software on a regular basis — software companies are monitoring for vulnerabilities and providing patches and updates that secure against threats as they become known. When the software no longer offers updates, it’s likely time to reconsider the platforms you are using.
  • Recognize cybersecurity as a cost of doing business. Skimping on it may end up costing you much more in the long run.
  • Don’t click on downloads you aren’t expecting or that aren’t from a reliable source. Doing so can end up subjecting your entire system to malware.
  • Seek out support and resources from experts. During the COVID-19 pandemic, for example, the American Hospital Association and the Department of Homeland Security partnered to protect hospitals from malicious activity.
  • Take advantage of resources from trusted sources like the Cybersecurity and Infrastructure Security Agency, which, among other things, provides extensive ransomware guidance and resources and generally tracks vulnerabilities.

Q: With much of America’s critical infrastructure managed by private companies, what can the U.S. federal government do to improve security?

A: Improved policies and regulations can help set basic standards for cybersecurity of critical infrastructure. For example:

  • Create a national information gathering and sharing mechanism that will enable all components of the country’s infrastructure ecosystem — whether in the public or private sector — to get real-time updates of the latest threats and suggested security measures.
  • Incentivize adoption of cutting-edge research into practice.
  • Standardize educational resources and training for local governments and companies.
  • Provide educational resources and training to local governments and companies.
  • Work closely with international partners to better understand the evolving threat landscape. Cyber doesn’t care about physical borders between countries. It’s important that these attacks are tracked broadly and that we share information on a global level.

Terry Grant

Media Relations Officer , Media Relations and Strategic Communications


image title

ASU, UNLV students collaborate to solve homeland security challenges

April 23, 2021

Students develop innovative ways to secure crowded places in Hardening Soft Targets challenge

In the parlance of homeland security, soft targets are places that are easily accessible to the general public and relatively unprotected. Marathons, large gatherings, sporting events and shopping malls are considered soft targets. Soft targets are challenging to secure. The Department of Homeland Security is working directly with students through programs like Department of Homeland Security Centers of Excellence to discover new and innovative ideas to solve challenging problems like soft targets.  

During the weekend of March 26–28, innovative students from both Arizona State University and the University of Nevada, Las Vegas competed in "Hardening Soft Targets" — a design challenge hosted by the Center for Accelerating Operational Efficiency, a DHS Center for Excellence led by ASU. 

Hardening Soft Targets was organized as part of Devils Invent, a series of engineering and design challenges from ASU’s Ira A. Fulton Schools of Engineering. During this three-day event, students worked directly with experts from the Department of Homeland Security and the Phoenix Police Department, as well as industry leaders and members of academia who participated as mentors and judges. 

The students were given the choice of three timely soft target challenges to work on solving: 1) protecting the perimeter of a marathon from vehicle ramming; 2) designing city infrastructure to prevent vehicle ramming attacks; and 3) ensuring that municipalities’ water systems are protected from cyberattacks. 

Bill Bryan, undersecretary (acting) for DHS Science and Technology Directorate, provided opening remarks.

“Those places we call soft targets are usually accessible to a very large number of people, which makes it a real challenge to harden or protect these open and crowded locations such as transportation systems or neighborhood parks,” he said. 

He also stated that “bad actors select soft targets because of the number of people in one location and the perceived value of the target.”

Bryan discussed the need for data analytics and database management professionals to help counter the risks of soft targets.

“There’s a growing need: the ability to collect, manage and share real-time information for the security of our citizens while protecting their privacy and civil liberties,” Bryan said.

He closed his remarks by emphasizing how much need there is for today’s students to “reshape the homeland security workforce of the future.”

Former DHS presidential appointee and current adjunct professor at Seton Hall University Mohamad Mirghahari also kicked off the event. Mirghahari gave the students a real-world overview of possible attack scenarios, adversarial actors, security gaps and countermeasure strategies currently being deployed. 

“Soft targets can be executed with little or no planning or expertise, and they're often able to remain undetected until operational," Mirghahari said. "Together with the massive amount of soft target locations it presents a significant security challenge.”   

He used the example of the Super Bowl to illustrate the substantial number of possible soft targets.

“Instead of potentially targeting the actual Super Bowl there are pregame and lead-up events around the city and near the site, which do not have the (same) security posture as the main event,” he said.

Mirghahari then discussed the importance of design challenges such as this one to give students the opportunities to “engineer the next solutions and ideas.”

The city of Phoenix Police Department and personnel from their Homeland Defense Bureau and Arizona Fusion Center provided technical expertise and resources for planning the event. Sgt. Chris Scranton from Phoenix PD’s Homeland Defense Bureau led the planning efforts and acted as a mentor, providing a local law enforcement perspective to the challenge of soft targets.

At the beginning of the event, students organized themselves into hybrid teams of expertise across major academic disciplines and universities. They then named their teams, selected their scenario of choice and developed strategic approaches to the challenge statements.

Academic mentors were led by Ross Maciejewski, Center for Accelerating Operational Efficiency director and associate professor in ASU's School of Computer, Informatics and Decision Systems Engineering; Dan McCarville, Center for Accelerating Operational Efficiency associate director of education and professor of practice in the School of Computer, Informatics and Decision Systems Engineering; and Mohamed Trabia, associate dean for research, graduate studies and computing at UNLV. They were joined by noted industry and government experts to provide the eight competing teams with hands-on guidance and feedback during the challenge. 

Over 30 students and 15 judges and mentors worked hard over the three-day challenge to offer creative solutions to hardening soft targets. 

The winning teams

First place: Team RAM, ASU

Elisa Magtoto, computer science; Maya Muir, computer science and math; Mohan Parekh, mechanical engineering; Zain Sidhwa, business data analytics; Mitchell Laukonen, computer science.

The team developed a mechanical barrier device for marathon event protection. It was designed as a mobile, easily transportable, cost-effective layered system that allowed for future technological integration. The presentation included a prototype of their design.

screenshot of Zoom presentation of marathon barricades

Winning design challenge team RAM.

Second place: SECURiVISION, UNLV

Jannelle Domantay, computer science; Yuria Mann, computer science; Dylan Obata, computer science.

SECURiVISION proposed a solution addressing equipment and human factor vulnerabilities in securing water facilities from attack including insider threats using blockchain and machine learning capabilities.

Third place: Tie 

Project NEMO, ASU and UNLV

Curtiss Brouthers, graduate student, learning sciences, ASU; Abraham Castaneda, electrical engineering, UNLV; Zeinab Mohammed, graduate student, engineering management, ASU; Niranjana Venkatesan, mechanical engineering, UNLV.

The group incorporated an environmental design approach as an ecosystem to protect municipal water facilities. Their concept spanned from employee credential validation to application and integrity of cyber firewalls.


Andrew Desos, mechanical engineering; Alexander Hollar, chemical engineering; Nihar Masurkar, graduate student, robotics and autonomous systems, mechanical and aerospace engineering; Will Noll, biomedical engineering; Ben Weber, aerospace engineering

BAWaN designed deployable vehicle barricades for marathon security, a strong steel structure with sensor ability for adapting to various vehicle sizes. The presentation included a prototype of their design.

Funding sources: U.S. Department of Homeland Security under Grant Award Number, 17STQAC00001-04-02.

Written by Dawn Janssen, communications manager, Center for Accelerating Operational Efficiency

Top photo: Students were asked to design innovative solutions for soft target events like marathons. 

image title

New ASU center to fight disinformation campaigns that threaten democracy

New ASU center to tackle nefarious disinformation spread around the globe.
March 30, 2021

Center on Narrative, Disinformation, and Strategic Influence to use interdisciplinary approach

The internet is open to everyone, and that democratization has a dark side. Disinformation is flourishing and affects everything from elections to public health.

The Global Security Initiative at Arizona State University has always focused on how disinformation influences people, and it has now dedicated a new unit to that research — the Center on Narrative, Disinformation, and Strategic Influence.

The center will use an interdisciplinary method of researching disinformation campaigns and developing tools to combat them, according to Scott Ruston, a research professor who will lead the new center, housed within the Global Security Initiative. He has already done extensive work in this area, tracking propaganda operations around the world.

“We aspire to facilitate research that would go in this general problem space of how humans understand the world and how nefarious actors manipulate the environment for degradation of democratic values and faith and trust in institutions that undergird our democracy,” he said.

“This is the kind of work that tells people in the federal government that this is worth pursuing, this is a valid and valuable way of going after tough problems,” he said.

The spread of disinformation is a major security challenge that is significantly impacting our country and the world in negative ways, according to Nadya Bliss, executive director of the Global Security Initiative.

"Whether spread by domestic actors for political gain or by foreign adversaries as a tool in geopolitical competition for global influence, disinformation is tearing away at the fabric of democratic institutions and our ability to unify around a common purpose," she said.

"At ASU, we have been working on this issue for years. The launch of this center builds on that work and brings together strengths in narrative theory, computer science, social sciences, humanities, journalism and other disciplines so we can better understand, identify and combat the purposeful spread of false or misleading information.”

Ruston answered some questions from ASU News:

Question: What will the center do?

Answer: The goal is to fuse humanities research, theories and methods and scientific research and methods with computer science and other technologies to better understand how humans make sense of the world around them, and how adversarial actors exploit that in the contemporary information environment.

What we’re doing is helping preserve those democratic values and principles that we’re seeing under threats that have been exposed in the last few years.

Most of the phenomena we study are not new. It’s how they’re being leveraged and accelerated that’s new.

Q: You approach disinformation from a “narrative” perspective. What does that mean?

A: The underlying premise is that humans make sense of the world around them via the stories they tell themselves, and the stories they are told, and how they put them into meaning-making structures. That’s how we conceive narrative. It's a slippery concept.

What comes to mind is the story of George Washington and the cherry tree. It’s a national myth taught to schoolchildren to learn the national hero origin story about honesty. It’s part of a larger ethos of the American Revolution and what America stands for.

The Boston Tea Party is a specific story about a bunch of Massachusetts figures who dress up, jump on a ship and throw some tea overboard. But it communicates certain values: resistance in the face of oppression, resistance to a distant government and resistance to taxation without representation.

Taken together, these stories create the narrative arc of the revolution and the birth of the country. Political protest is a valued principle. Honesty is a valued principle. There’s a narrative consistency to the stories.

At the end of the arc is the founding of the U.S.

Q: How does that apply to what the center does?

A: We look at how the contemporary environment tells stories and how those stories nest into narrative styles.

One research project focused on Russian propaganda in Latvia, where they flooded the information environment with stories about corruption in the Latvian legislature and in Latvian municipal government. They told stories about how pending legislation would be discriminatory against Russian speakers residing in Latvia. They told stories about how the Latvian government is beholden to NATO as a puppet master. They told stories about how NATO military exercises in Latvia tear up farmers’ fields and destroy their livelihoods.

We have all these little stories that paint a larger arc – that Latvia is a country in conflict beset by corruption manifested against Russian speakers. NATO is a global elite that doesn’t care about the little guy. The conclusion is that you should become an ally of Russia.

Q: So is it “fake news”?

A: To equate disinformation with things that are untrue is a misunderstanding of the phenomenon.

It could be specific pieces of information that may be false or misleading or inaccurate, but not solely untrue. Those pieces are accumulated into the narrative so the narrative does the work of disinformation, which has the goal of political influence.

That’s the sweet spot of the center.

Q: Is Russian propaganda a focus?

A: We’re agnostic on what part of the globe we look at. We’ve had projects in the Baltics, Eastern Europe, Southeast Asia.

The consistent piece is that we’re always looking at how humans understand the world and how that’s impacted by 21st-century information exchanges that are accelerated by new technologies that layer in some degree of malignant influence — and particularly those perpetrated by an adversarial actor for the purposes of strategic influence.

If there’s a lone voice in the wilderness spouting all kind of disinformation about some topic, we’re not going to pay attention to a single actor. We’re interested in larger scale disinformation campaigns tied to geopolitics.

Q: How do you apply social science research?

A: A good example of how we approach studying disinformation and influence and propaganda was the project in Latvia.

We collected a bunch of news articles that were published by known Russian state-sponsored propaganda outlets and also known news outlets that had a Russia bias.

And we read the articles, we looked for evidence of framing, a human communication or rhetorical practice that guides the interpretation of a set of facts. The principle of that is that it tells you not what you’re supposed to think but what lens to think about it through.

An example is an article that said that yesterday in Latvia, the veterans of the Latvia Legion marched to commemorate their service in World War II, and there were cheering crowds.

The Latvia Legion fought against the Red Army. From the Russian, or Soviet, perspective, these were Nazis. From the Latvian perspective, they were nationalist heroes who happened to be funded by the Nazis.

The actual guys are in their 90s now. There were a few hundred people in downtown Riga watching.

The Russian press frames it as cheering crowds, and that framing is indicative of the appeal of fascism in the Baltic country.

The theories behind framing come from the social science of human communication. We use social science to detect it reliably.

Q: How do you do that?

A: We had thousands of texts, like news articles. It was far more than we could read on our own.

We wanted to train a computer program that would read and adds the same labels that the humans would have. It uses data mining techniques to distinguish a sentence that frames the rise of fascism or corruption in local government versus a sentence indicative of discrimination against ethnic Russians versus sentences that did none of those things. The machine classifier was able to detect with a high degree of accuracy and consistency.

Q: Why is the social science component important?

A: The vast majority of work in this area from other universities and think tanks tends to be computer science heavy – data mining, information science, network science, social network analysis. They bring in social science, sociological and communication principles, but they aren’t as developed with the level of sophistication that we are doing. They’re not truly interdisciplinary.

We are adapting social science to meet computer science and adapting computer science to meet social science.

Our approach to narrative draws heavily on the humanities, how literary and film and media studies and the whole subfield of narratology approaches the study and analysis of narrative.

Q: What was the result of the Latvia project?

A: We could plot the curves on a graph that showed the relative distribution within the propaganda material. The most important thing we could determine was when Russia changed their minds. They were pushing the discrimination thing a fair bit, and then pushing the corruption thing a fair bit, but pretty suddenly in early 2018 they started pushing the discrimination element significantly more than fascism.

My thought about why they shifted gears is they decided the disinformation angle was the thing that would be most influential in the parliament election in fall 2018.

Q: What kind of outcomes will the center produce?

A: A lot of our research is sponsored by organizations affiliated with the government and particularly agencies that are tied closely to national security. Right now, we have four projects funded by the Department of Defense and one anticipated to be funded by the Department of Homeland Security.

So internal reports to the sponsoring agency is one outcome. The Latvia project was funded by the Department of State. We filed periodic reports through the course of that project that went into their library of available insights about disinformation and propaganda in different parts of the world.

Publications that contribute to academic fields is another. One of the computer science graduate students published a paper about his approach to identifying inauthentic behavior.

Other projects generate things like computer algorithms. The computer algorithm that the graduate students produced would be available to the State Department to use.

We have a project now in which the goal is to develop a computer system that incorporates a wide range of different algorithms that analyze media in different ways. Its ultimate application would be in a government or security agency to analyze manipulated media, like deep fakes.

What is commonly meant by deep fakes are videos altered to make it appear that a person is acting or saying words they didn’t say and making it look natural. It’s important to determine whether it was done for entertainment or humorous purposes or nefarious purposes.

The program is called "SemaFor," short for "semantic forensics"

This follows the model of interdisciplinary applications because the bulk is computer science but the contribution of the ASU team is from the Walter Cronkite School of Journalism and Mass Communication to bring insights about how the journalism industry works.

At the end of this three-year project, the prime contractor, a company, intends to deliver a computer system to the government that would be able to ingest any piece of data, such as a video news article or social media post, and run those algorithms and spit out a score that assesses whether the piece of information has had falsification or manipulation done to it. Then it attributes what kind of manipulation has taken place and characterizes whether it’s disinformation.

That’s sponsored by DARPA, which takes really tough problems and throws as much as they can at it to solve bits and pieces that can be put together.

Q: How do you work across the ASU community?

A: We want to leverage the wide range of talent at ASU, and to that effort we run a disinformation working group with approximately 20 members of faculty from departments and schools across the community and across all campuses. We have faculty from library science, theater, psychology. We get together regularly to triage what’s going on.

The field is moving incredibly fast. When we first started the working group about a year ago, about half of the participants hadn’t heard of the term deep fake.

Q: Will you look at disinformation locally?

A: We recognize that as researchers within a public university, we owe back to the public some of the benefits of our research, so it doesn’t just get published in an esoteric journal that other scholars in our field read. And that the benefit doesn’t just go back to the sponsoring agencies, but also is realized by the ASU community, the Phoenix metro community and the state of Arizona.

We do some small-scale efforts scanning the information environment of Arizona to identify trending elements of disinformation. We are aspiring to develop that capability so we could provide answers to the public if we caught wind of a disinformation campaign in Arizona.

Q: Are we living in a unique period of disinformation?

A: There’s not a lot of people left who were adults at the turn of the 20th century, when there was a rapid transformation of electrified communication, with the invention of the telephone and the television. There were massive changes in society in a short period of time. There were significant changes in social norms.

What we’ve experienced in the last 30 years is a similar epochal shift in information exchange.

The production of and distribution of information has been democratized, and in the early days of the internet, that was viewed with utopic zeal. It was, “More information produced by more people should be better for everyone.”

But we’re seeing the dark side, that the wild west of the information environment is ripe for exploitation.

And we ask, “How can our citizens and institutions – educators, the press, the judiciary, the legislature, civic groups – better defend against that exploitation?”

Top image by

Mary Beth Faller

Reporter , ASU News