The latest tech often misses a key ingredient — security — and an ASU expert is helping launch a drive to create an industry change
Social media and other technological innovations brought to market in recent years by the private sector have had a significant impact on national security. While these have resulted in many clear benefits, they have also increased the opportunities for national security threats.
For example, social media platforms allow foreign adversaries to influence our politics through disinformation at a scale not seen before, massive amounts of data collected online could be a goldmine for foreign intelligence organizations, and the use of artificial intelligence in decision-making means algorithms can be manipulated to lead to desired policies.
“Capabilities and convenience are important, but they need to be balanced with security,” said Nadya Bliss, the executive director of ASU’s Global Security Initiative. "Technologists typically prioritize capability over security, and that means we are constantly playing catch up, trying to patch vulnerabilities when they are already in the wild and being exploited."
Bliss said that while security measures such as encryption and authentication have been widely adopted, questions of security tend to be secondary to application capability. She and colleagues from other institutions are urgently calling for a profound change in the way new technologies are designed.
Through her role as a member of the Computing Research Association’s (CRA) Computing Community Consortium, Bliss and her colleagues are driving the national conversation around the need to build security into the design of new technologies, prioritizing it alongside capability.
This is part of a broader effort by the consortium to catalyze computing research to address national security priorities. Every four years, on behalf of the computing community, the CRA releases a series of white papers aimed at detailing research directions, challenges such as how to combat disinformation or how to prepare for the transition to quantum computing and its potential impacts on digital privacy and security, and recommendations for policymakers and the research community.
“White papers like CRA’s Quadrennial Papers — authored by top researchers in the field and released by trusted organizations like CRA — are kind of a 'coin of the realm’ in science policy circles. They're prized by federal policymakers and program managers who use them to help buttress new visions for research or bolster the research ecosystem. They can be key to launching national initiatives, or reshaping programs, or helping push agencies in new directions,” said Peter Harsha, CRA’s director of government affairs.
Bliss co-authored a white paper that outlined a series of steps to incentivize security in the design and development of new technologies. Those steps include:
● Sustained investment in computer science research across both basic science and mission-focused agencies.
● Creation of mission/sector-focused accelerators to support transition of relevant cybersecurity research into application and industry.
● A multidisciplinary effort and public/private partnership around metrics and incentives for security with a goal of continuously producing policy recommendations.
● Investment in lifelong learning and training to support a “security mindset” across the entire U.S. population.
“We can no longer afford as a nation or as individuals for security to be an after-thought as we build out new capability. Security goals and their enforcement are part of a system's foundation. Retrofit is expensive and too disruptive," said Fred Schneider, the Samuel B. Eckert professor of computer science at Cornell University and co-author of the white paper.
“Technological advancements like quantum computing or the next generation of artificial intelligence are not created or distributed in a vacuum,” Bliss said. “They plug into a much bigger ecosystem — our society — and if we don’t take the time to think through their potential negative impacts on that ecosystem, we are not being responsible and are creating larger problems we will eventually have to deal with.”
Written by Nathan Evans