ASU engineers study radiation-caused mutations in brain-like computing systems
Computers have a lot to learn from our brains’ strength of quickly processing information, but we also need to ensure that they don’t suffer the same weaknesses.
Brain-like neuromorphic computing platforms are taking electronics closer to a brain’s processing speed, but before they’re ready for critical applications where they’ll be exposed to radiation — such as in space and in warfare — they need some additional work.
Like our cells are susceptible to mutations caused by radiation, electronic materials are also susceptible to negative effects from radiation, which lead to unexpected behaviors and breakdown.
A team of researchers at Arizona State University’s Ira A. Fulton Schools of Engineering is working to design experimental neuromorphic architectures, test their ability to withstand radiation and determine what happens when they fail.
Electrical engineering associate professor and radiation effects expert Hugh Barnaby leads this effort, a five-year, $1.75 million project funded by the Defense Threat Reduction Agency. Co-principal investigators professor Michael Kozicki, assistant professor Shimeng Yu and associate professor Keith Holbert bring their own expertise in memory, higher-level computing architectures and nuclear engineering, respectively. Together with Sandia National Laboratories’ testing capabilities and facilities they will set out to understand how radiation impacts these systems.
The DTRA is interested in studying emerging technologies in five areas, including radiation hardness. This U.S. Department of Defense organization seeks to understand these technologies’ vulnerabilities and how to protect systems against physical attacks and harmful radiation environments. Neuromorphic computing architectures are a promising next generation of computing, but they are largely untested in harsh environments.
Learning from the human brain
Traditional von-Neumann computing architectures are what we find in nearly all our electronics today. They make a computation in the processor, store it in separate memory, bring up that information to make a computation in the processor and store it again in memory, repeating that two-step sequence of storing and reading data in separate areas of a chip. While modern computers process this very quickly, the read-write repetition limits their full potential.
To get over this obstacle in processing speeds, engineers have begun taking inspiration from another great processing platform — the human brain — in the form of neuromorphic computing.
In the brain, storage and processing mechanisms are intertwined and can therefore happen more quickly. Information is processed as a signal flows between two neurons in the brain through a synapse “bridge” that stores new memory within. Consolidating this process leads to faster processing and lower power consumption.
Neural systems inspired some of the earliest computing systems, but as the computing field grew, it followed the von Neumann approach instead for 70 years. Now engineers are revisiting neuromorphic computing as a way to break through the bottlenecks caused by conventional architectures.
“The computation in neuromorphic architectures is highly parallel and distributed into massively connected neural networks, thereby they can accelerate the data-intensive operations such as matrix multiplications very efficiently,” said Yu, who was awarded a National Science Foundation CAREER Award for his work in creating a self-learning microchip based on neuro-inspired computing. “The redundancy in millions neurons and billions of synapses may tolerate the random error or fault of individual cells, which make it attractive for the radiation harsh environment that can cause the failure of individual cells ”
The team will look at two architectures used in neuromorphic computing: static RAM, the current state-of-the-art methodology for designing neuromorphic platforms, and resistive RAM, a low energy resistive memory.
SRAM architectures improve on conventional architectures, but they use a lot of space and power and are known to be weak to ionizing radiation. RRAM is an emerging nanotechnology that is a proposed replacement for SRAM-based synapses as they’re smaller and more power efficient.
“RRAM cells have inherent electrical characteristics that are very similar to biological synapses — connections grow within the devices that become stronger the more they are stimulated, which is quite similar to what happens in the brain as it learns,” Kozicki said. “These cells are also the smallest devices that can be made in an integrated circuit, much smaller than a single transistor, and unlike regular transistors they can be stacked on top of each other, leading to huge synaptic density.”
The team’s preliminary work on their RRAM variant, the Programmable Metallization Cell, shows promise for radiation hardness.
“The Programmable Metallization Cell is extremely radiation-tolerant, so we have high confidence that rad-hard (radiation-hardened) systems based on the technology will be possible,” Kozicki said.
Barnaby’s team will study each architecture at the device, circuit and system levels in hardware and through simulation. With the help of Sandia National Laboratories’ facilities, they will expose them to different types of radiation, observe the effects and create models that help explain why those effects occurred.
Understanding radiation effects
In organic tissue, high-energy radiation is a common contributor to cell mutations. For example, gamma radiation, which comes from high-energy photons, is a high-frequency, short wavelength about the size of a DNA molecule. When DNA molecules and gamma radiation waves clash, it causes mutations in our DNA.
Radiation is also a concern for electronics, Barnaby says. Exposure to gamma rays, high energy charged particles like electrons and ions and even high-energy neutrons can cause electronic systems to fail. For the RRAM devices the team plans to use in their novel neural systems, Barnaby believes neutrons and heavy ions may be a particularly significant threat
“In electronics, neutrons and heavy ions can reconfigure the structure of the material,” Barnaby said. “The energy radiation absorbed from heavy ions can also create erroneous signals that can change the synaptic strength artificially. This can lead to errors in neuro-processing.”
Structural reorganization and transient ion effects are two potential radiation threats the team has identified so far, and will look for other primary radiation threats to neuromorphic computing systems.
They’ll look at chip-scale and system-level effects from transient damage, which includes single and multiple instances of radiation damage, and effects from cumulative radiation damage over time. The team will also try to understand which stage of a neuro-inspired algorithm’s execution is most vulnerable, and how radiation-caused errors propagate through the layers of a neural computing network.
Developing electronics for harsh environments
From their testing, Barnaby’s team will determine which hardware platform has the best prospects for radiation hardness and is the most beneficial for applications where radiation exposure is likely.
“This work may enable major breakthroughs in low-power co-processors for remote sensing and data analysis,” Barnaby said. “Neuromorphic co-processors could benefit many Department of Defense strategic and space mission applications that operate in radiation environments and are limited by traditional von Neumann computing, such as machine learning, computer vision or control of non-linear dynamic systems — object recognition by drones, for example.”
By better understanding the basic radiation effects on mechanisms in neuro-processing elements and virtual synapses, the team hopes to speed the development of radiation-hardened neuromorphic computing architectures.
“This will allow a quicker adoption of the neuromorphic systems for use in radiation environments and shorten the timeline for developing necessary hardening techniques for neuromorphic systems,” Barnaby said.
More Science and technology
Cracking the code of online computer science clubs
Experts believe that involvement in college clubs and organizations increases student retention and helps learners build valuable…
Consortium for Science, Policy & Outcomes celebrates 25 years
For Arizona State University's Consortium for Science, Policy & Outcomes (CSPO), recognizing the past is just as important as…
Hacking satellites to fix our oceans and shoot for the stars
By Preesha KumarFrom memory foam mattresses to the camera and GPS navigation on our phones, technology that was developed for…