The nascent field of neuromorphic computing – presumably spearheaded by Intel’s “Loihi” chip – uses virtualized neurons to mimic the behavior of the brain. Ordinarily the “brain” this phrase refers to would be a human brain – but what if it weren’t?
“Traditionally, we look at brain-inspired computers and think of ourselves – humans – and how complex our brain is and what enormous capabilities it has,” said Angel Yanguas-Gil, senior materials scientist at Argonne National Laboratory, in a webinar yesterday. “What we’ve done at Argonne is take a step back and look not just at people, but other types of inspirations that can help us develop systems that are essentially capable of these types of to carry out intelligent learning. “
“And in particular,” he continued, “one of the most promising things we’ve found is insects.”
On the direct route to Neuromorphics
The need for neuromorphic computing, Yanguas-Gil explained, stems from the fundamental limitations of how current AI algorithms work.
Angel Yanguas-Gil, senior materials scientist at Argonne National Laboratory. Image courtesy Argonne.
“Once you have a trained system, that system can be deployed – let’s say on your smartphone or in a chip,” he said. “But once this system is deployed, it’s static. If there are changes or disturbances in the environment that the system needs to respond to, it cannot do so unless it falls into the dataset for which it was trained. It’s very different in relation to the way we work. We – and humans and animals in general – have an enormous ability to learn spontaneously, to react to new information and to adapt to changes in the environment.
“You want,” he continued, “to have a system that is able to recognize that something has changed, and to adapt to a few examples and to recover.”
According to Yanguas-Gil, insects are not just an inspiration for compact AI: insects such as bees behave like intelligent sensors that are able to operate in a noisy environment, collect information and – crucially – adapt to this information . “That,” he said, “is the kind of flexibility that we were very interested in.”
Part of this flexibility stems from compactness: some insects, for example, can make better use of their scarce neurons by adapting the functionality of their brain connections based on context.
Argonne’s pincer maneuver
Yanguas-Gil stated that Argonne’s research on neuromorphic computing in insects is a two-pronged approach. The math of everything first, beginning with an exploration of the state of the art in insect neuroscience and behavior, and work on extracting the mathematical principles that led to insect performance.
“Once we have these mathematical principles, we can do them just like you would do machine learning or an AI algorithm,” said Yanguas-Gil. Then they take these networks and compare them with benchmarks in machine learning – and especially in the sub-area of continuous learning. “It turns out that while they are very small and very agile, they can work just as well on some tasks as the state-of-the-art algorithms that are out there,” he said.
The neuromorphic Loihi chip. Image courtesy Tim Herman / Intel.
Second, with these algorithms in hand, Yanguas-Gil said, “You want to port them into hardware.” He outlined the three hardware approaches Argonne explored to harness insect neuromorphic computing, including researching standard devices like FPGAs with collaborators like the University of San Antonio and working with cutting edge neuromorphic chips like Intel’s Loihi.
Design a stronger shell
“The last thing, though, is that we can take these ideas and figure out how we would change the way we design chips with novel materials,” he said. Yanguas-Gil outlined how researchers are using Argonne’s “extremely powerful program” in the field of atomic layer deposition – a thin-film technology used in semiconductor manufacturing – to conduct advanced co-design research for neuromorphic computing.
“We use [the neuromorphic application] as a goal not only to design the architecture at the same time … but [to identify] what kind of new materials we need or how we can best integrate the existing materials into this architecture in order to optimize the learning ability in real time. “
Part of that research, Yanguas-Gil said, was aimed at making these platforms more resilient to extreme environments.
“We found that combining these novel materials with other non-silicon platforms – like silicon carbide – can help you maximize computational effort while minimizing the number of components required,” he said, “which is at temperatures of 300 to 400 degrees Celsius is very important – and even in environments with high radiation. ”A material that Argonne developed years ago allows the specific resistance to be set over“ many orders of magnitude ”and is resistant to temperatures of up to 500 degrees Celsius .
Yanguas-Gil sees several uses for this type of ultra-compact, adaptable, heavy-duty neuromorphic hardware and mentions self-driving cars (“If you want the vehicle to respond to a change that it has not been trained to do without failing catastrophically, this is one Application ”) as well as the brain-computer interfaces for controlling prostheses.