This new brain-inspired chip is 23 times faster and needs 28 times less energy
It was 2009 when Mike Huang first knew he wanted to become a chip design engineer, as he developed a 16-bit microprocessor from scratch with his classmates in his university lab.
Soon, Huang grew interested in understanding how the mind works — and “as an engineer, to understand and prove how the mind works is to effectively reverse engineer the brain,” he says. He reached out to Professor Steve Furber at the University of Manchester, who was designing a supercomputer with 1 million ARM cores that draws inspiration from the communication architecture of the brain, and started working closely with him.
A decade later, Huang has just developed a brain-inspired microchip that could process large amounts of data faster and with lower power, improving performance and energy efficiency for AI applications.
“The technology itself closely mimics how biological neural networks work compared to conventional AI solutions,” Huang says.
Huang first created the chip as part of his PhD in Neuromorphic Computing at the University of Edinburgh School of Engineering, funded by UK-based radiation detection company Kromek and the US Defense Threat Reduction Agency (DTRA). He then launched a startup, Rigpa, and joined Cohort IV at Conception X to learn how to commercialise his technology.
The problem Huang originally set out to solve was to reduce power consumption and inference times compared to traditional chip architectures. He achieved this by designing a spiking neural network chip to accelerate the next generation of AI — efficient, sustainable and human brain-like.
“GPUs are an old technology — they were originally designed for video games,” Huang says. “The median GPU consumes a huge amount of power. Just think that the latest neural network model GPT-3 generated 552 metric tons of carbon dioxide during training — that’s the CO2 emissions the average American produces over more than 34 years. It’s not a sustainable solution.”
Rigpa’s technology achieves 28 times less power consumption and 23 times faster inference speed than conventional architectures, with key applications in situations that require reliable, real-time computation — think computer vision, drones, smart home appliances, self-driving cars, wearables, high-frequency trading and more.
“National security is a good example of how this technology could be used,” Huang says. “Imagine a police officer working in counterterrorism who’s equipped with a handheld radiation detector connected to their mobile phone, which processes the data from the detector. Rigpa’s new chip can be integrated directly into the detector so that everything happens in there, and the phone is no longer needed.”
At Conception X, Huang learned how to turn blue sky ideas into something tangible. “Conception X has helped to sharpen my mindset. I’m still doing research, but it’s completely different from one year ago,” he says. “Before, I wasn’t sure when or how this technology would be useful. Now, I know in which direction to go and I’m constantly thinking about how a new piece of research I’m working on will feed into my technology.”
Rigpa plans to launch its product on the market in 2024, when demand for neuromorphic technologies is set to take off, and is currently looking to raise.