Following are excerpts from an interview conducted by ventureLAB’s Kathryn Ross and published on 12 October 2021. The complete interview can be found at: Creating Sustainable AI
Tell us about what Blumind seeks to accomplish.
Artificial intelligence has made rapid advances in recent years to improve our lives. Entire industries are being transformed by replacing and enhancing legacy processes with neural network-based algorithms. From natural language processing to recommendation engines, machine learning has demonstrated remarkable capability to solve a variety of problems.
“At Blumind, we believe that to democratize AI, and for it to proliferate further, we must move AI inferencing – the application of knowledge to a trained neural network – closer to the devices and sensors that interact with our world.”
However, due to the limitations of today’s semiconductor chips, machine learning functions are limited to large, heavily-controlled data centres or, at best, devices connected to wired power or with large batteries like laptops. This has significant impacts for latency (the time it takes for data to be transferred between nodes), power consumption, and data security. At Blumind, we believe that to democratize AI, and for it to proliferate further, we must move AI inferencing – the application of knowledge to a trained neural network – closer to the devices and sensors that interact with our world, thereby creating autonomy at the Edge.
To accomplish this goal, Blumind has developed a proprietary semiconductor technology that delivers unparalleled latency, power and cost for AI inference. We will initially target smart sensors and smart devices, but our architecture can be deployed anywhere AI processing is done. We like to call it – AI for Everyone, Everywhere.
There are huge environmental benefits to Blumind’s ability to dramatically lower power consumption. Can you contextualize the scope of this?
Minimizing AI’s ecological footprint is vital for sustainable growth. It’s a multi-faceted problem that needs holistic optimization of the entire technology stack to make meaningful improvements. For instance, it’s not simply the electricity it takes to run a graphics processing unit (GPU) for AI training, but the impact of manufacturing that GPU; creating, transporting, and storing the data; and building the infrastructure that hosts the GPU. Although data centres have become more power-efficient over time, there are other factors such as building infrastructure, cooling systems, and wastewater that greatly increase its CO2 footprint.
At the device and sensor level, power consumption, production, and disposal have a huge environmental impact. As the number of calculations to run AI algorithms increases exponentially, we must continuously optimize the associated hardware and software for power and size.
Blumind’s technology achieves both these objectives by treating AI as a signal processing problem rather than shoehorning it into traditional architectures. Interestingly, this new approach is how our brain functions. It’s a fundamental simplification of the software and hardware stack resulting in a power reduction of 100-1000x, all in a tiny silicon footprint. Additionally, as data movement between sensors and computers is eliminated, latency of the signal path is greatly reduced.
“Minimizing AI’s ecological footprint is vital for sustainable growth.”