Register for free and continue reading

Join our growing army of changemakers and get unlimited access to our premium content

Login Register

Optical computing for more sustainable artificial intelligence

An Oxford university spin-out is developing faster and more efficient optical neural networks

Spotted: While many people are focused on the implications – both beneficial and frightening – of artificial intelligence (AI), much less has been said about its energy consumption.  

As AI models become more complex, the need for energy-consuming hardware to process them grows exponentially. And according to startup Lumai, today’s electronics just can’t provide sufficient capacity to accommodate the expected explosion of machine learning applications. To tackle this problem, the company is developing faster and more sustainable computing hardware to train the next generation of AI.  

Traditional ‘neural networks’ – a form of AI inspired by the human brain – are based on wires and electric currents. Lumai’s system, by contrast, uses optical neural networks that transfer information and perform calculations using light. In practice, this means that data is manipulated by passing lasers through a display and converging beams of light with a lens. This is both much faster and more energy-efficient than the digital electronics in use today. 

Many innovators are exploring optical computing, but Lumai has made an important breakthrough by creating a complete system that lets neural networks learn by themselves using light. And as these neural networks grow in size, the system becomes faster, which means that the technology can be scaled up far more easily than traditional electronic systems, providing the capacity needed for the dawning age of AI.  

As computing demands more and more energy, innovators are increasingly focusing on ways to improve the sustainability of the industry. Some ideas that Springwise has recently spotted include using microfluidics to cool computers, and a fully recyclable computer chip substrate.