Optical neural networks, that use photons instead of electrons, have advantages over traditional systems. They also face major obstacles.
Moore’s law is already pretty fast. It holds that computer chips pack in twice as many transistors every two years or so, producing major jumps in speed and efficiency. But the computing demands of the deep-learning era are growing even faster than which at a pace that is likely not sustainable. The International Energy Agency predicts that artificial intelligence will consume 10 times as much power in 2026 as it did in 2023, and that data centers in that year will use as much energy as Japan. “The amount of [computing power] that AI needs doubles every three months,” said Nick Harris, founder and CEO of the computing-hardware company Lightmatter far faster than Moore’s law predicts. “It’s going to break companies and economies.”
One of the most promising ways forward involves processing information not with trusty electrons, which have dominated computing for over 50 years, but instead using the flow of photons, minuscule packets of light. Recent results suggest that, for certain computational tasks fundamental to modern artificial intelligence, light-based “optical computers” may offer an advantage.
The development of optical computing is “paving the way for breakthroughs in fields that demand high-speed and high-efficiency processing, such as artificial intelligence,” said the University of Cambridge physicist Natalia Berloff.
Optimal Optical
In theory, light provides tantalizing potential benefits. For one, optical signals can carry more information than electrical ones they have more bandwidth. Optical frequencies are also much higher than electrical ones, so optical systems can run more computing steps in less time and with less latency.
Tags:
Cyber security