Neural networks shine for handling difficult problems such as for example facial and voice recognition, but conventional electronic versions are limited in speed and hungry for power. In principle, optics could beat digital electronic computers in the matrix calculations used in neural networks. And yet, optics had been limited by their inability to do some complex calculations that had required electronics. Now new experiments show that all-optical neural networks can tackle those problems.
The main attraction of neural networks is their massive interconnections among processors, comparable to the complex interconnections among neurons in the brain. This lets them perform numerous operations all at one time, like the human brain does when looking at faces or listening to speech, making them more efficient for facial and voice recognition than traditional electronic computers that execute one instruction at a time.
The latest electronic neural networks have reached eight million neurons, but their future use in artificial intelligence may be limited by their high power usage and limited parallelism in connections. Optical connections through lenses are inherently parallel. The lens in your eye simultaneously focuses light from across your field of view onto the retina in the back of your eye, where a wide range of light-detecting nerve cells detects the light. Each cell then relays the signal it receives to neurons in the brain that process the visual signals to show us an image.
Glass lenses process optical signals by focusing light, which performs an elaborate mathematical operation called a Fourier transform that preserves the information in the first scene but rearranges is completely. One use of Fourier transforms is converting time variations in signal intensity into a plot of the frequencies present in the signal. The military used this trick in the 1950s to convert raw radar return signals recorded by an aircraft in flight into a three-dimensional image of the landscape viewed by the plane. Today that conversion is done electronically, but the vacuum-tube computers of the 1950s were not up to the task.
Development of neural networks for artificial intelligence started with electronics, but their AI applications have been limited by their slow processing and need for extensive computing resources. Some researchers have built hybrid neural networks, in which optics perform simple linear operations, but electronics perform more complex nonlinear calculations. Now two groups have demonstrated simple all-optical neural networks that do all processing with light.
In May, Wolfram Pernice of the Institute of Physics at the University of Münster in Germany and colleagues reported testing an all-optical 'neuron' where signals change target materials between liquid and solid states, an effect that has been used for optical data storage. They demonstrated nonlinear processing, and produced output pulses like those from organic neurons. They then produced an integrated photonic circuit that incorporated four optical neurons operating at different wavelengths, each of which connected to 15 optical synapses. The photonic circuit contained over 140 components and could recognize simple optical patterns. The group wrote that their device is scalable, and that the technology promises 'access to the high speed and high bandwidth inherent to optical systems, thus enabling the direct processing of optical telecommunication and visual data.”
Now a team at the Hong Kong University of Science and Technology reports in Optica that they have made an all-optical neural network based on a different process, electromagnetically induced transparency, in which incident light affects how atoms shift between quantum-mechanical energy levels. The process is nonlinear and can be triggered by very weak light signals, says Shengwang Du, a physics professor and coauthor of the paper.
In their demo, they illuminated rubidium-85 atoms cooled by lasers to about 10 microKelvin (10 microdegrees above absolute zero). Despite the fact that the technique may seem unusually complex, Du said the system was the most accessible one in the lab that could produce the desired effects. 'As a pure quantum atomic system [it] is ideal for this proof-of-principle experiment,' he says.
Next, they plan to scale-up the demonstration using a hot atomic vapor center, which is low-cost, does not require time-consuming preparation of cold atoms, and can be integrated with photonic chips. Du says the major issues are reducing cost of the nonlinear processing medium and increasing the scale of the all-optical neural network for more complex tasks.
'Their demonstration seems valid,' says Volker Sorger, an electrical engineer at George Washington University in Washington who was not involved in either demonstration. He says the all-optical approach is interesting because it offers very high parallelism, but the update rate is limited to about 100 hertz because of the liquid crystals used in their test, and he is not totally convinced their approach can be scaled error-free.