Optical computing, which uses photons instead of electrons, has been one of the great promises of this field for decades.
According to Moore’s law —actually more like a forecast, formulated in 1965 by Intel co-founder Gordon Moore— the number of transistors in a microprocessor doubles about every two years, boosting the power of the chips without increasing their energy consumption. For half a century, Moore’s prescient vision has presided over the spectacular progress made in the world of computing. However, by 2015, the engineer himself predicted that we are reaching a saturation point in current technology. Today, quantum computing holds out hope for a new technological leap, but there is another option on which many are pinning their hopes: optical computing, which replaces electronics (electrons) with light (photons).
The end of Moore’s law is a natural consequence of physics: to pack more transistors into the same space they have to be shrunk down, which increases their speed while simultaneously reducing their energy consumption. The miniaturisation of silicon transistors has succeeded in breaking the 7-nanometre barrier, which used to be considered the limit, but this reduction cannot continue indefinitely. And although more powerful systems can always be obtained by increasing the number of transistors, in doing so the processing speed will decrease and the heat of the chips will rise.
Continue reading “Optical Computing: Solving Problems at the Speed of Light” »