The computer of the next century: This is how we imagined the photonic revolution in 1997, and what has really happened since then

A historical article by Avi Blizovsky from 1997 about optical computers, in a restored and updated version, with a look back at the place of photonics in modern computing.

Optical communication. Illustration: depositphotos.com
Optical communication. Illustration: depositphotos.com

In the 1990s, it seemed that the world of computing was on the verge of a dramatic leap. The personal computer was spreading rapidly, the Internet was starting to spread from academia to the general public, and chip manufacturers were pushing microelectronics to new limits. It was precisely against the backdrop of this success that many researchers wanted to look beyond the horizon and ask what would come after silicon. One of the most intriguing answers was the optical computer: a computer that would be based not on electrons and electric current, but on light.

Researchers then developed technologies designed to enable the construction of optical computers with enormous speed. In Japan, for example, they saw this field as a strategic goal. If electronics were associated with the science of the twentieth century, some believed that optics would be the science of the twenty-first century. Behind this declaration was a simple thought: as computers become faster, the question becomes more acute as to how much more it is possible to accelerate systems based on electric current, dense wiring, and tiny transistors increasingly crowded onto a chip.

The foundations for optical processing were not born in the 1990s. They were laid as early as World War II, in research on radar signal processing. Later, systems were built that combined optics and electronics, and in the world of computers, interfaces between the two technologies had already been created, for example in optical storage media. However, interest in light as the basis for future computers intensified especially in the 1970s and 1980s, when researchers estimated that microelectronic devices might soon approach the limits of their capabilities.

But electronics itself surprised. Chip manufacturers managed to continue improving processors, cram more and more transistors into them, and increase their speed. So, instead of a sudden revolution in which the electronic computer gives way to an optical computer, an intermediate direction began to emerge: a combination of optics and electronics. Fiber-optic communication based on light was already practical at that time, so researchers tried to build electronic parallel computers that would use light to improve the system's performance. The research focused on finding interfaces between microelectronics and photonics, by developing light transmitters, light receivers, and components that could translate electronic information into light signals and back again.

Still, the more ambitious vision was not abandoned. If the transistor is the heart of the electronic computer, then an optical computer would require an optical equivalent: an optical switch. Just as the electrical transistor is based on the transmission or blocking of an electric current, the optical switch was supposed to be based on the transmission or blocking of light. The idea sounds simple, but its implementation was infinitely complex. To build a real optical computer, not just one such switch would be enough, but entire optical microprocessors would be needed, each of which would include a huge number of tiny light switches. Here, the research had already touched on an area that at the time seemed almost like science fiction.

Building such devices required special materials, and many laboratories worked on their development. One of the directions that stood out was the use of structures known by the acronym MQW, meaning Multiple Quantum Well. The intention was to exploit the unique physical properties of very thin layers to build devices that could respond to light, control it, and perhaps in the future also process information using it. If it were indeed possible to produce tiny light switches in huge numbers, it would be possible to imagine computers that operate at extraordinary speeds and with new communication patterns.

One of the great theoretical advantages of the optical computer concerned not only speed, but also architecture. In conventional electronic systems, when processors or components operate in parallel, communication between them is relatively limited and based mainly on predefined links. In an optical system, at least in principle, it is possible to allow more flexible connectivity between distant components. Therefore, proponents of the field believed that optical computers could not only perform calculations faster, but also reorganize the way information flows within the computer.

Almost three decades after the original article was published, one can look back and ask what of all this has come to fruition. The answer is complex. On the one hand, the general optical computer, the one that was supposed to replace the electronic computer, has not become an everyday reality. Even today, most of the logic, memory, and control in computers is performed by electronic means. The transistor has not been replaced, and silicon has not left the stage. In this sense, the prediction of a complete revolution was premature.

On the other hand, the basic direction was right. Light has indeed taken on an increasingly important place in modern computing systems, but not where many expected it to in the 1990s. Instead of replacing the central processor, photonics has penetrated mainly into the areas of high-speed communication: between chips, between processors, between servers and between systems in data centers. As the amount of data grows, and as artificial intelligence systems require faster transfer of information between components, it becomes increasingly clear that the central bottleneck is not always the computation itself but the traffic around it. Here, light has a clear advantage.

The forecast from 1997. From the third millennium by Avi Blizovsky.
The forecast from 1997. From the third millennium by Avi Blizovsky.

In other words, what has been realized is not a pure optical computer, but hybrid computing. Electronics continues to do the bulk of the processing, but photonics helps it deal with limitations of speed, heat, power, and communication. This may be less dramatic than the original dream of a “light computer,” but from a practical perspective it is a very profound development. The world has not given up on the transistor, but it has not given up on light either. Instead of choosing between them, it is learning to combine them.

Therefore, that 1997 article is interesting today not only as a historical document, but also as a reminder of the way in which science and technology actually progress. Sometimes the vision comes true exactly as described. Sometimes it changes form along the way. In the case of the optical computer, the promise did not disappear. It simply found a different path. Not a complete revolution that would erase electronics, but a gradual penetration of photonics into the heart of the computational infrastructure of the new era. In this sense, one could say that the prediction was correct, but at a different pace, and in a different form than it seemed in 1997.

More of the topic in Hayadan:

Leave a Reply

Email will not be published. Required fields are marked *

This site uses Akismet to filter spam comments. More details about how the information from your response will be processed.