Quantum computers promise a revolution in our relationship with the world – 09/25/2022. – Alvaro Machado Diaz

The great technological myth of today is that artificial intelligence will become infinitely more skilled than us. It’s a promise with dystopian aspects, but it also aims to cure cancer, limitless food productivity, and more.

Its biggest pitfalls are not the ethical debates that stand out in the media, but the monumental computing capacity that this possibility requires even in its most sober versions. The same thing applies to dynamic simulations of physical environments, known as digital twins, central to the metaverse that becomes a relevant part of the real.

As we debate the pros and cons of these promises, everyday people are upset because we still have a long way to go, while inventing more powerful processors is becoming increasingly difficult.

Technological progress in the last few decades greatly affects the multifold increase in the capacity of digital devices, which is largely determined by the density of transistors on their logic boards. By doubling the number of transistors, we have actually doubled the processing that has been going on since 1975.

Current versions of this component are less than 10 nanometers wide. A silicon atom has 0.2. Starting at 0.5 nanometers, quantum fluctuations make it impossible to know whether an electron is inside or outside a transistor, making digital computing impossible. The dominant point of view is that this limit should be reached within five years.

We are in the phase of exhaustion of a trend of almost half a century, which subsidizes a good part of the technological promises of the coming decades.

This is not the only problem. Many of our technological dreams are generally not well suited to digital processing. For example, it would take us thousands of years of processing to run realistic simulations of the cross-benefits and risks of a portfolio of infrastructure projects across the country and project them into the metaverse.

The promise of quantum computing is to overcome this barrier. This should start happening in four to five years, but momentum should only be gained at the turn of the decade, which could mark the beginning of a new technological era.

In a more immediate sense, the emphasis goes to the perspective that the new futuristic imaginary, of quantum inspiration, is in the consolidation phase. I called it “the quantum of everything” in a previous article, which describes the key concepts at play. If you haven’t had a chance to read it, I recommend that you do so when you can.

A dead and alive cat at the same time has much more information to offer

In digital computers, electricity flowing to the transistor allows identification of the on/off state, leading to the basic unit of information, the bit, which can be either a 0 or a 1.

A quantum computer works in a completely different way. Instead of this organization, the amplitudes of atomic or subatomic particles are used, which, apart from interfering with each other like waves in the sea, are open to a wide spectrum of quantum properties, provided they are isolated from the rest of the world.

One of these properties is superposition, which allows a particle to simultaneously occupy the peak and trough of a wave whose amplitude has not yet been measured, as well as any intermediate position between the two. Because of this, it carries much more information than its binary counterpart, in which the particle would only be in one place at a time.

Inside these new machines, electrons or photons are held in superposition for the time it takes to process the amplitudes in quantum bits, or qubits, which are also held in a state of entanglement, a type of hyper-strong correlation that makes up the register The state of one qubit instantly generates information about all the others with which it is involved. This makes processing almost instantaneous, as it does not require decoding each bit individually.

Much of the quantum computing effort consists of isolating particles from interference from the outside world so that they continue to exhibit their special characteristics. More specifically, these properties are never actually lost; what happens is that the exposure causes the particles to become entangled with external reality, starting to behave like everything else. The spell is gone.

Quantum attributes radically change the approach to hard problems. While digital processing looks for solutions like finding a way out of a maze, testing them one at a time, quantum computing works as if testing them simultaneously.

The result is incredible: a device with only 300 qubits can process more solutions to a problem than there are atoms in the universe, while routine operations on 1,000 qubits, which IBM promises next year, would take more than twenty times the elapsed time since the Big Bang to simulate on traditional computers.

Currently, quantum computers from different manufacturers are competing for the title of the strongest, all with less than 260 qubits, and previously impossible operations are shyly appearing.

However, there are limitations, starting with the impossibility of completely isolating the qubit from the outside world, not least because it is necessary to know how to calculate the result. The inadvertent leakage of quantum information from the machine, whether in the form of radiation, observation or heat, is a major challenge to overcome for the technology to take off.

The strategy used today involves some sort of compensation for information leakage, which requires adding a few entangled qubits to make it work – something like a million qubits for a thousand to do the job.

Unless physics advances significantly, which the subject matter experts I spoke with don’t believe will happen, it will be necessary to find a way to put millions of qubits into action and improve software for transformative applications to emerge.


The idea that quantum computing will replace digital computing is wrong. Its most likely role is to solve problems with many dimensions and few solutions. For example, during Operation Satyagraha, the FBI tried unsuccessfully to break the encryption of Daniel Dantas’ hard drives. It was ten years ago, and to this day it has not been possible to find the alleged evidence against the Opportunity banker, whose verdict was overturned a little later. The idea is that a quantum computer can put this kind of thing on a clean plate.

The ability to break even the most complex codes makes this innovation particularly strategic in the military sphere. Quantum cyberattacks are expected to hit other countries’ infrastructure hard, lead to the eavesdropping of encrypted conversations and, ultimately, the theft of military secrets.

This factor is present in the fierce race for quantum hegemony led by the West and China, in which China is winning. Quantum advances are expected to further exacerbate global power asymmetries from 2025 onwards. As Peter Shor, one of the fathers of quantum computing, told me, this should encourage the development of post-quantum cryptography, which will serve as an antidote to these attacks, in countries where it is available.

Quantum computing is also expected to impact the way we deal with global warming. The case of fertilizer is illustrative. Today, we consume 170 million tons of them per year, which leads to the emission of about 1.4% of the total carbon gas that reaches the planet’s atmosphere. Half of the world’s food is used as fertilizers, which can be of natural or synthetic origin. The production of this second type is very expensive in an ecological sense. If we could mimic what nature does, we could produce cheaper and more sustainable food.

The core of the problem is the catalysts in the nitrogen fixation process, a key part of the fertilizer. Researchers estimate that, using current techniques, it would take 800,000 years to create molecules that could effectively replace the ones we know and one day use quantum computing.

The same thing applies to the production of batteries, solar panels, polymers, electronics, airline routes and many other environmentally expensive processes, including computing itself. A quantum computer is not only exponentially more powerful, it is also smaller and more economical than the current supercomputers, present in several companies, research centers and the like.

The new technology should also boost our climate models, both directly and indirectly, disqualifying environmental denial once and for all. This task depends on complex simulations of human behavior, which is where I see its greatest social contributions and also its greatest controversies.

Consider, for example, the economic debate, a domain where technical and ideological understandings merge. These include values ​​such as intolerance of inequality and intolerance of change, along with hidden group interests, which are responsible for the greatest divisions.

However, it is still true that the impossibility of mapping the immediate effects of economic decisions in all their spheres and elucidating their long-term effects, given that interactions grow exponentially, contributes to this situation.

Quantum algorithms promise to bring unprecedented precision to economic simulations. Their conclusions will be based on billions of times more dimensions and interactions than we can imagine or calculate using the current state of the art in AI, which should increase the clarity of what is at stake in each proposal, making the debate more enlightened. Public policies, international relations and legal designs are unlikely to pass unscathed.

On the one hand, it will be great. On the other hand, this is likely to create a new computational hegemony in most fields of knowledge, more or less as it happens in radiology and actuarial sciences, where algorithms perform basic tasks better than humans, who eventually take over the role of managers of synthetic production of results.

Will we miss something important by doing so? In that case, will this loss direct us to worse paths than the current ones? For example, one understanding of the 2008 crisis is that it was fueled by an over-mathematization of economic thinking, which fueled the proliferation of obscure financial products. This criticism particularly falls on the use of the Black-Scholes pricing model and its various formal developments in markets.

It’s still too early to know. In general, new technologies introduce several distortions, which are corrected over time, generating a more positive than negative balance. Examples of this include smartphones and the web, as well as the history of civilization.

What seems feasible is to consider that, long before the long-awaited conscious artificial intelligence, we will have remote — and often indirect — access to far less interesting machines, with even greater potential to alter reality.

I thank prof. Peter Shor (MIT) for his valuable contribution.

Leave a Reply

Your email address will not be published. Required fields are marked *