Quantum computing

ImprimirCitar
The Bloch sphere is a representation of a cubit, the fundamental building block of quantum computers.

quantum computing or quantum computing is a computing paradigm different from classical computing or classical computing. It is based on the use of qubits, a special combination of ones and zeros. Bits in classical computing can be either 1 or 0, but only one state at a time, whereas the qubit can have both states simultaneously. This gives rise to new logic gates that make new algorithms possible.

The same task can have different complexity in classical computing compared to that in quantum computing, which has given rise to great expectations, as some intractable problems become tractable. While a classical computer is equivalent to a Turing machine, a quantum computer is equivalent to a quantum Turing machine.

The approach of quantum computers is to solve problems in a fundamentally new way. The researchers hope that with this new approach to computing they can begin to explore some problems that we will never be able to solve in any other way.

Dr. Talia Gershon (Director of Research Strategy and Growth Initiatives at IBM) describes quantum computing, very broadly, as a combination of three factors: superposition of spins, entanglement of two objects, and interference, which helps control quantum states and amplify the types of signals that lead to the correct answer, and then cancel the types of signals that lead to the wrong answer.

Origin of quantum computing

As technology evolves and transistors are reduced in size to produce ever smaller microchips, this translates into faster processing speeds. However, you cannot make the chips infinitely small, as there is a limit after which they stop working properly. When the nanometer scale is reached, the electrons escape from the channels through which they must circulate; this is called: tunnel effect.

A classical particle, if it encounters an obstacle, cannot pass through it and bounces. But with electrons, which are quantum particles and behave like waves, there is a chance that a part of them could pass through the walls if they are thin enough; in this way the signal can pass through channels where it should not circulate. Therefore, the chip stops working correctly.

The idea of quantum computing arose in 1981, when Paul Benioff presented his theory to take advantage of quantum laws in the computing environment. Instead of working at the level of electrical voltages, one works at the level of quantum. In digital computing, a bit can take only one of two values: 0 or 1. In contrast, in quantum computing, the laws of quantum mechanics intervene, and the particle can be in coherent superposition: it can be 0, 1, and it can be being 1 and 0 at the same time (two orthogonal states of a subatomic particle). This allows several operations to be performed at the same time, depending on the number of qubits.

The number of qubits indicates the number of bits that can be in overlap. With conventional bits, if you had a three-bit register, there were eight possible values, and the register could only take one of those values. On the other hand, if you have a vector of three qubits, the particle can take on eight different values at the same time thanks to quantum superposition. Thus, a vector of three qubits would allow a total of eight parallel operations. As expected, the number of operations is exponential with respect to the number of qubits.

To get an idea of the breakthrough, a 30-qubit quantum computer would be equivalent to a conventional processor of 10 teraflops (10 trillion floating-point operations per second), currently the conventional Summit supercomputer has the capacity to process 200 petaflops (200 thousand trillion).

How does quantum computing work?

In the traditional computing model, the bit is the minimum unit of information, which corresponds to a binary system, it can only take two values, represented by 0 and 1. Using more bits, they can be combined to represent a greater amount of information.

While in the quantum computing system the minimum unit of information is the Qbit, which has the principle of quantum superposition, thanks to this property the Qbit can take on several values at the same time, it can be 0 and 1, or it is even possible that not only a superposition of both values occurs, but a simultaneous superposition of all the Qbits that are being combined occurs, a set of two Qbits can represent a superposition of values: 00, 01, 10 and 11 at the same time. The increase in the capacity for superposition is equivalent to a greater capacity for representing information.

Interleaving: it is a quality with which two Qbits that have been interleaved (in a correlation); they can be manipulated to do exactly the same, guaranteeing that operations can be carried out in parallel or simultaneously, this principle is known as quantum parallelism; it allows the capacity to perform parallel operations to grow exponentially in relation to the number of Qbits with which the computer can operate.

Quantum computing problems

One of the main obstacles to quantum computing is the problem of quantum decoherence, which causes loss of unitary character (and, more specifically, reversibility) of the steps of the quantum algorithm. The decoherence times for the candidate systems, in particular the transverse relaxation time (in the terminology used in nuclear magnetic resonance and magnetic resonance imaging technology), is typically between nanoseconds and seconds, at low temperatures. Error rates are typically proportional to the ratio of operation time to decoherence time, so any given operation must be completed in a much shorter time than the decoherence time. If the error rate is low enough, it is possible to use quantum error correction effectively, which would allow computation times longer than the decoherence time and, in principle, arbitrarily long. A limiting error rate of 10–4 is often quoted, below which it is assumed that efficient application of quantum error correction would be possible.

Dr Steven Girvin (Professor of Physics at the Yale Quantum Institute), whose main focus is quantum error correction and trying to understand the concept of fault tolerance, says that "everyone thinks they know it when they see it, but nobody in the quantum case can define it precisely.” Likewise, he mentions that in a quantum system, when fault tolerance is observed or measurements are made, the system can change in a way that is out of control.

Another major problem is scalability, especially considering the considerable increase in qubits needed for any computation involving error correction. For none of the currently proposed systems, a design capable of handling a high enough number of qubits to solve computationally interesting problems today is trivial.

Hardware for quantum computing

The problem of which hardware would be ideal for quantum computing has not yet been resolved. A series of conditions has been defined that he must meet, known as the Di Vincenzo list, and there are currently several candidates.

Google engineers work (2018) on a quantum processor called "Bristlecone".

Conditions to be met

  • The system must be able to initialize, that is, to take to a known and controlled state of departure.
  • It must be possible to manipulate the cubits in a controlled manner, with a set of operations that form a universal set of logical doors (to be able to reproduce any other logical door possible).
  • The system must maintain its quantum coherence throughout the experiment.
  • The final state of the system must be read after the calculation.
  • The system has to be scalable: there has to be a definite way to increase the number of cubits, to deal with problems of higher computational cost.

Candidates

  • Nuclear swords of molecules in dissolution, in an MRI apparatus.
  • Electric flow in SQUID.
  • Ions suspended in vacuum.
  • Quantum points on solid surfaces.
  • Molecular magnets in micro-SQUID.
  • Kane quantum computer.
  • Adiabatic computation, based on adiabatic theorem.

Processors

In 2004, scientists at the Institute for Applied Physics at the University of Bonn published results on an experimental quantum record. To do this, they used neutral atoms that store quantum information, which is why they are called "qubits" by analogy with "bits." His current goal is to build a quantum gate, which would provide the basic elements that make up processors, which are the heart of today's computers. It should be noted that a VLSI technology chip currently contains more than 100,000 gates, so its practical use is still on the far horizon.

Data transmission

Scientists from the Max Planck and Niels Bohr laboratories published in the journal Nature in November 2004, results on the transmission of quantum information at distances of 100 km using light as a vehicle, obtaining levels success rate of 70%, which represents a level of quality that allows the use of self-correcting transmission protocols. Currently, work is being done on the design of repeaters, which would allow the transmission of information at greater distances than those already reached.

Computer programs

Quantum algorithms

Quantum algorithms are based on a known margin of error in the basic operations and work by reducing the margin of error to exponentially small levels, comparable to the error level of current machines.

  • Shor Algorithm
  • Grover Algorithm
  • Algoritmo de Deutsch-Jozsa

Models

  • Benioff Quantum Computer
  • Feynman Quantum Computer
  • Deutsch quantum computer

Complexity

The BQP complexity class studies the cost of quantum algorithms with low margin of error.

Proposed problems

Quantum computing has been suggested as a superior alternative to classical computing for a number of problems, including:

  • Integer factoring
  • Logaritmo discreet
  • Simulation of quantum systems: Richard Feynman argued in 1982 that quantum computers would be effective as universal simulators of quantum systems, and in 1996 it was proved that the conjecture was correct.

Timeline

1980s

At the beginning of the 1980s, the first theories that pointed to the possibility of performing calculations of a quantum nature began to emerge.

1981 - Paul Benioff

The essential ideas of quantum computing came from the mind of Paul Benioff, who worked at the Argonne National Laboratory in Illinois (United States). He imagined a traditional computer (Turing machine) that worked with some principles of quantum mechanics.

1981-1982 Richard Feynman

Richard Feynman, physicist at the California Institute of Technology (United States) and winner of the Nobel Prize in 1965, presented a paper at the First Conference on the Physics of Computation, held at the Massachusetts Institute of Technology (United States). His talk, entitled Simulating physics with computers (Simulating physics with computers), proposed the use of quantum phenomena to perform computational calculations and stated that, given their nature, some highly complex calculations would be performed faster on a quantum computer.

1985 - David Deutsch

David Deutsch, an Israeli physicist from the University of Oxford (England) described the first universal quantum computer, that is, capable of simulating any other quantum computer (extended Church-Turing principle). Thus, the idea arose that a quantum computer could run different quantum algorithms.[citation needed]

1990s

At this time, theory began to be translated into practice: the first quantum algorithms, the first quantum applications, and the first machines capable of performing quantum calculations appeared.

1993 - Dan Simon

From the Microsoft research department (Microsoft Research), a theoretical problem emerged that demonstrated the practical advantage that a quantum computer would have over a traditional one.

He compared the classical probability model with the quantum model and his ideas served as the basis for the development of some future algorithms (such as Shor's).

1993 - Charles Bennett

This worker at the IBM research center in New York discovered quantum teleportation and that opened a new avenue of research towards the development of quantum communications.

1994-1995 Peter Shor

This American scientist from AT&T Bell Laboratories defined the algorithm that bears his name and which allows the calculation of prime factors of numbers much faster than in any traditional computer. In addition, its algorithm would allow breaking many of the cryptography systems currently used. His algorithm served to demonstrate to a large part of the scientific community that was looking incredulous at the possibilities of quantum computing, that it was a field of research with great potential. Also, a year later, he proposed a system of error correction in quantum calculation.

1996 - Lov Grover

He invented the data-finding algorithm that bears his name, Grover's algorithm. Although the acceleration achieved is not as drastic as in factorial calculations or in physical simulations, its range of applications is much greater. Like the rest of quantum algorithms, it is a probabilistic algorithm with a high success rate.

1997 - First experiments

In 1997 the first practical experiments began and the doors were opened to start implementing all those calculations and experiments that had been described theoretically until then. The first secure communication experiment using quantum cryptography is successfully performed at a distance of 23 km. In addition, the first quantum teleportation of a photon is carried out.

1998-1999 First qubits

Researchers at Los Alamos and the Massachusetts Institute of Technology manage to propagate the first qubit through a solution of amino acids. It was the first step to analyze the information carried by a qubit. During that same year, the first 2-qubit machine was born, which was presented at the University of Berkeley, California (USA). A year later, in 1999, in the IBM-Almaden laboratories, the first 3-qubit machine was created and was also capable of executing Grover's search algorithm for the first time.

Year 2000 to now

2000 - Progress Continues

Once again, IBM, led by Isaac Chuang (Figure 4.1), created a 5-qubit quantum computer capable of running an order-seeking algorithm, which is part of Shor's algorithm. This algorithm was executed in a single step when in a traditional computer it would require numerous iterations. That same year, scientists at the Los Alamos National Laboratory (USA) announced the development of a 7-qubit quantum computer. Using a nuclear magnetic resonator, electromagnetic pulses can be applied and it allows emulating the bit coding of traditional computers.

2001 - Shor's algorithm executed

IBM and Stanford University succeed in executing Shor's algorithm for the first time on the first 7-qubit quantum computer developed at Los Alamos. In the experiment the prime factors of 15 were calculated, giving the correct result of 3 and 5 using 1018 molecules, each with seven atoms.

2005 - The first Qbyte

The Institute for Optics and Quantum Information at the University of Innsbruck (Austria) announced that its scientists had created the first qbyte, an array of 8 qubits using ion traps.

2006 - Quantum control improvements

Scientists in Waterloo and Massachusetts devise methods to improve quantum control and succeed in developing a system of 12 qubits. The control of the quantum becomes increasingly complex as the number of qubits used by computers increases.

2007 - D-Wave

Canadian company D-Wave Systems had reportedly unveiled on February 13, 2007 in Silicon Valley, a first commercial general-purpose 16-qubit quantum computer; the same company later admitted that such a machine, called the Orion, is not really a quantum computer, but a kind of general-purpose machine that uses some quantum mechanics to solve problems.[citation needed ]

2007 - Quantum Bus

In September 2007, two American research teams, the National Institute of Standards (NIST) in Boulder and Yale University in New Haven succeeded in joining quantum components through superconductors.

This is how the first quantum bus appears, and this device can also be used as a quantum memory, retaining the quantum information for a short time before being transferred to the next device.

2008 - Storage

According to the US National Science Foundation (NSF), a team of scientists managed to store a qubit inside the nucleus of a phosphorus atom for the first time, and they were able to keep the information intact for 1.75 seconds. This period can be expandable through error correction methods, so it is a great advance in information storage.

2009 - Solid State Quantum Processor

The American team of researchers led by Professor Robert Schoelkopf of Yale University, who had already developed the Quantum Bus in 2007, now creates the first solid-state quantum processor, a mechanism that resembles and works in a similar way to a conventional microprocessor, albeit with the ability to perform only a few very simple tasks, such as arithmetic or data lookups.

For the communication in the device, this is done by means of photons that travel on the quantum bus, an electronic circuit that stores and measures microwave photons, increasing the size of an atom artificially.

2011 - First quantum computer sold

The first commercial quantum computer is sold by D-Wave Systems, founded in 1999, to Lockheed Martin for $10 million.

2012 - Advances in quantum chips

IBM announces that it has created a chip stable enough to allow quantum computing to reach homes and businesses. It is estimated that in about 10 or 12 years the first quantum systems may be on the market.

2013 - Quantum computer faster than a conventional computer

In April the company D-Wave Systems launches the new quantum computer D-Wave Two which is 500,000 times superior to its predecessor D-Wave One, with a computing power of 439 qubits. Actually, the D-Wave Two had serious problems in the end, since it did not have the theoretical processing improvements compared to the D-Wave One. This was compared to a computer based on the 2.9 GHz Intel Xeon E5-2690 microprocessor, taking into account than obtaining it, that is, the result on average 4000 times higher.

In 2016, Intel works in the silicon domain for the first quantum computer

In May 2017, IBM introduces a new commercial quantum processor, its most powerful to date with 17 qubits.

2019 - First quantum computer for commercial use

At CES 2019, IBM introduced the IBM Q System One, the first quantum computer for commercial use. It combines both quantum and "traditional" to offer a 20-qubit system for use in research and large calculations.[3]

On September 18, IBM announced that it will soon launch its fourteenth 53-qubit quantum computer, the largest and most powerful commercially available to date.

On September 20, the Financial Times first published "Google claims to have achieved quantum supremacy".

2022 - Quantum processor of 433 quantum bits or qubits

On November 9, 2022, as part of the IBM Quantum Summit, it presented Osprey, its 433-qubit quantum processor.[4]

Quantum computing and its impact on communications security

It is a fact that quantum computing will exponentially revolutionize various sectors, including telecommunications; which will also affect the cryptography and security of internet communications. We currently make use of asymmetric cryptography on each of our websites. Well, the two most used asymmetric encryption algorithms are RSA, whose factorization complexity is large numbers, and cryptography based on the mathematical structure of elliptic curves (ECC). The problem arises that these cryptography methods would be easily decipherable by a quantum computer. Well, algorithms can be developed that are capable of taking advantage of quantum parallelism to solve such complex mathematical problems.

Quantum computing and privacy on the internet

Quantum computing will not end privacy on the internet, however, it will affect and make obsolete the current main encryption methods, such as RSA and ECC.

New asymmetric encryption schemes that guarantee privacy in communications and are not vulnerable to quantum computing are currently being investigated. These investigations are grouped into 4 main groups:

  1. Asymmetrical criptography based on the theory of networks or lattices, instead of using a problem of factoring numbers are used the mathematical problems of complexity NP-Completes of finding the shortest vector (SVP) or finding the nearest vector (CVP). Algoritmos already developed on this basis: "Goldreich-Goldwasser-Halevi (GGH)" and "Number Theory Research Unit (NTRU)".
  2. Multivariate cryptography, based on multivariate polynomials in a finite body. Algorithms already developed on this basis: "Unbalanced Oil and Vinegar"
  3. Criptography based on error correction codes. Algoritmos already developed on this basis: "McElice".
  4. Schemes of encrypted digital signatures based on summaries or hashes. Algorithms already developed on this basis: "Lamport" and "Merkle"

Possibilities offered by quantum cryptography

Thanks to the characteristics of quantum mechanics and Heisenberg's uncertainty principle, quantum cryptography can guarantee absolute confidentiality by enabling the secure exchange of keys between 2 entities, despite the fact that the communication channel is being listened to by an external. Since the interaction of this intruder would modify the information transmitted. This is taken advantage of by algorithms that exchange quantum keys; to guarantee (by probability) that only the sender and receiver know the key.

Complementary bibliography

  • Agustin Rayo, «Custical Computing», Research and Science, 405, June 2010, pp. 92-93.
  • Mastriani, Mario (4 September 2014). Quantum correlated matrix memories, simple and improved: a proposal for your study and simulation on GPGPU. p. 268. Consultation on 12 September 2014.
  • Gershenfeld, Neil, and Isaac L. Chuang. "Quantum computing with molecules." Scientific American 278.6 (1998): 66-71.
  • Gershon, Talia. (2018). Quantum Computing Expert Explains One Concept in 5 Levels of Difficulty. 2021, by WIRED.

Contenido relacionado

Stack (computing)

A stack is an ordered list or data structure that allows data to be stored and retrieved, being the way of access to its elements of type LIFO (from English...

Sleeping barber problem

In computer science, the sleeping barber problem is a timing...

Lexicon

Lexico is a didactic programming language in Spanish to facilitate the learning and teaching of object-oriented...
Más resultados...
Tamaño del texto:
Copiar