Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory,[1][2][3] and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.
It is an interdisciplinary field that involves quantum mechanics, computer science, information theory, philosophy and cryptography among other fields.[4][5][6] Its study is also relevant to disciplines such as cognitive science, psychology and neuroscience.[7][8][9][10] Its main focus is in extracting information from matter at the microscopic scale. Observation in science is one of the most important ways of acquiring information and measurement is required in order to quantify the observation, making this crucial to the scientific method. In quantum mechanics, due to the uncertainty principle, non-commuting observables cannot be precisely measured simultaneously, as an eigenstate in one basis is not an eigenstate in the other basis. According to the eigenstate–eigenvalue link, an observable is well-defined (definite) when the state of the system is an eigenstate of the observable.[11] Since any two non-commuting observables are not simultaneously well-defined, a quantum state can never contain definitive information about both non-commuting observables.[8]
Data can be encoded into the quantum state of a quantum system as quantum information.[12] While quantum mechanics deals with examining properties of matter at the microscopic level,[13][8] quantum information science focuses on extracting information from those properties,[8] and quantum computation manipulates and processes information – performs logical operations – using quantum information processing techniques.[14]
Quantum information, like classical information, can be processed using digital computers, transmitted from one location to another, manipulated with algorithms, and analyzed with computer science and mathematics. Just like the basic unit of classical information is the bit, quantum information deals with qubits.[15] Quantum information can be measured using Von Neumann entropy.
Recently, the field of quantum computing has become an active research area because of the possibility to disrupt modern computation, communication, and cryptography.[14][16]
The history of quantum information theory began at the turn of the 20th century when classical physics was revolutionized into quantum physics. The theories of classical physics were predicting absurdities such as the ultraviolet catastrophe, or electrons spiraling into the nucleus. At first these problems were brushed aside by adding ad hoc hypotheses to classical physics. Soon, it became apparent that a new theory must be created in order to make sense of these absurdities, and the theory of quantum mechanics was born.[2]
Quantum mechanics was formulated by Erwin Schrödinger using wave mechanics and Werner Heisenberg using matrix mechanics.[17] The equivalence of these methods was proven later.[18] Their formulations described the dynamics of microscopic systems but had several unsatisfactory aspects in describing measurement processes. Von Neumann formulated quantum theory using operator algebra in a way that it described measurement as well as dynamics.[19] These studies emphasized the philosophical aspects of measurement rather than a quantitative approach to extracting information via measurements.
See: Dynamical Pictures
Evolution of: | Picture ( ) | ||
Schrödinger (S) | Heisenberg (H) | Interaction (I) | |
Ket state | constant | ||
Observable | constant | ||
Density matrix | constant |
In the 1960s, Ruslan Stratonovich, Carl Helstrom and Gordon[20] proposed a formulation of optical communications using quantum mechanics. This was the first historical appearance of quantum information theory. They mainly studied error probabilities and channel capacities for communication.[20][21][22] Later, Alexander Holevo obtained an upper bound of communication speed in the transmission of a classical message via a quantum channel.[23][24]
In the 1970s, techniques for manipulating single-atom quantum states, such as the atom trap and the scanning tunneling microscope, began to be developed, making it possible to isolate single atoms and arrange them in arrays. Prior to these developments, precise control over single quantum systems was not possible, and experiments used coarser, simultaneous control over a large number of quantum systems.[2] The development of viable single-state manipulation techniques led to increased interest in the field of quantum information and computation.
In the 1980s, interest arose in whether it might be possible to use quantum effects to disprove Einstein's theory of relativity. If it were possible to clone an unknown quantum state, it would be possible to use entangled quantum states to transmit information faster than the speed of light, disproving Einstein's theory. However, the no-cloning theorem showed that such cloning is impossible. The theorem was one of the earliest results of quantum information theory.[2]
Despite all the excitement and interest over studying isolated quantum systems and trying to find a way to circumvent the theory of relativity, research in quantum information theory became stagnant in the 1980s. However, around the same time another avenue started dabbling into quantum information and computation: Cryptography. In a general sense, cryptography is the problem of doing communication or computation involving two or more parties who may not trust one another.[2]
Bennett and Brassard developed a communication channel on which it is impossible to eavesdrop without being detected, a way of communicating secretly at long distances using the BB84 quantum cryptographic protocol.[25] The key idea was the use of the fundamental principle of quantum mechanics that observation disturbs the observed, and the introduction of an eavesdropper in a secure communication line will immediately let the two parties trying to communicate know of the presence of the eavesdropper.
With the advent of Alan Turing's revolutionary ideas of a programmable computer, or Turing machine, he showed that any real-world computation can be translated into an equivalent computation involving a Turing machine.[26][27] This is known as the Church–Turing thesis.
Soon enough, the first computers were made, and computer hardware grew at such a fast pace that the growth, through experience in production, was codified into an empirical relationship called Moore's law. This 'law' is a projective trend that states that the number of transistors in an integrated circuit doubles every two years.[28] As transistors began to become smaller and smaller in order to pack more power per surface area, quantum effects started to show up in the electronics resulting in inadvertent interference. This led to the advent of quantum computing, which uses quantum mechanics to design algorithms.
At this point, quantum computers showed promise of being much faster than classical computers for certain specific problems. One such example problem was developed by David Deutsch and Richard Jozsa, known as the Deutsch–Jozsa algorithm. This problem however held little to no practical applications.[2] Peter Shor in 1994 came up with a very important and practical problem, one of finding the prime factors of an integer. The discrete logarithm problem as it was called, could theoretically be solved efficiently on a quantum computer but not on a classical computer hence showing that quantum computers should be more powerful than Turing machines.
Around the time computer science was making a revolution, so was information theory and communication, through Claude Shannon.[29][30][31] Shannon developed two fundamental theorems of information theory: noiseless channel coding theorem and noisy channel coding theorem. He also showed that error correcting codes could be used to protect information being sent.
Quantum information theory also followed a similar trajectory, Ben Schumacher in 1995 made an analogue to Shannon's noiseless coding theorem using the qubit. A theory of error-correction also developed, which allows quantum computers to make efficient computations regardless of noise and make reliable communication over noisy quantum channels.[2]
Quantum information differs strongly from classical information, epitomized by the bit, in many striking and unfamiliar ways. While the fundamental unit of classical information is the bit, the most basic unit of quantum information is the qubit. Classical information is measured using Shannon entropy, while the quantum mechanical analogue is Von Neumann entropy. Given a statistical ensemble of quantum mechanical systems with the density matrix , it is given by [2] Many of the same entropy measures in classical information theory can also be generalized to the quantum case, such as Holevo entropy[32] and the conditional quantum entropy.
Unlike classical digital states (which are discrete), a qubit is continuous-valued, describable by a direction on the Bloch sphere. Despite being continuously valued in this way, a qubit is the smallest possible unit of quantum information, and despite the qubit state being continuous-valued, it is impossible to measure the value precisely. Five famous theorems describe the limits on manipulation of quantum information.[2]
These theorems are proven from unitarity, which according to Leonard Susskind is the technical term for the statement that quantum information within the universe is conserved.[33]: 94 The five theorems open possibilities in quantum information processing.
The state of a qubit contains all of its information. This state is frequently expressed as a vector on the Bloch sphere. This state can be changed by applying linear transformations or quantum gates to them. These unitary transformations are described as rotations on the Bloch sphere. While classical gates correspond to the familiar operations of Boolean logic, quantum gates are physical unitary operators.
The study of the above topics and differences comprises quantum information theory.
Quantum mechanics is the study of how microscopic physical systems change dynamically in nature. In the field of quantum information theory, the quantum systems studied are abstracted away from any real world counterpart. A qubit might for instance physically be a photon in a linear optical quantum computer, an ion in a trapped ion quantum computer, or it might be a large collection of atoms as in a superconducting quantum computer. Regardless of the physical implementation, the limits and features of qubits implied by quantum information theory hold as all these systems are mathematically described by the same apparatus of density matrices over the complex numbers. Another important difference with quantum mechanics is that while quantum mechanics often studies infinite-dimensional systems such as a harmonic oscillator, quantum information theory is concerned with both continuous-variable systems[34] and finite-dimensional systems.[8][35][36]
Entropy measures the uncertainty in the state of a physical system.[2] Entropy can be studied from the point of view of both the classical and quantum information theories.
Classical information is based on the concepts of information laid out by Claude Shannon. Classical information, in principle, can be stored in a bit of binary strings. Any system having two states is a capable bit.[37]
Shannon entropy is the quantification of the information gained by measuring the value of a random variable. Another way of thinking about it is by looking at the uncertainty of a system prior to measurement. As a result, entropy, as pictured by Shannon, can be seen either as a measure of the uncertainty prior to making a measurement or as a measure of information gained after making said measurement.[2]
Shannon entropy, written as a functional of a discrete probability distribution, associated with events , can be seen as the average information associated with this set of events, in units of bits:
This definition of entropy can be used to quantify the physical resources required to store the output of an information source. The ways of interpreting Shannon entropy discussed above are usually only meaningful when the number of samples of an experiment is large.[35]
The Rényi entropy is a generalization of Shannon entropy defined above. The Rényi entropy of order r, written as a function of a discrete probability distribution, , associated with events , is defined as:[37]
for and .
We arrive at the definition of Shannon entropy from Rényi when , of Hartley entropy (or max-entropy) when , and min-entropy when .
Quantum information theory is largely an extension of classical information theory to quantum systems. Classical information is produced when measurements of quantum systems are made.[37]
One interpretation of Shannon entropy was the uncertainty associated with a probability distribution. When we want to describe the information or the uncertainty of a quantum state, the probability distributions are simply replaced by density operators :
where are the eigenvalues of .
Von Neumann entropy plays a role in quantum information similar to the role Shannon entropy plays in classical information.
Quantum communication is one of the applications of quantum physics and quantum information. There are some famous theorems such as the no-cloning theorem that illustrate some important properties in quantum communication. Dense coding and quantum teleportation are also applications of quantum communication. They are two opposite ways to communicate using qubits. While teleportation transfers one qubit from Alice and Bob by communicating two classical bits under the assumption that Alice and Bob have a pre-shared Bell state, dense coding transfers two classical bits from Alice to Bob by using one qubit, again under the same assumption, that Alice and Bob have a pre-shared Bell state.
One of the best known applications of quantum cryptography is quantum key distribution which provide a theoretical solution to the security issue of a classical key. The advantage of quantum key distribution is that it is impossible to copy a quantum key because of the no-cloning theorem. If someone tries to read encoded data, the quantum state being transmitted will change. This could be used to detect eavesdropping.
The first quantum key distribution scheme, BB84, was developed by Charles Bennett and Gilles Brassard in 1984. It is usually explained as a method of securely communicating a private key from a third party to another for use in one-time pad encryption.[2]
E91 was made by Artur Ekert in 1991. His scheme uses entangled pairs of photons. These two photons can be created by Alice, Bob, or by a third party including eavesdropper Eve. One of the photons is distributed to Alice and the other to Bob so that each one ends up with one photon from the pair.
This scheme relies on two properties of quantum entanglement:
B92 is a simpler version of BB84.[38]
The main difference between B92 and BB84:
Like the BB84, Alice transmits to Bob a string of photons encoded with randomly chosen bits but this time the bits Alice chooses the bases she must use. Bob still randomly chooses a basis by which to measure but if he chooses the wrong basis, he will not measure anything which is guaranteed by quantum mechanics theories. Bob can simply tell Alice after each bit she sends whether he measured it correctly.[39]
The most widely used model in quantum computation is the quantum circuit, which are based on the quantum bit "qubit". Qubit is somewhat analogous to the bit in classical computation. Qubits can be in a 1 or 0 quantum state, or they can be in a superposition of the 1 and 0 states. However, when qubits are measured, the result of the measurement is always either a 0 or a 1; the probabilities of these two outcomes depend on the quantum state that the qubits were in immediately prior to the measurement.
Any quantum computation algorithm can be represented as a network of quantum logic gates.
If a quantum system were perfectly isolated, it would maintain coherence perfectly, but it would be impossible to test the entire system. If it is not perfectly isolated, for example during a measurement, coherence is shared with the environment and appears to be lost with time; this process is called quantum decoherence. As a result of this process, quantum behavior is apparently lost, just as energy appears to be lost by friction in classical mechanics.
QEC is used in quantum computing to protect quantum information from errors due to decoherence and other quantum noise. Quantum error correction is essential if one is to achieve fault-tolerant quantum computation that can deal not only with noise on stored quantum information, but also with faulty quantum gates, faulty quantum preparation, and faulty measurements.
Peter Shor first discovered this method of formulating a quantum error correcting code by storing the information of one qubit onto a highly entangled state of ancilla qubits. A quantum error correcting code protects quantum information against errors.
Many journals publish research in quantum information science, although only a few are dedicated to this area. Among these are: