Toggle light / dark theme

New quantum computing algorithm skips past time limits imposed by decoherence

This could be important!


A new algorithm that fast forwards simulations could bring greater use ability to current and near-term quantum computers, opening the way for applications to run past strict time limits that hamper many quantum calculations.

“Quantum computers have a limited time to perform calculations before their useful quantum nature, which we call coherence, breaks down,” said Andrew Sornborger of the Computer, Computational, and Statistical Sciences division at Los Alamos National Laboratory, and senior author on a paper announcing the research. “With a we have developed and tested, we will be able to fast forward quantum simulations to solve problems that were previously out of reach.”

Computers built of quantum components, known as qubits, can potentially solve extremely difficult problems that exceed the capabilities of even the most powerful modern supercomputers. Applications include faster analysis of large data sets, , and unraveling the mysteries of superconductivity, to name a few of the possibilities that could lead to major technological and scientific breakthroughs in the near future.

Bringing the promise of quantum computing to nuclear physics

Quantum mechanics, the physics of atoms and subatomic particles, can be strange, especially compared to the everyday physics of Isaac Newton’s falling apples. But this unusual science is enabling researchers to develop new ideas and tools, including quantum computers, that can help demystify the quantum realm and solve complex everyday problems.

That’s the goal behind a new U.S. Department of Energy Office of Science (DOE-SC) grant, awarded to Michigan State University (MSU) researchers, led by physicists at Facility for Rare Isotope Beams (FRIB). Working with Los Alamos National Laboratory, the team is developing algorithms – essentially programming instructions – for quantum computers to help these machines address problems that are difficult for conventional computers. For example, problems like explaining the fundamental quantum science that keeps an atomic nucleus from falling apart.

The $750,000 award, provided by the Office of Nuclear Physics within DOE-SC, is the latest in a growing list of grants supporting MSU researchers developing new quantum theories and technology.

An electrical trigger fires single, identical photons

Secure telecommunications networks and rapid information processing make much of modern life possible. To provide more secure, faster, and higher-performance information sharing than is currently possible, scientists and engineers are designing next-generation devices that harness the rules of quantum physics. Those designs rely on single photons to encode and transmit information across quantum networks and between quantum chips. However, tools for generating single photons do not yet offer the precision and stability required for quantum information technology.

Now, as reported recently in the journal Science Advances, researchers have found a way to generate single, identical photons on demand. By positioning a metallic probe over a designated point in a common 2-D semiconductor material, the team led by researchers at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has triggered a emission electrically. The photon’s properties may be simply adjusted by changing the .

“The demonstration of electrically driven single-photon emission at a precise point constitutes a big step in the quest for integrable quantum technologies,” said Alex Weber-Bargioni, a staff scientist at Berkeley Lab’s Molecular Foundry who led the project. The research is part of the Center for Novel Pathways to Quantum Coherence in Materials (NPQC), an Energy Frontier Research Center sponsored by the Department of Energy, whose overarching goal is to find new approaches to protect and control quantum memory that can provide new insights into novel materials and designs for quantum computing technology.

A new interpretation of quantum mechanics suggests that reality does not depend on the person measuring it

Quantum mechanics arose in the 1920s, and since then scientists have disagreed on how best to interpret it. Many interpretations, including the Copenhagen interpretation presented by Niels Bohr and Werner Heisenberg, and in particular, von Neumann-Wigner interpretation, state that the consciousness of the person conducting the test affects its result. On the other hand, Karl Popper and Albert Einstein thought that an objective reality exists. Erwin Schrödinger put forward the famous thought experiment involving the fate of an unfortunate cat that aimed to describe the imperfections of quantum mechanics.

In their most recent article, Finnish civil servants Jussi Lindgren and Jukka Liukkonen, who study quantum mechanics in their free time, take a look at the that was developed by Heisenberg in 1927. According to the traditional of the principle, location and momentum cannot be determined simultaneously to an arbitrary degree of precision, as the person conducting the measurement always affects the values.

However, in their study Lindgren and Liukkonen concluded that the correlation between a location and momentum, i.e., their relationship, is fixed. In other words, reality is an object that does not depend on the person measuring it. Lindgren and Liukkonen utilized stochastic dynamic optimization in their study. In their theory’s frame of reference, Heisenberg’s uncertainty principle is a manifestation of thermodynamic equilibrium, in which correlations of random variables do not vanish.

The Coding School, IBM Quantum Provide Free Quantum Education to 5,000 Students Around the World

LOS ANGELES, Oct. 6, 2020 /PRNewswire/ — The Coding School is collaborating with IBM Quantum to offer a first-of-its-kind quantum computing course for 5,000 high school students and above, designed to make quantum education globally accessible and to provide high-quality virtual STEM education. To ensure an equitable future quantum workforce, the course is free. Students can apply here.

Verified quantum information scrambling

Circa 2019


Quantum scrambling is the dispersal of local information into many-body quantum entanglements and correlations distributed throughout an entire system. This concept accompanies the dynamics of thermalization in closed quantum systems, and has recently emerged as a powerful tool for characterizing chaos in black holes1,2,3,4. However, the direct experimental measurement of quantum scrambling is difficult, owing to the exponential complexity of ergodic many-body entangled states. One way to characterize quantum scrambling is to measure an out-of-time-ordered correlation function (OTOC); however, because scrambling leads to their decay, OTOCs do not generally discriminate between quantum scrambling and ordinary decoherence. Here we implement a quantum circuit that provides a positive test for the scrambling features of a given unitary process5,6. This approach conditionally teleports a quantum state through the circuit, providing an unambiguous test for whether scrambling has occurred, while simultaneously measuring an OTOC. We engineer quantum scrambling processes through a tunable three-qubit unitary operation as part of a seven-qubit circuit on an ion trap quantum computer. Measured teleportation fidelities are typically about 80 per cent, and enable us to experimentally bound the scrambling-induced decay of the corresponding OTOC measurement.

Researchers crack quantum physics puzzle

Scientists have re-investigated a sixty-year-old idea by an American physicist and provided new insights into the quantum world.

The research, which took seven years to complete, could lead to improved , laser techniques, interferometric high-precision measurements and atomic beam applications.

Quantum physics is the study of everything around us at the atomic level, , electrons and particles. Atoms and electrons which are so small, one billion placed side by side could fit within a centimeter. Because of the way atoms and electrons behave, scientists describe their behavior as like waves.

Intel created a superconducting test chip for quantum computing

Circa 2017


Quantum computing is the next big technological revolution, and it’s coming sooner than you might think. IBM unveiled its own quantum processor this past May, scientists have been experimenting with silicon-laced diamonds (and basic silicon, too) as a quantum computing substrate, Google is already looking at cloud-based solutions and Microsoft is already creating a new coding language for the technology. Now Intel has taken another big step towards a quantum computing reality: the company has created a new superconducting chip using advanced material science and manufacturing techniques, and delivered it to Intel’s research partner in the Netherlands, QuTech.

Solid-state qubits integrated with superconducting through-silicon vias

O,.o.


As superconducting qubit circuits become more complex, addressing a large array of qubits becomes a challenging engineering problem. Dense arrays of qubits benefit from, and may require, access via the third dimension to alleviate interconnect crowding. Through-silicon vias (TSVs) represent a promising approach to three-dimensional (3D) integration in superconducting qubit arrays—provided they are compact enough to support densely-packed qubit systems without compromising qubit performance or low-loss signal and control routing. In this work, we demonstrate the integration of superconducting, high-aspect ratio TSVs—10 μm wide by 20 μm long by 200 μm deep—with superconducting qubits. We utilize TSVs for baseband control and high-fidelity microwave readout of qubits using a two-chip, bump-bonded architecture. We also validate the fabrication of qubits directly upon the surface of a TSV-integrated chip. These key 3D-integration milestones pave the way for the control and readout of high-density superconducting qubit arrays using superconducting TSVs.