A team of scientists from the Max Planck Institute of Quantum Optics recently demonstrated a record-breaking experiment that could turn the quantum computing industry on its…
Researchers looking to synthesize a brighter and more stable nanoparticle for optical applications found that their creation instead exhibited a more surprising property: bursts of superfluorescence that occurred at both room temperature and regular intervals. The work could lead to the development of faster microchips, neurosensors, or materials for use in quantum computing applications, as well as a number of biological studies.
Superfluorescence occurs when atoms within a material synchronize and simultaneously emit a short but intense burst of light. The property is valuable for quantum optical applications, but extremely difficult to achieve at room temperatures and for intervals long enough to be useful.
The material in question—lanthanide-doped upconversion nanoparticle, or UCNP—was synthesized by the research team in an effort to create a “brighter” optical material. They produced hexagonal ceramic crystals ranging from 50 nanometers (nm) to 500 nm in size and began testing their lasing properties, which resulted in several impressive breakthroughs.
Aug. 24, 2022 — Training a quantum neural network requires only a small amount of data, according to a new proof that upends previous assumptions stemming from classical computing’s huge appetite for data in machine learning, or artificial intelligence. The theorem has several direct applications, including more efficient compiling for quantum computers and distinguishing phases of matter for materials discovery.
“Many people believe that quantum machine learning will require a lot of data. We have rigorously shown that for many relevant problems, this is not the case,” said Lukasz Cincio, a quantum theorist at Los Alamos National Laboratory and co-author of the paper containing the proof published in the journal Nature Communications. “This provides new hope for quantum machine learning. We’re closing the gap between what we have today and what’s needed for quantum advantage, when quantum computers outperform classical computers.”
“The need for large data sets could have been a roadblock to quantum AI, but our work removes this roadblock. While other issues for quantum AI could still exist, at least now we know that the size of the data set is not an issue,” said Patrick Coles, a quantum theorist at the Laboratory and co-author of the paper.
This will be very useful in progressing the field of quantum computers and communication.
Researchers at the Max Planck Institute of Quantum Optics set a new record after achieving a quantum entanglement of 14 photons, the largest on record so far, an institutional press release said.
Quantum entanglement, famously described by Albery Einstein as “spooky action at a distance” is a phenomenon where particles become intertwined in such a way that they cease to exist individually, and changing the specific property of one results in an instant change of its partner, even if it is far away.
Physicists at the Max Planck Institute of Quantum Optics have managed to entangle more than a dozen photons efficiently and in a defined way. They are thus creating a basis for a new type of quantum computer.
In order to effectively use a quantum computer, a larger number of specially prepared – in technical terms: entangled – basic building blocks are needed to carry out computational operations. A team of physicists at the Max Planck Institute of Quantum Optics in Garching has now for the very first time demonstrated this task with photons emitted by a single atom. Following a novel technique, the researchers generated up to 14 entangled photons in an optical resonator, which can be prepared into specific quantum physical states in a targeted and very efficient manner. The new method could facilitate the construction of powerful and robust quantum computers, and serve the secure transmission of data in the future.
Max Planck of Quantum Optics.
Quantum entanglement, famously described by Albery Einstein as “spooky action at a distance” is a phenomenon where particles become intertwined in such a way that they cease to exist individually, and changing the specific property of one results in an instant change of its partner, even if it is far away.
From my understanding, inertia is typically taken as an axiom rather than something that can be explained by some deeper phenomenon. However, it’s also my understanding that quantum mechanics must reduce to classical, Newtonian mechanics in the macroscopic limit.
By inertia, I mean the resistance to changes in velocity — the fact of more massive objects (or paticles, let’s say) accelerating more slowly given the same force.
What is the quantum mechanical mechanism that, in its limit, leads to Newtonian inertia? Is there some concept of axiomatic inertia that applies to the quantum mechanical equations and explains Newtonian inertia, even if it remains a fundamental assumption of quantum theory?
Watch a movie backwards and you’ll likely get confused—but a quantum computer wouldn’t. That’s the conclusion of researcher Mile Gu at the Centre for Quantum Technologies (CQT) at the National University of Singapore and Nanyang Technological University and collaborators.
In research published 18 July in Physical Review X, the international team shows that a quantum computer is less in thrall to the arrow of time than a classical computer. In some cases, it’s as if the quantum computer doesn’t need to distinguish between cause and effect at all.
The new work is inspired by an influential discovery made almost 10 years ago by complexity scientists James Crutchfield and John Mahoney at the University of California, Davis. They showed that many statistical data sequences will have a built-in arrow of time. An observer who sees the data played from beginning to end, like the frames of a movie, can model what comes next using only a modest amount of memory about what occurred before. An observer who tries to model the system in reverse has a much harder task—potentially needing to track orders of magnitude more information.