Toggle light / dark theme

Quasiprobabilities shed light on quantum advantage

Given the importance of the Kirkwood–Dirac quasiprobability’s nonclassical values, two natural questions arise: Under what conditions does this quasiprobability behave anomalously? And how anomalous can its behaviour get? That’s what we wanted to explore.

What did you do in the paper?

We pinned down conditions under which the Kirkwood–Dirac quasiprobability assumes nonclassical values. Using these conditions, one can calculate which experiments can exhibit certain types of quantum advantages. We also put a “ceiling” on how much nonclassicality one Kirkwood–Dirac quasiprobability distribution can contain.

Outgrowing Einstein: A critical mass of cosmological discrepancies makes us reinterpret relativity

In search for a unifying quantum gravity theory that would reconcile general relativity with quantum theory, it turns out quantum theory is more fundamental, after all. Quantum mechanical principles, some physicists argue, apply to all of reality (not only the realm of ultra-tiny), and numerous experiments confirm that assumption. After a century of Einsteinian relativistic physics gone unchallenged, a new kid of the block, Computational Physics, one of the frontrunners for quantum gravity, states that spacetime is a flat-out illusion and that what we call physical reality is actually a construct of information within [quantum neural] networks of conscious agents. In light of the physics of information, computational physicists eye a new theory as an “It from Qubit” offspring, necessarily incorporating consciousness in the new theoretic models and deeming spacetime, mass-energy as well as gravity emergent from information processing.

In fact, I expand on foundations of such new physics of information, also referred to as [Quantum] Computational Physics, Quantum Informatics, Digital Physics, and Pancomputationalism, in my recent book The Syntellect Hypothesis: Five Paradigms of the Mind’s Evolution. The Cybernetic Theory of Mind I’m currently developing is based on reversible quantum computing and projective geometry at large. This ontological model, a “theory of everything” of mine, agrees with certain quantum gravity contenders, such as M-Theory on fractal dimensionality and Emergence Theory on the code-theoretic ontology, but admittedly goes beyond all current models by treating space-time, mass-energy and gravity as emergent from information processing within a holographic, multidimensional matrix with the Omega Singularity as the source.

There’s plenty of cosmological anomalies of late that make us question the traditional interpretation of relativity. First off, what Albert Einstein (1879 — 1955) himself called “the biggest blunder” of his scientific career – t he rate of the expansion of our Universe, or the Hubble constant – is the subject of a very important discrepancy: Its value changes based how scientists try to measure it. New results from the Hubble Space Telescope have now “raised the discrepancy beyond a plausible level of chance,” according to one of the latest papers published in the Astrophysical Journal. We are stumbling more often on all kinds of discrepancies in relativistic physics and the standard cosmological model. Not only the Hubble constant is “constantly” called into question but even the speed of light, if measured by different methods, and on which Einsteinian theories are based upon, shows such discrepancies and turns out not really “constant.”

Manufacturing silicon qubits at scale

Circa 2019


As quantum computing enters the industrial sphere, questions about how to manufacture qubits at scale are becoming more pressing. Here, Fernando Gonzalez-Zalba, Tsung-Yeh Yang and Alessandro Rossi explain why decades of engineering may give silicon the edge.

In the past two decades, quantum computing has evolved from a speculative playground into an experimental race. The drive to build real machines that exploit the laws of quantum mechanics, and to use such machines to solve certain problems much faster than is possible with traditional computers, will have a major impact in several fields. These include speeding up drug discovery by efficiently simulating chemical reactions; better uses of “big data” thanks to faster searches in unstructured databases; and improved weather and financial-market forecasts via smart optimization protocols.

We are still in the early stages of building these quantum information processors. Recently, a team at Google has reportedly demonstrated a quantum machine that outperforms classical supercomputers, although this so-called “quantum supremacy” is expected to be too limited for useful applications. However, this is an important milestone in the field, testament to the fact that progress has become substantial and fast paced. The prospect of significant commercial revenues has now attracted the attention of large computing corporations. By channelling their resources into collaborations with academic groups, these firms aim to push research forward at a faster pace than either sector could accomplish alone.

New record distance for quantum communications

Toshiba’s Cambridge Research Laboratory has achieved quantum communications over optical fibres exceeding 600 km in length, three times further than the previous world record distance.

The breakthrough will enable long distance, quantum-secured information transfer between metropolitan areas and is a major advance towards building a future Quantum Internet.

The term “Quantum Internet” describes a global network of quantum computers, connected by long distance quantum communication links. This technology will improve the current Internet by offering several major benefits – such as the ultra-fast solving of complex optimisation problems in the cloud, a more accurate global timing system, and ultra-secure communications. Personal data, medical records, bank details, and other information will be physically impossible to intercept by hackers. Several large government initiatives to build a Quantum Internet have been announced in China, the EU and the USA.

Optical cryostat proves a game-changer in quantum communication studies

German nanotechnology specialist attocube says its attoDRY800 cryostat enables quantum scientists to “reclaim the optical table” and focus on their research not the experimental set-up.

Twin-track innovations in cryogenic cooling and optical table design are “creating the space” for fundamental scientific breakthroughs in quantum communications, allowing researchers to optimize the performance of secure, long-distance quantum key distribution (QKD) using engineered single-photon-emitting light sources.

In a proof-of-concept study last year, Tobias Heindel and colleagues in the Institute of Solid State Physics at the Technische Universität (TU) Berlin, Germany, implemented a basic QKD testbed in their laboratory. The experimental set-up uses a semiconductor quantum-dot emitter to send single-photon pulses along an optical fibre to a four-port receiver that analyses the polarization state of the transmitted qubits.

New quantum entanglement verification method cuts through the noise

“Conditional witnessing” technique makes many-body entangled states easier to measure.


Quantum error correction – a crucial ingredient in bringing quantum computers into the mainstream – relies on sharing entanglement between many particles at once. Thanks to researchers in the UK, Spain and Germany, measuring those entangled states just got a lot easier. The new measurement procedure, which the researchers term “conditional witnessing”, is more robust to noise than previous techniques and minimizes the number of measurements required, making it a valuable method for testing imperfect real-life quantum systems.

Quantum computers run their algorithms on quantum bits, or qubits. These physical two-level quantum systems play an analogous role to classical bits, except that instead of being restricted to just “0” or “1” states, a single qubit can be in any combination of the two. This extra information capacity, combined with the ability to manipulate quantum entanglement between qubits (thus allowing multiple calculations to be performed simultaneously), is a key advantage of quantum computers.

The problem with qubits

However, qubits are fragile. Virtually any interaction with their environment can cause them to collapse like a house of cards and lose their quantum correlations – a process called decoherence. If this happens before an algorithm finishes running, the result is a mess, not an answer. (You would not get much work done on a laptop that had to restart every second.) In general, the more qubits a quantum computer has, the harder they are to keep quantum; even today’s most advanced quantum processors still have fewer than 100 physical qubits.

Electrons dual nature appears in a quantum spin liquid

Physics World


Quantum mechanics describes this frustration by suggesting that the orientation of the spins is not rigid. Instead, it constantly changes direction in a fluid-like way to produce an entangled ensemble of spin-ups and spin-downs. Thanks to this behaviour, a spin liquid will remain in a liquid state even at temperatures near absolute zero, where most materials usually freeze solid.

The holon and the spinon

To describe this behaviour in mathematical terms, the late Nobel laureate Philip W Anderson, who predicted the existence of spin liquids in 1973, proposed that in the quantum regime, an electron might in fact be composed of two distinct particles. The first, known as a “holon”, would bear the electron’s negative charge, while the second “spinon” particle would carry its spin. Anderson later suggested that this spin-charge separation might provide a microscopic mechanism to explain the high superconducting transition temperatures (Tc) that were observed in copper oxides, or cuprates, beginning in the late 1980s.

Exotic quantum state could make smallest-ever laser

When particles are cooled down to temperatures just above absolute zero, they form a BEC – a state of matter in which all the particles occupy the same quantum state and thus act in unison, like a superfluid. A BEC made up of tens of thousands of particles therefore behaves as if it were just one single giant quantum particle.

An international team of researchers led by Carlos Anton-Solanas and Christian Schneider from the University of Oldenburg, Germany; Sven Höfling of the University of Würzburg, Germany; Sefaattin Tongay at Arizona State University, US; and Alexey Kavokin of Westlake University in China, has now generated a BEC from quasiparticles known as exciton-polaritons in atomically thin crystals. These quasiparticles form when excited electrons in solids couple strongly with photons.

“Devices that can control these novel light-matter states hold the promise of a technological leap in comparison with current electronic circuits,” explains Anton-Solanas, who is in the quantum materials group at Oldenburg’s Institute of Physics. “Such optoelectronic circuits, which operate using light instead of electric current, could be better and faster at processing information than today’s processors.”

/* */