The novel 3D wiring architecture and chip fabrication method enable quantum processing units containing 10,000 qubits to fit in a smaller space than today’s 100-qubit chips.
This was a monumental breakthrough in the philosophy and foundations of quantum mechanics. Bell derived a mathematical inequality that showed if there were any local “hidden variables” (underlying, deterministic factors) explaining the “spooky” correlations in quantum entanglement, those correlations would have to obey certain limits. Experiments inspired by his theorem (starting with Alain Aspect in the early 1980s) have repeatedly shown that these limits are violated, confirming that quantum entanglement is real, non-local, and that nature fundamentally disagrees with Einstein’s idea of “local realism.”
John Bell, with whom I had a fruitful collaboration and warm friendship, is best known for his seminal work on the foundations of quantum physics, but he also made outstanding contributions to particle physics and accelerator physics.
If quantum computing is going to become an every-day reality, we need better superconducting thin films, the hardware that enables storage and processing of quantum information. Too often, these thin films have impurities or other defects that make them useless for real quantum computer chips.
Now, Yuki Sato and colleagues at the RIKEN Center for Emergent Matter Science (CEMS) in Japan have discovered a way to make a superconducting thin film from iron telluride, which is surprising because it is not normally superconducting.
The fabrication process reduces distortion in the crystal structure, making it superconducting at very low temperatures, and thus suitable for use in quantum chips. This study was published in Nature Communications.
Imagine a future where quantum computers supercharge machine learning—training models in seconds, extracting insights from massive datasets and powering next-gen AI. That future might be closer than you think, thanks to a breakthrough from researchers at Australia’s national research agency, CSIRO, and The University of Melbourne.
Until now, one big roadblock stood in the way: errors. Quantum processors are noisy, and quantum machine learning (QML) models need deep circuits with hundreds of gates. Even tiny errors pile up fast, wrecking accuracy. The usual fix—quantum error correction—may work, but it’s expensive. We’re talking millions of qubits just to run one model. That’s way beyond today’s hardware.
So, what’s the game-changer? The team discovered that you don’t need to correct everything.
University of Iowa researchers have discovered a method to “purify” photons, an advance that could make optical quantum technologies more efficient and more secure.
The work is published in the journal Optica Quantum.
The researchers investigated two nagging challenges to creating a steady stream of single photons, the gold standard method for realizing photonic quantum computers and secure communication networks. One obstacle is called laser scatter, which occurs when a laser beam is directed at an atom, causing it to emit a photon, which is a single unit of light. While effective, the technique can yield extra, redundant photons, which hampers the optical circuit’s efficiency, much like a wayward current in an electrical circuit.
What if your conscious experiences were not just the chatter of neurons, but were connected to the hum of the universe? In a paper published in Frontiers in Human Neuroscience, I present new evidence indicating that conscious states may arise from the brain’s capacity to resonate with the quantum vacuum—the zero-point field that permeates all of space.
More specifically, I argue that macroscopic quantum effects are at play inside our heads. This insight results from a synthesis of brain architectural and neurophysiological findings supplemented with quantitative model calculations. The novel synthesis suggests that the brain’s basic functional building blocks, cortical microcolumns, couple directly to the zero-point field, igniting the complex dynamics characteristic of conscious processes.
Researchers at HZDR have partnered with the Norwegian University of Science and Technology in Trondheim, and the Institute of Nuclear Physics in the Polish Academy of Sciences to develop a method that facilitates the manufacture of particularly efficient magnetic nanomaterials in a relatively simple process based on inexpensive raw materials.
Using a highly focused ion beam, they imprint magnetic nanostrips consisting of tiny, vertically aligned nanomagnets onto the materials. As the researchers have reported in the journal Advanced Functional Materials, this geometry makes the material highly sensitive to external magnetic fields and current pulses.
Nanomagnets play a key role in modern information technologies. They facilitate fast data storage, precise magnetic sensors, novel developments in spintronics, and, in the future, quantum computing. The foundations of all these applications are functional materials with particular magnetic structures that can be customized on the nanoscale and precisely controlled.
Scientists from the Indian Institute of Technology Bombay have found a way to use light to control and read tiny quantum states inside atom-thin materials. The simple technique could pave the way for computers that are dramatically faster and consume far less power than today’s electronics.
The materials studied are just one atom thick—far thinner than a human hair—and are known as two-dimensional (2D) semiconductors. Inside these materials, electrons can sit in one of two distinct quantum states, called valleys. These valleys, named K and K′, can be thought of as two different “locations” that an electron can choose between. Because there are two options, researchers have long imagined using them like the 0 and 1 of digital computing, but on a quantum level. This idea is the foundation of a rapidly growing research field called valleytronics.
However, being able to reliably control which valley electrons occupy—and to switch between them quickly and on demand—has been a major challenge. “Previous methods required complicated experimental setups with carefully tuned circularly polarized lasers and often multiple laser pulses, and they only worked under specific conditions,” said Prof. Gopal Dixit.