A new atom-by-atom crystal-building method could finally make long-distance quantum communication possible.
Quantum thermal machines are devices that leverage quantum mechanical effects to convert energy into useful work or cooling, similarly to traditional heat engines or refrigerators. Thermodynamics theory suggests that increasing the reliability with which all thermal machines produce the same thermodynamic processes in time comes at a cost, such as the wasted heat or the need for extra energy.
Drawing from theories and concepts rooted in thermodynamics, physicist Yoshihiko Hasegawa at the University of Tokyo recently set out to pinpoint the limits that would constrain the precision of finite-dimensional quantum thermal machines. In a recent paper, published in Physical Review Letters, he delineates these limits and shows that quantum coherence could reduce fluctuations, improving the accuracy of quantum thermal machines.
“Thermodynamic uncertainty relations have clarified an important ‘no free lunch’ principle: if you want an operation to be more precise, you must pay more thermodynamic cost, i.e., entropy production,” Hasegawa told Phys.org. “However, those thermodynamic uncertainty relations do not forbid, in principle, pushing entropy production arbitrarily high.
In spaces smaller than a wavelength of light, electric currents jump from point to point and magnetic fields corkscrew through atomic lattices in ways that defy intuition. Scientists have only ever dreamed of observing these marvels directly.
Now Princeton researchers have developed a diamond-based quantum sensor that reveals rich new information about magnetic phenomena at this minute scale. The technique uncovers fluctuations that are beyond the reach of existing instruments and provides key insight into materials such as graphene and superconductors. Superconductors have enabled today’s most advanced medical imaging tools and form the basis of hoped-for technologies like lossless powerlines and levitating trains.
The underlying diamond-based sensing methods have been under development for half a decade. But in a Nov. 27 paper in Nature, the team reported roughly 40 times greater sensitivity than previous techniques.
For decades, physicists have dreamed of a quantum internet: a planetary web of ultrasecure communications and super-powered computation built not from electrical signals, but from the ghostly connections between particles of light.
Now, scientists in Edinburgh say they’ve taken a major step toward turning that vision into something real.
Researchers at Heriot-Watt University have unveiled a prototype quantum network that links two smaller networks into one reconfigurable, eight-user system capable of routing and even teleporting entanglement on demand.
Hybrid materials made of magnets and superconductors give rise to fascinating quantum phenomena, which are so sensitive that it is crucial to measure them with minimal interference. Researchers at the University of Hamburg and the University of Illinois Chicago have now demonstrated, both experimentally and theoretically, how these quantum phenomena can be detected and controlled over longer distances using special techniques with a scanning tunneling microscope.
Their findings, which could be important for topological quantum computers, were published in the journal Nature Physics.
When a magnetic atom is located in a superconductor, so-called Yu-Shiba-Rusinov quasiparticles are created. Normally, they can only be measured with a high detection probability directly at the location of the atom using the tip of a scanning tunneling microscope.
In this episode of The Quantum Spin by HKA, host Veronica Combs discusses the intersections of quantum technology and cybersecurity with Chuck Brooks, an adjunct professor at Georgetown University and the president of Brooks Consulting International. Chuck discusses how the evolution of technology, particularly AI and quantum computing, has dramatically transformed cybersecurity. The conversation also touches on the role of CISOs, the integration of new technologies, and the importance of ongoing education and adaptation in the face of rapidly changing technologies.
00:00 Introduction to Quantum Spin Podcast 00:34 Guest Introduction: Chuck Brooks 00:46 Chuck Brooks’ Career Journey 02:09 Evolution of Cybersecurity 02:47 Challenges for CISOs 04:27 Quantum Computing and Cybersecurity 07:43 Future of Quantum and AI 10:51 Disruptive Technologies in Organizations 15:15 AI in Academia and Professional Use 17:06 Effective Communication on LinkedIn 18:23 Conclusion and Podcast Information.
Chuck Brooks serves as President of Brooks Consulting International with over 25 years of experience in cybersecurity, emerging technologies, marketing, business development, and government relations. He also is an Adjunct Professor at Georgetown University in the Cyber Risk Management Program, where he teaches graduate courses on risk management, homeland security, and cybersecurity.
This accomplishment breaks the previous record of 48 qubits set by Jülich scientists in 2019 on Japan’s K computer. The new result highlights the extraordinary capabilities of JUPITER and provides a powerful testbed for exploring and validating quantum algorithms.
Simulating quantum computers is essential for advancing future quantum technologies. These simulations let researchers check experimental findings and experiment with new algorithmic approaches long before quantum hardware becomes advanced enough to run them directly. Key examples include the Variational Quantum Eigensolver (VQE), which can analyze molecules and materials, and the Quantum Approximate Optimization Algorithm (QAOA), used to improve decision-making in fields such as logistics, finance, and artificial intelligence.
Recreating a quantum computer on conventional systems is extremely demanding. As the number of qubits grows, the number of possible quantum states rises at an exponential rate. Each added qubit doubles the amount of computing power and memory required.
Although a typical laptop can still simulate around 30 qubits, reaching 50 qubits requires about 2 petabytes of memory, which is roughly two million gigabytes. ‘Only the world’s largest supercomputers currently offer that much,’ says Prof. Kristel Michielsen, Director at the Jülich Supercomputing Centre. ‘This use case illustrates how closely progress in high-performance computing and quantum research are intertwined today.’
The simulation replicates the intricate quantum physics of a real processor in full detail. Every operation – such as applying a quantum gate – affects more than 2 quadrillion complex numerical values, a ‘2’ with 15 zeros. These values must be synchronized across thousands of computing nodes in order to precisely replicate the functioning of a real quantum processor.
The JUPITER supercomputer set a new milestone by simulating 50 qubits. New memory and compression innovations made this breakthrough possible. A team from the Jülich Supercomputing Centre, working with NVIDIA specialists, has achieved a major milestone in quantum research. For the first time, they successfully simulated a universal quantum computer with 50 qubits, using JUPITER, Europe’s first exascale supercomputer, which began operation at Forschungszentrum Jülich in September.