Quantum physics just keeps getting weirder, even as it gets more fascinating.

An international leader in quantum computing, architect of the U.S. National Quantum Initiative, and member of the National Academy of Sciences, Chris Monroe will join longtime long-distance collaborators at Duke to build practical quantum computers for use in fields from finance to pharmaceuticals.
Researchers at Oxford University, in collaboration with DeepMind, University of Basel and Lancaster University, have created a machine learning algorithm that interfaces with a quantum device and ‘tunes’ it faster than human experts, without any human input. They are dubbing it “Minecraft explorer for quantum devices.”
Classical computers are composed of billions of transistors, which together can perform complex calculations. Small imperfections in these transistors arise during manufacturing, but do not usually affect the operation of the computer. However, in a quantum computer similar imperfections can strongly affect its behavior.
In prototype semiconductor quantum computers, the standard way to correct these imperfections is by adjusting input voltages to cancel them out. This process is known as tuning. However, identifying the right combination of voltage adjustments needs a lot of time even for a single quantum device. This makes it virtually impossible for the billions of devices required to build a useful general-purpose quantum computer.
Light is notoriously fast. Its speed is crucial for rapid information exchange, but as light zips through materials, its chances of interacting and exciting atoms and molecules can become very small. If scientists can put the brakes on light particles, or photons, it would open the door to a host of new technology applications.
Now, in a paper published on Aug. 17, in Nature Nanotechnology, Stanford scientists demonstrate a new approach to slow light significantly, much like an echo chamber holds onto sound, and to direct it at will. Researchers in the lab of Jennifer Dionne, associate professor of materials science and engineering at Stanford, structured ultrathin silicon chips into nanoscale bars to resonantly trap light and then release or redirect it later. These “high-quality-factor” or “high-Q” resonators could lead to novel ways of manipulating and using light, including new applications for quantum computing, virtual reality and augmented reality; light-based WiFi; and even the detection of viruses like SARS-CoV-2.
“We’re essentially trying to trap light in a tiny box that still allows the light to come and go from many different directions,” said postdoctoral fellow Mark Lawrence, who is also lead author of the paper. “It’s easy to trap light in a box with many sides, but not so easy if the sides are transparent—as is the case with many Silicon-based applications.”
A theory of quantum gravity that describes the universe as beginning in a “Big Bounce” r
If you do not yet have an account, please register so you can.
“For the first time ever, we have direct experimental evidence that an external quantum efficiency above 100% is possible in a single photodiode without any external antireflection,” says Hele Savin, associate professor of Micro and Nanoelectonics at Aalto University in Finland. The results come just a few years after Savin and colleagues at Aalto University demonstrated almost unity efficiency over the wavelength range 250–950 nm in photodiodes made with black silicon, where the silicon surface is nanostructured and coated to suppress losses.
Noticing some curious effects in the UV region, Savin’s group extended their study of the devices to focus on this region of the electromagnetic spectrum. UV sensing has multiple applications, including spectroscopy and imaging, flame detection, water purification and biotechnology. While annual market demand for UV photodiodes is expected to increase to 30%, the efficiency of these devices has been limited to 80% at best. To Savin’s surprise, closer analysis of their device’s response to UV light revealed that the external quantum efficiency could exceed 130%. Independent measurements at Physikalisch Technische Bundesanstalt (PTB) verified the results.
For the first time ever, scientists have witnessed the interaction of a new phase of matter known as “time crystals”.
The discovery, published in Nature Materials, may lead to applications in quantum information processing because time crystals automatically remain intact—coherent—in varying conditions. Protecting coherence is the main difficulty hindering the development of powerful quantum computers.
Dr. Samuli Autti, lead author from Lancaster University, said: “Controlling the interaction of two time crystals is a major achievement. Before this, nobody had observed two time crystals in the same system, let alone seen them interact.
Quantum computing requires meticulously prepared hardware and big budgets, but cloud-based solutions could make the technology available to broader business audiences Several tech giants are racing to achieve “quantum supremacy”, but reliability and consistency in quantum output is no simple trick Covid-19 has prompted some researchers to look at how quantum computing could mitigate future pandemics with scientific precision and speed Quantum computing (QC) has been theorized for decades and has evolved rapidly over the last few years. An escalation in spend and development has seen powerhouses IBM, Microsoft, and Google race for ‘quantum supremacy’ — whereby quantum reliably and consistently outperforms existing computers. But do quantum computers remain a sort of elitist vision of the future or are we on course for more financially and infrastructurally viable applications across industries?
Getting to grips with qubits How much do you know? Ordinary computers (even supercomputers) deploy bits, and these bits comprise of traditional binary code. Computer processes – like code – are made up of countless combinations of 0’s and 1’s. Quantum computers, however, are broken down into qubits. Qubits are capable of ‘superpositions’: effectively adopting both 1 and 0 simultaneously, or any space on the spectrum between these two formerly binary points. The key to a powerful, robust, and reliable quantum computer is more qubits. Every qubit added exponentially increases the processing capacity of the machine.
Qubits and the impact of the superposition give quantum computers the ability to process large datasets within seconds, doing what it would take humans decades to do. They can decode and deconstruct, hypothesize and validate, tackling problems of absurd complexity and dizzying magnitude — and can do so across many different industries.
Wherein lies the issue then? Quantum computing for everybody! We’re still a way off – the general consensus being, it’s 5 years, at least, before this next big wave of computing is seen widely across industries and use cases, unless your business is bustling with the budgets of tech giants like Google, IBM, and the like. But expense isn’t the only challenge.
Frail and demanding — the quantum hardware Quantum computers are interminably intricate machines. It doesn’t take much at all to knock a qubit out of the delicate state of superposition. They’re powerful, but not reliable. The slightest interference or frailty leads to high error rates in quantum processing, slowing the opportunity for more widespread use, and rendering ‘quantum supremacy’ a touch on the dubious side.
Quantum computing (QC) has been theorized for decades and has evolved rapidly over the last few years. An escalation in spend and development has seen powerhouses IBM, Microsoft, and Google race for ‘quantum supremacy’ — whereby quantum reliably and consistently outperforms existing computers. But do quantum computers remain a sort of elitist vision of the future or are we on course for more financially and infrastructurally viable applications across industries?