An autonomous drone carrying water to help extinguish a wildfire in the Sierra Nevada might encounter swirling Santa Ana winds that threaten to push it off course. Rapidly adapting to these unknown disturbances inflight presents an enormous challenge for the drone’s flight control system.
To help such a drone stay on target, MIT researchers developed a new, machine learning-based adaptive control algorithm that could minimize its deviation from its intended trajectory in the face of unpredictable forces like gusty winds.
The study is published on the arXiv preprint server.
Using an algorithm they call the Krakencoder, researchers at Weill Cornell Medicine are a step closer to unraveling how the brain’s wiring supports the way we think and act. The study, published June 5 in Nature Methods, used imaging data from the Human Connectome Project to align neural activity with its underlying circuitry.
Mapping how the brain’s anatomical connections and activity patterns relate to behavior is crucial not only for understanding how the brain works generally but also for identifying biomarkers of disease, predicting outcomes in neurological disorders and designing personalized interventions.
The brain consists of a complex network of interconnected neurons whose collective activity drives our behavior. The structural connectome represents the physical wiring of the brain, the map of how different regions are anatomically connected.
IBM has just unveiled its boldest quantum computing roadmap yet: Starling, the first large-scale, fault-tolerant quantum computer—coming in 2029. Capable of running 20,000X more operations than today’s quantum machines, Starling could unlock breakthroughs in chemistry, materials science, and optimization.
According to IBM, this is not just a pie-in-the-sky roadmap: they actually have the ability to make Starling happen.
In this exclusive conversation, I speak with Jerry Chow, IBM Fellow and Director of Quantum Systems, about the engineering breakthroughs that are making this possible… especially a radically more efficient error correction code and new multi-layered qubit architectures.
We cover: - The shift from millions of physical qubits to manageable logical qubits. - Why IBM is using quantum low-density parity check (qLDPC) codes. - How modular quantum systems (like Kookaburra and Cockatoo) will scale the technology. - Real-world quantum-classical hybrid applications already happening today. - Why now is the time for developers to start building quantum-native algorithms.
00:00 Introduction to the Future of Computing. 01:04 IBM’s Jerry Chow. 01:49 Quantum Supremacy. 02:47 IBM’s Quantum Roadmap. 04:03 Technological Innovations in Quantum Computing. 05:59 Challenges and Solutions in Quantum Computing. 09:40 Quantum Processor Development. 14:04 Quantum Computing Applications and Future Prospects. 20:41 Personal Journey in Quantum Computing. 24:03 Conclusion and Final Thoughts.
String theory has long been touted as physicists’ best candidate for describing the fundamental nature of the universe, with elementary particles and forces described as vibrations of tiny threads of energy. But in the early 21st century, it was realized that most of the versions of reality described by string theory’s equations cannot match up with observations of our own universe.
In particular, conventional string theory’s predictions are incompatible with the observation of dark energy, which appears to be causing our universe’s expansion to speed up, and with viable theories of quantum gravity, instead predicting a vast ‘swampland’ of impossible universes.
Now, a new analysis by FQxI physicist Eduardo Guendelman, of Ben-Gurion University of the Negev, in Israel, shows that an exotic subset of string models—in which the tension of strings is generated dynamically—could provide an escape route out of the string theory swampland.
One of the current hot research topics is the combination of two of the most recent technological breakthroughs: machine learning and quantum computing.
An experimental study shows that already small-scale quantum computers can boost the performance of machine learning algorithms.
This was demonstrated on a photonic quantum processor by an international team of researchers at the University of Vienna. The work, published in Nature Photonics, shows promising new applications for optical quantum computers.
Empathy, the ability to understand what others are feeling and emotionally connect with their experiences, can be highly advantageous for humans, as it allows them to strengthen relationships and thrive in some professional settings. The development of tools for reliably measuring people’s empathy has thus been a key objective of many past psychology studies.
Most existing methods for measuring empathy rely on self-reports and questionnaires, such as the interpersonal reactivity index (IRI), the Empathy Quotient (EQ) test and the Toronto Empathy Questionnaire (TEQ). Over the past few years, however, some scientists have been trying to develop alternative techniques for measuring empathy, some of which rely on machine learning algorithms or other computational models.
Researchers at Hong Kong Polytechnic University have recently introduced a new machine learning-based video analytics framework that could be used to predict the empathy of people captured in video footage. Their framework, introduced in a preprint paper published in SSRN, could prove to be a valuable tool for conducting organizational psychology research, as well as other empathy-related studies.
The Goldman-Hodgkin-Katz model has long guided transport analysis in nanopores and ion channels. This paper (with a companion paper in Physical Review Letters) revisits the model, showing that its constant electric field assumption leads to inconsistencies. A new self-consistent theory, inspired by reverse electrodialysis, offers a unified framework for ion transport.#AdvancingField #BiophysicsSpotlight
A research team led by Prof. Yong Gaochan from the Institute of Modern Physics (IMP) of the Chinese Academy of Sciences has proposed a novel experimental method to probe the hyperon potential, offering new insights into resolving the longstanding “hyperon puzzle” in neutron stars. These findings were published in Physics Letters B and Physical Review C.
According to conventional theories, the extreme densities within neutron stars lead to the production of hyperons containing strange quarks (e.g., Λ particles). These hyperons significantly soften the equation of state (EoS) and reduce the maximum mass of neutron stars. However, astronomical observations have discovered neutron stars with masses approaching or even exceeding twice that of the sun, contradicting theoretical predictions.
Hyperon potential refers to the interaction potential between a hyperon and a nucleon. Aiming to resolve the “neutron star hyperon puzzle,” the study of hyperon potential has emerged as a frontier topic in the interdisciplinary field of nuclear and astrophysics. Currently, it is believed that if hyperon potentials exhibit stronger repulsion at high densities, they could counteract the softening effect of the EoS, thereby allowing massive neutron stars to exist.
An algorithm is a set of instructions for accomplishing a task. Every piece of code could be called an algorithm, but this book covers the more interesting bits. I chose the algorithms in this book for inclusion because they’re fast, or they solve interesting problems, or both. Here are some highlights: