We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Systemic candidiasis is an opportunistic fungal infection that has been difficult to treat effectively. Research published in a paper in the April edition of Cell Host & Microbe suggests that immune metabolic reprogramming could be a new strategy to fight the infection rather than developing another specific antifungal medication.
The fungus Candida albicans causes infections that range from superficial on the skin and nails to invasive into organs and the bloodstream. In recent decades, systemic candidiasis has increased due to more patients with immunosuppression from disease or treatments, prolonged antibiotic exposure, and certain conditions such as kidney disease. Management of systemic candidiasis has become more difficult because of antifungal drug resistance, limited early diagnostic tools, and absence of approved fungal vaccines.
According to Partha Biswas, DVM, Ph.D., lead author of the paper, and a Professor in the Department of Microbiology and Immunology in the Renaissance School of Medicine (RSOM) at Stony Brook University, these challenges have become roadblocks to treating systemic candidiasis and illustrate the need for new and different therapeutic strategies.
Luke Dicken, who previously spent a decade with Zynga, says his time with the company “has come to an end”
Imagine you’re trying to build a very long, complicated chain of dominoes. The aim is that each domino hits the next one perfectly, all the way down the line, producing an amazing result at the end. A quantum circuit is like a domino chain: a long chain of tiny steps (“operations”) that work together to process information together in a powerful way.
Now imagine that every domino is a little wobbly. In the quantum circuit, that wobble is called “noise.” It might not look like much—after all, all regular systems are subjected to some kind of “noise”—but noise in quantum circuits can accumulate and build up to a crescendo of problems.
For the first time, an international team of physicists has successfully harnessed a rare orbital transition in atoms of ytterbium to create a new type of atomic clock that is both highly precise and extremely sensitive to fundamental physical effects. Publishing their results in Nature Photonics, the researchers, led by Taiki Ishiyama at Kyoto University, say their approach could pave the way for some of the most stringent tests yet of predictions made by the Standard Model.
To measure the passing of time, an atomic clock excites an electron in confined atoms to a higher energy level, then interrogates the transition frequency of the atoms. Because these oscillations display such little variation, atomic clocks are the most accurate timekeepers ever developed.
To date, the most precise devices involve atoms trapped in an optical lattice: a periodic array of light and darkness created by interfering laser beams. These clocks operate at optical frequencies with hundreds of trillions of oscillations per second—far surpassing the microwave frequencies used in previous atomic clock designs. Already, this extraordinary precision has enabled sensitive tests of fundamental physics, as described by the Standard Model.
Researchers at IPhT (CEA, CNRS) and the Universitat Autònoma de Barcelona have shown that gravity—and with it, supersymmetry—emerge as logical necessities whenever a massive spin-3/2 particle exists in nature. Two principles are enough: causality, the fact that no signal can travel faster than light, and unitarity, the requirement that probabilities are conserved in quantum mechanics. The structure of supergravity is not assumed: it bootstraps itself.
In fundamental physics, gravity is usually thought of as an ingredient one adds to a theory. But could it instead be forced by the internal consistency of the quantum world? This is what a study published in the Journal of High Energy Physics demonstrates.
The starting point is disarmingly simple: a single massive spin-3/2 particle. The authors show that such a particle simply cannot exist in isolation within a consistent theory. Its scattering amplitudes grow too fast with energy, clashing with positivity inequalities—the mathematical encoding of causality (the speed of light as an absolute limit) and unitarity (the conservation of probabilities in every quantum process). The theory breaks down barely above the particle’s own mass.
Australia’s bid to detect elusive dark matter has taken a major step forward, with new research confirming that cosmic radiation levels deep inside the Stawell Underground Physics Laboratory (SUPL) are low enough to support the world-class experiment that will commence later this year.
ARC Center of Excellence for Dark Matter Particle Physics researchers recorded muon —or cosmic radiation—levels inside and outside the laboratory for more than a year. They detected 30,000 muons inside the underground laboratory, while 8.4 billion muons would be expected to be detected on the surface of Earth.
The SABRE Collaboration paper, published in Astroparticle Physics, is the first to use data collected in SUPL, marking a major achievement for Australian and international scientists involved in the project.
Most laser sources produce Gaussian beams that diverge as they propagate. This natural spreading limits their effectiveness in applications that require light to remain concentrated over long distances. To overcome this challenge, structured light beams have been developed, whose amplitude, phase, and polarization can be carefully controlled.
Among these are Bessel beams, which are generated by the self-interference of laser beams as they propagate through space. However, ideal Bessel beams possess complex ring structures that complicate their practical use. Additionally, existing methods for generating advanced beam shapes, such as optical bottle beams, often involve complex and expensive setups that necessitate precise alignment.
Now, researchers at Chiba University, Japan, have developed a simple and compact method to generate a laser chain beam that remains nondiffracting during free-space propagation.
A University of Sydney quantum physicist has developed a new approach to quantum error correction that could significantly reduce the number of physical qubits required to build large-scale, fault-tolerant quantum computers. The study, co-authored by Dr. Dominic Williamson from the School of Physics, is titled “Low-overhead fault-tolerant quantum computation by gauging logical operators” and published in Nature Physics.
The work was done while Dr. Williamson was on a sabbatical working at global technology firm IBM in California. Elements of the new design have been integrated into IBM’s plan to build large-scale quantum computing.
“We’re at a point where theory and experiment are beginning to align,” Dr. Williamson said. “The big question now is how to design quantum computers that can be scaled efficiently to solve useful problems. Our work provides a promising blueprint.”